Notes from the Expo Floor - Home
GTC 2014 and the Power of GPU Computing
Brian Kirk
MAR 04, 2014 17:22 PM
A+ A A-

Registration for Nvidia’s GPU Tech Conference (GTC 2014) is now open. As an IEEE Computer Society reader, you can receive a 20% discount on the list price when you register using the coupon code “GM20IEEE”: http://www.gputechconf.com/page/registration-pricing.html.

To this day, I still have friends outside of the industry occasionally ask about why GPUs are important to science in general and computational science specifically. I usually start with a story about how dollars attracted the hardware, creative content drove the industry (specifically with movie special effects and video game graphics), and then astute scientists used the GPUs for compute-intensive aspects of applications and started doing science.

The actual story is more nuanced, and I’m sure I don’t have the entire history of how GPUs became such an important factor in scientific computation, but the above story is more or less correct. (To be fair, I’m sure that there were some monetary restrictions and funding issues involved somewhere along the way, driving researchers to look for efficient alternatives to expensive CPU clusters.)

Regardless of the details, it came down to this: because of the Flops to dollar ratio possible with GPU clusters, scientists started to realize that GPU computing could help with efficient scientific computations.

Nvidia took note. In 2007, they started working on streamlining the process for researchers, using CUDA as the parallel computing platform to optimize the scientific process (along with C, C++, and Fortran code). They've even added a hands-on computer lab, based in the cloud, for those interested to learn best practices for C, C++, Fortran, and Python: https://nvidia.qwiklab.com/.   

While the Computer Society (and specifically Computing in Science and Engineering) has talked before at length about some of the ways that GPUs help in scientific computing, Nvidia has done an impressive job at getting the word out about how best to do it (synergistically increasing value and applicability of GPUs in general).

This year, CiSE is a media sponsor for the Nvidia GPU conference for one main reason: Nvidia has been making in-roads to the scientific community, enabling cutting-edge research, and giving researchers a platform for connecting with others. From a personal point of view, it’s exciting to see so much work in computational physics, bioinformatics, and analytics showcased by the company responsible for the hardware itself (maybe I’ve just grown accustomed to seeing it studied by nonprofits, but that’s another issue entirely).

GTC 2014 will be held March 24-27 in San Jose, CA and feature more than 400 sessions, tutorials, labs, and posters. More information can be found here: www.nvidia.com/gtc.

FIRST
PREV
NEXT
LAST
Page(s):
[%= name %]
[%= createDate %]
[%= comment %]
Share this:
Please login to enter a comment:
 
RESET