Meet the Supercharged Future of Big Data: GPU Databases

By Gidon Ben-Zvi

5.29.2017 twitter linkedin facebook

In case you missed it, the GPU Technology Conference (GTC), which was held May 8-11, showcased the most vital Big Data Analytics platformwork in the computing industry today, including deep learning, Big Data analytics, and self-driving cars with the brightest scientists, developers, graphic artists, designers, researchers, engineers, and IT managers who use GPUs to tackle computational and graphics challenges.

Frenemies: GPUs Vs. CPUs

Unlike central processing units (CPUs), graphics processing units (GPUs) can handle multiple parallel processing tasks simultaneously, allowing large amounts of data to be processed much faster. Indeed, even though a graphics processing unit is part of the CPU and was originally created for 3D game rendering, the former’s capabilities extend beyond image rendering.

data science GPU

Time is Money: A growing number of data scientists are using GPUs for big data analytics to make better, real-time business decisions
(Source)

SQream GPU TCP-XX Benchmarks

 

GPUs also offer significantly more advanced analytics acceleration on a fraction of the hardware required for CPU-only based solutions. A GPU can be compared to a coin press machine, which can punch out 100 coins with one operation from a single sheet of metal, whereas a CPU is a coin press that can punch out 10 coins at a time from a strip of metal.  The GPU is therefore well suited for operations that perform the same instruction on large amounts of data at once.

As a result, companies and coders are now moving workloads off the main CPU and onto a wide range of alternative processors.

 

Analytics on Steroids: Where GPUs are Being Injected

Leveraging GPUs for analytical workloads is on the rise, particularly among telcos, ad-tech companies, financial services, and retail organizations that often deal with extremely large data volumes with high scaling and real-time processing requirements.

Nowadays, GPUs are being pushed through the massive data centers that underpin the likes of Google, Facebook, Microsoft, and Amazon. Because their sweeping online services can no longer handle all tasks on CPUs alone, these companies are moving major processing loads onto GPUs, programmable chips called FPGAs, and even custom-built chips.

Indeed, the shift to GPUs is sending ripples across the worldwide chip market. The fortunes of companies such as NVidia, the world’s largest GPU manufacturer, and AMD, are on the upswing. And Intel, which isn’t a big player in GPUs, has spent billions acquiring companies that make FPGAs and various AI chips.

Growth Mode: Fundamental shifts in how workloads are structured has resulted in GPU performance growth far outpacing memory capacity
(Source)

 

Need for Speed: How GPUs are Power Boosting Big Data

Big Data systems are built to handle data intensive applications. Now, as large-scale machine learning and streaming start to play a larger role in the enterprise, the Big Data systems are in need of more computational capabilities. Tools such as those offered by SQream Technologies providing a high-performing next generation GPU-powered SQL database that’s designed to quickly relieve Big Data and complex analytics pains so as to enable businesses to interact dynamically with their digital assets. With minimum cost, hardware and infrastructure changes required, SQream’s patented GPU database software enables businesses to easily ingest, store and analyze enormously large datasets in near real-time. The advanced GPU database is a remarkably high-performing cost-effective Big Data Analytics platform that can be implemented on premise, the cloud or as a hybrid combination.

 

Conclusion: See You at the Next GPU Technology Conference…

GPU technologies are creating amazing breakthroughs in important fields. While the annual GTC attracts developers, researchers, and technologists from some of the top companies, universities, research firms and government agencies from around the world, its GPUs potential for disrupting Big Data is especially intriguing. There are 2.5 exabytes of digital data produced daily, and that data is expected to double every three years.  The key question: how to triage that velocity of data to understand content? The answer: machines that can see, hear, read, and reason at superhuman levels and superhuman pace. The means: GPUs.