Interviews

What’s new with your favorite virtualization companies and executives.

Events

Check out what’s happening in your area, from webinars to lunch and learns.

Blogs

Get the scoop on the latest technology news from industry experts.

How To’s

Step by step instructions on a variety of need to know virtualization topics.

News

Take a look at the industries most recent company and product annoucements.


Home » Blogs

Beyond the Delta: Compression is a Must for Big Data

Submitted by on October 9, 2018 – 3:07 amNo Comment

In an era of big data, high-speed, reliable, cheap and scalable databases are no luxury. Our friends over at SQream Technologies invest a lot of time and effort into providing their customers with the best performance-at-scale. As such, SQream DB (the GPU data warehouse) uses state-of-the-art HPC techniques. Some of these techniques rely on modifying existing algorithms to external technological advances, and other algorithms are home-brewed.

Big Data

Dr. Benjamin C. van Zuiden of SQream wrote a special report, “Beyond the Delta: Compression is a Must for Big Data,” that focuses on compression algorithms that make big data-at-scale possible. In data and signal processing, data compression is the process of encoding information using less bits (data) than the original representation. Data compression is useful to save disk space or reduce the I/O or bandwidth used when sending data (e.g., over the internet, or from storage to RAM).

To read the entire article, please click https://insidebigdata.com/2018/10/08/beyond-delta-compression-must-big-data/