What’s new with your favorite virtualization companies and executives.


Check out what’s happening in your area, from webinars to lunch and learns.


Get the scoop on the latest technology news from industry experts.

How To’s

Step by step instructions on a variety of need to know virtualization topics.


Take a look at the industries most recent company and product annoucements.

Home » Blogs

Has Hadoop Outlived Its Usefulness?

Submitted by on October 31, 2017 – 12:48 amNo Comment

Hadoop has been the analytics platform of choice for big data workloads in areas such as genomics, machine learning, artificial intelligence (AI) and nearly anything that requires sifting through massive amounts of data. Such applications require large data sets that continue to grow in size and number, requiring extremely high throughput to process the data as quickly as possible.


Data is first loaded into Hadoop’s file system (HDFS) from a central data store, meaning that large amounts of network traffic is generated that adversely impacts network performance for other applications and users. Besides being extremely inefficient, moving data requires additional administrative oversight. While still a powerful platform, Hadoop and HDFS are beginning to show their age and may quickly fade as IT organizations adopt new software based storage architectures, inexpensive NVM Express (NVMe) flash devices, and upgrade networks to run 10 to 100 times faster.

To read the entire article, please click on this