Interviews

What’s new with your favorite virtualization companies and executives.

Events

Check out what’s happening in your area, from webinars to lunch and learns.

Blogs

Get the scoop on the latest technology news from industry experts.

How To’s

Step by step instructions on a variety of need to know virtualization topics.

News

Take a look at the industries most recent company and product annoucements.


Home » Blogs

Big data application development: An introduction to Hadoop

Submitted by on August 12, 2017 – 10:16 amNo Comment

However, big data application development is challenging. With the growth of mobile, social media, and the internet of things, the volume of data that enterprises collect has been increasing. “Traditional database management systems (DBMSs) do not easily scale to support very large data sets,” noted Geneva Lake, vice president of worldwide alliances at MapR Technologies Inc. Also, old-school systems do not work well with unstructured information such as video.

Big Data

A new generation of DBMS technology emerged to fill the gaps. Hadoop began as the Google File System, an idea first discussed in the fall of 2003. By early 2006, the work had evolved into an open source project, and development was turned over to the Apache Software Foundation. Hadoop is an open source database management system for processing large data sets using the MapReduce programming model. The software runs on clusters of commodity hardware. Leading Hadoop distributions come from vendors such as Cloudera Inc., Hortonworks Inc. and MapR Technologies, all of which run partner programs for channel companies.

To read the entire article, please click on this link http://searchitchannel.techtarget.com/feature/Big-data-application-development-An-introduction-to-Hadoop