Hadoop for many is synonymous with big data. Hadoop is not an application but a set of open source multi-tools with the ultimate goal is to analyze volumes of structured and unstructured data.The research firm IDC(International Data Corporation) conducted a study to know how the companies were cohabiting big data analysis systems like Open Source Hadoop with other solutions to enhance their data. The survey commissioned by Red Hat, entitled “What trends for Hadoop deployments in business” reveals that 32 percent of companies surveyed have already made a Hadoop deployment, 31 percent intend to deploy Hadoop in the next 12 months, and 36 percent say to use a Hadoop deployment in more than a year.
Nearly 4 out of 10 managers surveyed indicated that they use big data technologies in the innovation of products and services in the context of modeling data to test scenarios. The less frequent use of Hadoop include deployments to work in conjunction with SQL technologies. A significant proportion using Hadoop is to replace traditional data warehouse technologies. Finally, enterprises use Hadoop for the analysis of large volumes of data generated by the Web.
The three benefits cited by the report using Hadoop are improve customer satisfaction, reduce development time and reducing costs operations. The difficulties encountered in the implementation of Hadoop include costs, lack of available skills and the difficulties of making the choice of technologies.
A recent Markets and Markets research report predicted worldwide Hadoop & big data analytics market is expected to grow to about $13.9 billion by 2017, at a CAGR of 54.9% from 2012 to 2017. The market will see growth in four Hadoop segmentation namely Hadoop performance monitoring software, Hadoop management software, Hadoop application software and Hadoop packaged software