When you hear about Big Data, Hadoop hype follows almost
automatically. But people often ask me what Hadoop actually does. Built
by Yahoo and Google to essentially index the Internet, Hadoop is not a
data warehouse or storage solution: it’s a tool that’s useful when
information can be broken up, analyzed in pieces and put back together.
For example, if a chain of convenience stores needs to find out how many customers used MasterCard, Visa, American Express, or cash at the pump in the past year, they can use Hadoop as a tool to retrieve that information because it can be divided up and managed in pieces per location, without affecting the big picture.
However, if you’re working with data that requires an examination of the relationships and dependencies within the data, you can’t just look at it in pieces and get the “big picture” of what the data is trying to tell you. So, back to the previous example, this approach would fail if the chain wants to know what food and beverages are being purchased together in rural vs. urban locations and how weather impacts those buying patterns.
The hype around Hadoop makes it seem like a one-size-fits-all solution for leveraging big data, but the reality is that not all problems are Hadoop-able, and more and more business users are learning that. Jaikumar Vijayan of ComputerWorld wrote, “Hadoop isn’t enough anymore for enterprises that need new and faster ways to extract business value from massive datasets.”
Time is a major factor, but what about requiring an IT army to run Hadoop? Steve Rosenbush wrote in Wall Street Journal about how GameStop CIO Jeff Donaldson “picked a more traditional approach for analyzing large amounts of customer data, because he didn’t want to manage the complexity of having his engineers learn Hadoop, or have to call in consultants for help.”
While Hadoop is an effective and low-cost tool for some companies, it is not an application and does not get business users any closer to the most critical part of Big Data: getting to the insight. Hadoop chops and dices and stores, but does not make a consumable dish! It leaves users wanting for the value of big data, some of which include:
Take healthcare, for example. You have nodes for people, medicine, symptoms and side effects. To determine the type of person most likely to have the least side effects related to a certain medication, one needs to leverage the patterns of connections within data – as opposed to breaking apart those connections into disparate clusters. This is the kind of analysis that is not going to lend itself to being partitioned in a Hadoop-able way.
Law enforcement agencies have data comprised of people, organizations, places and words. These connection points, if not Hadoop-ed, can reveal valuable connections to networks of interest and key influencers within those networks. But the value lies in the data that’s sparse, so it needs to be assessed all at the same time instead of as distributed fragments.
Though Hadoop is getting a lot of attention, it may not always be the best approach to crunching Big Data for strategic insights. “[Hadoop] is simply too slow for companies that need sub-millisecond query response times,” wrote Vijayan. Whether it’s healthcare, manufacturing, retail or banking, companies with data that can be represented as a “big picture” demand effective solutions beyond the methods they are likely using now such as spreadsheets made by guesstimates and intuition.
Original Source : http://smartdatacollective.com/radhikaatemcien/104206/de-mystifying-hadoop-not-all-problems-are-hadoop-able
For example, if a chain of convenience stores needs to find out how many customers used MasterCard, Visa, American Express, or cash at the pump in the past year, they can use Hadoop as a tool to retrieve that information because it can be divided up and managed in pieces per location, without affecting the big picture.
However, if you’re working with data that requires an examination of the relationships and dependencies within the data, you can’t just look at it in pieces and get the “big picture” of what the data is trying to tell you. So, back to the previous example, this approach would fail if the chain wants to know what food and beverages are being purchased together in rural vs. urban locations and how weather impacts those buying patterns.
The hype around Hadoop makes it seem like a one-size-fits-all solution for leveraging big data, but the reality is that not all problems are Hadoop-able, and more and more business users are learning that. Jaikumar Vijayan of ComputerWorld wrote, “Hadoop isn’t enough anymore for enterprises that need new and faster ways to extract business value from massive datasets.”
Time is a major factor, but what about requiring an IT army to run Hadoop? Steve Rosenbush wrote in Wall Street Journal about how GameStop CIO Jeff Donaldson “picked a more traditional approach for analyzing large amounts of customer data, because he didn’t want to manage the complexity of having his engineers learn Hadoop, or have to call in consultants for help.”
While Hadoop is an effective and low-cost tool for some companies, it is not an application and does not get business users any closer to the most critical part of Big Data: getting to the insight. Hadoop chops and dices and stores, but does not make a consumable dish! It leaves users wanting for the value of big data, some of which include:
- Law enforcement and intelligence agencies seeking insight from their data to mitigate threats to public safety.
- Healthcare institutions trying to predict disease outbreaks or customize treatments to diseases.
- Retailers wanting insight into demand trends and customer-buying patterns to serve the markets more profitably
- Supply chain professionals wanting insight into data to understand the cause-and effect across the nodes on the chain.
Take healthcare, for example. You have nodes for people, medicine, symptoms and side effects. To determine the type of person most likely to have the least side effects related to a certain medication, one needs to leverage the patterns of connections within data – as opposed to breaking apart those connections into disparate clusters. This is the kind of analysis that is not going to lend itself to being partitioned in a Hadoop-able way.
Law enforcement agencies have data comprised of people, organizations, places and words. These connection points, if not Hadoop-ed, can reveal valuable connections to networks of interest and key influencers within those networks. But the value lies in the data that’s sparse, so it needs to be assessed all at the same time instead of as distributed fragments.
Though Hadoop is getting a lot of attention, it may not always be the best approach to crunching Big Data for strategic insights. “[Hadoop] is simply too slow for companies that need sub-millisecond query response times,” wrote Vijayan. Whether it’s healthcare, manufacturing, retail or banking, companies with data that can be represented as a “big picture” demand effective solutions beyond the methods they are likely using now such as spreadsheets made by guesstimates and intuition.
Original Source : http://smartdatacollective.com/radhikaatemcien/104206/de-mystifying-hadoop-not-all-problems-are-hadoop-able
0 comments:
Post a Comment