3 Best Practices For Big Data Projects
Submitted by Heather Clancy on
The Opportunity
Before Think Big Analytics even existed, Facebook approached its founder, Ron Bodkin, to make sense of Hadoop. Now, the big data integrator's client list includes eBay, EMC, Intel, Johnson & Johnson, NASDAQ, NetApp and Western Digital. Plus, it is aligned with pretty much every big-name analytics and enterprise integration platform you can name, including Cloudera, GreenPlum, HortonWorks, MapR, Pivotal and many others.
Bodkin and his team of engineers have been thinking about big data before we called it that. Many worked together at data science service provider Quantcast and (before that) consulting company C-bridge. "We've always been big believers in the idea that Big Data is going to have a huge impact on organizations and on the economy," Bodkin said, "that it's going to allow for more efficiency in a whole range of areas: better customer interactions, more thoughtful product innovation and feedback, driving new efficiencies in financial matters."
To put perspective on the size of this market, global revenue for servers that support high-performance data analysis is expected to grow at a rate of 23.5 percent between now and 2018, reaching $2.7 billion by the end of the forecast period, according to a June 2014 IDC forecast. The related storage market will reach $1.6 billion in the same time frame. With that in mind, I asked Think Big to share some tips about how businesses of all sizes can unleash more value from data they've already collected.
(Image courtesy of Think Big Analytics)