Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » Did You Know » Hardware & Software »

Is There a Big Data Bubble?

The hype and new investment dollars flowing toward Big Data are both vast. Will it live up to the anticipation?

Other examples include Google finding that search queries about the flu are a quicker way to predict where the flu is spreading than previous methods (such as hospital admission records); and data analytics firm Evolv learning that employees with a criminal record actually perform slightly better in the workplace than everyone else.

One of the places “unknown unknowns” are a major problem is security. There’s a reason zero-day threats are such a concern in the security community: if you don’t know what it is, how can you block it?

However, by applying Big-Data-driven pattern recognition to threat analysis, security companies are quickly closing the zero-day window.

“When a security incident happens, one question customers always have is about what happened in the moments leading up to a security event. In the past, we couldn’t always tell them, at least not right away,” said Mike Hrabik, president and CTO of managed Security Service Provider Solutionary.

A problem legacy security systems have is that they can’t pull in both structured and unstructured data into a single platform for analysis. Thus, when an event happens, it could take days or weeks of forensics work to figure out what happened.

To address this problem, Solutionary deployed a Hadoop platform from MapR and Cisco’s Unified Computing System for high-performance computing.

Hadoop has significantly increased the amount of data analysis and contextual data that Solutionary can access, which provides a greater view of attack indicators and a better understanding of attackers’ goals and techniques. This capability also enables Solutionary to more quickly identify global patterns across its client base.

If new threats are discovered, Solutionary can now detect and analyze activity across all clients within milliseconds. With the previous environment, even this seemingly simple task would be considerably more difficult and costly to do, taking as long as 30 minutes even with a preplanned, optimized environment.

“In past, due to the sheer amount of data, our analysts were often limited to examining log data, which misses a lot. Now, our experts are unshackled. They can see the big picture and can put the incident into context,” Hrabik said.

Big-Data-driven innovation

Big Data is teaching us more about ourselves each and every day. Facebook may now know when you’ll be entering a romantic relationship before you do, simply based on your online activities. After the Super Bowl, Pornhub proudly told us what we already could have guessed: the fans of the losing team had to find other ways to entertain themselves, since the party for them was over. Less frivolously, analysis of search engine queries can now discover unknown drug side effects.

Many of the Big Data insights you’ll hear about on the news are just noise. They’re publicity tools that add little meaning to our lives.

But that probably tells us more about ourselves than about the capabilities of Big Data. (The Pornhub study showed up everywhere. Conversely, I don’t recall a single mention of the drug study; I just learned about it doing research for this story.)

Moreover, Big Data is already spawning startups looking to do everything from identify the key influencers in social networks (in order to focus marketing efforts on that person) to understanding social value in gaming.

The main difference I see between the dotcom boom and the Big Data one is that most dotcom companies targeted consumers first (as did laptops, WiFi, smartphones, tablets, etc.). The Consumerization of IT trend has taught us that consumer dollars are often easier to capture than business ones, especially for new, unproven tech products.

Big Data is the opposite. There are no real Big Data tools for consumers. The game is in the enterprise, yet the enterprise is far more cautious than your average consumer.

What will this mean?

My guess is that it means Big Data will evolve more slowly and sanely than many preceding trends. Fewer science projects will get funded. We’ll skip a dotcom-sized bubble, and companies will soon know a heck of a lot more about us than we know about them.

History may not be repeating, but I can certainly hear plenty of rhymes.

##

Jeff Vance is a technology journalist based in Santa Monica, California. Connect with him on LinkedIn, follow him on Twitter @JWVance, add him to your cloud computing circle on Google Plus

Photo courtesy of Shutterstock.


Originally published on www.datamation.com.





TECH RESOURCES FROM OUR PARTNERS
QUICK REFERENCE
How to Create a Desktop Shortcut to a Website

This Webopedia guide will show you how to create a desktop shortcut to a website using Firefox, Chrome or Internet Explorer (IE). Read More »

Flash Data Storage Vendor Trends

Although it is almost impossible to keep up with the pace of ongoing product releases, here are three recent highlights in the flash data storage... Read More »

15 Important Big Data Facts for IT Professionals

Keeping track of big data trends, research and statistics gives IT professionals  a solid foundation to plan big data projects. Here are 15... Read More »

DID YOU KNOW?
Who's Moving Ahead in Cloud Computing?

The future remains, well, cloudy. But either way: Amazon, look out. Microsoft is gaining fast. Read More »

Hype Versus Action in the Developer's World

Often times technologies start as hype but with time become adopted. As a developer or technologist, it is worth reading the hype and knowing the... Read More »

Microsoft Hyper-V Network Virtualization Q&A

The top 5 Hyper-V questions with answers provided by Nirmal Sharma, a MCSEx3, MCITP and Microsoft MVP in Directory Services. Read More »