Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » Did You Know » Hardware & Software »

Is There a Big Data Bubble?

The hype and new investment dollars flowing toward Big Data are both vast. Will it live up to the anticipation?

Other examples include Google finding that search queries about the flu are a quicker way to predict where the flu is spreading than previous methods (such as hospital admission records); and data analytics firm Evolv learning that employees with a criminal record actually perform slightly better in the workplace than everyone else.

One of the places “unknown unknowns” are a major problem is security. There’s a reason zero-day threats are such a concern in the security community: if you don’t know what it is, how can you block it?

However, by applying Big-Data-driven pattern recognition to threat analysis, security companies are quickly closing the zero-day window.

“When a security incident happens, one question customers always have is about what happened in the moments leading up to a security event. In the past, we couldn’t always tell them, at least not right away,” said Mike Hrabik, president and CTO of managed Security Service Provider Solutionary.

A problem legacy security systems have is that they can’t pull in both structured and unstructured data into a single platform for analysis. Thus, when an event happens, it could take days or weeks of forensics work to figure out what happened.

To address this problem, Solutionary deployed a Hadoop platform from MapR and Cisco’s Unified Computing System for high-performance computing.

Hadoop has significantly increased the amount of data analysis and contextual data that Solutionary can access, which provides a greater view of attack indicators and a better understanding of attackers’ goals and techniques. This capability also enables Solutionary to more quickly identify global patterns across its client base.

If new threats are discovered, Solutionary can now detect and analyze activity across all clients within milliseconds. With the previous environment, even this seemingly simple task would be considerably more difficult and costly to do, taking as long as 30 minutes even with a preplanned, optimized environment.

“In past, due to the sheer amount of data, our analysts were often limited to examining log data, which misses a lot. Now, our experts are unshackled. They can see the big picture and can put the incident into context,” Hrabik said.

Big-Data-driven innovation

Big Data is teaching us more about ourselves each and every day. Facebook may now know when you’ll be entering a romantic relationship before you do, simply based on your online activities. After the Super Bowl, Pornhub proudly told us what we already could have guessed: the fans of the losing team had to find other ways to entertain themselves, since the party for them was over. Less frivolously, analysis of search engine queries can now discover unknown drug side effects.

Many of the Big Data insights you’ll hear about on the news are just noise. They’re publicity tools that add little meaning to our lives.

But that probably tells us more about ourselves than about the capabilities of Big Data. (The Pornhub study showed up everywhere. Conversely, I don’t recall a single mention of the drug study; I just learned about it doing research for this story.)

Moreover, Big Data is already spawning startups looking to do everything from identify the key influencers in social networks (in order to focus marketing efforts on that person) to understanding social value in gaming.

The main difference I see between the dotcom boom and the Big Data one is that most dotcom companies targeted consumers first (as did laptops, WiFi, smartphones, tablets, etc.). The Consumerization of IT trend has taught us that consumer dollars are often easier to capture than business ones, especially for new, unproven tech products.

Big Data is the opposite. There are no real Big Data tools for consumers. The game is in the enterprise, yet the enterprise is far more cautious than your average consumer.

What will this mean?

My guess is that it means Big Data will evolve more slowly and sanely than many preceding trends. Fewer science projects will get funded. We’ll skip a dotcom-sized bubble, and companies will soon know a heck of a lot more about us than we know about them.

History may not be repeating, but I can certainly hear plenty of rhymes.


Jeff Vance is a technology journalist based in Santa Monica, California. Connect with him on LinkedIn, follow him on Twitter @JWVance, add him to your cloud computing circle on Google Plus

Photo courtesy of Shutterstock.

Originally published on www.datamation.com.

8 Agenda Apps to Help Students Stay Organized

Webopedia's student apps roundup will help you to better organize your class schedule and stay on top of assignments and homework. Read More »

List of Free Shorten URL Services

A URL shortener is a way to make a long Web address shorter. Try this list of free services. Read More »

Top 10 Tech Terms of 2015

The most popular Webopedia definitions of 2015. Read More »

Java Basics, Part 1

Java is a high-level programming language. This guide describes the basics of Java, providing an overview of syntax, variables, data types and... Read More »

Java Basics, Part 2

This second Study Guide describes the basics of Java, providing an overview of operators, modifiers and control Structures. Read More »

The 7 Layers of the OSI Model

The Open System Interconnection (OSI) model defines a networking framework to implement protocols in seven layers. Use this handy guide to compare... Read More »