IT in 2014: One Big Cloud
As the managing editor of Datamation, I track the dizzying change-fest that is the current IT scene. It's a lot to keep up with. Over the last few years, the model of slowly overlapping tech eras – as mainframes gave way to client-server – has vanished. Instead of one monolithic technology, we now see a semi-chaotic patchwork of trends that push and pull each other in countless ways.
Call it the Great Tech Mash-Up. VMware, the virtualization leader, created its own social media platform. Search king Google promotes an operating system, Android, that runs mobile phones. Citrix, an enterprise software outfit, sells an app that allows IT pros to log in using a smartphone. Everything bridges with everything else; this is the age of the API, which allows software to speak to software without assistance from we dumb humans. These days the stand-alone is fading fast, and every last holdout realizes it. Microsoft, which once fought Linux as the Great Scourge, now boasts how fast its Azure platform can run Linux boxes.
Welcome to the Cloud Era
Adding to the crazy patchwork: the earlier IT eras never really went away. It turns out that the evolution of IT is about addition, not replacement. Even as the cloud gets all the headlines, IBM rakes in billions from mainframe technology, and client-server is more intensively networked than ever. Looking out across the global IT market – which Gartner says hit $3.7 trillion in 2013 – can easily cause vertigo.
Oh sure, if you need a single unifying theme, our current era is indisputably the Cloud Era. And at one level the Cloud Era is not so different from earlier times. It has a clear market leader (Amazon), facing stiff competition from a second tier (Microsoft, Google, IBM, HP, Dell), with a crowded third tier all jockeying for position (Rackspace, Eucalyptus, RightScale, Joyent, NetApp). That type of market setup is pretty much business as usual.
But look deeper – into the technology itself – and you’ll see that cloud computing is the ultimate disrupter. As a concept, the cloud is about as open ended as, well, a cloud. There are private, public and hybrid clouds, remote and in-house clouds. There's Iaas, PaaS and SaaS, which are separate ideas, though they're closely interrelated and may be used in tandem. There are bundled pre-built vendor solutions, like VCE, a joint effort from Cisco, EMC, VMware and Intel, and there are DIY clouds based on commodity hardware, like those built by Cloudscaling and other firms.
If you think you can count all the permutations, just wait. New possibilities will soon spring up. The constant growth of the cloud, and its closely related tech cousin, virtualization, has led to a new trend: Software Defined. Be it networking, storage or the entire data center, Software Defined promises to fully abstract the software from the underlying hardware, and, like the cloud, to abstract data and compute from their physical location. Start-ups like Simplivity and Pluribus Networks mine this new sector, which, in essence, turns the data center into a single pooled cloud of resources. As tech analyst Stu Miniman said to me, "The cloud is the ultimate 'software defined'."
Like the cloud, Software Defined is platform agnostic and prizes interoperability. (Or at least it does in theory; some vendors have stuck the label Software Defined on their proprietary solutions.)
Today's emphasis on open standards and crazy-quilt heterogeneity is fueling the phenomenal success of OpenStack, an open source platform for building private and public clouds that touts its vendor neutrality. Founded in 2010, the foundation already has more than 200 member companies, including blue chips like IBM, Oracle, Dell, HP and Red Hat. Critics say that OpenStack is not quite ready, still immature; yet the foundation's growth speaks for itself. Interoperability is big business.
Dissolving the Boundaries Between In-House and Remote
So what’s the net effect of all this openness, all of today's concurrently evolving cloud-related technologies?
A great dissolving of boundaries. Cloud dissolves boundaries between in-house and a remote data center. Virtualization dissolves boundaries between hardware stacks. Open source dissolves boundaries in ownership between individual and community. Big Data – a mega-trend that VCs are investing in heavily – dissolves boundaries between the insiders in the C-suite and workers who access the database with low-end phablets. BYOD dissolves the boundaries between corporate and personal. Social media dissolves boundaries between public and private.
A hot IT buzzword is converged infrastructure. The term typically refers to some form of bundled solution: servers, storage, networking, software. But in a larger sense, our current era could be called the Converged Era. Today's many technologies prompt us, our physical environments as well as our personal selves, toward a remarkable degree of convergence.
In the Converged Era, we are all connected, all the time – to each other, to our neighbors, to our employers, to the world around us, to the constant flickering fibrillations in the always-on virtual network. Like it or not, technology is acting to intertwine everything, to make everything porous. An inflamed crowd tweets, as in the Arab Spring, and a dictator falls. A movement dubbed the Internet of Things works to connect every last microwave, refrigerator and wearable tech device to the Web. A rack of servers goes out in Seattle and 90,000 people sit idle across the country. A single hacker gets lucky and two million people get their passwords exposed. Ultimately, everything is part of one big...cloud. It is clearly time to learn to love our neighbors, for we'll surely be far more closely converged with them than ever before.
Stay up to date on the latest developments in Internet terminology with a free weekly newsletter from Webopedia. Join to subscribe now.
Perceptual computing is the ability for a computer to recognize what is going on around it. More specifically, the computer can perceive the... Read More »Apple Pay Promises to Strengthen Payment Security
Experts believe that Apple Pay and other competitive payment systems will be far more secure than cards, even cards equipped with EMV chips. Read More »The Great Data Storage Debate: Is Tape Dead?
Tape clearly is on the decline. But remember, legacy systems can hang for a shockingly long time. Read More »
- Watch Datamation's editor James Maguire moderate roundtable discussions with tech experts from companies such as Accenture, Dell, Blue Jeans Network, Microsoft and more »
A network is a group of two or more computer systems or devices, linked together to share resources, exchange files and electronic communications.... Read More »Computer Architecture Study Guide
This Webopedia study guide describes the different parts of a computer system and their relations. Read More »Webopedia Polls
The trend for the past two years has been for shoppers to spend more online during the holiday season. How do you typically shop for holiday... Read More »