Public cloud rumblings gathering force

Using GE’s decision to shift to public cloud as an important anecdotal marker and market data from a number of research houses as quantitative evidence of the public future, in the article below, Kevin Fogarty examines strategies adopted by several of the large IT vendors to capture the coming wave. From software providers looking for additional SaaS license revenue or the ability to integrate new capabilities to infrastructure players that may be looking to return to their services roots, all, he argues, must be prepared to offer enterprise class services. In Fogarty’s view, it’s not “the empty skate park after dark,” but rather “well-equipped data centre” with tools for monitoring and management that will push the last enterprise holdouts into the public cloud fold. (ed.)

 

Kevin Fogarty, freelance IT journalist and frequent contributor to InsightaaS
Kevin Fogarty, freelance IT journalist and frequent contributor to InsightaaS

The technology business — in which a well-articulated vision can be worth more than a fat balance sheet and where it’s possible for new tech to develop so fast that it’s possible to miss the boat by waiting until a product exists before trying to sell it — has lost its glammiest, airiest, least accountable computing platform and investment opportunity.

In the process, however, it may be about to gain something that’s been completely missing from public clouds until now — the tools, applications, access points, APIs, protocols, governance and other software that make corporate data centres more than just stacks of hardware that all happen to be in the same room.

Global corporate giant and technology-use touchstone General Electric Co. has shown the most dramatic high-profile turnaround on cloud computing of any major enterprise IT customer to date. As recently as 2011, GE “thumbed its nose at outsourcing” in the cloud by committing to the construction of a new world-class, LEED certified data centre in Louisville, Ky to drive its appliance and lighting businesses, but announced last week that it likes the cloud after all and has decided to use public cloud services to replace 90 percent of its data centre capacity.

The decision was a turnaround for GE, but also for the computer industry — both vendors that sell it and end-user companies that buy it — which has been inching slowly toward full commitment to the cloud since at least 2008 when it appeared at the top of Gartner’s Hype Cycle with video telepresence and Green IT as an overhyped, underutilized technology.

From a CIO or data centre manager’s perspective, the problem with cloud hasn’t been the technology. Their objections have centered on the need to trust placement of corporate data in a facility owned by someone else that may lack the tools corporate IT would normally use to monitor the use, security, location, volume and health of its data and applications. Those abilities have been as slow to come to public cloud platforms as large enterprise IT organizations have been to come to the cloud in more than a tentative way.

The use of external cloud services has been “exploratory” for corporations over the past several years according to Forrester analyst James Staten. Writing in April about a Forrester survey which predicted that the $58 billion spent on cloud services during 2013 would increase to top $191 billion in 2020, Staten argued that “strong growth and maturity” would bring public cloud close to par with private cloud for performance and spending at that time.

By 2016, the bulk of new IT spending will go to cloud providers rather than to traditional IT companies according to an October, 2013 study from Gartner.

By 2017, enterprise spending on cloud will triple the amount spent in 2011, topping out at $235.1 billion, according the HIS Technology. Worldwide, companies will increase the amount they spend on IT (primarily cloud) infrastructure by 20 percent, according to IDC, which also found that companies currently using cloud computing for anything will move further cloudward over the next two years, eventually spending 53.7 percent of their total IT budgets on cloud.

Amazon’s Web Services revenue increased 62 percent between 2012 and the end of 2013, which is a good sign of the health of full-bore cloud services, according to one study.

Still, $36 billion of the $58 billion businesses spent on cloud during 2013 went toward Software-as-a-Service (SaaS), not for Platform- or Infrastructure-as-a-Service, according to Forrester’s Staten.

SaaS is still the favorite flavour of cloud, and will continue to be, expanding beyond its current role as a supplement to on-premise software by starting to replace licensed on-prem software with the online variety. Public-cloud platforms will become so popular for “systems of engagement” that they will rival the markets for traditional middleware in supplying that function, Staten wrote.

“Systems of engagement” are systems used by corporations to provide social networks, interactive web sites, online billing and customer support and other means to interact with customers online, according to business-process guru Geoffrey Moore.  “Systems of record” are traditional applications with data repositories, application development tools and analytic systems that either monitor or make up the infrastructure of apps that run most big corporations.

The companies building and maintaining those SaaS and platform cloud services will continue to be big — Amazon, Microsoft, Google, but will increasingly include vendors whose names are more often associated with hardware, software or IT services than with the cloud.

Of the big IT vendors, IBM’s shift cloudward is probably the most dramatic. In January, the company announced it would spend $1.2 billion to build and equip 15 data centres worldwide this year and build another 15 next year. All the new glass houses will focus on the agile, flexible SoftLayer data centre-management and cloud-services platform and less on IBM’s own legacy data centre setups, which have less support for OpenStack and other open cloud standards and are less flexible in the way they can provide pay-for-use virtual servers and other data centre assets.

The shift is an attempt to compete more directly with Amazon, which was the unquestioned leader according to a 2013 Morgan Stanley survey, which also predicted that the number of workloads in public clouds will increase by half during the next three years, while the percentage of companies using public cloud nearly doubles, from 28 percent to 51 percent.

IBM’s shift was made to try to pick up some of that business but, more profoundly, to move IBM back toward the data centre-services-centric business model more characteristic of the company’s glory days during the ’60s and ’70s than of the model of more recent years, when it was spending far too much effort and making far less money (judging from Lenovo’s later success) selling mid-market and low end PCs and servers — a business it sold off at the same time it announced the shift to the cloud.

Though the division that owns cloud — Global Business Services — grew in revenue only 2 percent during Q2 2014, the $4.5 billion in revenue GBS brought in that quarter represented a 50 percent increase over revenues for that division in the same period last year.

IBM doesn’t appear to be chasing SaaS providers like Salesforce.com or Oracle, but it is focusing on the data centre-service market that has been the core of its business, and is expanding data centres to give itself better locations and better facilities for data centre outsourcing and cloud services than companies that can’t afford to build and run 40 or 50 datacenters at a time.

Cisco’s cloud-invasion announcement actually focused on something completely different. With a US$1 million investment and set of 30 telecom and network-service partners, Cisco’s Intercloud will focus on interoperability and connectivity among clouds, according to the company’s March 24 announcement. It will be a flat-out cloud platform service based on OpenStack and designed to “support any workload, on any hypervisor, and interoperate with any cloud,” according to a March 24 blog by Robert Lloyd, president, development and sales for Cisco.

Rather than offer standalone cloud, or expand its server products with an offer to replace internal IT infrastructure, Cisco’s Intercloud network will set itself apart by providing interconnect networks and middleware platforms to let customers — typically CSPs or ISPs — connect incompatible cloud platforms while paying for premium private networking service between data centres, according to a Sept. 28 WSJ story which also estimates that Cisco partners own a total of 250 data centres worldwide — by far the largest single network of data centres.

Other vendors — EMC, Oracle, Unisys and others — have either launched their own cloud initiatives or partnered with IBM or other  major cloud-platform providers to expand their own technology into the public cloud in a way that would make it easy for their current customers to expand to public-cloud sites without having to leave their existing LoB applications and application vendors, Kevin Ichhpurani, SAP’s head of business development told Forbes in October.

They’re also expanding the number and sophistication of the applications available in the cloud — adding content management, multichannel marketing automation, social media and analytic tools to public clouds, according to a March story in The Hub detailing cloud incursions by Adobe, Oracle and Salesforce.

If customers are going to the cloud, even the often-abhorred public cloud, the big vendors that have been selling specialized, expensive products to them for decades have to be sure to be where the customers are going to be. The result isn’t a mass conversion of enterprise on-premise software vendors to SaaS; it’s more like an additional channel through which they can sell additional licenses, additional services, and try to expand their own customer bases by combining analytic or marketing tools online in ways that are too difficult with packaged software.

The vendor migration is unquestionably inspired by and timed to predate a similar expansion into cloud by customers, but the trend will change public cloud platforms from generic virtual-server-rental spaces to something more customized and enterprise-focused, according to InfoWorld cloud blogger David Linthicum. Serving the enterprise from the cloud will require more comprehensive, more intuitive security, tiered data and thousands of special services and APIs that enterprise customers can use to access, monitor and manage applications and data that might live in a specific cloud only long enough to support a specific campaign or may become a permanent fixture connected to the home planet. Tools for automation, security and monitoring of systems, software and management will help make the public cloud look to a CIO or data centre manager a lot less like an empty skate park after dark and a lot more like a well-equipped data centre that is not as scary as a public cloud and may very well be perfectly usable, even if it isn’t actually theirs.

 

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.