Tuesday, February 8, 2011

Cloud computing drives need for open standards to define and describe a new enterprise environment

This guest post comes courtesy of Mark Skilton of Capgemini Global Applications, and The Open Group.

By Mark Skilton

I
recently looked back at some significant papers that had influenced my thinking on cloud computing as part of a review on current strategic trends. In February 2009, a paper published at the University of California, Berkeley, “Above the Clouds: A Berkeley View of Cloud Computing," stands out as the first of many papers to drive out the issues around the promise of cloud computing and technology barriers to achieving secure elastic service.

The key issue unfolding at that time was the transfer of risk that resulted from moving to a cloud environment and the obstacles to security, performance and licensing that would need to evolve. But the genie was out of the bottle, as successful early adopters could see cost savings and rapid one-to-many monetization benefits of on-demand services. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

A second key moment was the realization that the exchange of services was no longer a simple request and response. Social networks had demonstrated huge communities of collaboration and online “personas” changing individual and business network interactions, but something else had happened -- less obvious but more profound.

This change was made most evident in the proliferation of mobile computing that greatly expanded the original on-premise move to off-premise services. A key paper by Intel Research titled “CloneCloud,” published around that same time period, exemplified this shift. Services could be cloned and moved into the cloud, demonstrating the possible new realities in redefining the real potential of how work gets done using cloud computing.

Remote services

T
he key point was that storage or processing transactions, media streaming, or complex calculations no longer had to be executed within a physical device. It could be provided as a service from remote source, a virtual cloud service.

But more significant was the term “multiplicity” in this concept. We see this everyday as we download apps, stream video, and transact orders. The fact was that you could do not only a few, but multiple tasks, simultaneously and pick and choose the services and results.

This signaled a big shift away from the old style of thinking about business services that had us conditioned to think of service-oriented requests in static, tiered, rigid ways. Those business processes and services missed this new bigger picture. Just take a look at the phenomenon called "hyperlocal services" that offer location specific on-demand information or how crowd sourcing can dramatically transform purchasing choices and collaboration incentives.

The new multiplicity based world of cloud enabled networks means you can augment yourself and your company’s assets in ways that change the shape of your industry.



Traditional ways of measuring, modeling and running business operations are under-utilizing this potential and under-valuing what can be possible in these new collaborative networks. The new multiplicity-based world of cloud-enabled networks means you can augment yourself and your company’s assets in ways that change the shape of your industry.

What is needed is a new language to describe how this shift feels and works, and how advances in your business portfolio can be realized with these modern ideas, often by examining current methods and standards of strategy visualization, metrics, and design to evolve a new expression of this potential.

Some two years have passed, and what has been achieved? Certainly we have seen the huge proliferation of services into a cloud hosting environment. Large strategic movements in private data centers seek to develop private cloud services, by bringing together social media and social networking through cloud technologies.

But what's needed now is a new connection between the potential of these technologies and the vision of the Internet, the growth of social graph associations, and the wider communities and ecosystems that are emerging in the movement’s wake.

With every new significant disruptive change, there is also the need for a new language to help describe this new world. Open standards and industry forums will help drive this. The old language focuses on the previous potential, and so a new way to visualize, define, and use the new realities can help the big shift toward the potential above the cloud.

This guest post comes courtesy of Mark Skilton of Capgemini Global Applications, and The Open Group.

You may also be interested in:

EMC Greenplum releases Community Edition of MPP database product, big data analysis gets cheaper still

EMC recently introduced a free Community Edition of the EMC Greenplum Database, its massively parallel processing (MPP) database, along with free analytic algorithms and data mining tools.

Building on earlier Greenplum “big data” releases, like the EMC Greenplum Data Computing Appliance, the Community Edition lowers the cost barrier to entry for big data power tools for more developers, data scientists, and other data professionals.

The tools help to developers better understand data and provide new data uses, as well provide deeper insights and to better visualize those insights. The release was made at the 2011 O'Reilly Strata Conference, by Scott Yara, vice president, EMC Data Computing Products Division. EMC acquired Greenplum last summer. [Disclosure: Greenplum is a sponsor of BriefingsDirect podcasts.]

With the easily accessible Community Edition stack, developers can build complex applications to collect, analyze and operationalize big data leveraging best of breed big data tools, including the Greenplum Database with its in-database analytic processing capabilities.

“Our new Community Edition provides a parallel-everything 'big data' stack with unequaled speed that enables analysts to perform next-generation data analytics and experiment with real-world data, and most importantly -- innovate,” explained Luke Lonergan, CTO and vice president, EMC Data Computing Products Division and co-founder of Greenplum. “This project is about empowering developers. They can program using the most popular tools and they have a place to contribute open source extensions to the stack.”

The free EMC Greenplum Community Edition includes:
  • Greenplum Database CE, an industry-leading MPP database product for large-scale analytics and next-gen data warehousing.
  • MADlib, an open source analytic algorithms library, providing data-parallel implementations of mathematical, statistical and machine learning methods for structured and unstructured data.
  • Alpine Miner, an intuitive visual data mining modeler that delivers rapid "modeling to scoring" capabilities, leverages in-database analytics, and is purpose-built for "big data" applications.
Community benefits

T
he initial release of the Community Edition is designed for both first-time users and experienced Greenplum customers. First-time users gain access to a comprehensive, purpose-built business analytics environment that enables them to view, modify and enhance included demo data files, enabling experimentation with “big data” analytical tools within the Greenplum database. Existing users can download an upgraded version of Greenplum Database CE and analytic tools for integration into their development and research environments.

The Community Edition can be downloaded free of charge from http://community.greenplum.com as a pre-configured VMWare virtual appliance for use on laptops and desktops, or as a set of packages for deployment on user machines. All users are free to participate in new Greenplum community forums to get support, collaborate, post ideas, and test enhancements developed by various users independently.

Regular Community Edition updates will be made available online. The Community Edition is intended for experimentation, development and research purposes only. Current single-node edition users can deploy the new Community Edition in their single-node production environments. Greenplum commercial licenses must be purchased prior to using code for internal data processing or for any commercial or production purpose.

You may also be interested in:

Cyber security top of mind for enterprise architects at Open Group conference

SAN DIEGO -- The Open Group 2011 conference opened here yesterday with a focus on cyber security, showing how the risk management aspects of IT, architecture, and business stand as a high priority and global imperative for enterprises.

It's hard to plan any strategy for business and the IT forces that drive it, if the continuity of those services is suspect. Social media and the accelerating uses of mobile devices and networks are only adding more questions to the daunting issues around privacy and access. And, the Wikileaks affair has clearly illustrated how high the stakes can be. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

Three cyber security thought leaders plunged into the issues for the attendees: Bruce McConnell, Cybersecurity Counselor, National Protection and Programs Directorate (NPPD), US Department of Homeland Security; James Stikeleather, Chief Innovation Office, Dell; and Ben Calloni, Lockheed Martin Fellow for Software Security, Lockheed Martin Corp. Each speaker shared his thoughts on the current state of cyber security and where they see the industry heading in the future. Top of mind: The importance of trust, frameworks, and their impact on the security of critical infrastructure systems.

Following a brief introduction from Allen Brown, President and CEO of The Open Group, McConnell set the stage by discussing the current state of the security ecosystem.

Computing systems today often consist of numerous security hardware and software implementations working completely independently of each other. An improved security ecosystem would not only improve computing performance, but would also create an environment where interoperability would usher in governance and completeness. Facilitating information sharing between security systems would improve overall security by enabling systems to react in a more efficient manner when addressing security threats, he said.

The Department of Homeland Security (DHS) protects the federal executive branch, and works with critical infrastructure (gas, oil, electricity, telecom, etc.) to help them better protect themselves. DHS is currently working on a cyber security awareness campaign.

Stop, Think, Connect

Last year, DHS launched the “Stop, think, connect” campaign, which is directed at teens, young adults and parents of teens. With increased awareness, DHS believes that the threat of cyber security attacks will be lessened. For more information on the campaign, please go to http://www.dhs.gov/files/events/stop-think-connect.shtm.

McConnell mentioned that President Obama spoke on importance of private sector innovation earlier yesterday. He also stated that cyberspace is a new domain that is vital to our way of life. Therefore, it needs to be made more secure. Of course, government must play an important role in this process, but since cyber security is a civilian space, no one actor can secure it alone.

Given the global market of cyberspace, McConnell argued that the U.S. should continue to lead the security effort working together with consumers to achieve security. He then went on to suggest that an open, broad interoperability regime online would be able to validate attributes for online systems, but also emphasized that anonymity must be preserved.

Like every other function in IT, security, too, needs to be clearly defined in order to move forward.



McConnell concluded his keynote by speaking about a future white paper on the health of the cyber ecosystem, which will be based on the premise of a more secure cyberspace, where participants can work together in real-time to work against attacks. This cyber ecosystem would require automation, authentication and interoperability, enabling participating devices at any edge of a network to communicate with each other by policy established by the system owner. The ultimate purpose of the white paper is to encourage discussion and participation in an ecosystem that is more secure.

Dell innovation guru Stikeleather continued the plenary by emphasizing the need for a “Law of the Commons.” Like every other function in IT, security, too, needs to be clearly defined in order to move forward, he said. Clear definitions will enable the transparency and the common understanding needed for organizations and governments to communicate and discuss what goals the cyber community should strive to attain. This would not only lead to increased security, but it would also lead to improved trust, when addressing the growing concern of consumer privacy.

Co-evolution

The consequences of the Web’s evolution is actually a co-evolution, he said, in which people depend more on technology and we are restructuring how we see data (augmented reality); while technology is becoming contextual, dependent on who is making the request, how and when they are making it, and what their intentions are in making it.

In such a fluid environment trust is essential, but can there realistically be trust? We have created an untrustworthy environment, Stikeleather said, and the tipping point will be smart phones in the enterprise. This technology, in particular, is creating greater cracks in a complex environment that is destined to ultimately fail.

We’ve created rules for shared international usage of the world’s oceans and for outer space, and cyberspace should be no different.



Additionally, government and enterprise can’t agree on what the world should look like from a security perspective, due to differing cultural concepts in cyberspace, creating the need for a "Law of the Commons." We’ve created rules for shared international usage of the world’s oceans and for outer space, and cyberspace should be no different.

At the end of the day, everything is an economic survival issue, Stikeleather said. The real value of the Web has been network effects. If we were to lose trust in privacy and security, we'd lose the currency of that global network exchange and the associated economic model, which in turn could actually mean the collapse of the global economy, he said. A catastrophic event is likely to happen, he predicted. What will the world without trust look like? A feudal cyber world with white lists, locked clients, fixed communication routes, locked and bound desktops, limited transactions, pre-established trading partners, information hoarders, towers of Babel.

Underlying structure

We have a unique opportunity with cloud, Stikeleather said, to get it right early and put thought into what the underlying structure of cloud needs to look like, and how to conduct the contextual nature of evolving technology. Meantime, people should own the right to their own identity and control their information, and we need to secure data by protecting it within content.

There were a lot of car analogies during the plenary, whether intentional or not, and my favorite one of the day came from Calloni of Intel – “security needs to be built-in, not bolt-on.” I’ve thought of this analogy many times before when discussing IT, especially in regards to enterprise architecture.

Calloni said that given human nature’s tendency to use technology to engineer ways to make our life easier, better, more functional, etc., we increase the risk by increasing exposure. Drawing a comparison to a Ford Pinto, he stated that if organizations can purely focus on security, their probability of success would increase exponentially. However, when we add functionalities where focus will be more distributed, security will decrease as the attack surface increases.

He outlined key questions that each organization should ask when determining security:
  • Who has access?
  • What are the criteria for gaining access/clearance?
  • Who has controls?
  • What function is most important? Is being balanced key?
  • What type of security do you need?
Security is expensive, so the need to reduce an organization’s attack surface is critical, when establishing a security policy. In order to build a security policy that will protect your organization, Calloni argued that you must be able to look at what area or parts of your system/network are available for an assailant to compromise. Five key areas that must be looked at include:
  • Vulnerability -- to have it, an attacker must be able to access it
  • Threats -- any potential hazard of harm to the data, systems or environment by leveraging a vulnerability; Individual taking advantage of a vulnerability
  • Risk -- the probability of the threats using the vulnerabilities; higher risks come with more vulnerabilities and increased threats
  • Exposure -- the damage done through a threat taking advantage of a vulnerability
  • Countermeasures -- processes and standards that are used to combat and mitigate the risks
Like a car's drivetrain, security needs to be built-in, not bolted-on. Security frameworks need to have the solid foundation in which organizations can build-on in order to address the ever-changing cyber threats. Bolt-ons will only provide temporary band-aids that will leave your organization vulnerable to cyber threats, he emphasized.

As organizations move toward the cloud and as cyber threats are becoming more commonplace, it will be interesting to see what importance organizations place on the themes discussed yesterday. They definitely apply to the remaining conference tracks. I’m especially looking forward to how what the enterprise architecture and cloud speakers will address these topics.

If you want a real-time view of the 2011 San Diego Conference, please search for the Twitter hashtag #ogsdg.

You may also be interested in:

Monday, February 7, 2011

Take The Open Group survey to measure the true enterprise impact of cloud computing

This guest post comes from Dave Lounsbury, Chief Technical Officer at The Open Group.

By Dave Lounsbury

Everyone in the IT industry knows by now that cloud computing is exploding. Gartner said cloud computing was its number-one area of inquiry in 2010, and hype for the popular computing movement peaked last summer according to its 2010 Hype Cycle for Cloud Computing.

Regardless of what the media says, cloud is now a real option for delivery of IT services to business, and organizations of all sizes need to determine how they can generate real business benefit from using cloud. Industry discussion about its benefits still tend to be more focused on such IT advantages as IT capacity and utilization versus business impacts such as competitive differentiation, profit margin, etc. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

The Open Group’s Cloud Work Group has created a series of White Papers to help clarify the business impacts of using cloud and, as a next step, The Open Group at our San Diego Conference today launched a public survey that will examine the measurable business drivers and ROI to be gained from the cloud. We encourage you to spend a few minutes completing the online survey.

We’re specifically looking for input from end-user organizations about their business requirements, outcomes and initial experience measuring ROI around their cloud projects.



We’re specifically looking for input from end-user organizations about their business requirements, outcomes and initial experience measuring ROI around their cloud projects. The survey both builds on the work already done by the Cloud Work Group, and the results of the survey will help guide its future development on the financial and business impact of cloud computing.

The survey will be open until Monday, March 7, after which we’ll publish the findings. Please help us spread the word by sharing the link -- http://svy.mk/ogcloud -- to our survey with others in your network who are either direct buyers of cloud services or have influence over their organization’s cloud-related investments.

This guest post comes from Dave Lounsbury, Chief Technical Officer at The Open Group.

You may also be interested in:

Sunday, February 6, 2011

Android gaining as enterprises ramp up mobile app development across platforms and business models

It's a post-PC world, and mobile development is the name of the game. According to a report from Appcelerator and IDC, businesses and developers are racing to define a winning mobile strategy, while keeping an eye on platforms and business models.

The first-quarter report, garnered from a survey of 2,235 Appcelerator Titanium developers, shows that while the iPhone and iPad are still the leaders of the pack, Android smartphones and tablets are gaining large amounts of developer interest. Google has nearly caught up to Apple in smartphones and is closing the gap on tablets.

Businesses are increasingly taking a multi-platform approach. On average, respondents said they plan to deploy apps on at least four different devices.



The report also shows that for enterprises the days of mobile app exploration are drawing to a close and companies are moving, or have moved, into an acceleration phase, with an eye toward greater innovation. This year, developers and businesses expect to triple their app development efforts, and the average developer is now building for four different devices.


In addition, there is a dramatic increase in the integration of geo-location, social, and cloud connectivity services, along with increased plans to integrate advertising and in-app purchase business models.

With the growth in the market, Appcelerator and IDC have developed a "Mobile Maturity Model" to describe the three phases of mobility adoption -- exploration, acceleration, and innovation.

Last year, most respondents (44 percent) said they were in the exploration phase of their mobile strategy. A simple app or two -- typically on iPhone -- and a focus on free brand-affinity apps was standard practice. This year, 55 percent of respondents said they are now shifting into the ‘acceleration’ phase.

Summary of findings

Other findings from the report:
  • On average, each respondent said they plan to develop 6.5 apps this year, up 183 percent over last year.

  • Businesses are increasingly taking a multi-platform approach. On average, respondents said they plan to deploy apps on at least four different devices (iPhone, iPad, Android Phone, Android Tablet) this year, up two-fold over 2010.

  • Ubiquitous cloud-connectivity: 87 percent of developers said their apps will connect to a public or private cloud this year, up from only 64 percent deploying cloud-connected apps last year.

    In addition to cloud services, integration of social and location services will explode in 2011 and will define the majority of mobile experiences this year.



  • Always connected, personal, and contextual: in addition to cloud services, integration of social and location services will explode in 2011 and will define the majority of mobile experiences this year. Interest in commerce apps is also on the rise, with PayPal beating Apple as the most preferred method for payments.

  • Business models are evolving along with these more engaging mobile app experiences. Developers are shifting away from free brand affinity apps and becoming less reliant on 99-cent app sales. Increasingly, the focus is on user engagement models such as in-app purchasing and advertising, with mobile commerce on the horizon.

  • Outsource goes in-house: the enterprise takes control of its mobile destiny. 81 percent of respondents said they insource their development, with the majority saying they have an integrated in-house web and mobile team.
A mobile strategy

What do Appcelerator and IDC recommend for business trying to develop a mobile strategy? It's a four pronged approach:
  • Platforms: Cross-platform is mandatory, as is deploying to multiple form factors like tablets. In the third innovation phase, a business is thinking about possibilities across all major platforms and devices.

  • Customer: This perspective considers the shift away from simple content-based apps that inform or entertain to more complex and engaging applications that make use of location, social, and cloud services to transactional applications such as mobile commerce. As the customer experience evolves, so does application sophistication, customer expectations, business transformation opportunities, and the underlying business models. Free branded apps and a reliance on purely app store sales give way to advertising, in-application purchasing, and mobile commerce.

    What starts as a tactical outsourcing of development “to get an app done fast” quickly turns into a more strategic discussion.



  • People: There is an increasing shift from outsourcing to in-house development. What starts as a tactical outsourcing of development “to get an app done fast” quickly turns into a more strategic discussion around competitive advantage, control over a sustainable long-term mobile strategy, and rapid time-to-market considerations.

  • Technology: In order to meet the demand for more apps, new devices, frequent updates, and deeper customer engagement, a business needs to drive down costs, time-to-market, and complexity by developing and leveraging reusable components. Ultimately, this results in the need for a cross-platform, fully integrated mobile architecture that spans a company’s entire app portfolio.
A copy of the full report is available from the Appcelerator site.

You may also be interested in:

Wednesday, February 2, 2011

Open Group conference next week focuses on role and impact of enterprise architecture amid shifting sands for IT and business

Next week's The Open Group conference in San Diego comes at an important time in the evolution of IT and business. And it's not too late to attend the conference, especially if you're looking for an escape from the snow and ice.

From Feb. 7 through 9 at the Marriott San Diego Mission Valley, the 2011 conference is organized around three key themes: architecting cyber security, enterprise architecture (EA) and business transformation, and the business and financial impact of cloud computing. CloudCamp San Diego will be held in conjunction with the conference on Wednesday, Feb. 9. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

Registration is open to both members and non-members of The Open Group. For more information, or to register for the conference in San Diego please visit: http://www.opengroup.org/sandiego2011/register.htm. Registration is free for members of the press and industry analysts.

The Open Group is a vendor- and technology-neutral consortium, whose vision of Boundaryless Information Flow™ will enable access to integrated information within and between enterprises based on open standards and global interoperability.

I've found these conferences over the past five years an invaluable venue for meeting and collaborating with CIOs, enterprise architects, standards stewards and thought leaders on enterprise issues. It's one of the few times when the mix of technology, governance and business interests mingle well for mutual benefit.

The Security Practitioners Conference, being held on Feb. 7, provides guidelines on how to build trusted solutions; take into account government and legal considerations; and connects architecture and information security management. Confirmed speakers include James Stikeleather, chief innovation officer, Dell Services; Bruce McConnell, cybersecurity counselor, National Protection and Programs Directorate, U.S. Department of Homeland Security; and Ben Calloni, Lockheed Martin Fellow, Software Security, Lockheed Martin Corp.

Change management processes requiring an advanced, dynamic and resilient EA structure will be discussed in detail during The Enterprise Architecture Practitioners Conference on Feb. 8. The Cloud Computing track, on Feb. 9, includes sessions on the business and financial impact of cloud computing; cloud security; and how to architect for the cloud -- with confirmed speakers Steve Else, CEO, EA Principals; Pete Joodi, distinguished engineer, IBM; and Paul Simmonds, security consultant, the Jericho Forum.

General conference keynote presentation speakers include Dawn Meyerriecks, assistant director of National Intelligence for Acquisition, Technology and Facilities, Office of the Director of National Intelligence; David Mihelcic, CTO, the U.S. Defense Information Systems Agency; and Jeff Scott, senior analyst, Forrester Research.

I'll be moderating an on-stage panel on Wednesday on the considerations that must be made when choosing a cloud solution -- custom or "shrink-wrapped" -- and whether different forms of cloud computing are appropriate for different industry sectors. The tension between plain cloud offerings and enterprise demands for customization is bound to build, and we'll work to find a better path to resolution.

I'll also be hosting and producing a set of BriefingsDirect podcasts at the conference, on such topics as the future of EA groups, EA maturity and future roles, security risk management, and on the new Trusted Technology Forum (TTF) established in December. Look for those podcasts, blog summaries and transcripts here over the next few days and weeks.

For the first time, The Open Group Photo Contest will encourage the members and attendees to socialize, collaborate and share during Open Group conferences, as well as document and share their favorite experiences. Categories include best photo on the conference floor, best photo of San Diego, and best photo of the conference outing (dinner aboard the USS Midway in San Diego Harbor). The winner of each category will receive a $25 Amazon gift card. The winners will be announced on Monday, Feb. 14 via social media communities.

It's not too late to join in, or to plan to look for the events and presentations online. Registration is open to both members and non-members of The Open Group. For more information, or to register for the conference in San Diego please visit: http://www.opengroup.org/sandiego2011/register.htm. Registration is free for members of the press and industry analysts.

You may also be interested in:

Tuesday, January 25, 2011

HP enters public cloud market, puts muscle behind hybrid computing value and management for enterprises

HP today fully threw its hat into the public cloud-computing ring, joining the likes of Amazon Web Services (AWS) and IBM, to provide a full range of infrastructure as a service (IaaS) offerings hosted on HP data centers.

Targeting enterprises, independent software vendors (ISVs), service providers, and the global HP channel and partner ecosystem, the new HP Enterprise Cloud Services-Compute (ECS-Compute) bundles server, storage, network and security resources for consumption as pure services.

ECS-Compute is an HP-hosted compute fabric that's governed via policies for service, performance, security, and privacy requirements. The fabric is available next month via bursting with elasticity provisioning that rapidly adjusts infrastructure capacity, as enterprise demands shift and change, said HP. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

HP CloudSystem, a new private-hybrid cloud enablement offering that automates private cloud provisioning, uses HP Cloud Service Automation (CSA) solutions and HP Converged Infrastructure physical assets so that enterprises, governments, and service providers can better build, manage, and consume hybrid cloud services, said HP.

This is a hybrid services delivery capability, and you can manage it all as a service.



HP CloudSystem supports a broad spectrum of applications while speeding and simplifying the buying, deployment and support of cloud environments, said HP. CloudSystem brings "cloud maps" to play so that more applications can be quick-start "ported" to a cloud or hybrid environment.

The ECS-Compute and CloudSystem announcements much more fully deepen HP's cloud strategy, building on earlier announcements around CSA and Cloud Assure offerings. HP, however, is coming to the public cloud space from a hosting and multi-tenancy heritage, in large part through its EDS acquisition. That, HP expects, will make its cloud models more appealing to large businesses, governments and applications providers. HP is also emphasizing the security and management capabilities of these offerings.

As a new public cloud provider, HP is competing more directly with IBM, Rackspace, AWS, and Microsoft, and very likely over time, with private and hybrid cloud products from EMC/VMware, Oracle, Cisco, Red Hat, TIBCO and Google. There will be more overlap with burgeoning software-as-a-service (SaaS) providers like Salesforce.com, as they seek to provide more cloud-based infrastructure services.

Yet even among that wide field, HP is seeking to differentiate itself with a strong emphasis on hybrid computing over assemblages or components of plain vanilla public cloud services. HP sees a governance path for computing resources and services from a variety of sources and models (including legacy IT) that add up to IT as a service as its long-term strategic value.

"This is a hybrid services delivery capability, and you can manage it all as a service," said Rebecca Lawson, director of cloud initiatives at HP. The services are designed to help organizations "grow and manage the applications," regardless of the applications' heritage, production model, or technology, said Lawson.

"We're now saying, 'welcome to our data center' ... but we're ecumenical and agnostic on platform and applications," she said.

Also part of the Jan. 25 news, HP Hybrid Delivery will help businesses and governments build, manage, and consume services using a combination of traditional, outsourced and cloud services best suited to them. It consists of HP Hybrid Delivery Strategy Service, to provide a structured understanding of the programs, projects, and main activities required to move to a hybrid delivery model; and HP Hybrid Delivery Workload Analysis Service, to analyze enterprise workloads to determine the best fits for hybrid environments.

Professional services

HP sees these as enabling a "journey" to cloud and hybrid computing, with a strong emphasis on the professional services component of learning how to efficiently leverage cloud models.

HP's vision for the cloud -- part of its solution set for the demands of the "Instant-On Enterprise" -- clearly emphasizes openness and neutrality when it comes to operating systems, platforms, middleware, virtual machines, cloud stacks, SaaS providers, and applications, said Lawson. HP will support all major workloads and platforms from its new cloud hosting services, and help to govern and manage across them via HP's hybrid computing and private cloud capabilities as well, said Lawson.

The achievement of the instant-on enterprise, said Sandeep Johri, vice president of strategy and industry solutions at HP, comes from an increasing ability to automate, orchestrate, secure and broker services -- regardless of their origins: traditional IT, or public or private clouds.

HP therefore has a rare opportunity to appeal to many organizations and governments that fear cloud lock-in.



In other words, hybrid computing (perhaps even more than cloud itself) will become a key enabling core competency for enterprises for the foreseeable future. HP is banking on that, expecting that the platform and lock-in wars will push customers to an alternative lower-risk partner that emphasizes inclusion and open standards over singular cloud stacks.

HP therefore has a rare opportunity to appeal to many organizations and governments that fear cloud lock-in, as well as the costs and complexity of following a SaaS or software platform vendor's isolated path to cloud, which may come from a heritage of on-premises platform or proprietary stack lock-in, rather than from a support of heterogeneity and of a heritage of a myriad of hosted services.

Whereas some vendors such as VMware, Oracle, Microsoft, Cisco, Red Hat and Citrix are cobbling together so-called integrated cloud stacks -- and then building a set of hosting services that will most likely favor their stacks and installed bases, HP is working to focus at the higher abstraction of management and governance across many stacks and models. Hence the emphasis on hybrid capabilities. And, where some SaaS and business applications vendors are working to bring cloud infrastructure services and/or SaaS delivery to their applications, HP is working to help its users provide an open cloud home and/or hybrid support for all their applications, inclusive of those hosted anywhere.

HP's cloud strategy, then, closely follows (for now) its on-premises data center infrastructure strategy, with many options on software and stack, and an emphasis on overall and holistic management and cost-efficiency.

Less complex path

Some analysts I've heard recently, say that HP is coming late to public cloud. But, coming from a hosting and single- and multi-tenancy applications support services heritage may very well mean that HP already has a lot of cloud and hosted services DNA, and that the transition from global hosting for Fortune 500 enterprises to a full cloud offerings is a less tortured and complex path than those from other vendors, such as traditional on-premises OS, platform, middleware, and infrastructure license providers, as well as SaaS-to-cloud providers.

HP may be able to effectively position itself as more IT transformation-capable and mission-critical support-ready -- and stack-neutral and applications-inclusive -- to provide a spectrum of hybrid cloud services at global scale with enterprise-calibre response, security and reliability. And because HP does not have a proprietary middleware stack of its own to protect, it can support the requirements of more of its customers across more global regions.

Enterprise mature from the get-go, not late to the cloud-hype party, might be a better way to describe HP's timing on cloud sourcing and support services. The value HP seems to be eyeing comes from agility and total costs reduction for IT -- not on a technology, license or skills lock-in basis.

By allowing a large spectrum of applications support -- and the ability to pick and choose (and change) the sourcing for the applications over time -- the risk of lock-in, and for unwillingly paying high IT prices, goes down. Hybrid, says HP, offers the best long-term IT value and overall cost-efficiencies. Hybrid, says HP, can save 30-40 percent of the cost of traditional IT, though not offering too many specifics on how long such savings would take.

"You can now run mission-critical applications with the economics of cloud," said Patrick Harr, vice president of cloud strategy and solutions at HP. "It's a hybrid world."

HP is also thinking hybrid when it comes to go-to-market strategies. It expects to appeal to ISVs, resellers, and system integrators/outsourcers with the newest cloud offerings. By being hybrid-focused and open and agnostic to underlying platforms, more channel partners will look to HP with less strategic angst and the potential for later direct competition as they might with an Oracle or Microsoft.

I can easily see where a choice of tool/framework and openness too in terms of workload and operations environments joined to a coordinated, managed services and hybrid hosting spectrum would be very appealing.



And, HP is putting a lot of consulting and professional services around the hybrid push, including HP Cloud Discovery Workshops that help enterprises develop a holistic cloud strategy, with a focus on cloud economics, applications and cloud security.

HP ECS-Compute will be available in the US and EMEA countries in February, and in Asia-Pacific countries in June.

“To create an Instant-On Enterprise, organizations need to close the gap between what customers and citizens expect and what the enterprise can deliver,” said Ann Livermore, executive vice president, HP Enterprise Business. “With HP’s cloud solutions, clients can determine the right service delivery models to deliver the right results, in the right time frame, at the right price.”

These new offerings will not be a last chapter in HP's cloud and IT transformation drive. Looking back to last month's ALM 11 announcements, and HP's long heritage of SaaS test and dev services, one can easily envision a more end-to-end applications lifecycle and hybrid cloud operations capabilities set. Think of it as a coordinated, hybrid services approach to applications definition, build, test, deploy and brokering -- all as an open managed lifecycle.

That means joining PasS and hybrid computing on an automated and managed continuum, for ISVs, service providers, governments and enterprises. I can easily see where a choice of tool/framework and openness too in terms of workload and operations environments joined to a coordinated, managed services and hybrid hosting spectrum would be very appealing.

Such a flexible cloud support horizon -- from cradle to grave of applications and data -- could really impact the total cost of IT downward, while reducing complexity, and allowing businesses to focus on their core processes, innovation and customer value, rather than on an ongoing litany of never-ceasing IT headaches.

You may also be interested in:

Platform ISF 2.1 improves use, breadth of private cloud management, sets stage for use of public cloud efficiencies

Platform Computing on Tuesday released Platform ISF 2.1, which improves ease of use and automation for building and managing enterprise private clouds.

Platform's cloud management software helps enterprises transition from internal IT to more productive and efficient private cloud infrastructure services that support multi-tier applications.

New in Platform ISF 2.1 is a dynamic “single cloud pane” for cloud administration; expanded definitions for support of multi-tier application environments such as Hadoop, Jboss, Tomcat and WebSphere; and enhanced business policy-driven automation that spans across multiple data centers. [Disclosure: Platform Computing is a sponsor of BriefingsDirect podcasts.]

Enterprises looking to take advantage of the cloud do so for many reasons but one of the key ones is to enhance their agility in response to changing business dynamics.



By automating delivery of complex enterprise infrastructure and production applications across heterogeneous virtual, physical and public cloud resources, Platform ISF also helps reduce electricity and cooling requirements while freeing up capacity in data centers. The management layer provides improved monitoring, policy management, and workload management across multiple and heterogenous cloud and traditional IT stacks. By capturing corporate standards and business policies within the automation engine, companies can improve both compliance and security, said Platform Computing.

Via the single-pane administration capabilities, what Toronto-based Platform calls a "cloud cockpit," users can self-select approved services to support a wide variety of applications. Enhanced end-user portals are also new, including drag-and-drop portlet-based dashboards and customizable application instantiation pages.

What's more, the applications be can monitored from both private and public clouds, such as Amazon Web Services (AWS). The degree of management allows for future planning and capacity management, to help exploit hybrid computing benefits and cut the total overall costs of supporting applications.

Enhancing agility

“Enterprises looking to take advantage of the cloud do so for many reasons but one of the key ones is to enhance their agility in response to changing business dynamics,” said Cameron Haight, Research Vice President, Gartner, in a release. “This means that the technology used to manage cloud environments should be similarly agile and act to facilitate and not impede this industry movement. IT organizations should look for tools that can address the various cloud usage scenarios without demanding excessive investments in management infrastructure or staff support.”

Key capabilities in Platform ISF 2.1 include: self-service and chargeback, policy-based automated provisioning of applications, dynamic scaling of applications to meet service level agreements (SLAs) and unification of distributed and mixed-vendor resource pools for sharing. A unique “Active-Active” multiple data center supports higher availability and scalability by leveraging Oracle GoldenGate.

The goal is to manage the heterogeneous applications lifecycle, not just multiple cloud instances.



Ease of use benefits in the new release, which is available now, include account management and delegation based on applications or business processes. Such delegation can occur for such cloud-supported functions as platform as a service (PaaS), infrastructure as a service (IaaS), and hierarchical applications and their supporting components and services. Also included is self-service hierarchical account and resource management (including Active Directory for 10,000+ users) supporting an unlimited number of organizational tiers.

Business benefits include less downtime for applications, even as they are supported by hybrid resources, SLA-driven shared services, less need for specialized administrators, higher availability and creation of richer applications services catalogs. Use of Platform ISF 2.1 for private cloud activities clearly puts the users in a better position to use, exploit and manage public clouds, and to move quickly to the hybrid computing model. The goal is to manage the heterogeneous applications lifecycle, not just multiple cloud instances, said Jay Muelhoefer, VP Enterprise Marketing, Platform Computing.

A free 30-day trial of Platform ISF 2.1 can be downloaded at www.platform.com/privatecloud. Platform Computing is also hosting a webinar, “Building A Private Cloud Strategy – Best Practices” on Feb. 16. For more information about Platform ISF or the webinar, visit www.platform.com/privatecloud.

You may also be interested in: