Monday, June 6, 2011

HP at Discover releases converged infrastructure products and services aimed at helping IT migrate rapidly to the future

LAS VEGAS -- Cloud computing and mobility are redefining how people live, how organizations operate and how the world interacts. Enterprises must constantly adjust then to meet the changing needs of users, customers and the public by driving innovation and agility through technology.

Yet IT sprawl and outdated IT models and processes are causing enterprise complexity and crippling organizations’ abilities to keep pace with enterprise demands. Enterprises know they need to change, and they also have a pretty good idea of the IT operations and support they'd like to have. Now, it's a matter of getting there.

To help mobilize IT for the new order, HP today at HP Discover announced several Converged Infrastructure solutions that improve enterprise agility by simplifying deployment and speeding IT delivery. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Designed to be a key element in an Instant-On Enterprise, these offerings are designed in coordination to help reduce IT sprawl and turn technology assets into interoperable, shared pools of resources with a common management platform. They include:
  • Converged Systems, a new portfolio of turnkey, optimized and converged hardware, software, tailored consulting and HP Solution Support services that enable users to be up and running with new applications in hours vs. months.
  • Converged Storage architecture and portfolio, which integrates HP Store360 scale-out software with HP BladeSystem and HP ProLiant hardware to reduce storage administration, speed time to service delivery, increase energy efficiency and improve access for any data type or application workload. The offerings are complemented by new Storage Consulting services.

    The new solutions announced today extend the benefits of convergence to deliver new levels of speed, simplicity and efficiency that enable clients to capitalize and anticipate change.


  • Converged Data Center, a new class of HP Performance Optimized Data Centers (PODs) that can be deployed in just 12 weeks – and at a quarter of the cost when compared to a traditional brick-and-mortar data center. The HP POD 240a, also referred to as the “HP EcoPOD,” uses 95 percent less facilities energy. See separate blog on the EcoPOD.
  • HP Server Automation 9.1, which provides heterogeneous, automated server life cycle management for enterprise servers and applications for both converged and traditional environments. With new integrated database automation, it enables IT to significantly reduce the time it takes to achieve full application life cycle automation
Research conducted on behalf of HP found that 95 percent of private and public sector executives consider agility important to the success of their organizations. Plus, more than two-thirds of C-suite executives believe that enterprise agility is driven by technology solutions.

For me, enterprise IT strategists now basically know what they need for their data centers to meet the coming hybrid and cloud requirements. They will be using more virtualization, relying on standard hardware, managing their servers, storage and networks with increased harmony, supporting big data business intelligence, and dealing with more mobile devices.

More ways to move to modernization

HP is coming out with data center assets and services that -- pretty much better than ever for IT -- provide many on-ramps to modernizing all core IT infrastructure. The new and augmented products can be used by many types for organizations -- and at any stages of maturity -- to set out to meet modern and compete IT requirements. And they can do so knowing the capital and operating costs can be measured, managed and contained. These total IT costs are also being driven down from advancements in utilization, management, modular data center growth and pervasive energy conservation.

There are only a very few vendors that can supply the end-to-end data center transformation portfolio for the major domains of servers, storage, network and operational management. And HP is providing these globally with holistic and strategic integration so that the operational reliance and flexibility to scale and adapt become givens.

Most vendors are either hardware-heavy or software-centric, or lack depth in a major category like networking. HP has stated it plans to augment its software, and in the mean time is supporting best-of-breed choice on heterogeneous platforms, middleware and business applications -- including open source.

Additions to HP Technology Services were also announced, aimed at a life cycle of consulting support including strategy, assessment, design, test, implementation and training. HP Solution Support provides single-point-of-contact services for the entire turnkey solution, including third-party software.

Converged Storage for rapid response

Legacy monolithic and unified storage architectures were designed to address predictable workloads, structured data and dedicated access. Today’s requirements, however, are exactly opposite, with unpredictable application workloads, such as cloud, virtualization and big data applications, said Martin Whittaker, Vice President for Systems and Solutions Engineering, Enterprise Servers, Storage and Networking (ESSN), HP Enterprise Business.

HP’s Converged Storage architecture changes how data is accessed by integrating scale-out storage software with converged server and storage hardware platforms. Advanced management tools that span the architecture help speed IT service delivery. As a result, users can deploy and grow storage 41 percent faster while reducing administration time by up to 90 percent.

New solutions include:
  • HP X5000 G2 Network Storage Systems, which are built on HP BladeSystem technology. They can be deployed in minutes and reduce power requirements up to 58 percent, cooling needs up to 63 percent and storage footprint up to 50 percent.
  • HP X9000 IBRIX Network Storage Systems that optimize retention of unstructured data with new compliance features and the capacity for more than one million snapshots. The solution provides insight to the enterprise on trends, market dynamics and other pertinent facts by simplifying management of massive data sets. Policy management capabilities automate the movement of data to optimize resources.
Enhanced storage systems and services

HP’s fifth-generation Enterprise Virtual Array (EVA) family includes the new HP P6000 EVA, offering thin provisioning and Dynamic LUN Migration software along with 8Gb Fibre Channel, 10 Gb iSCSI and fibre channel over ethernet (FCoE) support. This enables users to consolidate application data to speed administration and reduce total cost of ownership.

HP Enterprise Services has integrated HP 3PAR Storage into the HP Data Center Storage Package, which offers storage management services that allow users to “flex” their storage needs up and down with changes in demand.

HP also offers new Storage Consulting Services that help users to design and deploy a Converged Storage environment, optimize the storage infrastructure, reduce costs, and protect and align data while preparing storage for cloud computing.

HP VirtualSystem

V
irtualization technology has been widely adopted to consolidate servers, gain flexibility and optimize return on investment. However, virtualized environments can be difficult to plan, deploy and operate due to the proliferation of management tools, uncertain performance characteristics and unaddressed security concerns.

The new HP VirtualSystem portfolio, based on HP Converged Infrastructure, consists of turnkey server and client virtualization solutions. See more news on the HP AppSystems portfolio.

The offerings are built on the HP BladeSystem platform, HP Lefthand/3PAR storage and HP FlexFabric networking technologies. As a result, they support up to three times more virtual machines per server, three times the I/O bandwidth and twice the memory as competing offerings. Also, the HP VirtualSystem portfolio is heterogeneous, supporting existing IT investments, multiple hypervisor strategies and operating systems with a common architecture, management and security model.

In a world where enterprises must instantly react to changing markets, clients are turning to HP Converged Infrastructure to dramatically improve their agility.



Further, HP VirtualSystem provides a path to cloud computing by utilizing similar hardware infrastructure and management environments as HP CloudSystem. To extend their environments and evolve to cloud computing, users follow a simple, rapid upgrade process, said James Jackson, Vice President for Marketing Strategy, ESSN in HP Enterprise Business.

HP offers three scalable deployment systems for small, midsize and large enterprises. Each includes leading hypervisor technologies from Microsoft or VMware, as well as the leading operating systems and applications.

The open, modular design of HP VirtualSystem simplifies management with a single-pane-of-glass view into each layer of the virtualized stack. HP TippingPoint security can be added for comprehensive threat protection of both physical and virtual platforms.

Users can increase the availability, performance, capacity allocation and real-time recovery of their HP VirtualSystem solutions with HP SiteScope, HP Data Protector, HP Insight Control and HP Storage Essentials software extensions.

Client Virtualization Reference Architecture for Enterprise includes Citrix or VMware software and HP Mission Critical Virtualization Reference Architecture complement the HP VirtualSystem solutions. The reference architecture resources contain a consistent set of architectural best practices, which enable users to rapidly deploy virtualized systems, improve security and performance, and reduce operating costs.

Life cycle support and consulting

To further reduce the complexity of virtual environments, HP Technology Services provides a full life cycle of consulting services, from strategy, assessment, design, test, implementation, training, and then transition to HP Solution Support for ongoing peace of mind.

HP VirtualSystem solutions are expected to begin shipping in the third quarter of this year. Availability of HP Client Virtualization Reference Architecture for Enterprise is expected in June and HP Mission Critical Virtualization Reference Architecture is expected in the third quarter of this year.

On-demand replays of the HP Discover press conference are available at www.hp.com/go/agileIT. Additional information about HP’s announcements at HP DISCOVER is available at www.hp.com/go/agility2011.

You may also be interested in:

HP delivers applications appliance solutions that leverage converged infrastructure for virtualization, data management

LAS VEGAS -- As part of its new Converged Infrastructure offerings, HP today here at the DISCOVER 2011 event rolled out AppSystems Portfolio, which offers a fully integrated appliance-like technology stack that includes hardware, management software, applications, tailored consulting and HP Solution Support services.

The new HP AppSystems portfolio is designed to improve application performance and reduce implementation from months to a matter of minutes. New application deployments can be complex, taking up to 18 months to roll out and optimize for the business.

The complexity of maintaining and integrating these environments often results in missed deadlines, incomplete projects, increased costs and lost opportunities. In fact, only 32 percent of application deployments are rated as “successful” by organizations, in a recent HP survey. [Disclosure: HP is a sponsor of Briefings Direct podcasts.]

HP AppSystem solutions can be rapidly deployed, supports a choice of applications, and is built on open standards to seamlessly integrate within existing infrastructures. The portfolio includes the following:

The complexity of maintaining and integrating these environments often results in missed deadlines, incomplete projects, increased costs and lost opportunities.


  • The HP Business Data Warehouse Appliance, which reduces the complexities and costs faced by many midmarket users when deploying data warehouses. Optimized for Microsoft SQL Server 2008 R2, the system can be implemented 66 percent faster than competing solutions for less than $13,000 per terabyte. Jointly engineered with Microsoft, the solution results in up to 50 percent faster input/output bandwidth to speed data load and query response.
  • The HP Database Consolidation Solution optimized for Microsoft SQL Server 2008 R2, which simplifies the management of virtualized infrastructures associated with the proliferation of SQL server databases. Optimized for Microsoft SQL Server 2008 R2, it consolidates hundreds of transactional databases into a single, virtual environment while enabling applications to access data.Once installed, new high-performance SQL Server databases can be provisioned in minutes and migrations can be accomplished with near-zero downtime.
  • The new HP VirtualSystem portfolio, also based on HP Converged Infrastructure, consists of turnkey server and client virtualization solutions. The offerings are built on the HP BladeSystem platform, HP Lefthand/3PAR storage and HP FlexFabric networking technologies. As a result, they support up to three times more virtual machines per server, three times the I/O bandwidth and twice the memory as competing offerings. Also, the HP VirtualSystem portfolio is heterogeneous, supporting existing IT investments, multiple hypervisor strategies and operating systems with a common architecture, management and security model.
These new solutions expand HP’s line of turnkey appliances, which also includes: HP Enterprise Data Warehouse Appliance and HP Business Decision Warehouse Appliance.

HP Technology Services also provides a full life cycle of consulting services, from strategy, assessment, design, test, implementation, training and HP Solution Support.

Vertica Analytics: Exadata Killer?

A key component of the new Converged Infrastructure offerings is the HP Vertica Analytics System, a potential Exadata killer for real-time Big Data analytics. Traditional relational database management systems (RDBMS) and enterprise data warehouse (EDW) systems were designed for the business needs of nearly 20 years ago. Today, vast amounts of structured and unstructured data are being created everywhere, every instant and from a variety of sources.

Built on the HP Converged Infrastructure, the new HP Vertica Analytics System provides an appliance-like, integrated technology stack that includes hardware, management software applications, consulting and HP Solution Support services.

The scale out cluster architecture of the HP Vertica Analytics System utilizes columnar storage and massively parallel processing (MPP) architecture. This enables users to load data up to 1,000 times faster than traditional row-stored databases and supports hundreds of nodes as well as petabytes of data efficiently without performance degradation, said HP. Further, because the systems are able to query data directly in compressed form, clients can store more data, achieve faster results, and use less hardware.

Only 32 percent of application deployments are rated as “successful” by organizations.



The HP Vertica Analytics System provides:
  • A data-compression technology that delivers a 50 to 90 percent reduction in database storage requirements with 12 separate compression algorithms.
  • An integrated, next-generation analytics database engine that can be deployed in minutes.
  • An optimized physical database design for user query needs with the Database Designer tool.
  • Faster response that can generate results in seconds versus hours and in real time.
  • Ease of integration with existing business analytics applications, reporting tools and open source software frameworks that support data-intensive distributed applications, such as Apache Hadoop.
The HP Vertica Analytics System is available immediately, in quarter, half and full-rack configurations. The HP Vertica Analytics Platform (software) also may be deployed on existing x86 hardware with the ability to run the Linux operating system.

When HP acquired Vertica early in 2011, I wondered if this was their path to a Exadata killer. Exadata, you may recall, was a join warehouse appliance effort between Oracle and HP before Oracle bought Sun. The HP hardware part of the Exadata line kind of fizzled out as Sun hardware was then used.

But now, Vertica plus HP converged infrastructure is architected to leverage in-memory data analytics of, for and by Big Data in the petabytes range. Oracle has its OLTP strengths, but for real-time analytics at scale and affordable cost, HP is betting big on Vertica. It's a critical element at the heart of HP’s growth strategy. These announcements around ease of deployment and support should go a long way to helping users explore and adopt it.

HP Vertica Systems real time analytics platform already has more than 350 clients in a variety of industries including finance, communications, online web and gaming, healthcare, consumer marketing and retail, said HP.

"We're winning deals against Exadata," said Martin Whittaker, Vice President for Systems and Solutions Engineering, Enterprise Servers, Storage and Networking (ESSN), HP Enterprise Business.

You may also be interested in:

HP rolls out EcoPOD modular data center, provides high-density converged infrastructure with extreme energy efficiency

LAS VEGAS – HP today at Discover here unveiled what it says is the world’s most efficient modular data center, a compact and self-contained Performance Optimized Data Center (POD) that supports more than 4,000 servers in 10 percent of the space and with 95 percent less energy than conventional data centers.

The HP POD 240a also costs 25 percent of what traditional data centers cost up front, and it can be deployed in 12 weeks, said HP. It houses up to 44 industry standard racks of IT equipment.

The EcoPOD joins a spectrum of other modular data center offerings, filling a gap on the lower end of other PODs like the shipping-container-sized Custom PODs, the HP POD 20c - 40c, and the larger bricks and mortar HP Flexible Data Center facilities. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

The EcoPOD can be filled with HP blade servers and equipment, but also supports servers from third-parties. It is optimized for HP converged infrastructure components, however. HP says the EcoPOD can be ordered and delivered in three months, and then just requires an electric power and network connect to become operational.

The modular design, low capital and operating costs and rapid deployment will be of interest to cloud providers, Web 2.0 applications providers, government and oil industry users. I was impressed with its role in business continuity and disaster recovery purposes. The design and attributes also will help those organizations that need physical servers in a certain geography or jurisdiction for compliance and legal reasons, but at low cost despite the redundancy of the workloads.

The HP EcoPOD also provides maximum density for data center expansion or as temporary capacity during data center renovations or migrations, given that it streamlines a 10,000-square-foot data center into a compact, modular package in one-tenth the space, said HP.

The design allows for servers to be added and subtracted physically or virtually, and the cooling and energy use can be dialed up and down automatically based on load and climate, as well as via set policies. It can use outside air when appropriate for cooling ... like my house most of the year.

The HP POD 240a is complemented by a rich management capability, the HP EcoPOD Environmental Control System, with its own APIs and including its own remote dashboards and control suite, as well as remote client access from tablet computers, said HP.

The cost savings are eye-popping. HP says an HP POD 240a costs $552,000 a year to operated, versus $15.4 million for traditional systems energy use.

Built at a special HP facility in Houston, HP POD-Works, the EcoPODs will be available in the Q4 of this year in North America, and rolling out globally into 2012.

HP is also offering leasing arrangement, whereby the costs of the data center are all operating expenses, with little up-front costs.

You may also be interested in:

Friday, June 3, 2011

MuleSoft takes full-service integration to the cloud with iON iPaaS ESB platform

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Read a full transcript or download a copy. Join the iON beta program. Sponsor: MuleSoft.

Enterprise application integration (EAI) as a function is moving out of the enterprise and into the cloud. So called integration platform as a service (iPaaS) has popped up on the edge of the enterprise. But true cloud integration as a neutral, full service, and entirely cloud-based offering has been mostly only a vision.

Yet, if businesses need to change rapidly as the cloud era unfolds, to gain and use new partners and new services, then new and flexible integration capabilities across and between extended applications and services are essential.

The older point-to-point methods of IT integration, even for internal business processes, are slow, brittle, costly, complex and hard to manage. Into this opportunity for a new breed of cloud integration services steps MuleSoft, a market leading, open-source enterprise service bus (ESB) provider, which aims to create a true cloud integration platform called Mule iON. [Disclosure: MuleSoft is a sponsor of BriefingsDirect podcasts.]

MuleSoft proposes nothing short of an iPaaS service that spans software as a service (SaaS) to legacy, SaaS to SaaS, and cloud to cloud integration. In other words, all of the above, when it comes to integrations outside of the enterprise.

BriefingsDirect recently learned more about MuleSoft iON, how it works and its pending general availability in the summer of 2011. There's also the potential for an expanding iON marketplace that will provide integration patterns as shared cloud applications, with the likelihood of spawning constellations of associated business to business ecosystems.

Explaining the reality for a full-service cloud-based integration platform solution are two executives from MuleSoft, Ross Mason, Chief Technology Officer and Founder, and Ali Sadat, the Vice President of Mule iON at MuleSoft. They are interviewed by Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:
Gardner: It strikes me that the number of integrations that need to be supported are further and further toward the edge -- and then ultimately outside the organization.

Like it or not, any company of any size has some if not most of its data now outside of the firewall.



Mason: We describe it internally as the center of the enterprise gravity is shifting. The web is the most powerful computing resource we’ve had in the information age, and it’s starting to drag the data away from the enterprise outside into the platform itself. What this means for enterprises is, like it or not, any company of any size, has some if not most of its data now outside of the firewall.

I'm not talking about the Fortune 2000. They still have 95 percent of their data behind the firewall, but they’re also changing. But, for all of the enterprises and for forward-thinking CIOs, this is a very big and important difference in the way that you run your IT infrastructure and data management and security and everything else.

It turns a lot of things on its head. The firewall is constructed to keep everything within. What’s happening is the rest of the world is innovating at a faster speed and we need to take advantage of that inside enterprises in order to compete and win in our respective businesses.

There are a number of drivers in the marketplace pushing us toward integration as a service and particularly iPaaS. First of all, if we look back 15 years, integration became a focal point for enterprises, because applications were siloing their data in their own databases and for business to be more effective, they have to get that data out of those silos and into a more operational context, where they could do extended business processes, etc.

What we're seeing with cloud, and in particular the new wave of SaaS applications, is that we're doing a lot of the same mistakes for the same behaviors that we did 10 years ago in the enterprise. Every new SaaS application becomes a new data silo in the cloud and it’s creating an integration challenge for anyone that has the data across multiple SaaS providers.

New computing models

And it's not just SaaS. The adoption of SaaS is one key thing, but also the adoption of cloud and hybrid computing models means that our data also no longer lives behind the firewall. Couple that with the drivers around mobile computing that are enabling our workforce and consumers, when they are on the go, again, outside of the firewall.

Add the next social media networks and you have a wealth of new information about your employees, customers, and consumers, available through things like LinkedIn and Facebook. You've also got the big data explosion. The rise of things like Hadoop for managing unstructured data has meant that we end up pushing more data outside of our firewalls to third party services that help us understand our data better.

There are four key drivers: the adoption of SaaS applications; the movements by using more cloud and hybrid models; mobile is driving a need to have data outside of the enterprise; and then social media and also big data together are redefining where we find and how we read our information.

Gardner: It also appears that there will be a reinforcing effect here. The more that enterprises use cloud services, the more they’ll need to integrate. The more they integrate, the more capable and powerful the cloud services, and so on and so on. I guess we could anticipate a fairly rapid uptake in the need for these external integrations.

Mason: We think we might be a bit early in carving out the iPaaS market, but the response we're hearing, even from our largest organizers, is that most have lots of needs around cloud integration, even if it's just to help homogenize departmental applications. We’ve been blown away at MuleSoft at the demand for iON already. [Join the iON beta program.]

New open enterprises

The open-source model is absolutely critical, and the reason is that one of the biggest concerns for anyone adopting technology is who am I getting into bed with? If I buy from Amazon, ultimately, I'm getting into there with Amazon and their whole computing model, and it’s not an easy thing to get out.

With integration, it’s even more of a concern for people. We’ve looked through the vendor lock-in of the 1990s and 2000s, and people are a little bit gun-shy from the experiences they had with the product vendors like Atria and IBM and Oracle.

When they start thinking about IaaS or the cloud, then having a platform that’s open and freely available and that they can migrate off or on to and manage themselves is extremely important. Open source, and particularly MuleSoft and the Mule ESB, provides that platform.

Gardner: Ali, how do you see iPaaS process enablement happening?

Sadat: It’s a pretty interesting problem that comes up. The patterns and the integrations that you need to do now are getting, in a sense, much more complex, and it’s definitely a challenge for a lot of folks to deal with it.

We’re talking not only to cloud-to-cloud or enterprise-to-enterprise, but now extending it beyond the enterprise to the various cloud and the direction of data can flow either from the enterprise to the cloud or from the cloud to the enterprise. The problems are getting a little more challenging to solve.

The other thing that we’re seeing out there is that a lot of different application programming interfaces (APIs) are popping up -- more and more every day. There are all kinds of different technologies either being exposed to traditional web services or REST-based web services.

We’re seeing quite a few APIs. By some accounts, we're in the thousands or tens of thousands right now. In terms of APIs, they're going to be exposed out there for folks who are trying to figure out and how to integrate. [Join the iON beta program.]

Gardner: What do you propose for that?

Hybrid world

Sadat: It’s something a hybrid world, and I think the answer to that is a hybrid model, but it needs to be very seamless from the IT perspective.

If I want to do a real-time integration between Salesforce and an SAP, how do I enable that? If I poke holes from my firewall that’s going to definitely expose all kinds of security breaches that my network security folks are not going to like that. So how do I enable that? This is where iON comes into play.

We’re going to sit there in a cloud, open up a secure public channel where Salesforce can post events to iON, and then via a secured connection back to the enterprise, we can deliver that directly to SAP. We can do on the reverse side too. This is something that the traditional TIBCOs and WedMethods of of the world weren’t designed to solve and they weren’t even thinking about this problem when they designed and developed that application. [Join the iON beta program.]

The difference between integration running on-premise or in the cloud shouldn't matter as much, and the tooling should be the same. So, it should be able to support both a cloud-based management, and also be able to manage and drive us in the enterprise, and set up on-premise tools.

One of the things you’ll see about iON is a lot of familiar components. If you have been a Mule user or Mule ESB user, you will see that at the heart of iON itself. What we're providing now is the ability to be able to deploy your solutions, your integration applications to a cloud and be able to manage it there, but we also are going to give you the capability to be able to integrate back into the enterprise.

Gardner: Why not just use what Salesforce provides you and let that be the integration point? Why would you separate the integration cloud capability?

Sadat: Integration, as a whole, is much better served by neutral party than just going by any one of the application vendors. You can certainly write custom code to do it, and then people have been doing it, but they've seen over and over that that doesn’t necessarily work.

Having a neutral platform that speaks to APIs on both sides is very important. You’re not going to find Salesforce, for example, adopting SAP APIs, and vice versa. So, having that neutral platform is very important. Then, having that platform and being able to carry out all the kinds of different integration patterns that you need is also important.

We do understand the on-premise side of it. We understand the cloud side of the problem. We're in a unique position to bring those two together.

Having that platform and being able to carry out all the kinds of different integration patterns that you need is also important.



Gardner: Ross, please define for me what you consider the top requirements for this unique new neutral standalone integration cloud?

Mason: I'll start with the must-haves on the PaaS itself. In my mind the whole point of working with a PaaS is not just to do integration, but it’s for a provider, such as MuleSoft, to take all the headache and hard work out of the architecture as well.

For me, a true PaaS would allow a customer to buy a service level agreement (SLA) for the integration applications. That means we are not thinking about CPUs, about architecture, or I/O or memory usage, and just defining the kind of characteristics they want from their application. That would be my Holy Grail of why a PaaS is so much better?

For integration, you need that, plus you need deep expertise in that integration itself. Ali just mentioned that people do a lot of their own point to points and SaaS providers do their own point integrations as well.

We spend a lot of money in the enterprise to integrate applications. You do want a specialist there, and you want someone who is independent and will adopt any API that makes sense for the enterprise in a neutral way.

We’re never going to be pushing our own customer relationship management (CRM) application. We're not going to be pushing our own enterprise resource planning (ERP). So, we’re a very good choice for being able to pull data from whichever application you're using. Neutrality is very important.

Hugely important

Finally, going back to the open-source thing again, open source is hugely important, because I want to know that if I build an integration on a Switzerland platform, I can still take that away and run it behind my firewall and still get support. Or, I just want to take it away and run it and manage it myself.

With iON, that’s the promise. You’ll be able to take these integration apps and the integration flows that you build, and run them anywhere. We're trying to be very transparent on how you can use the platform and how you can migrate on as well as off. That’s very important. [Join the iON beta program.]

Gardner: You’ve come out on May 23 with the announcement about iON and describing what you intend.

Sadat: That’s correct. We started with our private beta, which is coming to an end. As you mentioned, we’re now releasing a public beta. Pretty much anybody can come in, sign up, and get going in a true cloud fashion. [Join the iON beta program.]

We're allowing ourselves a couple months before the general availability to take in feedback during the beta release. We’re going to be actively working with the beta community members to use the product and tell us what they think and what they'd like changed.

One of the other things we’re doing soon after the general availability is releasing a series of iON applications that we'll be building and releasing. These will be both things that we’ll offer as ways to monetize certain integrations, but also as reference applications for partners and developers to look at, be able to mimic, and then be able to build their own applications on top of it.

Gardner: What is it they are going to get?

Sadat: At the core of it, they get Mule. That’s pretty essential, and there’s a whole lot of reasons why they do that. They get a whole series of connectors and various transports they can use. One of the things that they do get with iON is the whole concept of this virtual execution environment sitting in the cloud. They don’t have to worry about downloading and installing Mule ESB. It’s automatically provided. We'll scale it out, monitor it, and provide all that capability in the cloud for them.

Once they’re ready, they push it out to iON, and they execute it. They can then manage and monitor all the various flows that are going through the process.



They just need to focus on their application, the integration problems that they want to solve, and use our newly released Mule Studio to orchestrate these integrations in a graphical environment. Once they’re ready, they push it out to iON, and they execute it. They can then manage and monitor all the various flows that are going through the process.

The platform itself will have a pretty simple pricing model. It’s going to be composed of couple of different dimensions. As you need to scale out your application, you can run more of these units of work. You'll be able to handle the volume and throughout that you need, but we are also going to be tracking events. So this is, in Mule terminology, equivalent to a transaction. Platform users will be able to buy those in select quantities and then be able to get charged for any overage that they have.

Also, partners and ISVs today don’t have a whole lot of choices in terms of being able to build and embed OEM services in a cloud fashion into various applications or technologies that they are building. So, iON is going to provide that platform for them.

Embeddable platform

One of the key things of the platform itself is that it is very embeddable. Everything is going to be exposed as a series of APIs. SIs and SaaS providers can easily embed that in their own application and even put their own UI on top of it, so that underneath it it says iON, but on top, it’s their own look and feel, seamlessly integrated into their own applications and solutions. This is going to be a huge part of iON.

Gardner: Looking at the future how does the mobile trend in particular affect the need for a neutral third-party integration capability?

Mason: Mobile consumers are consuming data, basically. The mobile application model has changed, because now you get data from the server and you render on the device itself. That’s pretty different from the way we’ve been building applications up till fairly recently.

What that means is that you need to have that data available as a service somewhere for those applications to pick it up. An iPaaS is a perfect way of doing that, because it becomes the focal point where it can bring data in, combine it in different ways, publish it, scrub it, and push it out to any type of consumer. It's not just mobile, but it’s also point-of-sale devices, the browser, and other applications consuming that data.

Mobile is one piece, because it must have an API to grab the data from, but it’s not the only piece. There are lots of other embedded devices in cars, medical equipment, and everything else.

If you think about that web, it needs to talk to a centralized location, which is no one enterprise. The enterprise needs to be able to share its data with integration outside of its own firewall in order to create these applications.
Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Read a full transcript or download a copy. Join the iON beta program. Sponsor: MuleSoft.

You may also be interested in:

Wednesday, June 1, 2011

HP's IT Performance Suite empowers IT leaders with unified view into total operations, costs

The IT cobbler's kids finally have their own shoes.

Some 20 years into enterprise resource planning (ERP), whereby IT made business performance metrics and business intelligence (BI) a science for driving the modern corporation, ERP and BI for IT is finally at hand.

HP today unveiled a new suite of software designed to measure and improve IT performance at a comprehensive level via an IT system of record approach. The HP IT Performance Suite gives CIOs the ability to optimize application development, infrastructure and operations management, security, information management, financial planning, and overall IT administration.

The suite and its views into operations via accurate metrics, rather than a jumble of spreadsheets, also sets up the era of grasping true IT costs at the business process level, and therefore begins empirical costs-benefits analysis to properly evaluate hybrid computing models. Knowing your current costs -- in total and via discrete domains -- allows executives to pick the degree to which SaaS, cloud and outsourcing form the best bet for their company.

Included with the suite is an IT Executive Scorecard that provides visibility in critical performance indicators at cascading levels of IT leadership. It's founded on the open IT data model with built-in capabilities to integrate data from multiple sources, including third parties to deliver a single holistic view of ongoing IT metrics. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

HP has identified 150 standard, best-in-class key performance indicators (KPIs), 50 of which are included in the Executive Scorecard as a starting dashboard. KPIs are distributed in customizable dashboards that provide real-time, role-based performance insights to technology leadership, allowing alignment across common goals for an entire IT organization.

"With this we can leverage our human capital better," said Alexander Pasik, PhD, CIO at the IEEE, based in Piscataway, NJ. "The better automation you can apply to IT operations, the better. It frees us up to focus on the business drivers."

Lifecycle approach

The Performance Suite uses HP's lifecycle approach to software development and management and integrates industry standards such as ITIL.

The first solution to be offered in the suite is the CIO Standard Edition, which includes the Executive Scorecard, along with financial planning and analysis, project and portfolio management (PPM), and asset manager modules. This edition automatically integrates data from the modules to provide more than 20 best-practice KPIs covering financial and project health, enabling the optimization of IT performance from a business investment point of view.

"Our use of IT is about driving the actual business," said Pasik, who is adopting elements of the suite and looks forward to putting the scorecard to use soon. "We need to measure IT overall. We will have legitimate metrics on internal operations."

The scorecard can be very powerful at this time in computing, said Piet Loubser, HP senior director of product marketing, because the true capital expenses versus operations expenses for IT can be accurately identified. This, in turn, allows for better planning, budgeting and transitioning to IT shared services and cloud models. Such insight also allows IT to report to the larger organization with authority on its costs and value.

Using the scorecard, said Loubser, IT executives can quickly answer with authority two hitherto-fore vexing questions: Is IT on budget, and is IT on time?

What's especially intriguing for me is the advent of deeper BI for IT, whereby data warehouses of the vast store of IT data can be assembled and analyzed. There is a treasure trove of data and insights into how business operate inside of the IT complex.

Applying BI best practices, using pre-built data models, and developing ongoing reference metrics on business processes to the IT systems that increasingly reflect the business operations themselves portends great productivity and agility benefits. Furthermore, getting valid analysis on IT operations allows for far better planning on future data center needs, modernization efforts, applications lifecycle management, and comparing and contrasting for hybrid models adoption ... or not.

For more information, visit the suite's HP website, www.hp.com/go/itperform.

The announcement of the HP IT Performance Suite comes less than a week before HP's massive Discover conference in Las Vegas, where additional significant news is expected. I'll be doing a series of on-site podcasts from the conference on HP user case studies and on the implications and analysis of the news and trends. Look for them on this blog or BriefingsDirect partner site.

You may also be interested in: