Thursday, April 12, 2012

SAP and databases no longer an oxymoron

This guest post comes courtesy of Tony Baer’s OnStrategies blog. Tony is a senior analyst at Ovum.

By Tony Baer

In its rise to leadership of the ERP market, SAP shrewdly placed bounds around its strategy: it would stick to its knitting on applications and rely on partnerships with systems integrators to get critical mass implementation across the Global 2000. When it came to architecture, SAP left no doubt of its ambitions to own the application tier, while leaving the data tier to the kindness of strangers (or in Oracle’s case, the estranged).

Times change in more ways than one – and one of those ways is in the data tier. The headlines of SAP acquiring Sybase (for its mobile assets, primarily) and subsequent emergence of HANA, its new in-memory data platform, placed SAP in the database market. And so it was that at an analyst meeting last December, SAP made the audacious declaration that it wanted to become the #2 database player by 2015.

Times change in more ways than one – and one of those ways is in the data tier.



Of course, none of this occurs in a vacuum. SAP’s declaration to become a front-line player in the database market threatens to destabilize existing relationships with Microsoft and IBM as longtime SAP observer Dennis Howlett commented in a ZDNet post. OK, sure, SAP is sick of leaving money on the table to Oracle. But if the database is the thing, to meet its stretch goals, says Howlett, SAP and Sybase would have to grow that part of the business by a cool 6x – 7x.

But SAP would be treading down a ridiculous path if it were just trying to become a big player in the database market for the heck of it. Fortuitously, during SAP’s press conference on announcements of their new mobile and database strategies, chief architect Vishal Sikka tamped down the #2 aspirations as that’s really not the point – it’s the apps that count, and increasingly, it’s the database that makes the apps. Once again.

Main point

Back to our main point, IT innovation goes in waves; during emergence of client/server, innovation focused on database where the need was mastering SQL and relational table structures; during the latter stages of client/server and subsequent waves of Webs 1.0 and 2.0, activity shifted to the app tier, which grew more distributed.

With emergence of Big Data and Fast Data, energy shifted back to the data tier given the efficiencies of processing data big or fast inside the data store itself. Not surprisingly, when you hear SAP speak about HANA, they describe an ability to perform more complex analytic problems or compound operational transactions. It’s no coincidence that SAP now states that it’s in the database business.

So how will SAP execute its new database strategy? Given the hype over HANA, how does SAP convince Sybase ASE, IQ, and SQL Anywhere customers that they’re not headed down a dead end street?

That was the point of the SAP announcements, which in the press release, stated the near term roadmap but shed little light on SAP would get there. Specifically, the announcements were:
  • SAP HANA is now going GA and at the low (SMB) end come out with aggressive pricing: roughly $3000 for SAP BusinessOne on HANA; $40,000 for HANA Edge.

    It’s no coincidence that SAP now states that it’s in the database business.


  • Ending a 15-year saga, SAP will finally port its ERP applications to Sybase ASE, with tentative target date of year end. HANA will play a supporting role as the real-time reporting adjunct platform for ASE customers.
  • Sybase SQL Anywhere would be positioned as the mobile front end database atop HANA, supporting real-time mobile applications.
  • Sybase’s event stream (CEP) offerings would have optional integration with HANA, providing convergence between CEP and BI – where rules are used for stripping key event data for persistence in HANA. In so doing, analysis of event streams could be integrated or directly correlating with historical data.
  • Integrations are underway between HANA and IQ with Hadoop.
  • Sybase is extending its PowerDesigner data modeling tools to address each of its database engines.
Most of the announcements, like HANA going GA or Sybase ASE supporting SAP Business suite, were hardly surprises. Aside from go-to-market issues, which are many and significant, we’ll direct our focus on the technology roadmaps.

We’ve maintained that if SAP were serious about its database goals, that it had to do three basic things:
  1. Unify its database organization. The good news is that it has started down that path as of January 1 of this year. Of course, org charts are only the first step as ultimately it comes down to people.
  2. Branding. Although long eclipsed in the database market, Sybase still has an identifiable brand and would be the logical choice; for now SAP has punted.
  3. Cross-fertilize technology. Here, SAP can learn lessons from IBM which, despite (or because of) acquiring multiple products that fall under different brands, freely blends technologies. For instance, Cognos BI reporting capabilities are embedded into rational and Tivoli reporting tools.
Heavy lifting

The third part is the heavy lift. For instance, given that data platforms are increasingly employing advanced caching, it would at first glance seem logical to blend in some of HANA’s in-memory capabilities to the ASE platform; however, architecturally, that would be extremely difficult as one of HANA’s strengths –dynamic indexing – would be difficult to implement in ASE.

On the other hand, given that HANA can index or restructure data on the fly (e.g., organize data into columnar structures on demand), the question is, does that make IQ obsolete? The short answer is that while memory keeps getting cheaper, it will never be as cheap as disk and that therefore, IQ could evolve as near-line storage for HANA.

Of course that begs the question as to whether Hadoop could eventually perform the same function. SAP maintains that Hadoop is too slow and therefore should be reserved for offline cases; that’s certainly true today, but given developments with HBase, it could easily become fast and cheap enough for SAP to revisit the IQ question a year or two down the road.

SAP maintains that Hadoop is too slow and therefore should be reserved for offline cases.



Not that SAP Sybase is sitting still with Hadoop integration. They are providing MapReduce and R capabilities to IQ (SAP Sybase is hardly alone here, as most Advanced SQL platforms are offering similar support). SAP Sybase is also providing capabilities to map IQ tables into Hadoop Hive, slotting IQ as alternative to HBase.

In effect, that’s akin to a number of strategies to put SQL layers inside Hadoop (in a way, similar to what the lesser-known Hadapt is doing). And of course, like most of the relational players, SAP Sybase is also support the bulk ETL/ELT load from HDFS to HANA or IQ.

On SAP’s side for now is the paucity of Hadoop talent, so pitching IQ as an alternative to HBase may help soften the blow for organizations seeking to get a handle. But in the long run, we believe that SAP Sybase will have to revisit this strategy. Because, if it’s serious about the database market, it will have to amplify its focus to add value atop the new realities on the ground.

This guest post comes courtesy of Tony Baer’s OnStrategies blog. Tony is a senior analyst at Ovum.

You may also be interested in:

Tuesday, April 10, 2012

Top 10 ways HP is different and better when it comes to cloud computing

Today HP filled out its cloud computing strategy, a broad-based set of products and services, that set a date (finally!) for its public cloud debut (May 10).

See my separate earlier blog on all the news. Disclosure: HP is a long-time sponsor of my BriefingsDirect podcasts. I've come to know HP very well in the past 20 years of covering them as a journalist, analyst and content producer. And that's why I'm an unabashed booster of HP now, despite its well-documented cascade of knocks and foibles.

By waiting to tip its hand on how it will address the massive cloud opportunity, HP has clearly identified Cloud as a Business (CaaB) as the real, long-term opportunity. And HP appreciates that any way that it can propel CaaB forward for as many businesses, organizations and individuals as possible, then the more successful it will be, too.

Top 10 reasons

Here's why the cloud, as we now know it, is the best thing that could have happened to HP, and why HP is poised to excel from its unique position to grow right along with the global cloud market for many years. These are the top 10 reasons HP is different and better when it comes to cloud computing:

  1. Opportune legacy. HP is not wed to a profits-sustaining operating system platform, integration middleware platform, database platform, business applications suite, hypervisor, productivity applications suite, development framework, or any other software infrastructure that limits its ability to rapidly pursue cloud models without being hurt badly or fatally financially.

  2. HP has been ecumenical in supporting all the major operating environments, Unix, Linux, Windows -- as well as all major open-source and commercial IT stacks, middleware, virtual machines and applications suites across development and deployment longer and more broadly than any one, any where. This includes product, technology, and services. Nothing prevents HP from doing the same as other innovators arrive -- or adjusting as incumbents leave. HP's robust support of OpenStack and KVM now continues this winning score.

  3. Cloud-value software. The legacy computing software products that HP is deeply entrenched with -- application development lifecycle, testing and quality assurance, performance management, systems management, portfolio management, business services management, universal configuration management databases, enterprise service bus, SOA registry, IT financial management suite (to name a few) -- are all cloud-enablement value-adds. And HP has a long software-as-a-service (SaaS) heritage in test and development and other applications delivery. These are not millstones on the path to full cloud business model adoption, they are core competencies.

    These are not millstones on the path to full cloud business model adoption, they are core competencies.



  4. The right hardware. HP has big honking Unix and high-performance computing platforms, yes, but it bet big and rightly on energy- and space-efficient blades and x86 architecture racks, rooms, pods and advanced containers. HP saw the future rightly in virtualization for servers, storage and networking, and its various lines of converged infrastructure hardware and storage acquisitions are very-much designed of, by, and for super-efficient, fit-for-purpose uses like cloud.

  5. Non-sacred cash cows. HP has a general dependency on revenue from PCs and printers, for sure. But, unlike other cash-cow dependencies from other large IT vendors, these are not incompatible with large, robust and ruthless cloud capitalization. PCs and printers may not be growing like they used to, but high growth in cloud businesses won't be of a zero-sum nature with traditional hardware clients either. As with item number 1 above, the interdependencies do not prohibit rapid cloud model pursuit.

  6. Security, still the top inhibitor to cloud adoption, is not a product but a process born of experience, knowledge, and implementation at many technology points. HP wisely built, bought and partnered to architect security and protection into its products and services broadly. As the role of public cloud provider as first line of defense and protection to all its users grows, HP is in excellent shape to combine security services across hybrid cloud implementations and cloud ecosystem customers.

  7. Data and unstructured information. Because HP supports many databases, commercial and open source, it can act as neutral partner in mixed environments. It's purchase of Autonomy gives it unique strength in making unstructured data as analyzed, controlled and managed as structured, relational data -- even combing the analytics value between and among them them. The business value of data is in using it at higher, combined abstractions across clouds, a role HP can do more in, but nothing should hold it back. It's already providing MySQL cloud data services.

    HP can foster the technology services and new kinds of integrator of services along a cloud business process continuum that please both enterprise costumers and vertical cloud providers.



  8. Technology services and professional services. HP, again the partner more than interloper, developed technology services that support IT, help desks, and solutions-level build support requirements, but also did not become a global systems integrator like IBM, where channel conflict and "coopetition" work against ecosystem-level synergies. It is these process synergies now -- a harmonic supply chain among partners for IT services delivery (not systems-level integration) -- that cloud providers need in order to grow. HP can foster the technology services and new kinds of integrator of services along a cloud business process continuum that please both enterprise costumers and vertical cloud providers.

  9. Management, automation, remote services. Those enterprises and small and medium businesses (SMBs) making the transition from virtualization and on-premises data centers to cloud and hybrid models want to keep their hands firmly on the knobs of their IT, but see less and less of the actual systems. Remote management, unified management, and business service management are keystones to hybrid computing, and HP is a world leader. Again, they are core competencies to advanced cloud use by both the enterprises and cloud providers. And HP's performance management insights, continuous improvement, on the cloud and business services that becomes the key differentiator.

  10. Neutrality, trust, penetration and localization. While HP is in bed with everyone in global IT ecosystems, they are not really married. The relationship with Microsoft is a perfect example. HP is large enough not to be bullied (we'll see how Oracle does with that), but not too aggressive such that HP makes enemies and loses the ability to deliver solutions to the end users because of conflict in the channel or ecosystem. Cloud of clouds services and CaaB values will depend on trust and neutrality, because the partner to the cloud providers is the partner to the users. Both need to feel right about the relationship. HP may be far ahead of all but a few companies in all but a few markets in this role.
Full cloud continuum

The breadth and depth of HP's global ambitions evident from today's news shows its intent on providing a full cloud kit continuum -- from code to countless hybrid cloud options. Most striking for me is HP's strategy of enabling cloud provisioning and benefits for any enterprise, large or small. There are as many on-ramps to cloud benefits realization as there are types of users, as there should be. Yet all the cloud providers need to be compete in their offerings, and they themselves will look to outsource that which is not core.

As Ali Shadman, vice president and chief technologist in HP's Technology Consulting group, told me, HP's cloud provider customers need to deliver Apple "iCloud-like services," and they need help to get there fast. They need help knowing how to bill and invoice, to provide security, to find the right facilities. HP then is poised to become the trusted uncle with a host of cloud strengths for these myriad cloud providers worldwide as they grow and prosper.

This is different from selling piecemeal the means to virtualized private-cloud implementations, or putting a different meter on hosted IT services. This is not one-size-fits-all cloud APIs, or Heathkits for cloud hackers. This is not cloud in a box. HP is approaching cloud adoption as a general business benefit, as a necessary step in the evolution of business as an on-demand architectural advancement. It knows that clouds themselves are made up of lots of services, and HP wants a big piece of that supply chain role -- as partner, not carnivor.

HP then becomes the trusted uncle with a host of cloud strengths for these cloud providers as they grow and prosper.



Shadman said that HP is producing the type of public cloud that appeals to serious and technical providers and enterprises, not hobbyists and -- dare I say it, Silicon Valley startups on a shoestring. This is business-to-business cloud services for established, global, regional and SMB enterprises.

HP is showing that it can have a public cloud, but still be a partner with those building their own targeted public clouds. By viewing clouds as a supply chain of services, HP seeks to empower ecosystems at most every turn, recognizing that 1,000 points of cloud variability is the new norm.

We should expect managed service providers, independent software vendors, legacy enterprise operators to all want a type of cloud that suits their needs, their heritage and best serves their end customers. They will be very careful who they align with, who they outsource their futures to.

In other words, HP is becoming a booster of cloud models for any type of organization or business or government.



Of course, HP is seeking to differentiate itself from Amazon, Google, Microsoft, IBM, Oracle, VMware, Citrix and Rackspace. It seems to be doing it through a maturity mode approach to cloud, not positioning itself as the only choice, or one size fits all. HP wants to grow the entire cloud pie. HP seems confident that if cloud adoption grows, it will grow well, too, perhaps even better.

In other words, HP is becoming a booster of cloud models for any type of organization or business or government, helping them to not only build or acquire cloud capabilities, but seeding the business and productivity rationale for cloud as a strategy … indefinitely.

This is very much like Google's original strategy over the past 10 years, that anything that propels the Web forward for as many businesses, organizations and individuals as possible, then the more successful Google will be with its search and advertising model. This has, and continues to, work well for Google. It places a higher abstraction on the mission, more than just sell more ads, but grow the whole pie.

I believe we're early enough in the cloud game that the emphasis should be on safely enabling the entire cloud enterprise, of growing the pie for everyone. It's too soon to try and carve off a platform or application lock, and expect to charge a toll for the those caught in some sort of trap. Those days may actually be coming to an end.

You may also be interested in:

HP takes its cloud support approach public while converging its offerings around OpenStack, KVM

HP today announced major components and details for its HP Converged Cloud strategy, one that targets enterprises, service providers and small- and medium-sized business (SMBs) with a comprehensive yet flexible approach designed to "confidently" grow the cloud computing market fast around the globe.

HP has clearly exploited the opportunity to step back and examine how the cloud market has actually evolved, and taken pains to provide those who themselves provide cloud-enabled services with an architected path to business-ready cloud services. From its new Cloud Maps to HP Service Virtualization 2.0, HP is seeking to hasten and automate how other people's cloud services come to market.

And finally, HP has put a hard date on its own public cloud offering, called HP Cloud Services, which enters public beta on May 10. See my separate analysis blog on the news. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

HP's public cloud is geared more as a cloud provider support service set, rather than a componentized infrastructure-as-a-service (IaaS) depot for end-user engineers, start-ups and shadow IT groups. HP Cloud Services seems designed more to appeal to mainstream IT and developers in ISVs, service providers and enterprises, a different group than early public cloud entrant and leader Amazon Web Services has attracted.

There's also an urgency in HP's arrangement of products and services enabling a hastened path to hybrid computing values, built on standards and resisting lock-in, with an apparent recognition that very little in cloud's future will be either fully public or private. HP's has built a lot of its cloud architecture around "hardened open source technology from OpenStack," and chose the GNU-licensed KVM hypervisor for its implementation.

The OpenStack allegiance puts HP front and square in an open source ecosystem of cloud providers, including Rackspace, Red Hat and Cisco. IBM is said to be on board too with OpenStack. And former CloudStack charter contributor Citrix recently threw its efforts behind CloudStack under the Apache Foundation.

HP's OpenStack and KVM choices also keeps it at odds with otherwise partners Microsoft and VMware.

Data drives on-premises infrastructure spend now, but increased cloud spend in the future.



There were more details too today on HP's data services in the cloud strategy, apparently built on MySQL. We should expect more data services from HP, including information services from its recent Autonomy acquisition. The data and "information-as-a-service" support trend could be big wind in HP's cloud sails, and undercut its competitors on premises cash flow.

Data drives on-premises infrastructure spend now, but increased cloud spend in the future. As for the latter, what the data engine/platform is under the cloud hood is not as important as whether the skills in the market -- like SQL -- can use it readily, as is the case the the xSQL crowd.

Furthermore, the economics of data services hosting may favor HP. If HP can help cloud providers to store, manage and analyze MySQL and related databases and information as a service with the most efficiency, then those providers using HP cloud support not only beat out on-premises data services on cost, they beat out other non-HP cloud providers, too. Could data analytics services then become a commodity? Yes, and HP could make it happen, while making good revenue on the infrastructure and security beneath the data services.

The announcement

B
ut back to today's news:
  • Beginning May 10, HP Cloud Services will deliver its initial offering, HP public IaaS offerings, as pubic beta. These include elastic compute instances or virtual machines, online storage capacity, and accelerated delivery of cached content. Still in private beta will be a relational database service for MySQL and a block storage service which supports movement of data from one compute instance to another.

  • HP’s new Cloud Maps extends HP CloudSystem by adding prepackaged templates that create a catalogue of application services, cutting the time to create new cloud services for enterprise applications from months to minutes, says HP.

  • HP Service Virtualization 2.0, which tests the quality and performance of cloud or mobile applications without disrupting production systems and includes access to restricted services in a simulated, virtualized environment.

  • HP Virtual Application Networks to help speeds application deployment, automate management and ensures network service levels (SLAs) in delivering cloud and virtualized applications across the HP FlexNetwork architecture.

    There's also an urgency in HP's arrangement of products and services enabling a hastened path to hybrid computing values.



  • HP Virtual Network Protection Service adds security at the network virtualization management layer to help mitigate common threats.

  • HP Network Cloud Optimization Service helps clients enhance their network to improve cloud-based service delivery up to 93 percent compared to traditional download techniques, says HP.

  • HP Enterprise Cloud Services, outsourcing cloud management for private clouds, business continuity and disaster recovery services and unified communications.

  • In targeting a vertical industry capability, Engineering Cloud Transformation Services are designed to help product development and engineering design teams move to cloud.
As part of the announcement, HP also delivered new Cloud Security Alliance training courses.

More information about HP’s new cloud solutions and services is available at http://www.hp.com/go/convergedcloud2012.

You may also be interested in:

Monday, April 9, 2012

Learn why Ducati races ahead with private cloud and a virtualization rate approaching 100 percent

Listen to the podcast. Find it on iTunes/iPod. Read a full transcript or download a copy. Sponsor: VMware.

H
igh-performance motorcycle designer and manufacturer Ducati Motor Holding has greatly expanded its use of virtualization and is speeding toward increased private cloud architectures.

With a server virtualization rate approaching 100 percent, Ducati has embraced virtualization rapidly in just the past few years, with resulting benefits of application flexibility and reduced capital costs. As a result, Ducati has also embraced private cloud models now across both its racing and street bike businesses.

To learn more about the technical and productivity benefits of virtualization and private clouds, we're joined by Daniel Bellini, the CIO at Ducati Motor Holding in Bologna, Italy. He's interviewd by Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: VMware is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Bellini: Probably most people know about Ducati and the fact that Ducati is a global player in sports motorcycles. What some people may not know is that Ducati is not a very big company. It's a relatively small company, selling little more than 40,000 units a year and has around 1,000 employees.

At the same time, we have all the complexities of a multinational manufacturing company in terms of product configuration, supply chain, or distribution network articulation. Virtualization makes it possible to match all these business requirements with available human and economical resources.

Gardner: Some people like to gradually move into virtualization, but you've moved in very rapidly and are at almost 98 percent. Why so fast?

Bellini: Because of the company’s structure. Ducati is a privately owned company. When I joined the company in 2007, we had a very aggressive strategic plan that covered business, process, and technology. Given the targets we would face in just three to four years, it was absolutely a necessity to move quickly into virtualization to enable all the other products.

Gardner: Of course, you have many internal systems. You have design, development, manufacturing, and supply chain, as you mentioned. So, there's great complexity, if not very large scale. What sort of applications didn’t make sense for virtualization?

Legacy applications

Bellini: The only applications that didn't make sense for virtualization are legacy applications, applications that I'm going to dismiss. Looking at the application footprint, I don’t think there is any application that is not going into virtualization.

Gardner: And now to this notion of public cloud versus private cloud. Are you doing both or one versus the other, and why the mix that you’ve chosen?

Bellini: Private cloud is already a reality in Ducati. Over our private cloud, we supply services to all our commercial subsidiaries. We supply services to our assembly plant in Thailand or to our racing team at racing venues. So private cloud is already a reality.

In terms of public cloud, honestly, I haven’t any seen any real benefit in the public cloud yet for Ducati. My expectation from the public cloud would be to have something that has virtual unlimited scalability, both up and downward.

My idea is something that can provide virtually unlimited power when required and can go down to zero immediately, when not required. This is something that hasn't happened yet. At least it’s not something that I've received as a proposal from a partner yet.

I wouldn’t say that there's a specific link between the private cloud and security, but we take always charge of the security as part of any design we bring to production.



Gardner: How about security? Are there benefits for the security and control of your intellectual property in the private cloud that are attractive for you?

Bellini: Security is something that is common to all applications. I wouldn’t say that there's a specific link between the private cloud and security, but we take always charge of the security as part of any design we bring to production, be it in the private cloud or just for internal use.

Gardner: And because Ducati is often on the cutting edge of design and technology when it comes to your high-performance motorcycles, specifically in the racing domain, you need to be innovative. So with new applications and new technologies, has virtualization in a private cloud allowed you to move more rapidly to be more agile as a business in the total sense?

Bellini: This was benefit number one. Flexibility and agility was benefit number one. What we've done in the past years is absolutely incredible as compared to what technology was before that. We've been able to deploy applications, solutions, services, and new architectures in an incredibly short time. The only requirement before that was careful order and infrastructure planning, but having done that, all the rest has been incredibly quick, compared to that previous period.

Gardner: It’s also my understanding that you’re producing more than 40,000 motorcycles per year and that being efficient is important for you. How has virtualization helped you be conservative when it comes to managing costs?

Limited investment

Bellini: Virtualization has enabled us to support the business in very complex projects and rollouts, in delivering solution infrastructures in a very short time with very limited initial investment, which is always one thing that we have to consider when we do something new. In a company like Ducati, being efficient, being very careful and sensitive about cash flows, is a very important priority.

The private cloud and virtualization especially has enabled us to support the business and to support the growth of the company.

Gardner: Let’s look a little bit to the future, Daniel. How about applying some of these same values and benefits to how you deliver applications to the client itself, perhaps desktop virtualization, perhaps mobile clients in place of PCs or full fat clients. Any thoughts about where the cloud enables you to be innovative in how you can produce better client environments for your users?

Bellini: Client desktop virtualization and the new mobile devices are a few things that are on our agenda. Actually, we have been already using desktop virtualization for few years, but now we’re looking into providing services to users who are away and high in demand.

The second thing is mobile devices. We're seeing a lot of development and new ideas there. It's something that we're following carefully and closely, and is something that I expect will turn out into something real probably in the next 12-18 months in Ducati.

Looking back, there is nothing that I would change with respect to what we've done in the last few years.



Gardner: Any thoughts or words of wisdom for those who are undertaking virtualization now? If you could do this over again, is there anything that you might do differently and that you could share for others as they approach this.

Bellini: My suggestion would be just embrace it, test it, design it wisely, and believe in virtualization. Looking back, there is nothing that I would change with respect to what we've done in the last few years. My last advice would be to not be scared by the initial investment, which is something that is going to be repaid in an incredibly short time.
Listen to the podcast. Find it on iTunes/iPod. Read a full transcript or download a copy. Sponsor: VMware.

You may also be interested in: