Friday, March 23, 2012

Join me for a deep-dive live discussion on why IT's role may never be the same

Now is a fascinating time for businesses the world over, and the role and impact of IT is a big part of what's causing the changes that all of us are feeling and adjusting to.

Many of the changes involve the speed of change and rapid adaptation to dynamic markets. Clearly, the speed of business has never been faster, and it's getting even faster. Those that can't keep up are in a perilous situation.

Register now for this free online discussion.

It's therefore up to IT to help companies move at the pace of the market, or IT risks being passed over by cloud-based outside providers.

The stakes have never been higher for keeping applications and businesses up and running.


One way IT can hasten their responsiveness to business requirements is in their ability to deliver applications to PC faster, better and cheaper.

Join me and Embarcadero Technologies next Tuesday, March 27, for a deep-dive, live discussion (registration required) on how the major business and IT trends of the day -- cloud computing, mobile, social networks, and Big Data -- are changing the role of IT, and ways that IT can fight back. [Disclosure: Embarcadero Technologies is a sponsor of BriefingsDirect podcasts.]

Register now for this free online discussion.

You may also be interested in:

Wednesday, March 21, 2012

Study: Cloud computing becoming pervasive, and IT needs to take control now

Cloud computing may be taking the business world by storm, but its success could mean a "perfect storm" that endangers the role of IT.

As a result, IT needs to step up now and change its approach to cloud services. This includes building trust with the lines of business, beginning to manage public cloud services, and pursuing increased automation for service provisioning and operations.

These are the key findings of a survey commissioned by BMC Software and conducted by Forrester Research. The study, "Delivering on High Cloud Expectations," shows that business units' demand for speed and agility is leading them to circumvent IT and acquire cloud services, more than half of them from what were termed "unmanaged" clouds.

Brian Singer, Lead Solutions Marketing Manager for BMC, said his company commissioned the survey in an effort to confirm what the company was hearing anecdotally from customers. "Cloud and software as a service (SaaS) are in enterprises in a big way," Singer said, "and we wanted to see how IT was dealing with them."

Cloud and SaaS are in enterprises in a big way and we wanted to see how IT was dealing with them.



For the study, researchers polled 327 enterprise infrastructure executives and architects in the United States, Europe, and Asia-Pacific. Among the key findings:
  • Today, 58 percent run mission critical workloads in unmanaged public clouds, regardless of policy. The researchers use "unmanaged" to describe clouds that are managed by the cloud operators, but not by the company buying the service.
  • In the next two years, 79 percent plan to run mission-critical workloads on unmanaged cloud services.
  • Nearly three out of four responders, 71 percent, thought that IT should be responsible for public cloud services.
  • Seventy two percent of CIOs believe that the business sees cloud computing as a way to circumvent IT.
Wake-up call

"This is a wake-up call," Singer said. "They know that this is going on and they understand that cloud is a way to go around monolithic IT." According to the survey, 81 percent of respondents said that a comprehensive cloud strategy is a high priority for the next year.

While cost is a major driver in the C-suite, the lines of business respondents put cost way down on their list of priorities. Instead they are seeking higher availability, faster delivery of services, more agility, and options and flexibility.

The researchers suggested a three-prong approach for IT to get a handle on this:
  • Build trust with the users and create a better user experience -- have an honest conversation about needs of the business, incorporate business requirements into a cloud strategy, and demonstrate progress toward them.

    They know that this is going on and they understand that cloud is a way to go around monolithic IT.


  • Shift from unmanaged to managed public cloud services. Many cloud vendors allow IT operations to monitor and manage services. This will help mitigate the risk and complexity that unmanaged clouds now introduce.
  • Develop ways to provision and operate internal services so that users get experiences similar to those they get from outside. This requires more automation to rapidly deploy solutions.
The full study results will be announced April 26 at 11 a.m. CST as part of a BMC webinar, registration required.

You may also be interested in:

Tuesday, March 20, 2012

New HP application transformation offerings help enterprises tackle growing use of mobile computing and social media

HP this week announced a suite of products and services aimed at overcoming the challenges presented by the convergence of some of today's hottest business trends, mobile computing and social media.

The four software products and three services are designed to help enterprises leverage traditional systems of record, while creating an improved and extended presence and engaging better with customers, partners, and even employees.

IT-driven systems of record, while they serve up useful information, are usually commodities and in themselves don't create business differentiation. The new offerings, however, provide systems of engagement, which change the way enterprises interact with those people now using tablets and smartphones in increasing numbers, as well as through social media. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

"Unless we worry about a system of engagement, then we're in trouble," said Paul Evans, HPs Worldwide Lead for Application Transformation. "It's a question of how to create a competitive advantage. We still require traditional apps on the back end, and will still look for the opportunity to reduce cost, but our focus is going to change to systems of engagement. We've had a lot of clients say, 'How do I do that?' It's not a traditional skill."

Evans also pointed out that time-frames have collapsed as well. "Apps have to be built in a time-frame that clients aren't used to. It's got to be a quality app, and it has to be well-tested. But it also has to be done in six days."

New offerings


In an effort to meet this mounting challenge, HP is now offering:
  • HP Application Lifecycle Intelligence (ALI), which improves collaboration among delivery teams and reduces cycle times. While traditional systems of record are built and maintained every six months, ALI can automate the process and reduce the time to days.
  • HP Unified Functional Testing, offered with Perfecto Mobile, allows developers to emulate and test the user experience of mobile applications across devices as well as different networks. This is to ensure that all users have the same experience no matter what mobile carrier they're using.

    We still require traditional apps on the back end and will still look for the opportunity to reduce cost, but our focus is going to change to systems of engagement.


  • HP Anywhere, which enables clients to manage IT on the go. Mobile-based applications perform operations such as portfolio request management, defect tracking, service health monitoring and the composition of an executive scorecard, where IT managers can look at all of attributes in real time.
  • HP Enterprise Collaboration enhances knowledge sharing and accelerates application development through a social collaboration environment that enables real-time, context-based conversations traced back to actions and work items. This allows developers to communicate in new ways that take previously unstructured conversations and puts structure to them.
Additionally, among the new services are:
You may also be interested in:

Thursday, March 15, 2012

HP engineers support to better target multi-vendor, cloud environments

In a move to help enterprises address problems before they arise, HP this week rolled out IT support services architected for modern IT infrastructures.

Dubbed HP Always On Support, the new services integrate tech built into the HP Converged Infrastructure with services to help enterprises realize 95 percent first-time resolution rates and hasten the restoration of system interruptions. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

HP is taking a different approach from the break/fix model that legacy support services offer for individual pieces of equipment in technology silos. With Always On Support Services, HP is acknowledging the new data center paradigm: virtualized, multivendor and clouds with interdependencies across the IT infrastructure.

With HP Always On Support Services, the tech giant said it is taking a proactive stance, which includes providing a single point of contact for problem resolution -- problems that could otherwise lead to downtime the firm estimates costs average enterprises $10 million an hour.

Move to innovation

“Organizations can’t survive in today’s business climate without shifting time and resources from problem resolution to innovation,” says Antonio Neri, senior vice president and general manager of HP’s Technology Services group. “The traditional reactive IT support model is no longer effective -- the industry needs to change the way it delivers support to offer proactive solutions and customized service offerings -- and HP is leading that charge.”

Here’s an example of HP Always On Support Services in action: The service continuously monitors the 1,600 diagnostic data points HP ProActive Insight architecture collects. (This architecture is embedded in HP ProLiant Gen8 servers and is soon to be integrated across the HP Converged Infrastructure.)

Through its HP Foundation Care feature, HP Always On offers direct communication with an expert who already knows the client, the details of the client’s environment, and what the client’s system is experiencing. By leveraging relationships with leading independent software vendors (ISVs), HP promises to expedite problem resolution and eliminate the finger pointing typical of legacy support models.

Organizations can’t survive in today’s business climate without shifting time and resources from problem resolution to innovation.



Meanwhile, HP Proactive Care works to minimize downtime and optimize performance by addressing problems before they occur. HP Datacenter Care offers customized support for a client’s multivendor environment with a single point of contact at HP. Finally, HP Lifecycle Event Services work to augment the HP Care portfolio with HP expertise throughout the technology life cycle for client’s IT projects, including strategy, design, implementation and education services, allowing clients to select services a la carte.

BriefingsDirect contributor Jennifer LeClaire provided editorial assistance and research on this post. She can be reached at http://www.linkedin.com/in/jleclaire and http://www.jenniferleclaire.com.

You may also be interested in:

Tuesday, March 13, 2012

Join HP security expert Tari Schreider for a deep-dive live chat on cloud protection

Business leaders want to exploit cloud computing values fast, but they also fear the security risks in moving to cloud models too quickly.

Just at the time that companies want to leverage cloud, they know that security threats are growing. Indeed, according to recent HP-sponsored research, the volume and complexity of security threats has continued to escalate. Analyst firms such as Forrester place security and privacy as the top reasons for not adopting cloud.

Yet by better understanding cloud security risks, gaining detailed understanding of your own infrastructure and following proven reference architectures and methods, security can move from an inhibitor of cloud adoption to an enabler.

Indeed, CIOs must find the ways to make extended services use secure for their operations, data, processes, intellectual property, employees and customers – even as security threats ramp up.

And so these cloud services from both inside and outside the enterprise are rapidly compelling companies to rethink how they use and exploit technology securely. Cloud computing trends are now driving the need for a better approach to security.

Live discussion

On March 22, in a free, online, live multimedia "Expert Chat," I'll be interviewing Tari Schreider, Chief Security Architect, HP Technology Consulting. We'll also be taking live questions from the online audience. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

The stakes have never been higher for keeping applications and businesses up and running.

Register now as seats are limited for this free HP Expert Chat.

In this free discussion (registration required), hear recommendations from Schreider on how to prevent security concerns from holding up cloud adoption. We'll further explore what it takes to mitigate risks and prepare your organization for secure hybrid cloud computing.

Moreover, the entire presentation, as well as questions and answers, will be automatically translated into 13 major languages. Viewers can easily choose the language of their choice. They will also be able to download the presentation and listen and watch Schreider's chat on-demand after the live event.

Register now as seats are limited for this free HP Expert Chat.

You may also be interested in:

Tuesday, March 6, 2012

HP announces first servers in the Gen8 family for improving management, ROI, and energy conservation in the data center

HP today announced general availability of the first batch of servers in its ProLiant Gen8 series, which the company unveiled last month.

The new generation of servers, part of a two-year, $300-million effort, benefit from ProActive Insight archtecture, including lifecycle automation, dynamic workload acceleration, automated energy optimization, and proactive service and support. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Manual operations and facilities management can cost companies more than $24 million over three years, while unplanned downtime is estimated to cost companies as much as $10 million an hour. As a result, data center managers must rely on more intelligent, self-sufficient technologies that eliminate the time-consuming, error-prone processes that consume valuable resources and can cause both failure and data loss.

ProLiant now includes the Intel Xeon E5-2600 processor family, which beta testing has indicated can provide a return on investment (ROI) in as little as five months, while tripling administrators' productivity.

HP developed the new generation of servers in collaboration with HP Labs, the company’s central research arm, to offer data center-wide, end-to-end energy management including HP 3D Sea of Sensors and HP Location Discovery Services. As a result, the new energy optimized technology included in the servers delivers nearly double data center capacity per watt.

To accelerate virtualized and data-intensive application performance, engineers balanced the system architecture while unifying storage, I/O and compute resources to create a converged system platform, designed for the needs of virtualized environments, cloud deployments and the most demanding data center and workloads available.

Lots of excitement

"W
e've seen lots of excitement among customers during the beta testing and not only in performance," said John Gromala, Director, Product Marketing, Industry Standard Servers and Software for HP. "We focused on their IT needs and we're helping them deal with runaway costs. That's the piece that people are more excited about than anything. If they can get those costs in line, then IT can make it."

The comprehensive portfolio of HP ProLiant Gen8 servers include:
  • ProLiant BL460c, the world’s leading server blade, is ideal for a data center’s transition to the cloud

    That's the piece that people are more excited about than anything. If they can get those costs in line, then IT can make it.


  • ProLiant DL360p, versatile, rack-optimized server that offers high performance, efficiency and reliability in a 1U space
  • ProLiant DL380p, supports the broadest spectrum of workloads and provides for future scalability as needs change
  • ProLiant ML350p, expandable tower server well suited for remote and branch offices
  • ProLiant SL230s, a multi-node server for maximum performance of high density scale-out and high performance computing (HPC) environments
  • ProLiant SL250s, ideal for heterogeneous and custom data centers focused on graphics processing unit (GPU) computing
  • ProLiant DL160, ultra-dense rack server designed for web serving and memory-intensive applications.
Tested in more than 100 data centers by real-world customers, the new platforms are expected to ship worldwide in late March. Starting prices will range from $1,723 to $2,878 and vary based on configurations.

You may also be interested in:

Open Group's OTTF 'snapshot' addresses risks from counterfeit and tainted products

The Open Group has announced the publication of the Open Trusted Technology Provider Standard (O-TTPS) Snapshot, a preview of what is intended to become the first standard developed by The Open Group Trusted Technology Forum (OTTF).

Geared toward global providers and acquirers of commercial off-the-shelf (COTS) information and communication technology (ICT) products, the O-TTPS is designed to provide an open standard for organizational best practices to enhance the security of the global supply chain and help assure the integrity of COTS ICT products worldwide. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

Standards such as O-TTPS will have a significant impact on how organizations procure COTS ICT products over the next few years.



The snapshot provides an early look at the standard so providers, suppliers and integrators can begin planning how to implement the standard in their organizations, and so customers, including government acquirers, can differentiate those providers who adopt the standard's practices.

Version 1.0 of the standard is expected to be published in late 2012. The Open Group is planning an accreditation program to help provide assurance that providers conform to the standard.

Increasing threats

"With the increasing threats posed by cyberattacks worldwide, technology buyers at large enterprises and government agencies across the globe need assurance the products they source come from trusted technology suppliers and providers who have met set criteria for securing their supply chains," said David Lounsbury, chief technology officer, The Open Group. "Standards such as O-TTPS will have a significant impact on how organizations procure COTS ICT products over the next few years and how business is done across the global supply chain."

The Trusted Technology Forum was formed in late 2010 under the auspices of The Open Group to help technology companies, customers, government and supplier organizations create and promote guidelines for manufacturing, sourcing and integrating trusted, secure technology products as they move through the global supply chain.

The two risks being addressed in the snapshot are tainted and counterfeit products. Each pose significant risk to organizations because altered or non-genuine products introduce the possibility of untracked malicious behavior or poor performance. Both product risks can damage customers and suppliers resulting in failed or inferior products, revenue and brand equity loss, and disclosure of intellectual property.

Additional resources are available on line:
  • For more information on the O-TTPS Snapshot or to download, visit The Open Group Bookstore click here.
  • For more information on The Open Group Trusted Technology Forum, click here.
  • To view a video featuring OTTF Co-Chair and Cisco's chief security strategist for the Global Value Chain Edna Conway discussing the work of the OTTF, click here.
  • To attend a Webinar on the O-TTPS Snapshot entitled "Developing Standards that Secure the Global Supply Chain, Enabling Suppliers Globally to Raise the Bar on Security and Integrity," on March 15, 2012 at register here.
You may also be interested in:

Thursday, February 23, 2012

Informatica's stretch goal

This guest post comes courtesy of Tony Baer's OnStrategies blog. Tony is senior analyst at Ovum.

Informatica is within a year or two of becoming a $1 billion company, and the CEO’s stretch goal is to get to $3b.

Informatica has been on a decent tear. It’s had a string of roughly 30 consecutive growth quarters, growth over the last 6 years averaging 20%, and 2011 revenues nearing $800 million. Abbasi took charge back in 2004, lifting Informatica out of its midlife crisis by ditching an abortive foray into analytic applications, instead expanding from the company’s data transformation roots to data integration.

Getting the company to its current level came largely through a series of acquisitions that then expanded the category of data integration itself. While master data management (MDM) has been the headliner, other recent acquisitions have targeted information lifecycle management (ILM), complex event processing (CEP), low latency messaging (ultra messaging), along with filling gaps in its B2B and data quality offerings. While some of those pieces were obvious additions, others such as ultra messaging or event processing were not.

CEO Sohaib Abbassi is talking about a stretch goal of $3 billion revenue. The obvious chunk is to deepen the company’s share of existing customer wallets. We’re not at liberty to say how much, but Informatica had a significant number of 6-figure deals. Getting more $1m+ deals will help, but on their own won’t triple revenue.

So how to get to $3 billion?


O
bviously, two strategies: deepen the existing business while taking the original formula to expand the footprint of what’s data integration.

First, the existing business. Of the current portfolio, MDM is likely best primed to allow Informatica to more deeply penetrate the installed base. Most of its data integration clients haven’t yet done MDM, and it is not a trivial investment. And for MDM clients who may have started with a customer or product domain, there are always more domains to tackle. During Q&A, Abbasi listed MDM has having as much potential addressable market as the traditional ETL and data quality segments.

The addition of SAP and Oracle veteran Dennis Moore to the Informatica MDM team points to the classic tightrope for any middleware vendor that claims it’s not in the applications game – build more “solutions” or jumpstart templates to confront the same generic barrier that packaged applications software was designed to surmount: provide customers an alternative to raw toolsets or custom programming. For MDM, think industry-specific “solutions” like counter-party risk, or horizontal patterns like social media profiles. If you’re Informatica, don’t think analytic applications.

That’s part of a perennial debate (or rant) on whether middleware is the new enterprise application: you implement for a specific business purpose as opposed to technology project, such as application or data integration, and you implement with a product that offers patterns of varying granularity as a starting point.

This prompts the question, just because you can do it, should you.



Informatica MDM product marketing director Ravi Shankar argues it’s not an application because applications have specific data models and logic that become their own de factor silos, whereas MDM solutions reuse the same core metadata engine for different domains (e.g., customer, product, operational process). Our contention? If it solves a business problem and it’s more than a raw programming toolkit, it’s a de facto application. If anybody else cares about this debate, raise your hand.

MDM is typically a very dry subject but demo’ing a social MDM straw man showing a commerce application integrated into Facebook perked Twitter debate among analysts in the room. The operable notion is that such a use of MDM could update the customer’s (some might say, victim’s) profile by the associations that they make in social networks. An existing Informatica higher educational client that shall remain anonymous actually used MDM to mine LinkedIn to prove that its grads got jobs.

This prompts the question, just because you can do it, should you. When a merchant knows just a bit too much about you – and your friends (who may not have necessarily opted in) – that more than borders on creepy. Informatica’s Facebook MDM integration was quite effective; as a pattern for social business, well, we’ll see.

Staking new ground

S
o what about staking new ground? When questioned, Abbasi stated that Informatica had barely scratched the surface with productizing around several megatrend areas that it sees impacting its market: cloud, social media, mobile, and big data. More specifically:

  • Cloud continues to be a growing chunk of the business. Informatica doesn’t have all of its tooling up in the cloud, but it’s getting there. Consumption of services from the Informatica Cloud continues to grow at a 100 – 150% annual run rate. Most of the 1,500 cloud customers are new to Informatica. Among recent introductions are a wizard-driven Contact Validation service that verifies and corrects postal addresses from over 240 countries and territories. A new rapid connectivity framework further eases the ability of third parties to OEM Informatica Cloud services.

  • Social media – there were no individual product announcements her per se, just that Informatica’s tools must increasingly parse data coming from social feeds. That covers MDM, data profiling and data quality. Much of it leverages HParser, the new Hadoop data parsing tool released late last year.


  • Mobile – for now this is mostly a matter of making Informatica tools and apps (we’ll use the term) consumable on small devices. On the back end, there are opportunities for optimizing virtualizing and replicating data on demand to the edges of highly distributed networks. Aside from newly-announced features such as iPhone and Android support of monitoring the Informatica cloud, for now Informatica is making a statement of product direction.

    Informatica, like other major BI and database vendors, have discovered big data with a vengeance over the past year.



  • Big Data – Informatica, like other major BI and database vendors, have discovered big data with a vengeance over the past year. The ability to extract from Hadoop is nothing special – other vendors have that – but Informatica took a step ahead with release of HParser last fall. In general there’s growing opportunity for tooling in a variety of areas touching Hadoop, with Informatica’s data integration focus being one of them.

    We expect to see extension of Informatica’s core tools to not only parse or extract from Hadoop, but increasingly, work natively inside HDFS on the assumption that customers are not simply using it as a staging platform anymore. We also see opportunities in refinements to HParser providing templates or other shortcuts for deciphering sensory data. ILM, for instance, is another obvious one.

    While Facebook et al might not archive or deprecate their Hadoop data, mere mortal enterprises will have to bite the bullet. Data quality in Hadoop in many cases may not demand the same degree of vigilance as SQL data warehouses, creating demand for lighter weight data profiling and cleansing tooling And for other real-time web centric use case, alternatives stores like MongoDB, Couchbase, and Cassandra may become new Informatica data platform targets.
What, no exit talk?


Abbasi commented at the end of the company’s annual IT analyst meeting that this was the first time in recent memory that none of the analysts asked who would buy Informatica when. Buttonholing him after the session, we got his take which, very loosely translated to Survivor terms, Informatica has avoided getting voted off the island.

At this point, Informatica’s main rivals – Oracle and IBM – have bulked up their data integration offerings to the point where an Informatica acquisition would no longer be gap filling; it would simply be a strategy of taking out a competitor – and with Informatica’s growth, an expensive one at that.

One could then point to dark horses like EMC, Tibco, Teradata, or SAP (for obvious reasons we’ve omitted HP). A case might be made for EMC, or SAP if it remains serious in raising its profile as database player– but we believe both have bigger fish to fry. Never say never. But otherwise, the common thread is that data integration will not differentiate these players and therefore it is not strategic to their growth plans.

This guest post comes courtesy of Tony Baer's OnStrategies blog. Tony is senior analyst at Ovum.

You may also be interested in: