Friday, August 7, 2009

Information management targets e-discovery, compliance, legal risks while delivering long-term BI and governance benefits

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download or view the transcript. Learn more. Sponsor: Hewlett Packard.

Losing control over information sprawl at enterprises can cause long-term inefficiencies. But it's the short-term legal headaches of not being prepared for E-discovery requests that have caught many firms off-guard.

Potentially massive savings can be had from thwarting legal discovery fulfillment problems in advance by governing and managing information. In a sponsored podcast, I recently examined how the well-managed -- versus the haphazard -- information oversight approach reduces legal risks. Yet these same management lifecycle approaches bring long-term payoffs through better analytics, and regulatory compliance, while reducing the cost of data storage and archiving.

Better understand the perils and promise around information management with guests Jonathan Martin, Vice President and General Manager for Information Management at HP, and Gaynelle Jones, Discovery Counsel at HP. The discussion is moderated by me, BriefingsDirect's Dana Gardner.

Here are some excerpts:
Martin: Over the last five to 10 years, we've become increasingly addicted to information, both at home and at work. ... and the size of it is beginning to really impact businesses. This trend is that information tends to either double every year in developing countries, and tends to double about every 18 months in developed organizations. Today, we're creating more information than we have ever created before, and we tend to be using limited tools to manage that information.

We're getting less business value from the information that we create. ... Unfortunately, in the last 18 months or so, the economy has begun to slow down, so that concept of just throwing more and more capacity at the problem is causing challenges for organizations. Today, organizations aren't looking at expanding the amount of information that's stored. They're actually looking at new ways to reduce the amount of information.

Coming into 2010, both in the US and in Europe, there is going to be a new wave of a regulation that organizations are going to have to take on board about how they manage their business information.

Jones: Because we have black-letter law that computerized data is discoverable if relevant, and because of the enormous amount of electronic information that we are dealing with, litigants have to be concerned with discovery, in identifying and producing it, and making sure it's admissible.

I'm charged here [at HP] with developing and working with both the IT and the litigation teams around making sure that we are compliant, and that we respond quickly to identify our electronically stored information, and that we get it in a form that can be produced in the litigation.

There are horror stories that have been in the news in recent years around major companies such as Morgan Stanley, Enron, Qualcomm and a host of others being sanctioned for not following and complying properly with the discovery rules. ... In each case, companies failed to properly implement litigation rules, directly pointing to their failure to properly manage their electronic information. So the sanctions and the penalties can be enormous if we don't get a hold of this and comply.

We've seen, over the last few years, organizations move from taking a very reactive approach on these kinds of issues to a more proactive or more of a robust approach.

Martin: You have to be able to identify and manage the information and think

Over the last two to three years, organizations have begun to take a more proactive approach.

ahead about where you're likely to have to pull it in and produce it, and make a plan for addressing these issues before you have to actually respond. When you're managing a lot of litigation, you have to respond in a quick timeframe, by law. You don't have time to then sit down and draw up your plan.

[Not being prepared] makes the process at least twice as expensive, than if you've planned ahead, strategized, and know where your information was, so that when the time comes, you could find it and preserve it and produce it.

Over the last two to three years, organizations have begun to take a more proactive approach. They're gathering the content that's most likely to be used in an audit, or that's most likely to be used in a legal matter, and consolidating that into one location. They're indexing it in the same way and setting a retention schedule for it, so that when they're required to respond to litigation or are required to respond to an audit or a governance request, all the information is in one place. They can search it very quickly.

At first, the problem statement may look absolutely enormous. ... What we're seeing, though, is that organizations that went through this shift from reactive to proactive two to three years ago have actually generated a new asset within the organization. ... They ultimately end up with a brand-new repository in the organization that can help them make better business decisions, leveraging the majority of the content that the organization creates.

If you logically think through the process, as an organization, you are taking a more proactive stance. You're capturing all of those emails, you're capturing content from file systems and your [Microsoft] SharePoint systems. You're pulling sales orders. You get purchase request from your database environment. You're consolidating maybe miles and miles of paper into a digital form and bringing all of this content into one compliance archive.

This information is in one place. If you're able to add better classification of the content, a better way of a layer of meaning to the content, suddenly you have a tool in place that allows you to analyze the information in the organization, model information in the organization, and use this information to make better business decisions.

The final step, once you've got all that content in one place, is to add a layer of analytic or modeling capability to allow you to manipulate that content and respond quickly to a subpoena or an audit request.

Jones: We're working right now on putting an evidence repository in place, so that

So we're seeing either top-down or grassroots-up content moving into the cloud. From a regulatory perspective, a governance perspective, or a legal perspective, this has new implications for the organizations.

we can collect information that's been identified, bring it over, and then search on it. So, you can do early electronic searches, do some of the de-duping that Jonathan has talked about, and get some early case assessment.

Our counsel can find out quickly what kind of emails we've got and get a sense of what the case is worth long before we have to collect it and turn it over to our outside vendors for processing. That's where we're moving at this point.

We think it's going to have tremendous benefit for us in terms of getting on top of our litigation early on, reducing the cost of the data that we end up sending outside for processing, and of course, saving cost across the board, because we can do so much of it through our own management systems, when they're in place. We're really anxious and excited to see how this is going to help us in our overall litigation strategy and in our cost.

Martin: Increasingly, we're seeing more and more content move into the cloud. This is may be coming from a top-down initiative, or from a cost or capability perspective. Organizations are saying, "Maybe it's no longer cost effective for us to run an email environment internally. What we'd like to do is put that into the cloud, be able to manage email in the cloud, or have our email managed in the cloud.”

Or, it may come from the grassroots, bottom up, where employees, when they come to work, are beginning to act more and more like consumers. They bring consumer-type technology with them, something like Facebook or social networking sites. They're coming to the organization to set up a project team and to set up a Facebook community, and they collaborate using that.

So we're seeing either top-down or grassroots-up content moving into the cloud. From a regulatory perspective, a governance perspective, or a legal perspective, this has new implications for the organizations. A lot of organizations are struggling a little bit on how do they respond to that.

... How do you discover this content, how are you required to capture this content, or are they the same, legal obligations, the content that's inside your data center of this various IT data centers? How do you address applications, maybe mashups, where content may be spread across 20 to 30 different data centers. It's a whole new vista of issues that are beginning to appear as content moves into the cloud.

Jones: The courts haven't yet addressed the cloud era, but it's going to definitely be one for which we're going to have to have a plan in place. The sooner you start being aware of it, asking the questions, and developing a strategy, the better. Once again, you're not being reactive and, hopefully, you're saving money in the process.

Martin: Probably one of the best ways to learn is from the experience of others. We've invested quite heavily over the last year in building a community for the uses of our products, as well as the potential use of our products, to share best practices and ideas around this concept of information and governance that we've been talking about today, as well as just broader information management issues.

There is a website, www.hp.com/go/imhub. If you go there, you'll see lots of information from former users about how they're using their technology.
Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Download or view the transcript. Learn more. Sponsor: Hewlett Packard.

Thursday, August 6, 2009

Letterman's job remains safe as HP goes to top 10 list hijinks on server virtualization management

Is server virtualization sprawl a laughing matter? Do the pains of IT platform architects and administrators matter so little that world's largest technology company by revenue can poke fun at their daily challenges?

Apparently so. Taking a page from late-night comedians -- and the expected viral repurposing effects of such blogs like Huffington Post -- HP's virtualization marketers have swapped speeds-and-feeds brochures for self-deprecating cheap shots at corporate polyester ties.

It's all in the name of educating the IT community on virtualization best practices, and for the most part it works. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.] Not as well as my podcasts, mind you, but it works.

Posted on YouTube, HP with its "HPEN Top Ten," clip has spoofed the satirists. Usually Top Ten lists apply to areas of politics or entertainment -- but, honestly, most of the IT departments I've visited have plenty of both. So it's actually quite appropriate after all.



The next thing you know chubby, white, middle-aged bald guys will be stereotyped as IT industry analysts.

Starring Shay Mowlem, Strategic Marketing Lead, HP Software & Solutions, (as the sidekick musician), the video also features Mark Leake, Director, Portfolio & Executive Content, HP Software Products.

And what makes our dynamic duo fun and interesting to watch? None other than the "Top Ten Reasons that Customers Need HP Virtualization Management Solutions" ...

10) They have no idea how many virtual machines (VMs) they have

9) They have more management tools than staff

8) It takes 2 minutes to provision a VM and 2 weeks to provision its storage

7) They're drowning in new platforms and technologies

6) Virtual machines are up; user satisfaction is down

5) They are experiencing backup traffic jams in their network

4) Their VMs have gone prime time

3) They can't tell which business service their VM is supporting

2) Their auditors are starting to ask questions

1) Because VMware, Citrix, Microsoft and all the other partners say so

The video is better than the read, I have to say, but only virtually so. Check it out. And let me know, honestly, wouldn't you prefer the speeds-and-feeds brochures again?

Monday, August 3, 2009

WebLayers sets sights on cloud governance with updated WebLayers Center 5.0

For all the buzz about cloud computing, there remains a key challenge for companies: regulatory compliance and governance issues.

Left unaddressed, these issues could derail the long-term growth of cloud adoption. That's why more companies are coming to market with ever-evolving solutions that aim to take the pain out of controlling the cloud.

Now, Cambridge, Mass.-based WebLayers just updated its flagship automated governance software, WebLayers Center 5.0, with new features that aim to mitigate the risk and cut the costs of developing in the cloud.

From extended support for Eclipse-based integrated development environment (IDE) to a deeper policy library and from "what if" scenarios to tighter integration with IBM and HP governance software, WebLayers Center 5.0 offers a solution worth exploring for policy enforcement across service-oriented architecture (SOA) and related IT architectures.

Consistent, Measurable, Auditible

John Favazza, vice president of engineering at WebLayers, said governance is transforming from an option to a necessity. It's a necessity because governance can proactively identify and address policy violations in the software development life cycle (SDCL) before they make a negative impact on business operations.

"This necessity is due to the realization that the cost of fixing software code after it’s been deployed can be 50 to 200 times higher than if the issues were addressed as the code was being written by the software developer," Favazza says. "WebLayers Center 5.0 mitigates these risks and reduces unnecessary development costs resulting in greater cost efficiencies.”

WebLayers' approach centralizes policy management and distributes policy enforcement to support automated governance at every point in the infrastructure and across all platforms throughout the SDLC. As WebLayers sees it, distributed governance is a key to breeding a distributed-centric environment and intelligent automation via rules-based logic are a key to reducing errors while meeting business goals. I might call it federated governance, but the point is the same and will be critical to master going forward.

WebLayers 5.0 in Action

WebLayers 5.0's intelligent automated governors work to pinpoint all of the artifacts throughout the infrastructure that are related to a low security score and let the software developer and the SOA architect know specifically where code violates development policies or business rules.

WebLayers Center then picks up where the automated governors leave off, guiding the developer on

"This necessity is due to the realization that the cost of fixing software code after it’s been deployed can be 50 to 200 times higher than if the issues were addressed as the code was being written by the software developer."

the path to address issues no matter where they occur or in what phase they appear in the SDLC. WebLayers Center includes an auto correct feature to correct the violations so developers don't have to review entire applications. It's easy to see how this capability would reduce overall software development errors, lessen learning curves and even accelerate the return on SOA investments.

On the policy distribution front, WebLayers new node director captures a snapshot of each governance point within the enterprise so the manager can distribute recommended policies to any governed system – including each developer's desktop – for knowledge sharing and time savings across the SLDC.

While the debate on the best way to achieve cloud governance continues, the progress toward automated identification and correction and stronger distribution capabilities is a step in the right direction. With a growing list of competitors seeking to solve a real cloud computing challenge, we should see plenty of innovation in this space in the years ahead.

BriefingsDirect contributor Jennifer LeClaire provided editorial assistance and research on this post. She can be reached here and here.

Wednesday, July 29, 2009

Is your SOA hammer looking for a nail?

This guest post comes courtesy of ZapThink. Jason Bloomberg is managing partner at ZapThink. You can reach him here.

By Jason Bloomberg

It sounds so obvious when you get right down to it: you need to know what problem you're solving before you can solve it. Common sense tells you to start with the problem before you can find the solution to the problem. If you start with a solution without knowing what the problem is, then there's no telling if the solution you have will be the right one for the problems you're facing.

Obvious, yes, but it never ceases to amaze us at ZapThink that when it comes to service-oriented architecture (SOA) projects, time and again we run into technology teams who don't have a grasp as to what business problems they're looking to solve. Now, it might be tempting to dismiss this disconnect to "techies being techies" or some other superficial explanation, but the problem is so pervasive and so common that there must be more to it. As a result, we took a closer look at why so many SOA projects have unclear business drivers. What we found is that the underlying issue has little to do with SOA, and everything to do with the way large businesses are run.

The wrong question

This story begins with the SOA architect. Architects frequently call in ZapThink when they're stuck on some part of their SOA initiative; we're SOA fire fighters, so to speak. Frequently, the first question we get when we sit down with the architecture team is "how do I sell SOA to the business?" Well, if that's the first question they ask, they're already on the wrong foot. That's the wrong question! The correct question is "how do we best solve the business problems we're facing?"

SOA is not a panacea, after all; it only helps organizations solve certain problems typically having to do with business change in the face of IT heterogeneity. It's best to solve other problems with other approaches as appropriate, a classic example of the right tool for the job.

For this reason, ZapThink considers the SOA business case as an essential SOA artifact. Architects must have a clear picture of the business motivations for SOA, not only at the beginning of the initiative, but also as the architecture rolls out. After all, business issues evolve over time, partly because of the changing nature of business, but also because properly iterative SOA efforts solve problems during the course of the overall initiative.

Even when architects ask the right question, however, there is still frequently a disconnect between the business problems and the SOA approach. The challenge here is that the architects -- or more broadly, the entire SOA team -- are only one part of the bigger picture, especially in large organizations. In the enterprise context, how the business asks for IT capabilities in the broad sense is often at the root of the issue.

SOA Business Driver Pitfalls

Here are some real-world examples of how we've seen the issue of unclear business drivers for SOA play out in various enterprises. We've generalized a few of the details to protect the innocent (and the guilty!).
The SOA Mandate at the US Department of Defense (DoD) -- In aggregate, the DoD is perhaps the largest implementer of SOA in the world, in large part because they have an organization-wide SOA mandate. It's true they have high-level drivers for this mandate, including increased business agility and a more cost-effective approach to siloed IT resources. But those general drivers don't help much when a defense contractor must create a specific implementation.

This particular DoD project involved a contractor who delivered a perfectly functional SOA implementation to their client. The client, however, found it to be entirely unsatisfactory, and the entire multi-million dollar effort was canceled, the money down the tubes. What happened? The problem was a disconnect between high level business drivers like business agility and specific business problems the agency in question wanted to solve. The fact the client wanted to "do SOA," so the contractor "delivered SOA" to the client only made the situation worse.

The most important take-away from this fiasco is that SOA wasn't the problem. From all the information we gathered, the contractor properly designed and implemented the architecture, that is, followed the technical best practices that constitute the practice of SOA. In essence they built a perfectly functioning tool for the wrong job. Fundamentally, the client should have specified the problems they were looking to solve, instead of specifying SOA as their requirement.

SOA for the Enterprise Morass -- ZapThink recently received a SOA requirements document from a financial services client who asked us to review it and provide our recommendations. This document contained several pages of technical requirements, but when we got to the business requirements section, it was simply marked "to be determined." Naturally, we pointed out that among the risks this project faced was the lack of a clear business case.

Our recommendation was to flesh out the business case by discussing business drivers for the SOA initiative with business stakeholders. In response, the client told us that it wasn't feasible to speak with the business side. The entire SOA initiative focused on the IT organization, and the business wasn't directly involved.

We pressed them on this point, explaining how critical having clear business drivers is for the success of the SOA initiative. Why is communicating with the business such an issue for them anyway? Were they afraid of such interactions? Did they not know who the stakeholders were? Or perhaps there was no business motivation for the project at all?

The problem, as it turned out, was more subtle. They described the challenge they faced as the "enterprise morass." As happens in so many very large organizations, there are no clear communication patterns connecting business and IT. Yes, there are business stakeholders, and yes, business requirements drive IT projects in a broad sense, but there are so many players on both sides of the business/IT equation that associating individual business sponsors with specific IT projects is a complex, politically charged challenge. As a result, the SOA team can only look to the management hierarchy within IT for direction, under the assumption that at some executive level, IT speaks to the business in order to get specific drivers for various IT initiatives, including the SOA effort. Speaking directly to business sponsors, however, is off limits.

The SOA Capability Conundrum -- The chief SOA architect at this European firm was in one of our recent Licensed ZapThink Architect Bootcamps, and when we got to the Business Case exercise, he explained that in their organization, the motivation for SOA was to build out the SOA capability. His argument was as follows: if we implement SOA, deploying a range of Services that can meet a variety of business needs in the context of a flexible infrastructure that supports loose coupling and reusability, then we'll be well-positioned for any requirements the business might throw at us in the future.

The reasoning behind this argument makes sense, at least on a superficial level. Let's build several Business Services that represent a broad range of high-value IT capabilities in such a way that when the business comes to us with requirements, we're bound to be able to meet those requirements with the Service capabilities we've deployed ahead of time. The business is bound to be ecstatic that we planned ahead like this, anticipating their needs before they gave us specific requirements!

While this organization might very well get lucky and find that they built cost-effective capabilities that the business will need, taking this "fire-ready-aim" approach to SOA dramatically increases the risks of the initiative. After all, it's difficult enough to build reusable Services when you have clear initial requirements for those Services. Building such Services in the absence of any specific requirements is just asking for trouble.
The ZapThink Take

If you take a close look at the three scenarios above, you'll notice that the stories don't really have to be about SOA at all. You could take SOA out of the equation and replace it with any other IT effort aimed at tackling enterprise-level issues and you might still have the same pitfalls. Master data management, customer relationship management, or business process management are all examples of cross-organizational initiatives that might succumb to the same sorts of disconnects between business and IT.

At the root of all of these issues is the dreaded phrase "business/IT alignment." It seems that the larger the organization, the more difficult it is to align IT capabilities with business drivers. Sometimes the problem is that the business asks for a particular solution without understanding the problem (like the DoD's SOA mandate), or perhaps a combination of politics and communication issues interfere with business/IT alignment (the enterprise morass), or in other cases IT jumps the gun and delivers what it thinks the business will want (the capability conundrum). In none of these instances is the problem specific to SOA.

SOA, however, can potentially be part of the solution. As ZapThink has written about before, the larger trend of which SOA is a part is a movement toward formalizing the relationship between business and IT with flexible abstractions, including Business Services, Cloud capabilities, and more. If you confuse this broader trend with some combination of technologies, however, you're falling for the straw man that gave rise to the SOA is dead meme. On the other hand, if you take an architectural approach to aligning business drivers with IT capabilities that moves toward this new world of formalized abstraction, then you are taking your first steps on the long road to true, enterprise-level business/IT alignment.

This guest post comes courtesy of ZapThink. Jason Bloomberg is managing partner at ZapThink. You can reach him here.


SPECIAL PARTNER OFFER

SOA and EA Training, Certification,
and Networking Events

In need of vendor-neutral, architect-level SOA and EA training? ZapThink's Licensed ZapThink Architect (LZA) SOA Boot Camps provide four days of intense, hands-on architect-level SOA training and certification.

Advanced SOA architects might want to enroll in ZapThink's SOA Governance and Security training and certification courses. Or, are you just looking to network with your peers, interact with experts and pundits, and schmooze on SOA after hours? Join us at an upcoming ZapForum event. Find out more and register for these events at http://www.zapthink.com/eventreg.html.

Thursday, July 23, 2009

Cloud security depends on the human element

This BriefingsDirect guest post comes to you courtesy of Andras Robert Szakal, director of software architecture for the U.S. Federal Software Group at IBM and a member of The Open Group board of directors. He can be contacted here.

By Andras Robert Szakal

I’m spending a considerable amount of time these days thinking about cyber security. Not just contemplating the needs of my customers but working with other industry thought leaders within IBM and externally on how to address this convoluted, politically charged, complex and possibly lucrative domain. Much of our thinking has centered around new government programs and game changing technologies that seek to solve the increasing cyber security challenge.

Yesterday another cyber security report was released and instead of new technology it suggested the problem was with us – the people – as in us humans. The study suggests that we need more educated smart folks to thwart those evil hackers and prevent nation state attacks. Oh, and embedded in the report is the notion that technical folks would like to have some runway in their careers. Gee that’s a novel idea that we have been pushing as part of the professional certification programs within The Open Group over the last five years. [Disclosure: The Open Group is a sponsor of BiefingsDirect podcasts.]

I’m sitting in the cloud computing stream today at the 23rd Open Group Enterprise Architecture Practitioners Conference in Toronto. Not surprisingly every presenter and panel eventually settles on the subject of cloud resiliency and securability and the need for skilled architects capable of effectively applying these technologies.

I can’t help but think that most of the industry’s challenges with implementing shared services and effectively protecting the infrastructure is that organizations don’t have the proper architectural skills. Most vulnerabilities are seeded in bad architectural decisions. Likewise,

Some headway has been made in the integration between IT and the business. But for the most part they still exist as separate entities.

many of the poorly constructed services, SOA, Cloud or otherwise are the result of poor design and a lack of architectural skill. The challenge here is that high value architects are difficult to grow and often more difficult to retain.

Back to the cyber report released July 22.

The fact is that most organizations have created an artificial barrier between IT professionals and business professionals. The line of business professionals, management and executives are more valued than the techies running the IT shop. Some headway has been made in the integration between IT and the business. But for the most part they still exist as separate entities. No wonder the cyber report suggests that prospective high valued cyber security specialists and architects don’t see a future in a cyber security career.

How do we address this challenge?

Let me offer a few ideas. First, ensure these folks have architectural as well as cyber security skills. This will allow them to think in the context of the business and find opportunity to move from IT to a line of business position as their careers grow. Ultimately, the IT teams must be integrated into the business itself. As the report suggests it’s necessary to establish a career path for technologies but more importantly technical career paths must be integrated into the overall career framework of the business.

This BriefingsDirect guest post comes to you courtesy of Andras Robert Szakal, director of software architecture for the U.S. Federal Software Group at IBM and a member of The Open Group board of directors. He can be contacted here.

Tuesday, July 21, 2009

Enterprises seek better ways to discover, manage and master their information explosion headaches

Listen to the podcast. Download the podcast. Find it on iTunes/iPod and Podcast.com.

Read a full transcript. Download the transcript. Learn more. Sponsor: Hewlett-Packard.

Join a free HP Solutions Virtual Event on July 28 on four main IT themes. Learn more. Register.

Businesses of all stripes need better means of access, governance, and data lifecycle best practices, given the vast ocean of new information coming from many different directions.

By getting a better handle on information explosion, enterprises can gain clarity in understanding what is really going on within the businesses, and, especially these days, across dynamic market environments.

The immediate solution approach requires capturing, storing, managing, finding, and using information better. We’ve all seen a precipitous drop in the cost of storage and a dramatic rise in the incidents of data from all kinds of devices and across more kinds of business processes, from sensors to social media.

To help better understand how to best manage and leverage information, even as it’s exploding around us, I recently spoke with Suzanne Prince, worldwide director of information solutions marketing at Hewlett-Packard (HP).

Here are some excerpts:
Prince: We’re noticing major shifts going on in the business environment, which are partially driven by the economy, but they were already happening anyway.

We’re moving more into the collaboration age, with flatter organizations. And the way is information is consumed is changing rapidly. We live in the always-on age, and we all expect and want instant access, instant gratification for whatever we want. It’s just compounding the problems.

We did a survey in February of this year in several countries around the world. It was both for IT and line-of-business decision makers. The top business priority for those people that we talked to, way and above everything else, was having the right information at the right time, when needed. It was above reducing operating costs, and even above reducing IT costs. So what it’s telling us is how business managers see this need for information as business critical.

[Yet] you often hear people saying that information is life -- it’s the lifeblood of an organization. But, in reality, that analogy breaks down pretty quickly, because it does not run smoothly through veins. It’s sitting in little pockets everywhere, whether it’s the paper files ... that get lost, on your or my memory sticks, on our laptops, or in the data center.

The growth in unstructured content is double the growth that’s going on in the structured world. ... For the longest time now, IT has really focused on the structure side of data, stuff that’s in databases. But, with the growth of content that was just mentioned -- whether it's videos, Twitter tweets, or whatever -- we’re seeing a massive uptick in the problems around content storage.

The whole category of information governance really comes into play when you start talking about cloud computing, because we’ve already talked about the fact that we’ve got disparate sources, pockets of information throughout an organization. That’s already there now. Now, you open it up with cloud and you’ve got even more.

There are quality issues, security issues, and data integration issues, because you

In reality, the work product of IT is information. It’s not applications. Applications are what move it around, but, at the end of the day, information is what is produced for the business by IT.

most likely want to pull information from your cloud applications or services and integrate that within something like a customer relationship management (CRM) system to be able to pull business intelligence (BI) out.

You also need to have a governance plan that brings together business and IT. This is not just an IT problem, it’s a business problem and all parties need to be at the table.
Another area to look at is content repository consolidation, or data mart consolidation. I’m talking about consolidating the content and data stores.

You really need to look at deleting what I would call "nuisance information," so that you’re not storing things you don’t need to. In other words, if I’m emailing you to see if you’d like to come have a cup of coffee, that doesn’t need to be stored. So, optimizing storage and optimizing your data center infrastructure.

People are now looking at information as the issue. Before they would look at the applications as the issue. Now, there's the realization that, when we talk about IT, there is an "I" there that says "Information." In reality, the work product of IT is information. It’s not applications. Applications are what move it around, but, at the end of the day, information is what is produced for the business by IT.

In HP Labs, we have eight major focus areas, and I would categorize six of them as being focused on information -- the next set of technology challenges. It ranges all the way from content transformation, which is the complete convergence of the physical and digital information, to having intelligent information infrastructure. So, it’s the whole gamut. But, six out of eight of our key projects are all based on information, information processing, and information management.
Listen to the podcast. Download the podcast. Find it on iTunes/iPod and Podcast.com.

Read a full transcript. Download the transcript. Learn more. Sponsor: Hewlett-Packard.

Join a free HP Solutions Virtual Event on July 28 on four main IT themes. Learn more. Register.

TIBCO joins governance ecology support with HP's SOA lifecycle software

TIBCO Software is expanding its governance solutions for service-oriented architecture (SOA) and will now provide support for Hewlett-Packard (HP) SOA Systinet lifecycle governance software.

ActiveMatrix Policy Manager and Service Performance Manager from TIBCO combined with HP SOA Systinet are designed to reduce risk and improve management of SOA environments, including identifying and defining appropriate services, managing the lifecycle of service assets, and measuring effectiveness. [Disclosure: TIBCO and HP are sponsors of BriefingsDirect podcasts.]

Governance has proved important when adopting SOA solutions by preventing delays in software delivery from compliance interoperability, security risks, and poor service quality. The topic has been a hot one at this week's Open Group conference in Toronto.

Companies that have been early adopters of end-to-end governance have seen significant results. One global telcom company saw a 327 percent return on investment (ROI) over three years, something they attributed to well-managed SOA governance, according to TIBCO.

We should expect to see more governance ecology cooperation like this one. And these same vendors should support standards and collaboration efforts like the Jericho Forum and Cloud Security Alliance, if the promise of cloud computing is to be realized. Vendors and/or cloud providers that try to provide it all their own way will only delay or sabotage the benefits that cloud can provide.

Open Group conference shows how security standards and governance hold keys to enterprise cloud adoption

This BriefingsDirect guest post comes courtesy of Jim Hietala, vice president of security, The Open Group. You can reach him here.

By Jim Hietala

Spending the early part of this week in The Open Group Security Forum meetings, I have been struck by the commonality of governance, risk, compliance, and audit issues between physical IT infrastructure today, and virtual and cloud environments in the (very) near future. Issues such as:
  • Moving away from manual compliance processes, toward automated test, measurement, and reporting on compliance status for large IT infrastructure. When you are talking about physical infrastructure, manual compliance is difficult, expensive in labor cost, and sub-optimal -- given that many organizations choose to sample just a few representative systems for compliance, rather than actually testing the entire environment. When you are talking about virtual environments and cloud services, manual compliance processes just won’t work, automation will be key.

  • Incompatible log formats output by physical devices continues to be a problem for the industry that manifests itself in problems for security information and event management systems, log management systems, and auditors. Ditto for virtual and cloud environments, at much larger scale.

  • Managing security configurations across physical versus virtual and cloud environments provides similar challenges. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]
Emerging-standards work from the Security Forum, which was originally conceived as solutions for some of these issues in traditional IT environments (in house, physical servers), will have important applications in cloud and virtualization scenarios. In fact, with the scale and agility provided by these environments, it is hard to think about adequately addressing audit and compliance concerns without standards that provide for “scalable automation.”

The Automated Compliance Expert Markup Language standards initiative will address issues of security configuration and compliance alerting and reporting across physical, virtual, and cloud environments. The revised XDAS standard from The Open Group will address audit incompatibility issues. Both of these standards efforts are work-in-progress at the present time, and our standards process is truly and open one. If your organization is a customer organization grappling with these issues, or a vendor whose product might benefit from implementing these standards, we invite you to learn more.

This BriefingsDirect guest post comes courtesy of Jim Hietala, vice president of security, The Open Group. You can reach him here.