Friday, February 11, 2011

Some thoughts on the Microsoft and Nokia tag team on mobile or bust news

Given what they are and where they have been, there's little logical reason for Microsoft not dominating the mobile smartphone computing landscape. And it should have been a done-deal in many global major markets at least four years ago.

The only reason that Microsoft is now partnering with Nokia on mobile -- clearly not the client giant's first and primary strategy on winning the market -- is because of a lack of execution. I surely recall speaking to Microsofties as many as 10 years ago, and they were all-in on the importance and imperative for mobile platforms. Windows CE's heritage is long and deep. Nokia just as well knew the stakes, knew the technology directions, knew the competition.

Now. In the above two paragraphs replace the words "Microsoft" and "Nokia." Still works. Both had huge wind in their sails (sales?) to steer into the mobile category for keeps, neigh to define and deliver the mobile category to a hungry world and wireless provider landscape ... on their, the platform-providers', terms!

So now here we have two respective global giants who had a lead, one may even say a monopoly or monopoly-adjacency, in mobile and platforms and tools for mobile. And now it is together and somehow federated -- rather than separately or in traditional OEM partnership -- that they will rear up and gallop toward the front of the mobile device pack -- the iOS, Android, RIM and HP-Palm pack. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

How exactly is their respective inability, Microsoft and Nokia, to execute separately amid huge market position advantages enhanced now by trying to execute in cahoots ... loosely, based mostly on a common set of foes? I'll point you to the history of such business alliances, often based on fear, and its not any better than the history of big technology mergers and acquisitions. It stinks. It stinks for end-users, investors, partners and employees.

But why not reward the leadership of these laggards with some more perks and bonuses? Works in banking.

A developer paradise

And talk about an ace in the hole. Not long ago, hordes of developers and ISVs -- an entire global ecosystem -- were begging Microsoft to show them the mobile way, how to use their Visual Studio skills to skin the new cat of mobile apps. They were sheep waiting to be lead (and not to slaughter). The shepherd, it turned out, was out to lunch. Wily Coyote, super genius.

And execution is not the only big reason these companies have found themselves scrambling as the world around them shifts mightily away. Each Microsoft and Nokia clearly had the innovators dilemma issues in droves. But these were no secret. (See reason one above on execution again ... endless loop).

Microsoft had the fat PC business to protect, which as usual divided the company on how to proceed on any other course, Titantic-like. Nokia had the mobile voice business and mobile telecom provider channel to protect. So many masters, so many varieties of handsets and localizations to cough up. Motorola had a tough time with them one too. Yes, it was quite a distraction.

But again, how do these pressures to remain inert inside of older models change by the two giants teaming up? Unless they spin off the right corporate bits and re-assemble them together under a shared brand, and go after the market anew, the financial pressures not to change fast remain steadfast. (See reason one above on execution again ... endless loop).

What's more there's no time to pull off such a corporate shell game. The developers are leaving (or left), the app store model is solidifying elsewhere, the carriers are being pulled by the end-users expectations (and soon enterprises). And so this Microsoft-Nokia mashup is an eighth-inning change in the line-up and there's no time to go back to Spring training and create a new team.

Too little, too late

Nope, I just can't see how these synergies signal anything but a desperation play. Too little, too late, too complex, too hard to execute. Too much baggage.

At best, the apps created for a pending Nokia-Microsoft channel nee platform will be four down the list for native app support. More likely, HTML 5 and mobile web support (standards, not native) may prove enough to include the market Microsoft and Nokia muster together. But that won't be enough to reverse their lackluster mobile position, or get them the synergies they crave.

Each Microsoft and Nokia were dark horses in the mobile devices and associated cloud services race. Attempting to hitch the two horses together with baling wire and press releases doesn't get them any kind of leg up on the competition. It may even hobble them for good.

You may also be interested in:

Wednesday, February 9, 2011

The golden thread of interoperability runs deep at Open Group conference

This guest post comes courtesy of Dr. Chris Harding, Director for Interoperability and SOA at The Open Group.

By Dr. Chris Harding

S
AN DIEGO -- There are so many things going on at every Conference by The Open Group that it is impossible to keep track of all of them, and this week’s conference here is no exception. The main themes are cybersecurity, enterprise architecture, SOA and cloud computing. Additional topics range from real-time and embedded systems to quantum lifecycle management.

But there are a number of common threads running through all of those themes, relating to value delivered to IT customers through open systems. One of those threads is interoperability.

Interoperability panel session

The interoperability thread showed strongly in several sessions on the opening day of the conference, Monday Feb. 7, starting with a panel session on Interoperability Challenges for 2011 that I was fortunate to have been invited to moderate. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

The panelists were Arnold van Overeem of Capgemini, chair of the Architecture Forum’s Interoperability project, Ron Schuldt, the founder of UDEF-IT and chair of the Semantic Interoperability Work Group’s UDEF project, TJ Virdi of Boeing, co-chair of The Open Group Cloud Computing Work Group, and Bob Weisman of Build-the-Vision, chair of The Open Group Architecture Forum’s Information Architecture project. The audience was drawn from many companies, both members and non-members of The Open Group, and made a strong contribution to the debate.

What is interoperability? The panel described several essential characteristics:
  • Systems with different owners and governance models work together;
  • They exchange and understand data automatically;
  • They form an information-sharing environment in which business information is available in the right context, to the right person, and at the right time; and
  • This environment enables processes, as well as information, to be shared.
Interoperability is not just about the IT systems. It is also about the ecosystem of user organizations, and their cultural and legislative context.

Semantics is an important component of interoperability. It is estimated that 65 percent of data warehouse projects fail because of their inability to cope with a huge number of data elements, differently defined.

There is a constant battle for interoperability. Systems that lock customers in by refusing to interoperate with those of other vendors can deliver strong commercial profit. This strategy is locally optimal but globally disastrous; it gives benefits to both vendors and customers in the short term, but leads in the longer term to small markets and siloed systems. The front line is shifting constantly. There are occasional resounding victories – as with the introduction of the Internet – but the normal state is trench warfare with small and painful gains and losses.

Blame for lack of interoperability is often put on the vendors, but this is not really fair. Vendors must work within what is commercially possible. Customer organizations can help the growth of interoperability by applying pressure and insisting on support for standards. This is in their interests; integration required by lack of interoperability is currently estimated to account for over 25 percent of IT spend.

SOA has proved a positive force for interoperability. By embracing SOA, a customer organization can define its data model and service interfaces, and tender for competing solutions that conform to its interfaces and meet its requirements. Services can be shared processing units forming part of the ecosystem environment.

This is in some ways reinforcing SOA as an interoperability enabler.



The latest IT phenomenon is cloud computing. This is in some ways reinforcing SOA as an interoperability enabler. Shared services can be available on the cloud, and the ease of provisioning services in a cloud environment speeds up the competitive tendering process.

But there is one significant area in which cloud computing gives cause for concern: lack of interoperability between virtualization products. Virtualization is a core enabling technology for cloud computing, and virtualization products form the basis for most private cloud solutions. These products are generally vendor-specific and without interoperable interfaces, so that it is difficult for a customer organization to combine different virtualization products in a private cloud, and easy for it to become locked in to a single vendor.

There is a need for an overall interoperability framework within which standards can be positioned, to help customers express their interoperability requirements effectively. This framework should address cultural and legal aspects, and architectural maturity, as well as purely technical aspects. Semantics will be a crucial element.

Such a framework could assist the development of interoperable ecosystems, involving multiple organizations. But it will also help the development of architectures for interoperability within individual organizations – and this is perhaps of more immediate concern.

The Open Group can play an important role in the development of this framework, and in establishing it with customers and vendors.

SOA/TOGAF practical guide


S
OA is an interoperability enabler, but establishing SOA within an enterprise is not easy to do. There are many stakeholders involved, with particular concerns to be addressed. This presents a significant task for enterprise architects.

TOGAF has long been established as a pragmatic framework that helps enterprise architects deliver better solutions. The Open Group is developing a practical guide to using TOGAF for SOA, as a joint project of its SOA Work Group and The Open Group Architecture Forum.

The discussion resolved all the issues, enabling the preparation of a draft for review by The Open Group, and we can expect to see this valuable guide published at the conclusion of the review process.



This work is now nearing completion. Ed Harrington of Architecting-the-Enterprise had overcome the considerable difficulty of assembling and adding to the material created by the project to form a solid draft. This was discussed in detail by a small group, with some participants joining by teleconference. As well as Ed, this group included Mats Gejnevall of Capgemini and Steve Bennett of Oracle, and it was led by project co-chairs Dave Hornford of Integritas and Awel Dico of the Bank of Montreal.

The discussion resolved all the issues, enabling the preparation of a draft for review by The Open Group, and we can expect to see this valuable guide published at the conclusion of the review process.

UDEF deployment workshop

The importance of semantics for interoperability was an important theme of the interoperability panel discussion. The Open Group is working on a specific standard that is potentially a key enabler for semantic interoperability: the Universal Data Element Framework (UDEF).

It had been decided at the previous conference, in Amsterdam, that the next stage of UDEF development should be a deployment workshop. This was discussed by a small group, under the leadership of UDEF project chair Ron Schuldt, again with some participation by teleconference.

The group included Arnold van Overeem of Capgemini, Jayson Durham of the US Navy, and Brand Niemann of the Semantic Community. Jayson is a key player in the Enterprise Lexicon Services (ELS) initiative, which aims to provide critical information interoperability capabilities through common lexicon and vocabulary services. Brand is a major enthusiast for semantic interoperability with connections to many US semantic initiatives, and currently to the Air Force OneSource project in particular, which is evolving a data analysis tool used internally by the USAF Global Cyberspace Integration Center (GCIC) Vocabulary Services Team, and made available to general data management community. The participation of Jayson and Brand provided an important connection between the UDEF and other semantic projects.

As a result of the discussions, Ron will draft an interoperability scenario that can be the basis of a practical workshop session at the next conference, which is in London.

Complex cloud environments

Cloud Computing is the latest hot technology, and its adoption is having some interesting interoperability implications, as came out clearly in the Interoperability panel session. In many cases, an enterprise will use, not a single cloud, but multiple services in multiple clouds. These services must interoperate to deliver value to the enterprise. The Complex Cloud Environments conference stream included two very interesting presentations on this.

The first, by Mark Skilton and Vladimir Baranek of Capgemini, explained how new notations for cloud can help explain and create better understanding and adoption of new cloud-enabled services and the impact of social and business networks. As cloud environments become increasingly complex, the need to explain them clearly grows.

Consumers and vendors of cloud services must be able to communicate. Stakeholders in consumer organizations must be able to discuss their concerns about the cloud environment. The work presented by Mark and Vladimir grew from discussions in a CloudCamp that was held at a previous Conference by The Open Group. We hope that it can now be developed by The Open Group Cloud Computing Work Group to become a powerful and sophisticated language to address this communication need.

The second presentation, from Soobaek Jang of IBM, addressed the issue of managing and coordinating across a large number of instances in a cloud computing environment. He explained an architecture for “Multi-Node Management Services” that acts as a framework for auto-scaling in a SaaS lifecycle, putting structure around self-service activity, and providing a simple and powerful web service orientation that allows providers to manage and orchestrate deployments in logical groups.

SOA conference stream

The principal presentation in this stream picked up on one of the key points from the Interoperability panel session in a very interesting way. It showed how a formal ontology can be a practical basis for common operation of SOA repositories. Semantic interoperability is at the cutting edge of interoperability, and is more often the subject of talk than of action. The presentation included a demonstration, and it was great to see the ideas put to real use.

The presentation was given jointly by Heather Kreger, SOA Work Group Co-chair, and Vince Brunssen, Co-chair of SOA Repository Artifact Model and Protocol (S-RAMP) at OASIS. Both presenters are from IBM. S-Ramp is an emerging standard from OASIS that enables interoperability between tools and repositories for SOA. It uses the formal SOA Ontology that was developed by The Open Group, with extensions to enable a common service model as well as an interoperability protocol.

This presentation illustrated how S-RAMP and the SOA Ontology work in concert with The Open Group SOA Governance Framework to enable governance across vendors. It contained a demonstration that included defining new service models with the S-RAMP extensions in one SOA repository and communicating with another repository to augment its service model.

Architecture governance must change in the context of cloud-based ecosystems. It may take some effort to keep to the principles of the SOA style – but it will be important to do this.



To conclude the session, I gave a brief presentation on SOA in the Cloud – the Next Challenge for Enterprise Architects. This discussed how the SOA architectural style is widely accepted as the style for enterprise architecture, and how cloud computing is a technical possibility that can be used in enterprise architecture. Architectures using cloud computing should be service-oriented, but this poses some key questions for the architect. Architecture governance must change in the context of cloud-based ecosystems. It may take some effort to keep to the principles of the SOA style – but it will be important to do this. And the organization of the infrastructure – which may migrate from the enterprise to the cloud – will present an interesting challenge.

Enabling semantic interoperability

T
he day was rounded off by an evening meeting, held jointly with the local chapter of the IEEE, on semantic interoperability. The meeting featured a presentation by Ron Schuldt, UDEF Project Chair, on the history, current state, and future goals of the UDEF.

The importance of semantics as a component of interoperability was clear in the morning’s panel discussion. In this evening session, Ron explained how the UDEF can enable semantic interoperability, and described the plans of the UDEF Project Team to expand the framework to meet the evolving needs of enterprises today and in the future.

This meeting was arranged through the good offices of Jayson Durham, and it was great that local IEEE members could join conference participants for an excellent session.

This guest post comes courtesy of Dr. Chris Harding, Director for Interoperability and SOA at The Open Group.

You may also be interested in:

Examining the current state of the enterprise architecture profession with The Open Group's Steve Nunn

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Read a full transcript or download a copy. Sponsor: The Open Group.

Join an executive from The Open Group to examine the current state of enterprise architecture (EA) as part of The Open Group Conference in San Diego the week of Feb. 7, 2011. In this podcast summary blog, learn how EA is becoming more business-oriented and how organizing groups for the EA profession are consolidating and adjusting. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

Get an update on The Association of Open Group Enterprise Architects (AOGEA) and learn more about its recent merger with the Association of Enterprise Architects. What's more, receive an assessment of the current maturity levels and overall professionalism drive of EA, and learn more about what to expect from the EA field and these organizing groups over the next few years.

To delve into the current state of EA, we've interviewed Steve Nunn, Chief Operating Officer of The Open Group and CEO of The Association of Open Group Enterprise Architects. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some Q&A excerpts:
Gardner: Is EA dead or outmoded as a professional category?

Nunn: Absolutely not. EA is very much the thing of the moment, but it's also something that’s going to be with us for the foreseeable future too. Both inside The Open Group and the AOGEA, we're seeing significant growth and interest in the area of EA. In the association, it’s individuals becoming certified and wanting to join a professional body for their own purposes and to help the push to professionalize EA.

Within The Open Group it’s entities and organizations. Whether they be commercial, governments, academic, they are regularly joining The Open Group Architecture Forum. So, it's far from dead and in terms of the importance of business overall, EA being relevant to business.

A plenary session here at the conference is a good example. It's about using EA for business transformation. It's about using EA to tie IT into the business. There is no point in doing IT for IT's sake. It's there to support the business, and people are finding that one way of doing that is EA.

Gardner: Are the major trends around mobile, security, and cyber risk putting wind in your sails?

Nunn: Absolutely. We're seeing increasingly that you can't just look at EA in some kind of silo. It's more about how it fits. It's so central to an organization and the way that organizations are built that it has all of the factors that you mentioned. Security is a good one, as well as cloud. They're all impacted by EA. EA has a role to play in all of those.

Inside the Open Group, what's happening is a lot of cross-functional working groups between the Architecture Forum, the Security Forum, and the Cloud Work Group, which is just recognition of that fact. But, the central tool of it is EA.

Gardner: What's important about certification for enterprise architects?

Nunn: Everyone seems to want to be an enterprise architect or an IT architect right now. It's that label to have on your business card. What we're trying to do is separate the true architects from one of these, and certification is a key part of that.

If you're an employer and you're looking to take somebody on to help in the EA role, then it’s having some means to assess whether somebody really has any experience of EA, whether they know any frameworks, and what projects they've led that involve EA. All those things are obviously important to know.

One of the great things we see is the general acceptance of certification as a means to telling the wood from the trees.



There are various certification programs, particularly in The Open Group, that help with that. The TOGAF Certification Program is focused on the TOGAF framework. At the other end of the spectrum is the ITAC Program, which is a skills and experience based program that assesses by peer review an individual’s experience in EA.

There are those, there are others out there, and there are more coming. One of the great things we see is the general acceptance of certification as a means to telling the wood from the trees.

Gardner: It was three years ago at this very event that The AOGEA was officially launched. Tell us what’s happened since .

Nunn: Three years ago, we launched the association with 700 members. We were delighted to have that many at the start. As we sit here today, we have over 18,000 members. Over that period, we added members through more folks becoming certified through not only The Open Group programs, but with other programs. For example, we acknowledged the FIAC Certification Program as a valid path to full membership of the association.

We also embraced the Global Enterprise Architecture Organization (GEAO), and those folks, relevant to your earlier question, really have a particular business focus. We've also embraced the Microsoft Certified Architect individuals. Microsoft stopped its own program about a year ago now, and one of the things they encouraged their individuals who were certified to do was to join the association. In fact, Microsoft would help them pay to be members of the association, which was good.

So, it reflects the growth and membership reflects the interest in the area of EA and the interest in individuals' wanting to advance their own careers through being part of a profession.

Valuable resource

Enterprise architects are a highly valuable resource inside an organization, and so we are both promoting that message to the outside world. For our members as individuals what we're focusing on is delivering to them latest thinking in EA moving towards best practices, white papers, and trying to give them, at this stage, a largely virtual community in which to deal with each other.

Where we have turned it in to real community is through local chapters. We now have about 20 local chapters around the world. The members have formed those. They meet at varying intervals, but the idea is to get face time with each other and talk about issues that concern enterprise architects and the advancement of profession. It’s all good stuff. It’s growing by the week, by the month, in terms of the number of folks who want to do that. We're very happy with what has gone in three years.

Gardner: There are several EA organizations, several communities, that have evolved around them. Now the AOGEA has announced its merger with the Association of Enterprise Architects (AEA). How does that shape up?

Nunn: Well, it is certainly a melding of the two. The two organizations actually became one in late fall last year, and obviously we have the usual post-merger integration things to take care of.

As we develop, we're getting closer to our goal of being able to really promote the profession of EA in a coherent way.



But, I think it’s not just a melding. The whole is greater than the sum of the parts. We have two different communities. We have the AOGEA folks who have come primarily through certification route, and we also have the AEA folks who haven’t been so, so focused on certification, but they bring to the table something very important. They have chapters in different areas than the AOGEA folks by and large.

Also, they have a very high respected quarterly publication called The Journal of Enterprise Architecture, along the lines of an academic journal, but with a leaning towards practitioners as well. That’s published on a quarterly basis. The great thing is that that’s now a membership benefit to the merged association membership of over 18,000, rather than the subscribed base before the merger.

As we develop, we're getting closer to our goal of being able to really promote the profession of EA in a coherent way. There are other groups beyond that, and there are the early signs of co-operation and working together to try to achieve one voice for the profession going forward.

Gardner: This also followed a year ago the GOAO merger with the AOGEA. It seems as if we're getting the definitive global EA organization. Tell me about this new über organization.

Nunn: Well, the first part of that is the easy part. We have consulted the membership multiple times now actually, and we are going to name the merged organization, The Association of Enterprise Architects. So that will keep things nice and simple and that will be the name going forward. It does encompass so far GEAO, AOGEA and AEA. It's fair to say that, as a membership organization, it is the leading organization for enterprise architects.

Role to play

There are other organizations in the ecosystem who are, for example, advocacy groups, training organizations, or certification groups, and they all have a role to play in the profession. But, where we're going with AEA in the future is to make that the definitive professional association for enterprise architects. It's a non-profit 501(c)(6) incorporated organization, which is there to act as the professional body for its members.

Gardner: Let’s get back to the notion of the enterprise architect as an entity. Where are we on a scale of 1 to 10?

Nunn: There's a long way to go, and I think to measure it on a scale of 1 to 10, I'd like to say higher, but it's probably about 2 right now. Just because a lot of things that need to be done to create profession are partly done by one group or another, but not done in a unified way or with anything like one voice for the profession.

It's interesting. We did some research on how long we might expect to take to achieve the status of a profession. Certainly, in the US at least, the shortest period of time taken so far was 26 years by librarians, but typically it was closer to 100 years and, in fact, the longest was 170-odd years. So, we're doing pretty well. We're going pretty quickly compared to those organizations.

There's a long way to go, but we've made good progress in a short numbers of years, really.



We're trying to do it on a global basis, which to my knowledge is the first time that's been done for any profession. If anything, that will obviously make things a little more complicated, but I think there is a lot of will in the EA world to make this happen, a lot of support from all sorts of groups. Press and analysts are keen to see it happen from the talks that we've had and the articles we've read. So, where there is a will there is a way. There's a long way to go, but we've made good progress in a short numbers of years, really.

Gardner: What's in these groups for the enterprise? What does a group like the AEA do for them?

Nunn: It's down to giving them the confidence that the folks that they are hiring or the folks that they are developing to do EA work within their enterprise are qualified to do that, knowledgeable to do that, or on a path to becoming true professionals in EA.

Certainly if you were hiring into your organization an accountant or a lawyer, you'd be looking to hire one that was a member of the relevant professional body with the appropriate certifications. That's really what we're promoting for EA. That’s the role that the association can play.

Confidence building

When we achieve success with the association is when folks are hiring enterprise architects, they will only look at folks who are members of the association, because to do anything else would be like hiring an unqualified lawyer or accountant. It's about risk minimization and confidence building in your staff.

Gardner: You wear two hats, Chief Operating Officer at The Open Group and CEO of the AEA. How do these two groups relate?

Nunn: It's something I get asked periodically. The fact is that the association, whilst a separately incorporated body, was started by The Open Group. With these things, somebody has to start them and The Open Group's membership was all you needed for this to happen. So, very much the association has its roots in The Open Group and today still it works very closely with The Open Group in terms of how it operates and certain infrastructure things for the association are provided by The Open Group.

The support is still there, but increasingly the association is becoming a separate body. I mentioned the journal that’s published in the association's name that has its own websites, its own membership.

It's one of the leading organizations in the EA space and a group that the association would be foolish not to pay attention to.



So, little by little, there will be more separation between the two, but the aims of the two or the interests of the two are both served by EA becoming recognized as profession. It just couldn't have happened without The Open Group, and we intend to pay a lot of attention to what goes on inside The Open Group in EA. It's one of the leading organizations in the EA space and a group that the association would be foolish not to pay attention to, in terms of the direction of certifications and what the members, who are enterprise architects, are saying, experiencing, and what they're needing for the future.

It's a very close partnership and along with partnerships with other groups. The association is not looking to take anyone's turf or tread on anyone’s toes, but to partner with the other groups that are in the ecosystem. Because if we work together, we'll get to this profession status a lot quicker, but certainly a key partner will be The Open Group.
Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Read a full transcript or download a copy. Sponsor: The Open Group.

You may also be interested in:

Deployit 3.0 enhances deployment automation

XebiaLabs recently launched a new version of its deployment automation software. Deployit 3.0 promises to take away the risks associated with manual or scripted Java deployments, as well as out-of-date procedures. The end result: more productive DevOps teams.

Coert Baart, CEO, XebiaLabs, rightly points out that the escalating adoption of virtualization and cloud computing is causing increasingly complex application deployments -- especially with roles of development and operations teams crossing over.

“Issues of virtual sprawl, lengthy scripts, and error-prone manual processes become all the more burdensome for companies tackling deployments without the right technologies in place,” Baart says. “With Deployit 3.0, companies can relieve these burdens and achieve the productivity they need in today’s increasingly dynamic markets.”

Removing the red tape

Deployit 3.0 works to remove the complexity of managing deployments. Here’s the secret sauce: the ability to work with large numbers of applications and servers and manage large environments. XebiaLabs says the additional insight into configuration items and deployments empowers companies to implement secure, self-service deployments that set the stage for continuous deployment scenarios.

Deployit 3.0 executes multiple deployments at the same time with one mouse click. Additional new features in Deployit 3.0 include a new, user-friendly interface, Python command-line interface, and Maven plugin, as well as the ability to compare packages, servers and applications.

“At BGPI, part of Crédit Agricole, we constantly aim to deliver the best private banking services to our clients at competitive cost levels. However, too many errors in the deployment phase caused an increase in downtime for critical applications, leading to higher costs for our IT organization,” says Xavier Daguerre, Development Manager at BGPI. “With Deployit, we managed to significantly reduce the number of errors, as well as the time needed to deploy new applications or features. This contributed to lower deployment costs and overall, a higher business responsiveness.”
BriefingsDirect contributor Jennifer LeClaire provided editorial assistance and research on this post. She can be reached at http://www.linkedin.com/in/jleclaire and http://www.jenniferleclaire.com.
You may also be interested in:

Tuesday, February 8, 2011

Cloud computing drives need for open standards to define and describe a new enterprise environment

This guest post comes courtesy of Mark Skilton of Capgemini Global Applications, and The Open Group.

By Mark Skilton

I
recently looked back at some significant papers that had influenced my thinking on cloud computing as part of a review on current strategic trends. In February 2009, a paper published at the University of California, Berkeley, “Above the Clouds: A Berkeley View of Cloud Computing," stands out as the first of many papers to drive out the issues around the promise of cloud computing and technology barriers to achieving secure elastic service.

The key issue unfolding at that time was the transfer of risk that resulted from moving to a cloud environment and the obstacles to security, performance and licensing that would need to evolve. But the genie was out of the bottle, as successful early adopters could see cost savings and rapid one-to-many monetization benefits of on-demand services. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

A second key moment was the realization that the exchange of services was no longer a simple request and response. Social networks had demonstrated huge communities of collaboration and online “personas” changing individual and business network interactions, but something else had happened -- less obvious but more profound.

This change was made most evident in the proliferation of mobile computing that greatly expanded the original on-premise move to off-premise services. A key paper by Intel Research titled “CloneCloud,” published around that same time period, exemplified this shift. Services could be cloned and moved into the cloud, demonstrating the possible new realities in redefining the real potential of how work gets done using cloud computing.

Remote services

T
he key point was that storage or processing transactions, media streaming, or complex calculations no longer had to be executed within a physical device. It could be provided as a service from remote source, a virtual cloud service.

But more significant was the term “multiplicity” in this concept. We see this everyday as we download apps, stream video, and transact orders. The fact was that you could do not only a few, but multiple tasks, simultaneously and pick and choose the services and results.

This signaled a big shift away from the old style of thinking about business services that had us conditioned to think of service-oriented requests in static, tiered, rigid ways. Those business processes and services missed this new bigger picture. Just take a look at the phenomenon called "hyperlocal services" that offer location specific on-demand information or how crowd sourcing can dramatically transform purchasing choices and collaboration incentives.

The new multiplicity based world of cloud enabled networks means you can augment yourself and your company’s assets in ways that change the shape of your industry.



Traditional ways of measuring, modeling and running business operations are under-utilizing this potential and under-valuing what can be possible in these new collaborative networks. The new multiplicity-based world of cloud-enabled networks means you can augment yourself and your company’s assets in ways that change the shape of your industry.

What is needed is a new language to describe how this shift feels and works, and how advances in your business portfolio can be realized with these modern ideas, often by examining current methods and standards of strategy visualization, metrics, and design to evolve a new expression of this potential.

Some two years have passed, and what has been achieved? Certainly we have seen the huge proliferation of services into a cloud hosting environment. Large strategic movements in private data centers seek to develop private cloud services, by bringing together social media and social networking through cloud technologies.

But what's needed now is a new connection between the potential of these technologies and the vision of the Internet, the growth of social graph associations, and the wider communities and ecosystems that are emerging in the movement’s wake.

With every new significant disruptive change, there is also the need for a new language to help describe this new world. Open standards and industry forums will help drive this. The old language focuses on the previous potential, and so a new way to visualize, define, and use the new realities can help the big shift toward the potential above the cloud.

This guest post comes courtesy of Mark Skilton of Capgemini Global Applications, and The Open Group.

You may also be interested in:

EMC Greenplum releases Community Edition of MPP database product, big data analysis gets cheaper still

EMC recently introduced a free Community Edition of the EMC Greenplum Database, its massively parallel processing (MPP) database, along with free analytic algorithms and data mining tools.

Building on earlier Greenplum “big data” releases, like the EMC Greenplum Data Computing Appliance, the Community Edition lowers the cost barrier to entry for big data power tools for more developers, data scientists, and other data professionals.

The tools help to developers better understand data and provide new data uses, as well provide deeper insights and to better visualize those insights. The release was made at the 2011 O'Reilly Strata Conference, by Scott Yara, vice president, EMC Data Computing Products Division. EMC acquired Greenplum last summer. [Disclosure: Greenplum is a sponsor of BriefingsDirect podcasts.]

With the easily accessible Community Edition stack, developers can build complex applications to collect, analyze and operationalize big data leveraging best of breed big data tools, including the Greenplum Database with its in-database analytic processing capabilities.

“Our new Community Edition provides a parallel-everything 'big data' stack with unequaled speed that enables analysts to perform next-generation data analytics and experiment with real-world data, and most importantly -- innovate,” explained Luke Lonergan, CTO and vice president, EMC Data Computing Products Division and co-founder of Greenplum. “This project is about empowering developers. They can program using the most popular tools and they have a place to contribute open source extensions to the stack.”

The free EMC Greenplum Community Edition includes:
  • Greenplum Database CE, an industry-leading MPP database product for large-scale analytics and next-gen data warehousing.
  • MADlib, an open source analytic algorithms library, providing data-parallel implementations of mathematical, statistical and machine learning methods for structured and unstructured data.
  • Alpine Miner, an intuitive visual data mining modeler that delivers rapid "modeling to scoring" capabilities, leverages in-database analytics, and is purpose-built for "big data" applications.
Community benefits

T
he initial release of the Community Edition is designed for both first-time users and experienced Greenplum customers. First-time users gain access to a comprehensive, purpose-built business analytics environment that enables them to view, modify and enhance included demo data files, enabling experimentation with “big data” analytical tools within the Greenplum database. Existing users can download an upgraded version of Greenplum Database CE and analytic tools for integration into their development and research environments.

The Community Edition can be downloaded free of charge from http://community.greenplum.com as a pre-configured VMWare virtual appliance for use on laptops and desktops, or as a set of packages for deployment on user machines. All users are free to participate in new Greenplum community forums to get support, collaborate, post ideas, and test enhancements developed by various users independently.

Regular Community Edition updates will be made available online. The Community Edition is intended for experimentation, development and research purposes only. Current single-node edition users can deploy the new Community Edition in their single-node production environments. Greenplum commercial licenses must be purchased prior to using code for internal data processing or for any commercial or production purpose.

You may also be interested in:

Cyber security top of mind for enterprise architects at Open Group conference

SAN DIEGO -- The Open Group 2011 conference opened here yesterday with a focus on cyber security, showing how the risk management aspects of IT, architecture, and business stand as a high priority and global imperative for enterprises.

It's hard to plan any strategy for business and the IT forces that drive it, if the continuity of those services is suspect. Social media and the accelerating uses of mobile devices and networks are only adding more questions to the daunting issues around privacy and access. And, the Wikileaks affair has clearly illustrated how high the stakes can be. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

Three cyber security thought leaders plunged into the issues for the attendees: Bruce McConnell, Cybersecurity Counselor, National Protection and Programs Directorate (NPPD), US Department of Homeland Security; James Stikeleather, Chief Innovation Office, Dell; and Ben Calloni, Lockheed Martin Fellow for Software Security, Lockheed Martin Corp. Each speaker shared his thoughts on the current state of cyber security and where they see the industry heading in the future. Top of mind: The importance of trust, frameworks, and their impact on the security of critical infrastructure systems.

Following a brief introduction from Allen Brown, President and CEO of The Open Group, McConnell set the stage by discussing the current state of the security ecosystem.

Computing systems today often consist of numerous security hardware and software implementations working completely independently of each other. An improved security ecosystem would not only improve computing performance, but would also create an environment where interoperability would usher in governance and completeness. Facilitating information sharing between security systems would improve overall security by enabling systems to react in a more efficient manner when addressing security threats, he said.

The Department of Homeland Security (DHS) protects the federal executive branch, and works with critical infrastructure (gas, oil, electricity, telecom, etc.) to help them better protect themselves. DHS is currently working on a cyber security awareness campaign.

Stop, Think, Connect

Last year, DHS launched the “Stop, think, connect” campaign, which is directed at teens, young adults and parents of teens. With increased awareness, DHS believes that the threat of cyber security attacks will be lessened. For more information on the campaign, please go to http://www.dhs.gov/files/events/stop-think-connect.shtm.

McConnell mentioned that President Obama spoke on importance of private sector innovation earlier yesterday. He also stated that cyberspace is a new domain that is vital to our way of life. Therefore, it needs to be made more secure. Of course, government must play an important role in this process, but since cyber security is a civilian space, no one actor can secure it alone.

Given the global market of cyberspace, McConnell argued that the U.S. should continue to lead the security effort working together with consumers to achieve security. He then went on to suggest that an open, broad interoperability regime online would be able to validate attributes for online systems, but also emphasized that anonymity must be preserved.

Like every other function in IT, security, too, needs to be clearly defined in order to move forward.



McConnell concluded his keynote by speaking about a future white paper on the health of the cyber ecosystem, which will be based on the premise of a more secure cyberspace, where participants can work together in real-time to work against attacks. This cyber ecosystem would require automation, authentication and interoperability, enabling participating devices at any edge of a network to communicate with each other by policy established by the system owner. The ultimate purpose of the white paper is to encourage discussion and participation in an ecosystem that is more secure.

Dell innovation guru Stikeleather continued the plenary by emphasizing the need for a “Law of the Commons.” Like every other function in IT, security, too, needs to be clearly defined in order to move forward, he said. Clear definitions will enable the transparency and the common understanding needed for organizations and governments to communicate and discuss what goals the cyber community should strive to attain. This would not only lead to increased security, but it would also lead to improved trust, when addressing the growing concern of consumer privacy.

Co-evolution

The consequences of the Web’s evolution is actually a co-evolution, he said, in which people depend more on technology and we are restructuring how we see data (augmented reality); while technology is becoming contextual, dependent on who is making the request, how and when they are making it, and what their intentions are in making it.

In such a fluid environment trust is essential, but can there realistically be trust? We have created an untrustworthy environment, Stikeleather said, and the tipping point will be smart phones in the enterprise. This technology, in particular, is creating greater cracks in a complex environment that is destined to ultimately fail.

We’ve created rules for shared international usage of the world’s oceans and for outer space, and cyberspace should be no different.



Additionally, government and enterprise can’t agree on what the world should look like from a security perspective, due to differing cultural concepts in cyberspace, creating the need for a "Law of the Commons." We’ve created rules for shared international usage of the world’s oceans and for outer space, and cyberspace should be no different.

At the end of the day, everything is an economic survival issue, Stikeleather said. The real value of the Web has been network effects. If we were to lose trust in privacy and security, we'd lose the currency of that global network exchange and the associated economic model, which in turn could actually mean the collapse of the global economy, he said. A catastrophic event is likely to happen, he predicted. What will the world without trust look like? A feudal cyber world with white lists, locked clients, fixed communication routes, locked and bound desktops, limited transactions, pre-established trading partners, information hoarders, towers of Babel.

Underlying structure

We have a unique opportunity with cloud, Stikeleather said, to get it right early and put thought into what the underlying structure of cloud needs to look like, and how to conduct the contextual nature of evolving technology. Meantime, people should own the right to their own identity and control their information, and we need to secure data by protecting it within content.

There were a lot of car analogies during the plenary, whether intentional or not, and my favorite one of the day came from Calloni of Intel – “security needs to be built-in, not bolt-on.” I’ve thought of this analogy many times before when discussing IT, especially in regards to enterprise architecture.

Calloni said that given human nature’s tendency to use technology to engineer ways to make our life easier, better, more functional, etc., we increase the risk by increasing exposure. Drawing a comparison to a Ford Pinto, he stated that if organizations can purely focus on security, their probability of success would increase exponentially. However, when we add functionalities where focus will be more distributed, security will decrease as the attack surface increases.

He outlined key questions that each organization should ask when determining security:
  • Who has access?
  • What are the criteria for gaining access/clearance?
  • Who has controls?
  • What function is most important? Is being balanced key?
  • What type of security do you need?
Security is expensive, so the need to reduce an organization’s attack surface is critical, when establishing a security policy. In order to build a security policy that will protect your organization, Calloni argued that you must be able to look at what area or parts of your system/network are available for an assailant to compromise. Five key areas that must be looked at include:
  • Vulnerability -- to have it, an attacker must be able to access it
  • Threats -- any potential hazard of harm to the data, systems or environment by leveraging a vulnerability; Individual taking advantage of a vulnerability
  • Risk -- the probability of the threats using the vulnerabilities; higher risks come with more vulnerabilities and increased threats
  • Exposure -- the damage done through a threat taking advantage of a vulnerability
  • Countermeasures -- processes and standards that are used to combat and mitigate the risks
Like a car's drivetrain, security needs to be built-in, not bolted-on. Security frameworks need to have the solid foundation in which organizations can build-on in order to address the ever-changing cyber threats. Bolt-ons will only provide temporary band-aids that will leave your organization vulnerable to cyber threats, he emphasized.

As organizations move toward the cloud and as cyber threats are becoming more commonplace, it will be interesting to see what importance organizations place on the themes discussed yesterday. They definitely apply to the remaining conference tracks. I’m especially looking forward to how what the enterprise architecture and cloud speakers will address these topics.

If you want a real-time view of the 2011 San Diego Conference, please search for the Twitter hashtag #ogsdg.

You may also be interested in:

Monday, February 7, 2011

Take The Open Group survey to measure the true enterprise impact of cloud computing

This guest post comes from Dave Lounsbury, Chief Technical Officer at The Open Group.

By Dave Lounsbury

Everyone in the IT industry knows by now that cloud computing is exploding. Gartner said cloud computing was its number-one area of inquiry in 2010, and hype for the popular computing movement peaked last summer according to its 2010 Hype Cycle for Cloud Computing.

Regardless of what the media says, cloud is now a real option for delivery of IT services to business, and organizations of all sizes need to determine how they can generate real business benefit from using cloud. Industry discussion about its benefits still tend to be more focused on such IT advantages as IT capacity and utilization versus business impacts such as competitive differentiation, profit margin, etc. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

The Open Group’s Cloud Work Group has created a series of White Papers to help clarify the business impacts of using cloud and, as a next step, The Open Group at our San Diego Conference today launched a public survey that will examine the measurable business drivers and ROI to be gained from the cloud. We encourage you to spend a few minutes completing the online survey.

We’re specifically looking for input from end-user organizations about their business requirements, outcomes and initial experience measuring ROI around their cloud projects.



We’re specifically looking for input from end-user organizations about their business requirements, outcomes and initial experience measuring ROI around their cloud projects. The survey both builds on the work already done by the Cloud Work Group, and the results of the survey will help guide its future development on the financial and business impact of cloud computing.

The survey will be open until Monday, March 7, after which we’ll publish the findings. Please help us spread the word by sharing the link -- http://svy.mk/ogcloud -- to our survey with others in your network who are either direct buyers of cloud services or have influence over their organization’s cloud-related investments.

This guest post comes from Dave Lounsbury, Chief Technical Officer at The Open Group.

You may also be interested in: