Friday, May 1, 2009

rPath offers free management tool for applications aspiring to the cloud

rPath would like to be your applications' path to cloud computing.

The Raleigh, N.C.-based start up founded by Red Hat refugees, Tim Buckley,
executive chairman of the board, and Erik Troan, CTO, recently released a free downloadable version of rBuilder for managing application deployment to virtual or cloud-based environments as well as traditional glass houses. [Disclosure: rPath is a sponsor of BriefingsDirect podcasts.]

For IT managers looking at cloud deployment, rPath’s approach is to embrace as many flavors of the cloud as possible to deal with the fact that what is commonly called the cloud is really a bunch of non-standard environments varying from vendor to vendor.

rPath lists support for three clouds, Amazon EC2, Globus Alliance, and Bluelock. rBuilder also supports hypervisors, including VMware ESX, Citrix Xen and Microsoft Hyper-V.

As a startup with a limited budget for hardware, rPath eats its own cloud dog food. The company uses Amazon EC2 for some of its own applications, as Billy Marshall, chief strategy officer, explained in a Q&A interview with SearchSOA last fall. We also did an interiew with Marchall on BriefingsDirect.

The new free version of rBuilder differs from the free rBuilder Online community version in that you can download it and run it behind your own firewall. And it differs from the commercial version in that it is restricted to 20 running system instances in production.

Once a user reaches 21, they have to “establish a commercial relationship with rPath.”

Also, users of the free version can only get support through the rBuilder Online community.

For shops looking to explore Cloud computing, the free version of rBuilder, appears to be a viable option. You can check out the system requirements and download instruction at rPathQuickStart.

Rich Seeley provided research and editorial assistance to BriefingsDirect on this blog. He can be reached at Writer4Hire.

Follow me on Twitter at http://twitter.com/Dana_Gardner.

Wednesday, April 29, 2009

PC 'security as a service' gains global cloud footprint with free Panda anti-virus offering

Cloud computing's utility and power in everyday life reached a notable new milestone today with Panda Security's free PC security service.

This delivery and two-way malware detection-access model makes a ton of sense, so much so that I expect we'll be soon seeing the cloud model deliver of more than PC security and anti-virus/anti-spam services. The era of remote services for a slew of device support and maintenance -- of everything from cars to cell phones to home appliances -- is upon us.

Essentially anything that uses software and has network access can be supported efficiently and powerfully based on the Panda Security cloud model. Making the service free to home-based users is especially brilliant because it gains the Metcalfe's Law benefits of a valuable community to detect the malware, with the means to then sell the detection and prevention means to business and professional users. [Disclosure: Panda Security is a sponsor of BriefingsDirect podcasts.]

Here's how it works, from Panda's release:
Consumers can download the free client protection product from http://www.cloudantivirus.com. ... The Panda Cloud Antivirus thin-client agent introduces a new philosophy for on-access asynchronous cloud-scanning. It combines local detection technologies with real-time cloud-scanning to maximize protection while minimizing resource consumption. This optimized model blocks malicious programs as they attempt to execute, while managing less dangerous operations via non-intrusive background scans.

Panda's proprietary cloud computing technology called Collective Intelligence, Panda Cloud Antivirus harnesses the knowledge of Panda's global community of millions of users to automatically identify and classify new malware strains in almost real-time. Each new file received by Collective Intelligence is automatically classified in under six minutes. Collective Intelligence servers automatically receive and classify over 50,000 new samples every day. In addition, Panda's Collective Intelligence system correlates malware information data collected from each PC to continually improve protection for the community of users.
Panda says the model demands a lot less of a PC's resources, 5% versus 9% for other fat-client AV software approaches. That means older PCs can get protected better, cheaper, and longer. Far fewer people will need to upgrade the PC hardware just to keep it free from viruses. It's about time! Poor security should not be a business model for sellers of new computers and software.

I'm going to try this service on Windows XP Home running on Parallels on my iMac Leopard. I'll report back on how it works.

As I said, I hope this model succeeds because it really is a harbinger of how cloud-based services can improve and solve thorny problems in a highly efficient manner that combines the power of community with scale and automation. This may go far in also dissuading the creators of malware because the bad things will be squelched so fast if a Panda model get critical mass that the effort is useless and therefore mute.

Panda Security, a privately held company based in Spain, could well see its services expand to include PC maintenance, support, remote and automated support, and even more SaaS applications and productivity services. I expect this burgeoning ball of PC services from the cloud ecology to become the real software plus services model. It will be very interesting to see which vendors and/or providers or partnerships can assemble the best solutions package first and best.

Incidentally, don't expect Microsoft to do this cloud-based security thing. It can't afford to kill off or alienate the third-party malware security providers by doing it all itself. Those days are long past gone. The third parties, however, can now stretch their wings and fly. And they are.

Follow me on Twitter at http://twitter.com/Dana_Gardner.

Tuesday, April 28, 2009

Can software development aspire to the cloud?

This guest post comes courtesy of Tony Baer's OnStrategies blog. Tony is a senior analyst at Ovum. His profile is here. You can reach him here.

As we’re all too aware, the tech field has always been all too susceptible to the fad of the buzzword, which of curse gave birth to another buzzword as popularized by gave birth to Gartner’s Hype Cycles. But in essence the tech field is no different from the worlds of fashion or the latest wave in electronic gizmos – there’s always going to be some new gimmick on the block.

But when it comes to cloud, we’re just as guilty as the next would-be prognosticator as it figured into several of our top predictions for 2009. In a year of batten-down-the-hatch psychology, anything that saves or postpones costs, avoids long-term commitment, while preserving all options (to scale up or ramp down) is going to be quite popular, and under certain scenarios, cloud services support all that.

And so it shouldn’t be surprising that roughly a decade after Salesforce.com re-popularized the concept (remember, today’s cloud is yesterday’s time-sharing), the cloud is beginning to shake up how software developers approach application development. But in studying the extent to which the cloud has impacted software development for our day job at Ovum, we came across some interesting findings that in some cases had their share of surprises.

ALM vendors, like their counterparts on the applications side, are still figuring out how the cloud will impact their business. While there is no shortage of hosted tools addressing different tasks in the software development lifecycle (SDLC), major players such as IBM/Rational have yet to show their cards. In fact, there was a huge gulf of difference in cloud-readiness between IBM and HP, whose former Mercury unit has been offering hosted performance testing capabilities for 7 or 8 years, and is steadily expanding hosted offerings to much of the rest of its BTO software portfolio.

More surprising was the difficulty of defining what Platform-as-a-Service (PaaS) actually means. There is the popular definition and then the purist one. For instance, cloud service providers such as Salesforce.com employ the term PaaS liberally in promoting their Force.com development platform, in actuality development for the Force.com platform uses coding tools that don’t run on Salesforce’s servers, but locally on the developer’s own machines. Only once the code is compiled is it migrated to the developer’s Force.com sandbox where it is tested and staged prior to deployment. For now, the same principle applies to Microsoft Azure.

That throws plenty of ambiguity on the term PaaS – does it refer to development inside the cloud, or development of apps that run in the cloud? The distinction is important, not only to resolve marketplace confusion and realistically manage developer expectations, but also to highlight the reality that apps designed for running inside a SaaS provider’s cloud are going to be architecturally different than those deployed locally. Using the Salesforce definition of PaaS, apps that run in its cloud are designed based on the fact that the Salesforce engine handles all the underlying plumbing. In this case, it also highlights the very design of Salesforce’s Apex programming language, which is essentially a stored procedures variant of Java. It’s a style of development popular from the early days of client/server, where the design pattern of embedding logic inside the database was viewed as a realistic workaround to the bottlenecks of code running from fat clients. Significantly, it runs against common design patterns for highly distributed applications, and of course against the principles of SOA, which was to loosely couple the logic and abstracted from the physical implementation. In plain English, this means that developers of apps to run in the cloud may have to make some very stark architectural choices.

The confusion over PaaS could be viewed as a battle over vendor lock-in. It would be difficult to port an application running in the

That throws plenty of ambiguity on the term PaaS – does it refer to development inside the cloud, or development of apps that run in the cloud?

Salesforce cloud to another cloud provider or transition it to on premises because the logic is tightly coupled to Salesforce’s plumbing. This also sets the stage for future differentiation of players like Microsoft, whose Software + Services is supposed to make the transition between cloud and on premises seamless; in actuality, that will prove more difficult unless the applications are written in strict, loosely-coupled service-oriented manner. But that’s another discussion that applies to all cloud software, not just ALM tools.

But the flipside of this issue is that there are very good reasons why much of what passes for PaaS involves on-premises development. And that in turn provides keen insights as to which SDLC tasks work best in the cloud and which do not.

The main don’ts consist of anything having to do with source code, for two reasons: Network latency and IP protection. The first one is obvious: who wants to write a line of code and wait until it gets registered into the system, only to find out that the server or network connection went down and you’d better retype your code again. Imagine how aggravating that would be with highly complex logic; obviously no developer, sane or otherwise, would have such patience. And ditto for code check-in/check out, or for running the usual array of static checks and debugs. Developers have enough things to worry about without having to wait for the network to respond.

More of concern however is the issue of IP protection: while your program is in source code and not yet compiled or obfuscated, anybody can get to it. The code is naked, it’s in a language that any determined hacker can intercept. Now consider that unless you’re automating a lowly task like queuing up a line of messages or printers, your source code is business logic that represents in software how your company does business. Would any developer wishing to remain on the payroll the following week dare place code in an online repository that, no matter how rigorous the access control, could be blown away by determined hackers for whatever nefarious purpose?

If you keep your logic innocuous or sufficiently generic (such as using hosted services like Zoho or Bungee Connect), developing online may be fine (we’ll probably get hate mail on that). Otherwise, it shouldn’t be surprising that no ALM vendor has yet or is likely to place code-heavy IDEs or source code control systems online. OK, Mozilla has opened the Bespin project, but just because you could write code online doesn’t mean you should.

Conversely, anything that is resource-intensive, like performance testing, does well with the cloud because, unless you’re a software vendor, you don’t produce major software releases constantly. You need lots of resource occasionally to load and performance test those apps (which by that point, their code is compiled anyway). That’s a great use of the cloud, as HP’s Mercury has been doing since around 2001.

Similarly, anything having to do with the social or collaboration aspects of software development lent themselves well to the cloud. Project management, scheduling, task lists, requirements, and defect management all suit themselves well as these are at core group functions where communications is essential to keeping projects in sync and all members of the team – wherever they are located — on literally the same page. Of course, there is a huge caveat here – if your company designs embedded software that goes into products, it is not a good candidate for the cloud: imagine getting a hold of Apple’s project plans for the next version of the iPhone.

This guest post comes courtesy of Tony Baer's OnStrategies blog . Tony is a senior analyst at Ovum. His profile is here. You can reach him here.

Wednesday, April 22, 2009

Progress gives CEP a performance boost with multi-core support on Apama

Progress Software this week announced the release of an enhanced Parallel Correlator for its Apama Complex Event Processing (CEP) platform so it can take advantage of multi-core, multi-processor hardware.

Progress claims a seven-fold increase in CEP performance on an eight-core server in the company’s internal benchmark testing of this version of the Parallel Correlator in what the company described as “real-world customer scenarios.” [Disclosure: Progress is a sponsor of BriefingsDirect podcasts.]

John Bates, who founded Apama in 1999 to build out technology based on his research at Cambridge University in the UK, believes CEP is an easier technology sell to business users than service oriented architecture (SOA) because a clear case can be made for the ability of a product like Apama to execute high-speed transactions based on the identification of millisecond movements in the business enivronment. It can also provide business managers and executives with split-second snapshots of how they are doing in their markets.

I guess we can think of CEP as SOA for high octane business intelligence (BI) for transactional and real-time insights and inferences from tremendously complex and often massive streams of services (and more). Mostly traditional BI comes from read-only agglomerations of fairly static SQL data, some of which needs a lot of handholding before it gives up its analytics gems.

Incidentally, I also spoke this week with Cory Isacsson at CodeFutures, busy at the MySQL show, who has a lot to say these days about database sharding and how applying it to OSS databases like MySQL gets, among other things, more BI love from transactional read/write SQL data. More here.

Back to CEP ... This is being newly perceived by some as much more tangible to business users than the more nerdy benefits of SOA, such as reuse of services for more agile programming of new applications. Talk about the benefits of CEP and business users eye light up. Talk about the benefits of SOA and even business process management (BPM) and their eyes can glaze over.

I should point out that my buddy at ActiveEndpoints Alex Neihaus (another disclosure on their sponsorship of BriefingsDirect podcasts) would argue that CEP and SOA are the real somnolence inducers, and that BPM and visual orchestration form the far better point on the business value arrow around service swarms. Talk among yourselves ...

In making the latest Apama announcement, Progress touts an IDC report on CEP (excerpts) that included evaluation of the 2008 version of the Apama platform. IDC gave the Progress product high ratings in the categories of “Low Latency,” the speed of event processing, “Business User Control,” how it works for the business people, and “Deterministic Behavior,” the predictability and repeatability of the event processing programs.

Lo, and although it is not mentioned in the Progress announcement, Apama did not get such high scores in the two other IDC categories, “Data Management,” and “Complex Event Detection.”

IDC does non-metaphysical squares, rather than Magic Quadrants, we should gather.

In the real world, the major market for CEP appears the beleaguered financial services industry and the government watchdog agencies that are overseeing them. This appears reflected in the Apama customers listed in this week’s announcement, including JP Morgan, Deutsche Bank, and FSA (Financial Services Authority) of the UK.

Written in the midst of this recession, the IDC report worries: "Because Apama is so closely identified with the financial markets, the current downturn is likely to negatively impact Apama's opportunity and growth prospects in the near term. Therefore, it is incumbent that Progress figure out how to cost effectively apply the technology to new markets with better short-term growth prospects."

At the beginning of last fall’s financial system meltdown, Bates told a reporter that there may be a silver lining for CEP even in the midst of a banking crisis. He foresees potential for greater use of CEP by both government regulators as well as the financial institutions that need to supply more and more detailed data to show how they are complying with new regulations now being formulated, as well as old regulations now being more rigorously enforced.

Too bad they can't apply it to card counting or my wife's algorithmic-rich shuffling of copius coupons for generating a simple groceries list. Just start with the old one, I keep telling her.

Other industries that both Progress and IDC agree might provide new markets for CEP include transportation and inventory control systems based on RFID, and ERP systems for manufacturing. I continue to be intrigued too by mobile commerce (Google Voice, anyone?), laced with locations services and other varibles like weather.

CEP is going to advance the competitive capabilities for a lot of companies. What's less clear is how they will manage that along with their BI, SOA, cloud, and other must do somedays on the IT groceries list.

Rich Seeley provided research and editorial assistance to BriefingsDirect on this blog. He can be reached at Writer4Hire.

Monday, April 20, 2009

TIBCO CEO worries about Oracle-Sun deal's impact on IT industry

BriefingsDirect contributor Rich Seeley interviewed Vivek Ranadive, CEO of TIBCO Software, on the day the Oracle-Sun proposed deal was announced. Here's his report.

Will there be confusion and even fear in the Java community? Can Microsoft take advantage of that? Will there be disruption in the hardware server business that works to the advantage of Cisco? Vivek Ranadive, CEO of TIBCO Software, sees a lot of question marks around Oracle’s proposed acquisition of Sun Microsystems.

“I’m sure there’s nervousness in the Java community,” Ranadive said in an exclusive interview with BriefingsDirect. “Can they trust [Oracle Chairman and CEO] Larry Ellison? What’s he going to do with this control? Is he going to manipulate Java so he gets an advantage? Is he going to make it less open? Is he going to find ways to start charging customers for it? There are a lot of question marks.”

[Disclosure: TIBCO is a sponsor of BriefingsDirect podcasts.]

Sun, as a hardware company, was committed to open source and Java, in the TIBCO CEO’s view, whereas Oracle is a software company “that has been notorious at exacting money from customers.”

In Ranadive's view Microsoft is a beneficiary of any fear, uncertainty and doubt about the future of Java and open source.

“It helps Microsoft,” Ranadive said. “If you’re a customer and you’re wondering about Java, you might just say the heck with it, I’ll go with Microsoft.”

Microsoft also has a cloud computing initiative while Oracle has been reticent, he noted.

“Larry Ellison has been on record as saying he doesn’t believe in the cloud,” Ranadive said, “whereas Microsoft jumped on the virtualization bandwagon and is going to head up the parade on that.”

While TIBCO as a middleware vendor maintains “Swiss-like neutrality” between the Java and .NET worlds, Ranadive said he has been impressed with the cloud technology coming out of Redmond.

Noting that in the midst of the recession TIBCO continues to report record earnings, Ranadive said that in an IT market where Java and .NET co-exist in many shops, his company is positioned as the “trusted arbiter in the middle.”

Using the example of the suitor Sun spurned, he notes the TIBCO competes with IBM, works with IBM and runs on IBM servers.

Ranadive said he is not concerned about what Oracle will do with the Sun hardware servers. But he noted that the “plot thickens” for the other server vendors including IBM, HP, Dell, and now Cisco.

“We don’t know what is happening with the server business,” he said. “Is Oracle going to keep it? Are they going to shut it down? Is Oracle going into the hardware business?”

But he sees Cisco possibly benefiting from any disruption in the hardware market.

“We have customers that are looking at Sun, and they may look at this and say maybe we’ll go with Cisco,” he speculated.

Ranadive also lists his own company as a beneficiary from whatever disruption arises from Oracle’s latest acquisition.

“Our customers are already rebelling against putting more eggs in the Oracle basket,” he said. “As the Swiss-neutral party that integrates everything with everything, it helps us a great deal.”

Noting that he got his own start in the businesses with a workstation borrowed from Sun, Ranadive did say he was sorry for the loss of what Sun once represented as the home of the innovators who created Java. But he doesn’t believe Silicon Valley has lost its innovative spirit.

“The innovators will show up at other companies, including ours,” he said. “Innovation is here to stay in the Valley.”

On the lighter side, the TIBCO CEO was speculating that Ellison might acquire Best Buy next.

“He seems to like commodity products that can be accretive,” Ranadive quipped. “So maybe Best Buy will be his next purchase. He could even start selling servers in his shop.”

In a more serious analogy, Ranadive noted that the accretive model has been used before in the software business, most notably by Computer Associates, which steadily acquired mainframe software companies in the 1980s and the 1990s.

“Oracle has become the CA of the present era,” he said.

Rich Seeley provides research and editorial assistance to BriefingsDirect. He can be reached at Writer4Hire.

Hooray! Oracle acquisition of Sun makes perfect sense

The reported acquisition of Sun Microsystems by Oracle today makes a ton more sense than IBM's earlier failed bid. This new compact, if it succeeds, will bring as good an end to an independent Sun as the pioneering (yet long flagging) IT vendor could have hoped for at this sorry stage in its history.

But there are much larger implications in Oracle's latest super-grab than Sun's demise and assimilation. Among them is the fact that IBM now -- for the first time, really -- has a true, full and global counter weight to its role and influence. Oracle plus Sun aligned with Hewlett-Packard (which I fully expect) meets and begins to beat IBM at all the important full-service IT games.

This is truly healthy for IT and the global IT marketplace. IBM's earlier purported bid for Sun always smelled bad to me. I was, it turned out, mostly a red herring. Perhaps Oracle needed the IBM roller coaster ride to focus its intentions. Nonetheless, the outcome is optimal. It bodes well for cloud computing too, as Oracle just about overnight becomes a cloud force to reckon with. I always thought Larry Ellison was just biding time on this one. The recession has hastened the timetable.

Other than IBM's unassailed hegemony, the other losers in this are Microsoft (actually possibly creeping to irrelevancy faster than anyone could have imagined three years ago), SAP, and Cisco Systems. Amazon may also get getting more competition soon on the platform as a service front. Using Sun's cloud investments, implementations and plans, Ellison can also quickly forge together his own counter-weight to Salesforce.com. No need to buy it now (for a while).

Open source in general, too, may take a hit, as I don't expect Unbreakable Linux to remain Oracle's point on the operating system arrow. Solaris will be the prime Oracle OS for performance, meaning Oracle's channel pipeline to Red Hat will shrink. And MySQL will be a means and not an ends for Oracle, which would, of course, prefer an Oracle 11g cloud instead.

Suffice to say that whatever momentum Sun had behind open source everywhere will be muted to open source some times as a ramp to other Oracle stuff, or to grown the community and keep developers happy.

Like IBM, Oracle will have little interest or need for open source middleware or service oriented architecture (SOA) components. Further, given Oracle's early and deep interest in Eclipse and OSGi, the Java tools will stay free and open (with a lot of Oracle wizards embedded across the database and other middleware). The tussle for influence between Oracle and IBM in Eclipse and the Java Community Process (JCP) will be great fun to watch in coming years. Again, this is healthy. (Good thing Sun opened this up, eh?)

No other company has shown an ability to merge and integrate at the massive scale and complexity that Oracle has. It's acquisition spree that began five years ago is unprecedented in its scope and level of success. We have no reason to suspect that the way it handles Sun will be any different.

Winners on the deal include Java itself in the fullest and broadest sense. Oracle and IBM are the premier Java vendors, and the might of IBM (and its customers and developers) in the market will force Oracle to keep Java open and vibrant, while Oracle's penchant for control and commercial success will keep Java safe and singular. I expect the old BEA WebLogic implementations now at Oracle to gather some minor bundles from Sun's software portfolio, but Sun's enterprise software stack (for all intents and purposes) is history. I can't see Glass Fish or Net Beans going anywhere but bye-bye. Same with the Sun SOA stuff.

Most interesting will be the way that Oracle matches the Sun assets against HP's burgeoning partnership with Oracle. Will HP perhaps buy Sun's hardware, storage and integrated cicuits intellectual property outright after the Oracle acquisition is final? I'd bet on it. [Disclosure: HP is a sponsor of BriefingsDIrect podcasts.]

The Exadata announcement last fall is a good example of what to expect. Business intelligence is the killer enterprise application of the day (era), and Oracle and HP aim to win. Coupling Oracle BI and business applications is something special ... better potentially than what IBM and SAP can do. Should we expect from this Oracle-Sun merger some more love or more between IBM and SAP. Oh, ya!

We should expect to see a major go-to-market push by HP and Oracle, with all kinds of appliances and solutions portfolios. Both Oracle's and HP's love of virtualization allows all kinds of neat packaging. Expect some of the industry's premier on-premises cloud solutions ASAP.

Indeed, we now have a land grab race for the modernized data center/private cloud between Oracle/Sun/HP and IBM. What's more, HP with all the old DEC stuff, plus Sun's Unix, may keep Unix alive and well while keeping IBM at bay with its everything mainframe lust.

On the blue sky front, consider if Apple and Google get closer to the Oracle-Sun-HP trifecta? Wow. Cloud city.

Larry Ellison correctly predicted a few years ago that only a few IT companies would remain. Maybe we should just remove the "IT" and keep it at only a few companies will remain -- and Oracle will be one of them.

Talk about pure irony ... It was when Oracle turned its back on Sun four years ago with the unbreakable Linux and Java process business (Eclipse over NetBeans, OSGi support, etc.) that Sun's nosedive deepened. In a sense, you could say that Oracle pushed Sun off a cliff in slow motion, only to catch the pieces at fire sale prices.

Friday, April 17, 2009

HP teams with Microsoft, VMware to expand appeal of desktop virtualization solutions

As the sour economy pushes more companies into the arms of virtual desktop infrastructure (VDI) for cost cutting, the vendor community is eagerly wiping out obstacles to adoption by broadening the appeal of desktops as a service for more users, more types of applications and media.

This became very clear this week with a flurry of announcements that package the various parts of VDI into bundled solutions, focus on the need to make rich applications and media perform well, and expands the types of endpoints that can be on the receiving end of VDI.

Hewlett-Packard (HP) expanded its thin-client portfolio with new offerings designed to extend virtualization across the enterprise, while providing a more secure and reliable user experience. The solutions bundle software from Microsoft and VMware along with HP's own acceleration and performance software, as well as three thin client hardware options.

I can hardly wait for HP to combine the hardware and software on the server side, too. I have no knowledge that HP is working up VDI appliances that could join the hardware configurations on the client side. But it sure makes a lot of sense.

Seriously, there are few companies in the better position to bring VDI to the globe, given what technologies they gain with Mercury and Opsware, along with internal development ... Oh, and there's EDS to make VDI hosting a service in itself. Look for a grand push from HP into this enterprise productivity solutions area.

Leading the pack in this latest round of VDI enhancements are the three thin clients -- the HP gt7720 Performance Series, and the HP t5730w and t5630w Flexible Series. These offer new rich multimedia deployment and management functionality -- rich Internet applications (RIA), Flash, and streaming media support -- that enhance the Microsoft Windows Embedded Standard. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

The Palo Alto, Calif. company also announced several other new features:
The thin clients feature Microsoft Internet Explorer 7, Windows Media Player 11 and the ability to run applications locally, they also include Microsoft Remote Desktop Protocol 6.1, which enables devices to connect and take advantage of the latest security and enterprise management technologies from Windows Server 2008.

RDP Enhancements multimedia and USB redirection enable users to easily run web applications, videos and other files within a virtual desktop environment, while avoiding frame skipping and audio or video synchronization issues. The software downloads the processing directly to the thin client, creating an enhanced multimedia experience while lowering the load on the server, which results in increased server scalability.

This also creates a near-desktop experience for VMware View environments, including support for the latest VMware View Manager 3 broker with no need for additional employee training. Users simply log in on the thin client to take advantage of its multimedia features, such as training videos, and USB device support.

HP and VMware also are working together to enable VMware View Manager’s universal access feature to leverage RGS for remote desktop sessions.

RGS is designed for customers requiring secure, high-performance, collaborative remote desktop access to advanced multimedia streaming and workstation-class applications. The software includes expanded, real-time collaboration features to allow multiple workers from remote locations to see and share content-rich visualizations, including 2-D design, 3-D solid modeling, rendering, simulation, full-motion video, heavy flash animation and intense Web 2.0 pages.

Not surprisingly, a lot of the technology being used in these VDI bundles originated with secure CAD/CAM virtual workstation implementations, where graphics and speed are essential. If it works for developers in high-security areas, it should work for bringing ERP apps and help desk apps to the masses of workers who don't need a full PC on every desktop. They just need an interactive window into the apps and data.

Expected to be available in early May, the new thin clients will be priced from $499 to $799. More information is available through HP or authorized resellers or from http://www.hp.com/go/virtualization. I would expect that EDS is going to have some packages that drive the total cost down even more.

Research and editorial assistance by Carlton Vogt.

Tuesday, April 14, 2009

CollabNet rebrands ALM product to better support distributed development and cloud applications

As companies are being drawn -- or nudged -- into cloud computing, tools are emerging to make distributed services lifecycles more secure and efficient. The latest entry into the field is CollabNet's newly rebranded TeamForge 5.2, which greases the skids for Internet-based software development and deployment.

Formerly known as SourceForge Enterprise, the Brisbane, Calif. company's flagship application lifecycle management (ALM) product now helps developers define and modify profiles and software stacks and provision these profiles on both physical and virtualized build-and-test servers, including from public or private clouds.

Users can access servers from CollabNet's OnDemand Cloud, Amazon's EC2, as well as their own private cloud implementation. TeamForge 5.2 also includes Hudson's continuous integration capability, allowing Hudson users to provision and access build-and-test servers from any of these clouds.

CollabNet also announced Tuesday a relationship with VMWare to help deliver an integrated development environment so independent software vendors (ISVs) and developers can use TeamForge and VMWare Studio to create applications for deployment in internal and external clouds.

CollabNet said it renamed its product to reflect the company's support of modern software development, and elevated Subversion management, across widely distributed project teams. We can expect that the cloud shift will move development to multiple cloud development and deployment environments, and so require heightened management and security capabilities. As platform as a service (PaaS) gains traction, complexity could well skyrocket.

Just as complexity in traditional development projects has benefited from Subversion and ALM, so too will the cloud-impacted aspects of development and deployment. I wonder when the business process management (BLM) functions and ALM functions will intercept, and perhaps integrate. Oh, and how about a feedback loop or two to services governance and a refined requirements update wokflow stream. Now that's a lifecycle.

TeamForge 5.2 also provides role-based access control for distributed teams via increased management visibility, governance, and control of mission-critical software in Subversion repositories. The new release provides granular, path-based permissions for flexible Subversion access control, said CollabNet.

Lastly, the Agile software development method gets a nod with the integration of the Hudson continuous integration engine through a CollabNet plug-in. TeamForge also supports a wide variety of development methods, environments, and technologies.

TeamForge 5.2 is available for download as a free trial at http://www.collab.net/downloadctf.

Research and editorial assistance by Carlton Vogt.