Thursday, September 23, 2010

Sonoa becomes Apigee, offers new and rebranded API management and analysis product lines

Sonoa Systems, a provider of application programming interface (API) solutions, has changed its name this week to Apigee.

While Sonoa originally offered a free API tools and management platform, Apigee now offers three product lines for enterprises, developers, and API providers of all sizes. The company now serves more than 7,000 developers and some 140 enterprises with API management services. [Disclosure: Sonoa Systems is a past sponsor of BriefingsDirect podcasts.]

“By unifying the company under one brand and launching our premium line, we can better serve the full spectrum of companies and developers using APIs to power their apps, mobile and multichannel strategies and business partnerships,” said Chet Kapoor, CEO, Apigee.

The traffic on the traffic has been brisk. Currently, 2,500 GB of data per month and 25k messages are processed per second on Apigee Tech, says the firm.

As I heard more about the role of APIs and how managing and defining that traffic and use patterns -- both incoming and outgoing -- I was reminded too of the Big Data analysis value so many companies are building out.

What if you were to be able to analysis real-time data with real-time API activities? This may not be for everyone, but many mobile, e-commerce and service providers -- and a boat load of web-focused start-ups -- could develop some super insights.

Joining the analysis from APIs, systems logs, and data could be a killer business intelligence benefit. It might also spur new revenue by selling that analysis if you happen to find yourself at the juncture of APIs and data and either business or consumer behavior. Viva la real time analytics at scale!

Among the new and rebranded Apigee products:
  • Apigee Premium: Announced on Wednesday, Apigee Premium provides advanced features on top of the Apigee Free platform, including unlimited API traffic, advanced rate limiting and analytics, and developer key provisioning. Visit https://app.apigee.com/sign_up to sign up for the preview.

  • Apigee Free: A free tools platform launched last year for developers and providers to learn, test, and debug APIs, get analytics on API performance and usage, and apply basic rate-limits to protect their services.

  • Apigee Enterprise: An industrial-grade API platform for enterprises using APIs to fuel their mobile, multichannel, application and cloud strategies. Previously Sonoa Systems’ core product ServiceNet, Apigee Enterprise provides API visibility, control, management and security.
You may also be interested in:

Wednesday, September 22, 2010

Data center transformation requires more than new systems, there's also secure data removal, recycling, server disposal

Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Read a full transcript or download a copy. Sponsor: HP.

An often-overlooked aspect of data center transformation (DCT) is what to do with the older assets as newer systems come online. Much of the retiring IT equipment can possess sensitive data, may be sources of significant economic return, or need at least need to be recycled according to various regulations.

Improperly disposing of data and other IT assets can cause embarrassing security breaches, increase costs, and pose the risk of regulatory penalties. Indeed, many IT organizations are largely unaware of the hazards and risks of selling older systems into auction sites, secondary markets or via untested suppliers.

Compliance and recycling issues, as well as data security concerns and proper software disposition, should therefore be top of mind early in the DCT process, not as an after-thought.

In a recent podcast discussion, I tapped two HP executives on how to best manages productive transitions of data center assets -- from security and environmental impact, to recycling and resale, and even to rental of transitional systems during a managed upgrade process. I spoke with Helen Tang, Worldwide Data Center Transformation Lead for HP Enterprise Business, and Jim O'Grady, Director of Global Life Cycle Asset Management Services with HP Financial Services.

Here are some excerpts:
Helen Tang: Today there are the new things coming about that everybody is really excited about, such as virtualization, and private cloud. ... This time around, enterprises don’t want to repeat past mistakes, in terms of buying just piles of stuff that are disconnected. Instead, they want a bigger strategy that is able to modernize their assets and tie into a strategic growth enablement asset for the entire business.

Yet throughout the entire DCT process, there's a lot to think about when you look at existing hardware and software assets that are probably aged, and won’t really meet today’s demands for supporting modern applications.

How to dispose of those assets? Most people don’t really think about it nor understand all of the risks involved. ... Even experienced IT professionals, who have been in the business for maybe 10, 20 years, don’t quite have the skills and understanding to grasp all of this.

We're starting to see sort of this IT hybrid role called the IT controller, that typically reports to the CIO, but also dot-lines into the CFO, so that the two organizations can work together from the very beginning of a data center project to understand how best to optimize both the technology, as well as the financial aspects.

Jim O'Grady: We see that a lot of companies try to manage this themselves, and they don’t have the internal expertise to do it. Often, it’s done in a very disconnected way in the company. Because it’s disconnected and done in many different ways, it leads to more risks than people think.

You are putting your company’s brand at stake, through improper environmental recycling compliance, or exposing your clients, customers, or patients’ data to a security breach. This is definitely one of those areas you don’t want to read about in a newspaper to figure out what went wrong.

One of the most common areas where our clients are caught unaware of the complexity of the data security, and the e-waste legislation requirements that are out there, and especially the pace of its change.

We suggest that they have a well thought-out plan for destroying or clearing data prior to the asset decommissioning and/or prior to the asset leaving the physical premise of the site. Use your outsource partner, if you have one, as a final validation for data security. So, do it on site, as well as do it off site.

Have a well-established plan and budget up-front, one that’s sponsored by a corporate officer, to handle all of the end-of-use assets well before the end-of-use period comes.

Reams of regulations

E-waste legislation resides at the state, local, national, and regional levels, and they all differ. There's some conflict, but some are in line with each other. So it's very difficult to understand what your legislative requirements are and how to comply. Your best bet is to deal with a highest standard and pick someone that knows and has experience in meeting these legislative requirements.

Legislation resides at the state, local, national, and regional levels, and they all differ.



There are tremendous amounts of global complexities that customers are trying to overcome, especially when they try to do data center consolidation and transformation, throughout their enterprise across different geographies and country borders.

You're talking about a variety of regulatory practices and directives, especially in the EU, that are emerging and restrict how you move used and non-working product across borders. There are a variety of different data-security practices and environmental waste laws that you need to be aware of.

Partner beware

A lot of our clients choose to outsource this work to a partner. But they need to keep in mind that they are sharing risk with whomever they partner with. So they have to be very cautious and be extremely picky about who they select as a partner.

This may sound a bit self-serving, but I always suggest for enterprises to resist smaller local vendors. ... If you don’t kick the tires with your partner and you don’t find out that the partner consists of a man, a dog, and a pickup truck, you just may have a hard time defending yourself as to why you selected that partner.

Also, develop a very strong vendor audit qualification and ongoing inspection process. Visit that vendor prior to the selection and know where your waste stream is going to end up. Whatever they do with the waste stream, it’s your waste stream. You are a part of the chain of custody, so you are responsible for what happens to that waste stream, no matter what that vendor does with it.

You need to create rigorous documented end-to-end controls and audit processes to provide audit trails for any future legal issues. And finally, select a partner with a brand name and reputation for trust and integrity. Essentially, share the risk.

Total asset management

Enterprises should well consider how they retire and recover value for their entire end-of-use IT equipment, whether it's a PDA or supercomputer, HP or non-HP product. Most data center transformations and consolidations typically end with a lot of excess or end-of-use product.

We can help educate customers on the hidden risk and dispositioning that end-of-use equipment into the secondary market. This is a strength of HP Financial Services (HPFS).

Typically, what we find with companies trying to recover value for product is that they give it to their facilities guys or the local business units. These guys love to put it on eBay and try to advertise for the best price. But, that’s not always the best way to recover the best value for your data center equipment.

Your best bet is to work with a disposition provider that has a very, very strong re-marketing reach into the global markets, and especially a strong demonstrative recovery process.



We're now seeing it migrate into the procurement arm. These guys typically put it out for bid and select the highest bid from a lot of the open market brokers. A better strategy to recover value, but not the best.

Your best bet is to work with a disposition provider that has a very, very strong re-marketing reach into the global markets, and especially a strong demonstrative recovery process.

From a financial asset ownership model, HPFS has the ability to come in and work with a client, understand their asset management strategy, and help them to personalize the financial asset ownership model that makes sense for them.

For example, if you look at a leasing organization, when you lease a product, it's going to come back. A key strength in terms of managing your residual is to recover the value for the product as it comes back, and we do that on a worldwide basis.

We have the ability to reach emerging markets or find the market of highest recovery to be able to recover the value for that product. As we work with clients and they give us their equipment to remarket on their behalf, we bring it into the same process.

When you think about it, an asset recovery program is really the same thing as a lease return. It's really a lot of reverse logistics -- bring it into a technical center, where it's audited, the data is wiped, the product is tested, there’s some level of refurbishment done, especially if we can enhance the market value. Then, we bring it into our global markets to recover value for that product.

We have skilled product traders within our product families who know how to hold product, and wait for the right time to release it into the secondary market. If you take a lot of product and sell it in one day, you increase the supply, and all of the recovery rates for the brokers drop overnight. So, you have to be pretty smart. You have to know when to release product in small lot sizes to maximize that recovery value for the client.

Legacy support as core competency

We're seeing a big uptake in the need to support legacy product, especially in DCT. We're able to provide highly customized pre-owned authentic legacy HP product solutions, sometimes going back 20 years or more. The need for temporary equipment just scaling out legacy data center hardware platform capacity that’s legacy locked is an increasing need that we see from our clients.

Clients also need to ensure their product is legally licensed and they do not encounter intellectual property right infringements. Lastly, they want to trust that the vendor has the right technical skills to deal with the legacy configuration and compatibility issues.

Our short-term rental program covers new or legacy products. Again, many customers need access to temporary product to prove out some concepts, or just to test some software application on compatibility issues. Or, if you're in the midst of a transformation, you may need access to temporary swing gear to enable the move.

We also help clients understand strategies to recover the best value for decommissioned assets, as well as how to evaluate and how to put in place a good data-security plan.

We help them understand whether data security should be done on-site versus off-site, or is it worth the cost to do it on-site and off-site. We also help them understand the complexities of data wiping enterprise product, versus just the plain PC.

The one thing we help customers understand, and it’s the real hidden complexity is how to set up an effective reverse logistic strategy.



Most of the local vendors and providers out there are skilled in wiping data for PCs, but when you get into enterprise products, it can get really complex. You need to make sure that you understand those complexities, so you can secure the data properly.

Lastly, the one thing we help customers understand, and it’s the real hidden complexity is how to set up an effective reverse logistic strategy, especially on a global basis. How do you get the timing down for all the products coming back on a return basis?

Tang: We reach out to our customers in various interactions to talk them through the whole process from beginning to end.

One of the great starting points we recommend is something we called the Data Center Transformation Experience Workshop, where we actually bring together your financial side, your operations people, and your CIOs, so all the key stakeholders in the same room, and walk through these common issues that you may or may not have thought about to begin with. You can walk out of that room with consensus, with a shared vision, as well as a roadmap that’s customized for your success.
Listen to the podcast. Find it on iTunes/iPod and Podcast.com. Read a full transcript or download a copy. Sponsor: HP.

Tuesday, September 21, 2010

IBM acquires Netezza as big data market continues to consolidate around appliances, middle market, new architecture

IBM is snapping up yet another business analytics player. After purchasing OpenPages last week, Big Blue is now laying down $1.7 billion in an all-cash deal to acquire Netezza.

Netezza provides high-performance analytics in a data warehousing appliance that claims to handle complex analytic queries 10 to 100 times faster than traditional systems. Netezza appliances puts analytics into the hands of business users in sales, marketing, product development, human resources and other departments that need to actionable insights to drive decision-making.

With its latest business analytics acquisition, Steve Mills, senior vice president and group executive of IBM Software and Systems, says the company is bringing analytics to the masses.

“We continue to evolve our capabilities for systems integration, bringing together optimized hardware and software, in response to increasing demand for technology that delivers true business value,” Mills says. “Netezza is a perfect example of this approach.”

Big Blue’s long haul

Netezza fits in with IBM’s maturing business analytics strategy. Big Blue has long put an emphasis on data analysis and business intelligence (BI) as key drivers of IT infrastructure needs. The company has demonstrated a clear understanding that data analysis and BI can also be easily applied to business issues.

IBM’s relationship database, DB2, also fits into the big picture. Over the years, IBM has built a strong family of database-driven products around DB2. Essentially, IBM has successfully worked to tie the data equation together with the needs of enterprises and the strength of their IT departments.

We continue to evolve our capabilities for systems integration, bringing together optimized hardware and software, in response to increasing demand for technology that delivers true business value.



While DB2 reaches into the past and supports the data needs of legacy and distributed systems and applications, new architectures around in-memory and optimized platforms for persistence-driven tasks are in vogue. While Neteeza's strengths are in analytics, this architecture has other uses, ones we'll be seeing more of.

Fast-forward to the Netezza acquisition. The $1.7 billion grab shows that IBM is well aware that big data sets don’t lend themselves to traditional architecture for crunching data. IBM, along with its competitors, have been developing or acquiring new architectures that focus more on in-memory solutions.

Rather than moving the entire database or large caches around on disk or tape, then, new architectures have emerged where the data and logic reside closer together -- and the data is accessed from high-performing persistence.

For example, with Netezza appliances, NYSE Euronext has slashed the time it takes to load and extract massive amounts of historical data so it can run analytic queries more securely and efficiently, while reducing run times from hours to seconds. Virgin Media, a UK provider of TV, broadband, phone and mobile services with millions of subscribers, uses Netezza across its product marketing, revenue assurance and credit services departments to proactively plan, forecast, and respond to the effect of pricing and tariff changes enabling them to quickly respond with competitive offerings.

Business analytics consolidation

W
ith the Netezza acquisition, the business analytics market is seeing consolidation as major players begin preparing to tap into a growing big data opportunity. Much the same as the BI market saw consolidation a few years ago -- IBM acquired Cognos, Oracle bought Hyperion, and SAP snapped up Business Objects -- vendors are now seeing big data analytics as an area that should be embedded into the total infrastructure of solutions. That requires a different architecture.

The competition is heating up. EMC purchased Greenplum, an enabler of big data clouds and self-service analytics, in July. Both companies are planning to sell the hardware and software together in appliances. The vendors tune and optimize the hardware and software to offer the benefits of big data crunching, taking advantage of in memory architecture and high performance hardware.

Expect to see more consolidation, although there aren’t too many players left in the Netezza space. Acquisition candidates include data management and analysis software company Aster Data Systems and Teradata with its enterprise analytics technologies, among others. [Disclosure: Aster Data is a sponsor of BriefingsDirect podcasts.]

Meanwhile, Oracle this week at OpenWorld is pushing against the market with its new Exadata product. The battle is on. My take is that these purchases are for more than the engines that drive analytics -- they are for the engines that drive SaaS, cloud, mobile, web and what we might call the more modern work loads ... data intensive, high-scaling, fast-changing and services-oriented.
BriefingsDirect contributor Jennifer LeClaire provided editorial assistance and research on this post. She can be reached at http://www.linkedin.com/in/jleclaire and http://www.jenniferleclaire.com.
You may also be interested in:

Monday, September 20, 2010

Morphlabs eases way for companies to build private cloud infrastructures, partners with Zend

Morphlabs, a provider of enterprise cloud architecture platforms, has simplified the process of building and managing an internal cloud for enterprise environments -- enabling companies to create their own private cloud infrastructure.

The Manhattan Beach, Calif. company today announced a significant upgrade to its flagship product, mCloud Controller. The enhanced version introduces Enterprise Cloud Architecture (ECA), a new approach that provides enterprises with immediate access to the building blocks and binding components of a fault tolerant, elastic, and highly automated platform.

Morphlabs also announced a partnership with Zend Technologies Ltd., whose Zend Server will be shipped as part of the mCloud Enterprise, said Winston Damarillo, CEO at Morphlabs.

mCloud Controller is a comprehensive cloud computing platform, delivered as an appliance or virtual appliance, as well as providing open mCloud APIs (you can manage the ECA cloud from an iPad, for example). To support the leading platforms, mCloud Controller will have built-in ECA compliant support for Java, Ruby on Rails, and PHP.

Fittingly for enterprise private clouds, the Morph offering also provides direct integration to mainstream middleware via standards-based connectors. It also supports a plethora of VMs, from KVM to Xen, and and VMware, and allows for others cluster managers to be used as well.

Look for Morphlabs to seek to sell to both service providers and enterprises for the compatible hybrids benefits. Of course, we're hearing the same from Citrix, VMware, Novell, HP, etc. It's a horse race out there for a de facto hybrid cloud standard, all right.

Productivity gains

“PHP has been broadly adopted for the productivity gains it brings to Web application development, and because it can provide the massive scalability that e-commerce, social networking and media sites require,” said Matt Elson, vice president of business development at Zend. “Integrating Zend Server into Morphlabs’ mCloud Controller enables IT organizations to leverage the elasticity of cloud computing and automate the process of deploying highly reliable PHP applications in the cloud.”

Key features of the mCloud Controller with ECA include:
  • Uniform environments from development to production to help users simplify system configuration. Applications can grow as needed, while maintaining a standardized infrastructure for ease of growth and replacement.

  • Simplified system administration with automated monitoring and self-healing out of the box to avoid complicated system tuning. mCloud Controller also comes with graphical tools for viewing system-wide performance.

  • Self-service resource provisioning, which frees the IT department from numerous application provisioning requests. Without any system administration skills, authorized users can start and stop computes and provision applications as needed. Billing is also included within the system.

  • Streamlined application management automates the process of deploying, monitoring and backing-up applications. Users do not have to deal with configuration files and server settings.
The mCloud Controller v2.5 is available now in the United States, Japan and South East Asia. For more information contact Morphlabs at info@mor.ph.

You may also be interested in:

Wednesday, September 15, 2010

HP Business Service Automation portfolio gives IT the tools it needs to compete with clouds

HP is pushing the automation card again with new tools for hybrid IT environments. The company today announced “enhanced automation solutions” that set the stage for lower-cost business application deployment -- whether those apps are deployed traditionally, virtually or via a cloud.

HP’s latest Business Service Automation (BSA) enhancements beef up its solutions for hybrid IT environments, which the company defines as any combination of on-premise, off-premise, physical and virtual scenarios, including cloud computing. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

HP has identified a strong need in the enterprise, which is why it’s moving so fast on the BSA front. Although hybrid IT environments can increase a business’s agility and speed time to market, they also increase complexity, risk and costs by creating IT silos -- if the environment isn’t holistically managed. HP’s new BSA software enhancements work to take the “if” out of the equation.

A 360-degree hybrid solution

Today’s BSA announcement builds on HP’s recent cloud announcements for hybrid IT environments. The just-announced software enhances the HP’s BSA portfolio to offer unified server, network, storage, and application management. The goal is to break down IT silos to simplify application development and hybrid IT management.

HP is promising financial returns for companies that adopt its solutions. According to a June 2010 ExpertROI Spotlight conducted by IDC on behalf of HP, organizations that deploy HP BSA solutions can realize up to $4.82 in benefits for every IT dollar invested, reduce annual IT costs by up to $24,000 per 100 end users, and reduce outsourcing costs by 40 percent to 80 percent.

Organizations are seeking solutions that deliver business applications and services with greater agility, speed and at the lowest cost to the enterprise, regardless of their IT environment.



“Organizations are seeking solutions that deliver business applications and services with greater agility, speed and at the lowest cost to the enterprise, regardless of their IT environment,” says Erik Frieberg, vice president of Marketing, Software and Solutions at HP. “Clients can achieve up to 382 percent ROI by deploying HP’s leading automation software and leverage the benefits of new hybrid delivery models.”

HP’s acquisition of Stratavia has strengthened its automation portfolio by adding deployment, configuration and management solutions for enterprise databases, middleware and packaged applications. These solutions aim to bridge the gap between application development and operational teams. With Stratavia’s technology in its portfolio, HP said it can now provision all of the components, rapidly deploy changes and manage the ongoing configuration and compliance management.

Under the BSA hood

HP’s BSA portfolio now offers new capabilities in application deployment and risk mitigation, as well as better efficiency and productivity. For example, HP Server Automation 9.0 helps clients automate the entire server life cycle, control virtualization sprawl, and provide more flexible provisioning and deployment of applications. New Application Deployment Manager (ADM) functionality lets IT organizations automate the release process to bridge the gap between development, quality assurance and operations teams. HP said these enhancements can accelerate application deployment by up to 86 percent.

What’s more, HP Network Automation 9.0 now helps clients contain costs, mitigate risk and improve efficiency of the network by automating error-prone tasks, reducing outages and enforcing policies in real-time regardless of the environment. And HP Operations Orchestration 9.0 helps clients faced with constant alerts and siloed teams improve service quality across hybrid environments. It gives clients the ability to automate the IT processes required to support cloud computing initiatives.

HP Operations Orchestration software can help manage a hybrid infrastructure through a single view while HP Client Automation 7.8 helps clients reduce administration costs for managing physical and virtual machines through a single tool. And HP Storage Essentials 6.3 helps clients reduce complexity in hybrid environments, while improving storage utilization and controlling capacity growth.

IT needs to play at productivity better

The BSA offerings come at a crossroads for enterprise IT. The fact is that IT can no longer just compete against its own past practices and cost structures. There's a looming gulf between what IT costs the IT department to provide and what a small army of outside hosts is coming to market with. IT now needs to compete against the costs structures of pure-play cloud and SaaS providers and hosts.

The solution for IT to remain competitive, and to pick and choose what to retain and what to outsource, is to make all of its systems and apps perform better and more efficiently. And it also needs the governance and management to automate those apps and systems to keep complexity and costs in line.

Visibility, automation and management are essential for IT to stay in the game against hosts, MSPs, clouds, SaaS providers, etc. And the same management allows IT to function as the best broker of services, regardless of where the servers reside. This is clearly the target HP's BSA portfolio has in its sights.
BriefingsDirect contributor Jennifer LeClaire provided editorial assistance and research on this post. She can be reached at http://www.linkedin.com/in/jleclaire and http://www.jenniferleclaire.com.
You may also be interested in: