Wednesday, December 4, 2013

Identity and access management as a service gets boost with SailPoint's IdentityNow cloud

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: SailPoint Technologies.

Business trends like bring your own device (BYOD) are forcing organizations to safely allow access to all kinds of applications and resources anytime, anywhere, and from any device.

According to research firm MarketsandMarkets, the demand for improved identity and access management (IAM) technology is estimated to grow from more than $5 billion this year to over $10 billion in 2018.

The explosive growth -- doubling of the market in five years -- will also fuel the move to more pervasive use of identity and access management as a service (IDaaS). The cloud variety of IAM will be driven on by the need for pervasive access and management over other cloud, mobile, and BYOD activities, as well as by the consumerization of IT and broader security concerns.

To explore the why and how of IDaaS, BriefingsDirect recently sat down with Paul Trulove, Vice President of Product Marketing at SailPoint Technologies in Austin, Texas, to explore the changing needs for -- and heightened value around -- improved IAM.

We also discover how new IDaaS offerings are helping companies far better protect and secure their information assets. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: SailPoint is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: The word "control" comes up so often when I talk to people about security and IT management issues, and companies seem to feel that they are losing control, especially with such trends as BYOD. How do companies regain that control, or do we need to think about this differently?

Trulove: The reality in today's market is that a certain level of control will always be required. But as we look at the rapid adoption of new corporate enterprise resources, things like cloud-based applications or mobile devices where you could access corporate information anywhere in the world at any time on any device, the reality is that we have to put a base level of controls in place that allow organizations to protect the most sensitive assets. But you have to also provide ready access to the data, so that the organizations can move at the pace of what the business is demanding today.

Gardner: The expectations of users has changed, they're used to having more of their own freedom. How is that something that we can balance, allow them to get the best of their opportunity and their productivity benefits, but at the same time, allow for the enterprise to be as low risk as possible?

Trulove
Trulove: That's the area that the organization has to find the right balance for their particular business that meets the internal demands, the external regulatory requirements, and really meet the expectations of their customer base. While the productivity aspect can't be ignored, taking a blind approach to allowing an individual end-user to begin to migrate structured data out of something like an SAP or other enterprise resource planning (ERP) systems, up to a personal Box.com account is something most organizations are just not going to allow.

Each organization has to step back, redefine the different types of policies that they're trying to put in place, and then put the right kind of controls that mitigate risk in terms of inappropriate acts, access to critical enterprise resources and data, but also allow the end user to have a little bit more control and little bit more freedom to do things that make them the most productive.

Uptake in SaaS

Gardner: We've seen a significant uptake in SaaS, certainly at the number of apps level, communications, and email, but it seems as if some of the infrastructure services around IAM are lagging. Is there a maturity issue here, or is it just a natural way that markets evolve? What's the case in understanding why the applications have gone fast, but we're now just embarking on IDaaS?

Trulove: We're seeing a common trend in IT if you look back over time, where a lot of the front-end business applications were the first to move to a new paradigm. Things like ERP and service resource management (SRM)-type applications have all migrated fairly quickly.

Over the last decade, we've really seen a lot of the sales management applications, like Salesforce and NetSuite come on as full force. Now, there are things like Workday and even some of the work force management becoming very popular. However, the infrastructure generally lagged for a variety of reasons.

In the IAM space, this is a critical aspect of enterprise security and risk management as it relates to guarding the critical assets of the organization. Security practitioners are going to look at new technology very thoroughly before they begin to move things like IAM out to a new delivery paradigm such as SaaS.

The other thing is that organizations right now are still fundamentally protecting internal applications. So there's less of a need to move your infrastructure out into the cloud until you begin to change the overall delivery paradigm for your internal application.
As customers implement more and more of their software out in the cloud, that's a good time for them to begin to explore IDaaS.

What we're seeing in the market, and definitely from a customer perspective, is that as customers implement more and more of their software out in the cloud, that's a good time for them to begin to explore IDaaS.

Look at some of the statistics being thrown around. In some cases, we've seen that 80 percent of new software purchases are being pushed to a SaaS model. Those kinds of companies are much more likely to embrace moving infrastructure to support that large cloud investment with fewer applications to be managed back in the data center.

Gardner: The notion of mobile-first applications now has picked up in just the last two or three years. I have to imagine that's another accelerant to looking at IAM differently when you get to the devices. How does the mobile side of things impact this?

Trulove: Mobile plays a huge part in organizations' looking at IDaaS, and the reason is that you’re moving the device that's interacting with the identity management service outside the bounds of the firewall and the network. So, having a point of presence in the cloud gives you a very easy way to generate all of the content out to the devices that are being operated outside of the traditional bounds of the IT organization, which was generally networked in to the PCs, laptops, etc that are on the network itself.

Moving to IDaaS

Gardner: I'd like to get into what hurdles organizations need to overcome to move in to IDaaS, but let's define this a little better for folks that might not be that familiar with it. How does SailPoint define IDaaS? What are we really talking about?

Trulove: SailPoint looks at IDaaS as a set of capabilities across compliance and governance, access request and provisioning, password management, single sign-on (SSO), and Web access management that allow for an organization to do fundamentally the same types of business processes and activities that they do with an internal IAM systems, but delivered from the cloud.

We also believe that it's critical, when you talk about IDaaS to not only talk about the cloud applications that are being managed by that service, but as importantly, the internal applications behind the firewall that still have to be part of that IAM program.

Gardner: So, this is not just green field. You have to work with what's already in place, and it has to work pretty much right the first time.

Trulove: Yes, it does. We really caution organizations against looking at cloud applications in a siloed manner from all the things that they're traditionally managing in the data center. Bringing up a secondary IAM system to only focus on your cloud apps, while leaving everything that is legacy in place, is a very dangerous situation. You lose visibility, transparency, and that global perspective that most organizations have struggled to get with the current IAM approaches across all of those areas that I talked about.
We see a little bit less of the data export concerns with companies here in the US, but it's a much bigger concern for companies in Europe and Asia in particular.

Gardner: So, we recognize that these large trends are forcing a change, users want their freedom, more mobile devices, more different services from different places, and security being as important if not more than ever. What is holding organizations back from moving towards IDaaS, given that it can help accommodate this very complex set of requirements?

Trulove: It can. The number one area, and it's really made up of several different things, is the data security, data privacy, and data export concerns. Obviously, the level at which each of those interplay with one another, in terms of creating concern within a particular organization, has a lot to do with where the company is physically located. So, we see a little bit less of the data export concerns with companies here in the US, but it's a much bigger concern for companies in Europe and Asia in particular.

Data security and privacy are the two that are very common and are probably at the top of every IT security professional’s list of reasons why they're not looking at IDaaS.

Gardner: It would seem that just three or four years ago, when we were talking about the advent of cloud services, quite a few people thought that cloud was less secure. But I’ve certainly been mindful of increased and improved security as a result of cloud, particularly when the cloud organization is much more comprehensive in how they view security.

They're able to implement patches with regularity. In fact, many of them have just better processes than individual enterprises ever could. So, is that the case here as well? Are we dealing with perceptions? Is there a case to be made for IDaaS being, in fact, a much better solution overall?

IAM as secure

Trulove: Much like organizations have come to recognize the other categories of SaaS as being secure, the same thing is happening within the context of IAM. Even a lot of the cloud storage services, like Box.com, are now signing up large organizations that have significant data security and privacy concerns. But, they're able to do that in a way and provide the service in a way where that assurance is in place that they have control over the environment.

And so, I think the same thing will happen with identity, and it's one of the areas where SailPoint is very focused on delivering capabilities and assurances to the customers that are looking at IDaaS, so that they feel comfortable putting the kinds of information and operating the different types of IAM components, so that they get over that fear of the unknown.

One of the biggest benefits of moving from a traditional IAM approach to something that is delivered as IDaaS is the rapid time to value. It's also one of the biggest changes that the organization has to be prepared to make, much like they would have as they move from a Siebel- to a Salesforce-type model back in the day.
IAM delivered as a service needs to be much more about configuration, versus that customized solution where you attempt to map the product and technology directly back to existing business processes.
The benefit that they get out of that is a much lower total cost of ownership (TCO), especially around the deployment aspects of IDaaS.

One of the biggest changes from a business perspective is that the business has to be ready to make investments in business process management, and the changes that go along with that, so that they can accommodate the reality of something that's being delivered as a service, versus completely tailoring a solution to every aspect of their business.

The benefit that they get out of that is a much lower total cost of ownership (TCO), especially around the deployment aspects of IDaaS.

Gardner: It's interesting that you mentioned business process and business process management. It seems to me that by elevating to the cloud for a number of services and then having the access and management controls follow that path, you’re able to get a great deal of flexibility and agility in how you define who it is you’re working with, for how long, for when.

It seems to me that you can use policies and create rules that can be extended far beyond your organization’s boundaries, defining workgroups, defining access to assets, creating and spinning up virtualized companies, and then shutting them down when you need. So, is there a new level of consideration about a boundaryless organization here as well?

Trulove: There is. One of the things that is going to be very interesting is the opportunity to essentially bring up multiple IDaaS environments for different constituents. As an organization, I may have two or three fundamentally distinct user bases for my IAM services.

Separate systems

I may have an internal population that is made up of employees, and contractors that essentially work for the organization that need access to a certain set of systems. So I may bring up a particular environment to manage those employees that have specific policies and workflows and controls. Then, I may bring up a separate system that allows for business partners or individual customers to have access to very different environments within the context of either cloud or on-prem IT resources.

The advantage is that I can deploy these services uniquely across those. I can vary the services that are deployed. Maybe I provide only SSO and basic provisioning services for my external user populations. But for those internal employees, I not only do that, but I add access certifications, and segregation of duties (SOD) policy management. I need to have much better controls over my internal accounts, because they really do guard the keys to the kingdom in terms of data and application access.

Gardner: We began this conversation talking about balance. It certainly seems to me that that level of ability, agility, and defining new types of business benefits far outweighs some of the issues around risk and security that organizations are bound to have to solve one way or the other. So, it strikes me as a very compelling and interesting set of benefits to pursue.

You've delivered the SailPoint IdentityNow suite. You have a series of capabilities, and there are more to come. As you were defining and building out this set of services, what were some of the major requirements that you had, that you needed to check off before you brought this to market?

Trulove: The number one capability that we really talk to a lot of customers about is an integrated set of IAM services that span everything from that compliance and governance to access request provisioning and password management all the way to access management and SSO.
They can get value out of it, not necessarily on day one, but within weeks, as opposed to months.

One of the things that we found as a critical driver for the success of these types of initiatives within organizations is that they don't become siloed, and that as you implement a single service, you get to take advantage of a lot of the work that you've done as you bring on the second, third, or fourth services.

The other big thing is that it needs to be ready immediately. Unlike a traditional IAM solution, where you might have deployment environments to buy and implement software to purchase and deploy and configure, customers really expect IDaaS to be ready for them to start implementing the day that they buy.

It's a quick time-to-value, where the organization deploying it can start immediately. They can get value out of it, not necessarily on day one, but within weeks, as opposed to months. Those things were very critical in deploying the service.

The third thing is that it is ready for enterprise-level requirements. It needs to meet the use cases that a large enterprise would have across those different capabilities, but also as important, that it meets data security, privacy, and export concerns that a large enterprise would have relative to beginning to move infrastructure out to the cloud.

Even as a cloud service, it needs a very secure way to get back into the enterprise and still manage the on-prem resources that aren’t going away anytime soon. n one hand we would talk to customers about managing things like Google Apps, Salesforce and Workday. In the same breath, they also talk about still needing to manage the mainframe and the on-premises enterprise ERP system that they have in place.

So, being able to span both of those environments to provide that secure connectivity from the cloud back into the enterprise apps was really a key design consideration for us as we brought this product to market.

Hybrid model

Gardner: It sounds if it's a hybrid model from the get-go. We hear about public cloud, private cloud, and then hybrid. It sounds as if hybrid is really a starting point and an end point for you right away.

Trulove: It's hybrid only in that it's designed to manage both cloud and on-prem applications. The service itself all runs in the cloud. All of the functionality, the data repositories, all of those things are 100 percent deployed as a service within the cloud. The hybrid nature of it is more around the application that it's designed to manage.

Gardner: You support a hybrid environment, but I see, given what you've just said, that means that all the stock in trade and benefits as a service offering are there, no hardware or software, going from a CAPEX to OPEX model, and probably far lower cost over time were all built in.

Trulove: Exactly. The deployment model is very much that classic SaaS, a multitenant application where we basically run a single version of the service across all of the different customers that are utilizing it.

Obviously, we've put a lot of time, energy, and focus on data protection, so that everybody’s data is protected uniquely for their organization. But we get the benefits of that SaaS deployment model where we can push a single version of the application out for everybody to use when we add a new service or we add new capabilities to existing services. We take care of upright processes and really give the customers that are subscribing to the services the option of when and how they want to turn new things on.
We've put a lot of time, energy, and focus on data protection, so that everybody’s data is protected uniquely for their organization.

The IdentityNow suite is made up of multiple individual services that can be deployed distinctly from one another, but all leverage a common back-end governance foundation and common data repository.

The first service is SSO and it very much empowers users to sign on to cloud, mobile, and web applications from a single application platform. It provides central visibility for end users into all the different application environments that they maybe interacting with on a daily basis, both from a launch-pad type of an environment, where I can go to a single dashboard and sign on to any application that I'm authorized to use.

Or I may be using back-end Integrated Windows Authentication, where as soon as I sign into my desktop at work in the morning, I'm automatically signed into all my applications as I used them during the day, and I don’t have to do anything else.

The second service is around password management. This is enabling that end-user self-service capability. When end users need to change their password or, more commonly, reset them because they’ve forgotten them over a long weekend, they don’t have to call the help desk.

Strong authentication

They can go through a process of authenticating through challenge questions or other mechanisms and then gain access to reset that password and even use some strong authentication mechanisms like one-time password tokens that are going to be issued, allow the user to get in and then, change that password to something that they will use on an ongoing basis.

The third service is around access certifications, and this automates that process of allowing organizations to put in place controls through which managers or other users within the organization are reviewing who has access to what on a regular basis. It's a very business-driven process today, where an application owner or business manager is going to go in, look at the series of accounts and entitlements that a user has, and fundamentally make a decision whether that access is correct at a point in time.

One of the key things that we're providing as part of the access certification service is the ability to automatically revoke those application accounts that are no longer required. So there's a direct tie into the provisioning capabilities of being able to say, Paul doesn’t need access to this particular active directory group or this particular capability within the ERP system. I'm going to revoke it. Then, the system will automatically connect to that application and terminate that account or disable that account, so the user no longer has access.

The final two services are around access request and provisioning and advanced policy and analytics. On the access request and provisioning side, this is all about streamlining, how users get access. It can be the automated birth-right provisioning of user accounts based on a new employee or contractor joining new organization, reconciling when a user moves to a new role, what they should or should not have, or terminating access on the back end when a user leaves the organization.
What most customers see, as they begin to deploy IDaaS is the ability to get value very quickly.

All of those capabilities are provided in an automated provisioning model. Then we have that self-service access request, where a user can come in on an ad-hoc basis and say, "I'm starting a new project on Monday and I need some access to support that. I'm going to go in, search for that access. I'm going to request it." Then, it can go through a flexible approval model before it actually gets provisioned out into the infrastructure.

The final service around advanced policy and analytics is a set of deeper capabilities around identifying where risks lie within the organization, where people might have inappropriate access around a segregation of duty violation.

It's putting an extra level of control in place, both of a detective nature, in terms of what the actual environment is and which accounts that may conflict that people already have. More importantly, it's putting preventive controls in place, so that you can attach that to an access request or provisioning event and determine whether a policy violation exists before a provisioning action is actually taken.

Gardner: What are your customers finding now that they are gaining as a result of moving to IDaaS as well, as the opportunity for specific services within the suite? What do you get when you do this right?

Trulove: What most customers see, as they begin to deploy IDaaS is the ability to get value very quickly. Most of our customers are starting with a single service and they are using that as a launching pad into a broader deployment over time.

So you could take SSO as a distinct project. We have customers that are implementing that SSO capability to get rapid time to value that is very distinct and very visible to the business and the end users within their organization.

Password management

Once they have that deployed and up and running, they're leveraging that to go back in and add something like password management or access certification or any combination thereof.

We’re not stipulating how a customer starts. We're giving them a lot of flexibility to start with very small distinct projects, get the system up and running quickly, show demonstrable value to the business, and then continue to build out over time both the breadth of capabilities that they are using but also the depth of functionality within each capability.

Mobile is driving a significant increase in why customers are looking at IDaaS. The main reason is that mobile devices operate outside of the corporate network in most cases. If you're on a smartphone and you are on a 3G, 4G, LTE type network, you have to have a very secure way to get back into those enterprise resources to perform particular operations or access certain kinds of data.
One of the benefits that an IDaaS service gives you is a point of presence in cloud that allows the mobile devices to have something that is very accessible from wherever they are. Then, there is a direct and very secure connection back into those on-prem enterprise resources as well as out to the other cloud applications that you are managing.
The other big thing we're seeing in addition to mobile devices is just the adoption of cloud applications.

The reality in a lot of cases is that, as organizations add those BYOD type policies and the number of mobile devices that are trying to access corporate data increase significantly, providing an IAM infrastructure that is delivered from the cloud is a very convenient way to help bring a lot of those mobile devices under control across your compliance, governance, provisioning, and access request type activities.

The other big thing we're seeing in addition to mobile devices is just the adoption of cloud applications. As organizations go out and acquire multiple cloud applications, having a point of presence to manage those in the cloud makes a big difference.

In fact, we've seen several deployment projects of something like Workday actually gated by needing to put in the identity infrastructure before the business was going to allow their end users to begin to use that service. So the combination of both mobile and cloud adoption are driving a renewed focus on IDaaS.

If you look at the road map that we have for the IdentityNow product, the first three services are available today, and that’s SSO, password management, and access certification. Those are the key services that we're seeing businesses drive into the cloud as early adopters. Behind that, we'll be deploying the access request and provisioning service and the advanced policy and analytic services in the first half of 2014.
Continued maturation

Beyond that, what we're really looking at is continued maturation of the individual services to address a lot of the emerging requirements that we're seeing from customers, not only across the cloud and mobile application environments, but as importantly as they begin to deploy the cloud services and link back to their on-prem identity and access management infrastructure, as well as the applications that they are continuing to run and manage from the data center.

Gardner: So, more inclusive, and therefore more powerful, in terms of the agility, when you can consider all the different aspects of what falls under the umbrella of IAM.

Trulove: We're also looking at new and innovative ways to reduce the deployment timeframes, by building a lot of capabilities that are defined out of the box. These are  things like business processes, where there will be catalog of the best practices that we see a majority of customers implement. That has become a drop-down for an admin to go in and pick, as they are configuring the application.

We'll be investing very heavily in areas like that, where we can take the learning as we deploy and build that back in as a set of best practices as a default to reduce the time required to set up the application and get it deployed in a particular environment.

Tuesday, December 3, 2013

BI and big data analytics force an overdue reckoning between IT and business interests

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: Dell Software.

The relationship between enterprise IT and lines of business leadership has not always been rosy. Sometimes IT holds the upper hand, and sometimes the business does an end-run around IT to use new tools or processes. They might even call it innovation.

Today, with the push toward big data and business intelligence (BI), a new chasm is growing between enterprise IT groups and business units. But, in this case, it could be disastrous because IT should be a big part of the big data execution.

The next BriefingsDirect discussion therefore examines how an ebb and flow between IT centralization and decentralization that swings in the direction of business groups, and even shadow IT, now runs the risk of neglecting essential management security and scalability requirements.

Indeed, big data and analytics should actually force more collaboration and lifecycle-based relationships among and between business and IT groups. For those organizations -- where innovation is being divorced from IT discipline -- we'll explore ways that a comprehensive and virtuous adoption of rigorous and protected data insights can both make the business stronger and make IT more valued.

To get to the bottom of why, BriefingsDirect recently sat down with John Whittaker, Senior Director of Marketing for Dell Software's Information Management Solutions Group. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: Dell Software is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: John, we seem to go back and forth between resources in organizations being tightly controlled and governed by IT, and then resources and control resting largely with the line of business or even, as I mentioned, with a shadow IT group of some sort. So over the past 20 or more years, why has this problem been so difficult to overcome? Why is it persistent? Why do we keep going back and forth?

Whittaker: That’s an interesting question, and I agree. I've been in IT for longer than 20 years and certainly in your study of history you can see that this ebb and flow of centralized management to gain some constraints or some controls in governance and security has been one of the primary motivators of IT. It’s one of the big benefits they provide, but in the backdrop, you have lines of business that want to innovate and want to go in new directions.

Whittaker
We’re entering one of those times right now with big data and the advent of analytics, and it’s driving lines of business to push into these new technologies, and maybe  in ways that IT isn’t ready for just yet.

This, as you mentioned, has been going on for some time. The last iteration where this occurred was back in the ’90s when e-commerce and the Web captured the imagination of business. We saw a lot of similarities to what's occurring today.

Big-data push

It ultimately caused some problems back in the ’90s around e-commerce and leveraging this great new innovation of the Internet, but doing it in a way that was more decentralized. It was a little bit more of the Wild West-based approach and ultimately led to some pretty significant issues that I think we are going to see out of the big data and analytics push that’s occurring right now.

Gardner: I suppose to be fair to each constituency here, it’s the job of IT to be cautious and to try to dot all the i’s and cross the t’s. There were a lot of people in 1996-97 who didn’t necessarily think the Internet was going to be that big of a thing, it seemed to have lots of risk associated with it. So, I suppose due diligence needed to be brought to bear.

http://software.dell.comOn the other hand, if the businesses didn’t recognize that this could be a huge opportunity and we needed to take those risks -- create a website, and enter into a direct dialogue with customers to a new channel -- they would have missed a big opportunity. So these are sort of natural roles, but they can’t be too brittle.

Whittaker: You’re absolutely right. At their core, both groups had, and have, good motivations. IT lives in a world of constraints, of governance, security, and of needing to deliver something that’s going to be stable, that’s going to scale, that’s going to be secure, and that’s not going break governance.
Nobody in either group is trying to harm the business or anything close to it.

Those are laudable goals to have in mind. From the line-of-business perspective, the business wants to innovate and doesn’t want to be outmoded by its competitors. They rightfully see that all these great innovations are coming, and analysts, pundits, and experts are talking about how this is going to make a huge difference for businesses.

So they inevitably want to embrace those, and you have this cognitive dissonance occurring between the IT goals around constraints and the desire to keep things running in a clean and efficient manner. IT is seeing this new technology and saying, “Hold on. We don’t necessarily want to jump into this. This is going to break our model.”

Ultimately, IT gets to a point where maybe they suggest we shouldn’t do it or we should push it off for some time. That’s where the chasm between the two gets started. From the business perspective, the answer “no” is unacceptable, if they feel that’s what they need to do to achieve success in business. They own the profit and loss responsibilities. That’s where these problems come from.

Nobody in either group is trying to harm the business or anything close to it. They just have different motivations and perspectives on how to approach something, and when one gets wildly far apart from the other, that’s where these problems tend to occur. Again, when these big innovation cycles happen, you’re more likely to see a lot of these problems start to occur.

I definitely remember back in 1996-1997. We didn’t call it shadow IT at the time, but you saw IT-like personnel being hired into functional business areas to institute these new technologies, and that ultimately led to a pretty serious hangover at the end of that innovation cycle.

Gardner: What’s the risk of ignoring IT, doing an end-run around them, or downplaying the role? What form does it take?

On their own

Whittaker: Ignoring IT can have some pretty serious problems. It all starts with the fact that, and by and large, businesses can embrace these new technologies without the aid of IT.  Cloud-based implementations have made it possible for lines of business to rapidly deploy some of these new big data technologies, and you have vendors in some cases telling them they don’t need IT’s help. So it’s not all that difficult for lines of business to go out on their own and implement a big data technology.

But they don’t typically have the discipline to apply across-the-board governance capabilities and discipline into their deployment and that leads to potential issues with regulatory requirements. It also leads to security issues, and ultimately can lead to problems where you have seriously bad data management issues.

You have data sunk in silos, and maybe the CEO wants to know how much business we’re doing with x, y, and z. No one can deliver that, because we call x, y, and z, something in one system, a different name in another system, and a different name in the third system. Trying to pull that data together becomes really difficult. When you have lines of business independently operating disparate solutions, those core governance issues tend to break down.

Additionally, although they are great at spotting innovation opportunities, line of business people are not necessarily in the business of building scalable, secure, stable environments. That’s not the core of, say, marketing. They need to understand how the technology can be leveraged, but maintaining and managing it is not core to their charter. It tends to be ignored.
There are a lot of lessons that can be learned from the concept of working closely together, iterating rapidly, and being open to innovation and the idea that changes occur.

Gardner: John, it strikes me that there are some examples within IT that help understand this potential problem and even grab some remediation, and that’s in software development. We’ve seen the complexity in groups working without a lot of coordination and shared process insights and have run aground.

For many years, we saw a very high failure rate among software development projects, but more recently, we’ve seen improvements -- agile, scrum, opening up the process, small iterative steps that then revert back to an opportunity to take stock and know what everyone is doing, checking in, checking out with centralization -- but without stifling innovation. Is there really a lesson here in what’s happened within software development that could be brought to the whole organization?

Whittaker: Absolutely. In fact, within Dell Software itself we embrace agile and use scrum internally. There are a lot of lessons that can be learned from the concept of working closely together, iterating rapidly, and being open to innovation and the idea that changes occur.

Particularly in these major innovation cycles, it’s important to go with the flow and implement some of these new technologies and new capabilities early, so you can have that brain trust built internally among the broad team. You don’t want IT to hold the reins entirely, and at the same time, you don’t want line of business to do it.

We really need to break that model, that back and forth, centralization-decentralization swing that keeps occurring. We need to get to a point where we really are partnering and have good collaboration, where innovation can be embraced and adopted, and the business can meet its goals. But it has to be done in a way that IT can implement sound governance and implement solutions that can scale, are stable, are reliable, and are going to lead to long-term success.

Back-and-forth

Gardner: What’s different this time, John? Are the stakes higher because we’re talking about data analysis? That’s basically intelligence about what’s going on within your markets, your organization, your processes, your supply chain, your ecosystem, all of which could have a huge bearing.

We have the ability now to tackle massive amounts of data very rapidly, but if we don’t bring this together holistically, it seems as if there is a larger risk. I’m thinking about a competitive risk. Others that do this well could enter your market and really disrupt.

Whittaker: You’re absolutely right. There’s great potential benefit that organizations receive or can get out of leveraging big data and analytics, that of being able to determine predictively what is going to occur in their business and what are the most efficient routes to market and what areas of improvements can occur.

The businesses that leverage this are going to outmode, outperform, and ultimately win in the markets currently dominated by organizations who aren’t paying attention and who aren’t implementing solutions today. They’re getting a little bit ahead of this cycle so that they are ready and are able to be successful down the road.
We’re really moving into an era where the context of what’s happening is critically important.

We’re really moving into an era where the context of what’s happening is critically important. A data-driven management model is going to be embraced and it’s ultimately going to lead to more successful organizations. Companies and organizations that embrace this today are going to be the winners tomorrow.

If you’re ignoring this or putting this off, you’re really taking a tremendous risk, because this next iteration of innovation that’s occurring around analytics applies to large data sources. It’s being able to build the correlations and determine that this is a more efficient approach, or conversely, that we have a problem with this outlier that’s going to give us issues down the road.

If you’re not doing that as an organization, you really are running a pretty tremendous risk that somebody else is going to walk in and be able to make smarter decisions, faster.

Gardner: At the same time, your customers are gaining insights into how to procure all the better. And so any rewards that might be out there, if you are in a sales role of any kind, would become much more apparent.

Whittaker: That’s definitely true as well. The construct and the conversation has really shifted. With the advent of social media and the pace at which information is shared and opinions are made, it’s no longer the company that is the primary voice about its products and its capabilities or its positions and point of views.

Customers more empowered

It needs to have those. It needs to get them out. It needs to push them. But in this new world we live in, the customers are so much more empowered than they have ever been before, and it should be a good thing. For companies that are delivering great products and solving real problems for their customers, this should be great news.

If you’re not listening to what your customers are saying in social media and if you’re not paying attention to the ongoing story line and conversation of your firm in the social sphere, you’re really putting yourself at risk. You’re missing out on a tremendous opportunity to engage with your customers in a new, interesting, and very useful way.

That’s a lot of what we built. We have a lot of capabilities here at Dell Software around data management, data integration, and data analysis. On the analysis side, we spend a great deal of time with products like Kitenga and our social networking analytics platforms to do that semantic analysis and look into that form of big data.

But big data is more than just social. It’s also sensor data. The iterative thing is another area where businesses should be innovating and organizations should be pushing to take advantage of it. That’s where line of business should be saying, “We need to get out into this area, or if we don’t, we’re going to be outmoded by our competitors.” And IT should be encouraging it. They should be pushing for more innovation, bringing new ideas, and being a real partner and collaborator at the table within the business and organization. That’s the right way to do this.
IT could use big data analytics to improve its own environment and to answer this crisis of confidence that exists.

And IT itself should be applying some of these technologies. In fairness to line of business, there exists a bit of a crisis of confidence in IT, and there’s really no better way to push against that or fight against that then to be able to run analytics on the solutions you’re providing. How well is IT performing? Are you benchmarking against past performance? How do you benchmark against your industry?
That’s another component. Big-data analytics can be utilized by IT not just to deliver capabilities to the organization or push out and help with connecting to the customer. IT could use big data analytics to improve its own environment and to answer this crisis of confidence that exists.

You could turn these tools internally and look at rates of response as compared to your industry, how your network is performing, how your database is performing, or how the code you write is performing. Are your developers efficient in building clean code?

Everybody has been watching the major shift in the healthcare environment in North America. A big component of that probably should have been more benchmark analysis, analytics on code quality, and things of that nature. That’s a great current and topical example of how IT should be utilizing some of these technologies, not just externally, not just bringing it to line of business, but within its own environment, to prove that it’s building systems that are going to be scalable, secure, and stable.

Gardner: What needs to take place in order for this higher level of coordination and collaboration to take place? Are there any key steps that you have in mind for embarking on this?

Four key areas

Whittaker: I think that there are four key areas that need to occur for this collaboration to happen. Number one, senior executives need to be aligned to what the organization is trying to achieve. They need to articulate a common vision that accounts for the shared interest of both IT and line of business and make it clear that they expect collaboration. That should come at the top of the organization.

We need to get out of the smoke-stacked, completely siloed, organizational approaches and get to something that’s far, far more collaborative, and that needs to come from the top. The current approach is not acceptable. These groups need to work together. That’s a key component. If you don’t have buy-in at the top, it makes it really hard for this collaboration to occur.

Number two, IT needs to get its house in order. This means many things, but primarily, it means overcoming the crisis of confidence line of business has in IT by coming to the table with an approach that works for line of business, something that business aligns with such that it feels like it has IT involvement and that they’re buying into the future that the business wants to head towards. IT needs to show that they have a plan that does not compromise the innovations that the business needs.

IT absolutely can no longer just say no. That’s not an acceptable position. Certainly, if you look back, there were IT organizations that were saying, “No, we’re not going to connect to the Internet. It’s not secure. The answer is just going to be no.”

That didn’t work out for them and it’s not going to work out here either. They should be embracing this shift. We shouldn’t perpetuate this cycle by driving more shadow IT and creating ultimately more for IT down the road as inevitable problems start to emerge.
We shouldn’t perpetuate this cycle by driving more shadow IT and creating ultimately more for IT down the road as inevitable problems start to emerge.

Number three, clear the air and put the executive plan in place. Tensions between IT and line of business have gotten to the point where they can’t be ignored any more. Put the stakeholders together in a room, air out the difficulties, and move forward with a clean slate. This is a tremendous opportunity to build a plan that meets both parties’ needs and allows them to start executing on something that’s really going to make a huge impact for the business.

Finally, the fourth point, seek solutions that emphasize collaboration between IT and the business. Many vendors today are encouraging groups to go rogue and operate in silos, and that’s causing a lot of the problem. At Dell, we’re much more about pushing a more collaborative approach. We think IT is terrific, but business has a point. They need innovation and they need IT to step up. And the business needs to embrace IT.

Instead of conflicting with each other and doing your own thing, back up your commitment to collaboration and utilize tools that empower it. That’s where we’re going to win, and that’s how business is going to succeed in the future.

This isn’t something that the G20, the Fortune 500, or Fortune 2000 alone can benefit from. This goes way down in the hierarchy, in the stack, certainly down to the small- and medium-sized business (SMB) level. And maybe even lower. If you’re a data-intensive small business, you probably need to start implementing and taking a look at big data and what analytics based approaches and data-driven decision making opportunities exist within your organization, or you will be outmoded by organizations that do embrace that.

Cloud-based approach

More and more, we’re seeing, particularly in the mid-market, embracing of a cloud-based approach. It's important to point out that that approach is fine and terrific. We love the cloud and we’re big proponents of it, but using a cloud-based solution doesn’t free line of business from the need to collaborate with IT. It will not eliminate this problem.

We’re seeing terrific IT departments and leadership starting to take a larger role, starting to ultimately become drivers of innovation. That’s really what we want to see. All businesses want the same thing. They want to find sustainable competitive advantages. They want to control spending. They want to reduce risk to the business.
And the most effective and efficient path to achieving all three is getting IT and the business aligned and allowing that collaboration to occur. That’s really at the crux of how businesses are going to gain competitive advantage out of technology in the future.

Embrace new technology

The big points are, embrace the new technology that’s coming out. The innovation is going to make your business far more successful, and your organization will prosper from these new innovations that will occur.

Number two, do it in a manner that is collaborative between IT and line of business. The CIO, the CMO, the CFO, the CEO, the heads of all of the functional departments, whether you are in sales, marketing, finance, operation, wherever you are, should be aligning with their IT counterparts. It's the combined collaborative approach that’s going to win the day.

And finally, this should really be driven top-down. Senior executives, this is an opportunity to get everybody on the same page to go after and leverage a pretty enormous opportunity before it becomes a huge problem. Let’s get out there right now. We’re still in the early days, but that doesn’t mean there’s not a lot to be gained. And ultimately, in the long-term, we’re going to have more successful organizations able to achieve even greater output through this collaboration and the leveraging of big data analytics.

Monday, November 18, 2013

Enterprise Tablets: Choose a Device Nature That Best Supports Cloud Nurturing

Now that server hardware decisions are no-brainers (thanks to virtualization and the ubiquity of multi-core 64-bit x86), deciding on the enterprise-wide purchase of a tablet computer types will be the biggest hardware choice many IT leaders will make.

So what guides these tablet decisions? Do the attributes of the mobile device and platform (the nature of the thing) count most? Or is it more important that it conforms to the fast-changing needs of the back-end services and cloud ecosystem? Can the tablet be flexible and adaptive, to act really as many client types in one (the nurture)?
Can the tablet be flexible and adaptive, to act really as many client types in one (the nurture)?

Given how the requirements from enterprise to enterprise vary so much, this is a hugely complex issue. We've seen a rapidly maturing landscape of new means to the desired enterprise tablet ends in recent years: mobile device management (MDM), containerization and receiver technology flavors, native apps, web-centric apps, recasting virtual desktop infrastructure (VDI). It is still quite messy, really, despite the fact that this is a massive global market, the progeny of the PC market of the past 25 years.

Some think that bring your own device (BYOD) will work using these approaches on the user’s choice of tablet. If so, IT will be left supporting a dozen or more mobile client device types and/or versions. You and I know that can’t happen. The list of supported device types needs to be under six, preferably far less, whether it’s BYOD or quasi-BYOD.

Ticking time bomb

Yet enterprises must act. Users are buying and making favorites. Mobility is an imperative. These tablet hardware decisions must be made.

Think of it. You’re an IT leader at a competitive enterprise and rap, rap, rapping on your Windows to get in ASAP are BYOD, mobile apps dev, Android apps, iOS apps, and hybrid-cloud processes.

You have a lot to get done fast amid complex overlaps and interdependencies from your choices that could haunt you — or bless you — for years. And, of course, you have a tight budget as you fight to keep operating costs in check, even as scale requirements keeping rising.
Back-end strategy and procurement decisions count more than at any time in the last 12 years.

Somewhere in this 3D speed chess match against the future there are actual hardware RFPs. You will be buying client hardware for the still large (if not predominant) portion of the workforce that won’t be candidates for BYOD alone. And a sizable portion of these workers are going to need an enterprise tablet, perhaps for the first time. They want you to give it to them.

This cost-benefit analysis vortex is where I decided to break from my primary focus on enterprise software and data-center infrastructure to consider the implications of the mobile client hardware. My dearly held bias is that the back-end strategy and procurement decisions count more than at any time in the last 12 years.

Better not brick

But at the end of the network hops, there still needs to be a physical object, on which the user will get and put in the work that matters most. This object cannot, under any circumstances, become a weak link in the hard-won ecosystem of services that support, deliver, and gather the critical apps and data. This productivity symphony you are now conducting from amid your legacy, modern data center, and cloud/SaaS services must work on every level — right out to those greasy fingertips on the smart tablet glass.

Yes, the endpoint must be as good as the services stream behind them, yet not hugely better, not a holy shiny object that tends to diminish the rest, not just a pricey status symbol — but a workhorse that can be nurtured and that can adapt as demanded.
There still needs to be a physical object, on which the user will get and put in the work that matters most.

So I recently received and evaluated a Levono ThinkPad Tablet 2 running Windows 8 as well as an iPad Air running iOS 7. I wanted to get a sense of what the enterprise decisions will be like as enterprises seek the de facto standard mass-deployed tablet for their post-PC workforce. [Disclosure: Intel sent me, for free, a Lenovo ThinkPad as a trial, and I bought my own iPad Air. I do not do any business with Apple, Lenovo, or Intel.]

Let’s be clear, I’m a long-time Apple user by choice, but still run one instance of Windows 7 on a PC just in case there are Windows-only apps or games I need or want access to. This also keeps up my knowledge on Windows in general.

Good enough is plenty

Here’s what I found. I personally love the iPad Air, but the Lenovo ThinkPad Tablet 2 was surprisingly good, certainly good enough for enterprise uses. I will quibble with efficacy of the stylus, that the Google Chrome browser is better on it than Microsoft IE, that the downloads for both are a pain, and that battery life is a weakness on Lenovo — but these are not deal breakers and will almost certainly get better.

What’s key here is that the apps I wanted were easily accessed. There’s a store for that, regardless of the device. Netflix just runs. The cloud services and my data/profile/preferences were all gained quickly and easily. The synching across devices was quickly running. Never having used Windows 8, although familiar with Windows 7, was not an issue. I picked it up quickly, very quickly.
So the nature of the device is not the major factor, not a point of lock-in, or even a decision guide.

Any long-time Windows user, the predominant enterprises worker, will adapt to an Intel-powered Lenovo device running Windows quite well. And enterprise IT departments already know the strengths and weaknesses of Windows, be it 7 or 8, and they know they will have to pay Microsoft its use taxes for years to come in any event, given their dependence on Microsoft apps, servers, services and middleware.

But that same enterprise tablet user will graft well to an Android device, an iOS device (thanks to market penetration of iPod, iTunes and iPhone), or perhaps a Kindle Fire. Users will have their personal cloud affiliations and the services can be brught to any of these devices and platforms. It can be both a work and a personal device. Or you could easily carry two, especially if the company pays for one of them. As has been stated better elsewhere, these tablets are pretty much the same.

So the nature of the device is not the major factor, not a point of lock-in, or even a decision guide. Because of the single-sign-on APIs from cloud and social media providers, you can now go from tablet to tablet, find your cloud of choice — be it Google, Apple, Microsoft, Facebook, Yahoo, or Amazon. You know how you can just rent bicycles in many cities now and just ride it and drop it off? Same for everyone. This is the future of tablet devices too. Quite soon, actually. Rent it, log in, use it, move on.

Perhaps enterprises should just lease these things?

Enterprises must still choose

Which tablets then will connect back best to the enterprise? Will the business private cloud services be as easily assimilated as the public cloud ones? What of containerization support, isolation and security features, and/or apps receiver technology flavors? Apple’s iOS 7 goes a long way to help enterprises run their own identity and access management (IAM) and isolate apps and run a virtual private connection. Windows 8 has done this all along. Google and Amazon are happy to deliver cloud services just as well. There are the three or four flavors.

After using the Lenovo ThinkPad Tablet 2 running Windows 8, it astounds me that Microsoft lost this market and has to claw back from such low penetration in the mobile market. This should have been theirs by any reckoning. Years ago.

Now it’s too late for the device and client platform alone to dictate the market direction. It’s now a function of how the business cloud services can best co-exist with a personal device instance. Because this coexistence will be a must-have capability, it doesn’t really matter what the device is. Any of the top three or four will do.

The ability of the device to best nurture the business and the end-users -- both separate while equal in the same hardware -- that’s the ticket. The rest is standard feature check-offs.

You may also be interested in:

Wednesday, November 13, 2013

Cardlytics on HP Vertica powers millions of swiftly tailored marketing offers to bank card consumers

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: HP.

The next edition of the HP Discover Podcast Series delivers an innovation case study interview that highlights how data-intensive credit- and debit-card marketing services provider, Cardlytics, delivers millions of highly tailored marketing offers to banking consumers across the United States.

Cardlytics, in adopting a new analytics platform, gained huge data analysis capacity, vastly reduced query times, and swiftly met customer demands at massive scale.

To learn how, we sat down with Craig Snodgrass, Senior Vice President for Analytics and Product at Cardlytics Inc., based in Atlanta. The discussion, which took place at the recent HP Vertica Big Data Conference in Boston, is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: At some point, you must have had a data infrastructure or legacy setup that wasn't meeting your requirements. Tell us a little bit about the journey that you've been on gaining better analytic results for your business.

Snodgrass: As with any other company, our data was growing and growing and growing. Also growing at the same time was the number of advertisers that we were working with. Since our advertisers spanned multiple categories -- they range from automotive, to retail, to restaurants, to quick-serve -- the types of questions they were asking were different.

Snodgrass
So we had this intersection of more data and different questions happening at a vertical level. Using our existing platform, we just couldn't answer those questions in a timely manner, and we couldn't iterate around being able to give our advertisers even more insights, because it was just taking too long.

First, we weren’t able to even get answers. Then, when there was the back-and-forth of wanting to understand more or get more insight it just ended up taking longer-and-longer. So at the end of the day, it came down to multiple and unstructured questions, and we just couldn't get our old systems to respond fast enough.

Gardner: Who are your customers, and what do you do for them?

Growing the business

Snodgrass: Our customers are essentially anybody who wants to grow their business. That's probably a common answer, but they are advertisers. They're folks who are used to traditional media, where when they do a TV or radio ad. They're hitting everybody, people that were going to come to their store anyways and people who probably weren’t going to come to their store.

We're able to target who they want to bring into their store through looking at both debit-card and credit-card purchase data, all in an anonymized manner. We’re able to look at past spending behavior, and say, based on those spending behaviors, that these are the types of customers that are most likely to come to your store and more importantly, most likely to be a long-term customer for you.

We can target those, we can deliver the advertising in the form of a reward, meaning the customer actually gets something for the advertising experience. We deliver that through their bank.

The bank is able to do this for their customers as well. The reward comes from the bank, and the advertiser gets a new channel to go bring in business. Then, we can track for them over time what their return on ad-spend is. That’s not an advantage they’ve had before with the traditional advertising they’ve been doing.
It works inside of retail, just as well as restaurants, subscriptions, and the other categories that are out there as well.

Gardner: So it sounds like a win, win, win. As a consumer, I'm going to get offers that are something more than a blanket. It's going to be something targeted to me as the bank that’s providing the credit card. They're going to get loyalty by having a rewards effort that works. Then, of course, those people selling goods and services have a new way of reaching and marketing those goods and services in a way they can measure.

Snodgrass: Yeah, and back to this idea of the multiple verticals. It works inside of retail, just as well as restaurants, subscriptions, and the other categories that are out there as well. So it's not just a one-category type reward.

A customer will know quickly when something is not relevant. If you bring in a customer for whom it may not be relevant or they weren’t the right customer, they're not going to return.
The advertiser isn't going to get their return on ad-spend. So it's actually in both our interests to make sure we choose the right customers, because we want to get that return on ad-spend for the advertisers as well.

Gardner: Craig, what sort of volume of data are we talking about here?

Intersecting growth

Snodgrass: We're doing roughly 10 terabytes a year. From a volume standpoint, it's a combination of not just the number of transactions we're bringing in, but the number of requests, queries, and answers that we’re having to go against it. That intersection of growth in volume and growth in questions is happening at the same time.

For us right now, our data is structured. I know a lot of companies are working on the unstructured piece. We're in a world where in the payment systems and banking systems, the data is relatively structured and that's what we get, which is great. Our questions are unstructured. They're everywhere from corporate real estate types of questions, to loyalty, to just random questions that they've never known before.

One key thing that we can do for advertisers is, at a minimum, answer two large questions. What is my market share in an area? Typically, advertisers only know when customers come into their store with that transaction. They don't know where that customer goes and, obviously, they don't know when people don’t come into their store.

We have that full 360-degree view of what happens at the customer level, so we can answer, for a geographic area or whatever area that an advertiser wants, what is their market share and how is their market share trending week-to-week.

The other piece is that when we do targeting, there could be somebody that visits a location three times over a certain time period. You don't know if they're somebody who shops the category 30 times or if they only shop them three times. We can actually answer share-of-wallet for a customer, and you can use that in targeting, designing your campaigns, and more importantly, in analysis. What's going on with these customers?
For us, with Vertica, one of the key components isn't just the speed, but how quick we can scale if the number of queries goes up.

Gardner: So the better job you do, the more queries will be generated.

Snodgrass: It's a self-fulfilling prophesy. For us, with Vertica, one of the key components isn't just the speed, but how quick we can scale if the number of queries goes up. It's relatively easy to predict what our growth and data volume is going to be. It is not easy for me to predict what the growth in queries is going to be. Again, as advertisers understand what types of questions we can answer, it's unfortunately a ratio of 10 to 1. Once they understand something, there are 10 other questions that come out of it.

We can quickly add nodes and scalability to manage the increase in volumes of queries, and it's cheap. This is not expensive hardware that you have to put in. That is one of the main decision points we had. Most people understand HP Vertica on the speed piece, but that and the quick scalability of the infrastructure were critical for us.

Gardner: Just as your marketing customers want to be able to predict their spend and the return on investment (ROI) from it, do you sense that you can predict and appreciate, when you scale with HP Vertica what your costs will be? Is there a big question mark or do you have a sense of, I do this and I have to pay that?

Snodgrass: It is the "I do this and I'll have to pay that," the linearness. For those who understand Vertica, that’s a bit of a pun, but the linear relationship is that if we need to scale, all we need to do is this. It's very easy to forecast. I may not know the date for when I need to add something, but I definitely know what the cost will be when we need to add it.

Compare and contrast

Gardner: How do you measure, in addition to that predictability of cost, your benefits? Are there any speeds and feeds that you can share that compare and contrast and might help us better understand how well this works?

Snodgrass: There are two numbers. During the POC phase, we had a set of 10 to 15 different queries that we used as a baseline. We saw anywhere from 500x to 1,000x or 1,500x speed in return of getting that data. So that’s the first bullet point.

The second is that there were queries that we just couldn't get to finish. At some point, when you let it go long enough, you just don't know if it is going to converge. With Vertica, we haven't hit that limit yet.

Vertica has also allowed to have varying degrees of analysts’ capabilities when it comes to SQL writing. Some are elegant and they write fantastic, very efficient queries. Others are still learning the best way to go put the queries together. They will still always return with Vertica. In the legacy world prior to Vertica, those are the ones that just wouldn't return.
In a SaaS shop, there are a lot of things that you're going to do in SaaS that you are not going to go do in SQL

I don’t know the exact number for how much more productive they are, but the fact that their queries are always returning, and returning in a timely manner, obviously has dramatically increased their productivity. So it's a hard one to measure, but forget how fast the queries have returned, the productivity of our analyst has gone up dramatically.

Gardner: What could an analytics platform do better for you? What would you like to see coming down the pipeline in terms of features, function, and performance?

Snodgrass: If you could do something in SQL, Vertica is fantastic. We'd like more integration with R, more integration with software as a service (SaaS), more integration with these sophisticated tools. If you get all the data into their systems, maybe they can manipulate it in a certain way, but then, you are managing two systems.

Vertica is working on a little bit better integration with R through distributed R, but there's also SaaS as well. In a SaaS shop, there are a lot of things that you're going to do in SaaS that you are not going to go do in SQL. That next level of analytics integration is where we would love to go see the product go.

Gardner: Do you expect that there will be different types of data and information that you could bring to bear on this? Perhaps some sort of camera, sensor of some sort, point-of-sale information, or mobile and geospatial information that could be brought to bear? How important is it for you to have a platform that can accommodate seemingly almost any number of different information types and formats?

Snodgrass: The best way to answer that one is that we don't ever want to tell business development that the reason they can't pursue a path is because we don't have a platform that can support that.

Different paths

Today, I don't know where the future holds from these different paths, but there are so many different paths we can go down. It's not just the Vertica component, but the HP HAVEn components and the fact that they can integrate with a lot of the unstructured, I think they call it “the human data versus the machine data.”

It's having the human data pathway open to us. We don't want to be the limiting factor for why somebody would want to do something. That's another bullet point for HP Vertica in our camp. If a business model comes out, we can support it.
Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: HP.

You may also be interested in:

Wednesday, November 6, 2013

Efficient big data capabilities help Cerner drive needed improvements into healthcare outcomes

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: HP.

The next edition of the HP Discover Podcast Series delves into how a healthcare solutions provider leverages big-data capabilities. We’ll see how Cerner has deployed the HP Vertica Analytics platform to help their customers better understand healthcare trends, as well as to help them better run their own systems.

To learn more about how high-performing and cost-effective big data processing forms a foundational element to improving healthcare quality and efficiency, join Dan Woicke, Director of Enterprise Systems Management at Cerner Corp. based in Kansas City, Missouri.

The discussion, which took place at the recent HP Vertica Big Data Conference in Boston, is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: We're going through some major transitions in how healthcare payments are going to be made -- and how good care is defined. We're moving from pay for procedures to more pay for outcomes. So tell me about Cerner, and why big data is such a big deal.

Woicke: The key element here is that the payment structure is changing to more of an outcome model. In order for that to happen, we need to get all the sources of data from many, many disparate systems, bring them in, and let our analysts work on what the right trends are and predict quality outcomes, so that you can repeat those and stay profitable in the new system.

My direct responsibility is to bring in massive amounts of performance data. This is how our Cerner Millennium systems are running.
We have hundreds of clients, both in the data center and those that manage their own systems with their own database administrators (DBAs). The challenge is just to have a huge system like that running with tens of thousands of clinicians on the system.

We need to make sure that we have the right data in place in order to measure how systems are running and then be able to predict how those systems will run in the future. If things are happening that might be going negative, how can we take the massive amounts of data that are coming into our new analytical platform, correlate those parameters, predict what’s going to happen, and then take action before there is a negative?

Effect change

We want to be able to predict what’s happening, so that we can effect change before there is a negative impact on the system.

Gardner: How does big data and the ability to manage big data get you closer to the real-time and then, ultimately, proactive results your clients need?

Woicke: Since January we've begun to bring in what we call Response Time Measurement System (RTMS) records. For example, when a doctor or a nurse is in our electronic medical record (EMR) system is signing an order, I can tell you how long it took to log into the system. I can tell you how long you were in the charting module.

Woicke
All those transactions produce 10 billion timers, per month, across all of our clients. We bring those all into our HP Vertica Data Warehouse. Right now, it’s about a two-hour response time, but my goal, within the next 12 months, is to get it down to 10 minutes.

I can see in real time when trends are happening, either positive or negative, and be able to take action before there is an issue.

Gardner: Tell us more about about Cerner -- what you do in IT.

Woicke: We run the largest EMR in the world. We have well over 400 domains to manage  -- we call them domains -- which allows us to hook up multiple facilities to those domains. Once we have multiple facilities connecting into those domains, at any given time, there are tens of thousands clinicians on the system at one time.

We have two data centers in Kansas City, Missouri and we host more than half for our clients in those data centers. The trend is moving toward being remote-hosted managed like that. We still have a couple of hundred clients that are managing their own Millennium domains. As I said before, we need to make sure that we provide the same quality of service to both those sets of clients.

Single database

Cerner Millennium is a suite of products or solutions. Millennium is a platform where the EMR is placed into a single database. Then, we have about 55 different solutions that go on top of that platform, starting with ambulatory solutions. This year was really neat. We were able to launch our first ambulatory iPad application.

There are about 55 different solutions, and it's growing all the time with surgery and lab that fit into the Cerner Millennium system. So we do have a cohesive set of data all within one database, which makes us unique.

Gardner: Where does the data come from primarily, and how much data we are talking about?

Woicke: We're talking about quite a bit of data, and that’s why we had to transform something away from a traditional OLTP database into an MPP type database, because those systems that are now sending data to Cerner. 

We have claims data, and HL7 messages. We're going to get all our continuous care records from Millenium. We have other EMRs. So that’s pretty much the first time that we're bringing in other EMR records.

You’ll have that claim data that comes in from multiple sources, multiple EMRs, but the whole goal of population health is to get a population to manage their own health. That means that we need to give them the tools in their hands. And they need to be accurate, so that they can make the right decisions in the future. What that's going to do is bring the total cost of your healthcare down, which is really the goal.
What that's going to do is bring the total cost of your healthcare down, which is really the goal.

We have health-plan enrollments, and then of course, within Millennium, we're going to drill down into outcomes, re-admissions, diagnosis, and allergies. That’s the data that we need to be able to predict what kind of care we are going to have in the future.

Gardner: So it seems to me that we talk about "Internet of things." We're also going to the "Internet of people." More information from them about their health comes back and benefits you and benefits the healthcare providers. But ultimately, they can also provide great insights to the patients themselves.

Do you see, in the not too distant future, applications where certain data -- well-protected and governed of course -- is made into services and insights that allow for a better proactive approach to health?

Proactive approach

Woicke: Without a doubt. We're actually endorsing this internally within the company by launching our own weight-loss challenges, where we're taking our medical records and putting them on the web, so that we have access to them from home.

I can go on the site right now and manage my own health. I can track the number of steps I'm doing. Those are the types of tools that we need to launch to the population, so that they endorse that good behavior, which will ultimately change their quality of life.

Right now, we're in production with the operation side that we talked about a little bit about earlier. Then, we are in production with what we call Health Facts, a huge set of blinded data. We hire a team of analysts and scientists to go through this data and look for trends.
You can see what that’s going to do for the speed of the amount of analysis we could do on the same amount of data. It’s game changing.

It’s something we haven’t been able to do until recently, until we got HP Vertica. I am going to give you a good example. We had analysts log a SQL query to do an exploratory type of analysis on the data. They would log that at 5 p.m., then issue it, and hopefully, by the time they came back at 8 a.m. the next day, that query would be done.

In Vertica, we've timed those queries at between two and five seconds. So you can see what that’s going to do for the speed of the amount of analysis we could do on the same amount of data. It’s game changing.

There were a lot of competitors that would have worked out, but we had a set of criteria that we drilled down on. We were trying to make it as scientific as possible and very, very thorough. So we built a score sheet, and each of us from the operation side and Health Facts side graded and weighted each of those categories that we were going to judge during the proof of concept (POC). We ended up doing six POCs.
We got down to two, and it was a hard choice. But with the throughput that we got from Vertica, their performance, and the number of simultaneous users on the system at a given period of time, it was the right choice for us.

Gardner: And because we're talking about healthcare, costs are super important. Was there a return on investment (ROI) or cost benefit involved as well?

Extremely competitive

Woicke: Absolutely. You could imagine that this would be the one or two top categories weighted on our score sheet, but certainly HP Vertica is extremely competitive, compared to some of the others that we looked at.

Gardner: Dan, looking to the future, what do you expect your requirements to be, say, two years from now? Is there a trajectory that you need to take as an organization, and how does that compare to where you see Vertica going?

Woicke: Having Vertica as a partner, we navigate that together. They invited me here to Boston to sit on the user board. It was really neat to sit right there with [HP Vertica General Manager] Colin Mahony at the same table and be able to say, "This is what we need. These are our needs coming around the corner," and have him listen and be able to take action on that. That was pretty impressive.

To answer your question though, it’s more and more data. I was describing the operations side, where we bring in 10 billion RTMS records. There's going to be another 10 billion type of records coming in from other sources, CPU, Memory, Disk I/O, everything can be measured.

We want to bring it into Vertica, because I'm going to be able to do some correlation against something we were talking about. If I know that the RTMS records show a negative performance that's going to happen within the next 10-15 minutes, I can figure out which one of those operational parameters is most affecting that outcome of that performance, and then can send the analyst directly in to mitigate that problem.
By bringing in more and more data and being able to correlate it, we're going to show all the clients, as well as the providers, how their system is doing.

On the EMR side, it’s more data as well. On the operations side, we're going to apply this to other enterprises to bring in more data to connect to the experts. So there is always somebody out there. That’s the expert. What we're going to do is connect the provider with the payers and the patient to complete that triangle in population health. That’s where we're going in the next few months.

Gardner: I certainly think that managing data effectively is a huge component of our healthcare challenge here in the United States, and of course, you're operating in about 19 countries. So this is something that will be a benefit to almost any market where efficiency, productivity, quality of care come to bear.

Woicke: At Cerner Corp., we're really big on transparency. We have a system right now called the Lights On Network, where we are taking these parameters and bringing them into a website. We show everything to the client, how they're performing and how the system is doing. By bringing in more and more data and being able to correlate it, we're going to show all the clients, as well as the providers, how their system is doing.
Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: HP.

You may also be interested in: