Friday, July 12, 2013

The Open Group conference emphasizes healthcare as key sector for ecosystem-wide interactions improvement

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: The Open Group.

This latest BriefingsDirect discussion, leading into The Open Group Conference on July 15 in Philadelphia, brings together a panel of experts to explore how new IT trends are empowering improvements, specifically in the area of healthcare. We'll learn how healthcare industry organizations are seeking large-scale transformation and what are some of the paths they're taking to realize that.

We'll see how improved cross-organizational collaboration and such trends as big data and cloud computing are helping to make healthcare more responsive and efficient.

The panel: Jason Uppal, Chief Architect and Acting CEO at clinicalMessage; Larry Schmidt, Chief Technologist at HP for the Health and Life Sciences Industries, and Jim Hietala, Vice President of Security at The Open Group. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

This special BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference, which is focused on enterprise transformation in the finance, government, and healthcare sectors. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL. [Disclosure: The Open Group and HP are sponsors of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: Let’s take a look at this very interesting and dynamic healthcare sector. What, in particular, is so special about healthcare and why do things like enterprise architecture and allowing for better interoperability and communication across organizational boundaries seem to be so relevant here?

Hietala: There’s general acknowledgement in the industry that, inside of healthcare and inside the healthcare ecosystem, information either doesn’t flow well or it only flows at a great cost in terms of custom integration projects and things like that.

Fertile ground

From The Open Group’s perspective, it seems that the healthcare industry and the ecosystem really is fertile ground for bringing to bear some of the enterprise architecture concepts that we work with at The Open Group in order to improve, not only how information flows, but ultimately, how patient care occurs.

Gardner: Larry Schmidt, similar question to you. What are some of the unique challenges that are facing the healthcare community as they try to improve on responsiveness, efficiency, and greater capabilities?

Schmidt: There are several things that have not really kept up with what technology is able to do today.

For example, the whole concept of personal observation comes into play in what we would call "value chains" that exist right now between a patient and a doctor. We look at things like mobile technologies and want to be able to leverage that to provide additional observation of an individual, so that the doctor can make a more complete diagnosis of some sickness or possibly some medication that a person is on.

We want to be able to see that observation in real life, as opposed to having to take that in at the office, which typically winds up happening. I don’t know about everybody else, but every time I go see my doctor, oftentimes I get what’s called white coat syndrome. My blood pressure will go up. But that’s not giving the doctor an accurate reading from the standpoint of providing great observations.

Technology has advanced to the point where we can do that in real time using mobile and other technologies, yet the communication flow, that information flow, doesn't exist today, or is at best, not easily communicated between doctor and patient.
There are plenty of places that additional collaboration and communication can improve the whole healthcare delivery model.

If you look at the ecosystem, as Jim offered, there are plenty of places that additional collaboration and communication can improve the whole healthcare delivery model.

That’s what we're about. We want to be able to find the places where the technology has advanced, where standards don’t exist today, and just fuel the idea of building common communication methods between those stakeholders and entities, allowing us to then further the flow of good information across the healthcare delivery model.

Gardner: Jason Uppal, let’s think about what, in addition to technology, architecture, and methodologies can bring to bear here? Is there also a lag in terms of process thinking in healthcare, as well as perhaps technology adoption?

Uppal: I'm going to refer to a presentation that I watched from a very well-known surgeon from Harvard, Dr. Atul Gawande. His point was is that, in the last 50 years, the medical industry has made great strides in identifying diseases, drugs, procedures, and therapies, but one thing that he was alluding to was that medicine forgot the cost, that everything is cost.

At what price?

Today, in his view, we can cure a lot of diseases and lot of issues, but at what price? Can anybody actually afford it?

Uppal
His view is that if healthcare is going to change and improve, it has to be outside of the medical industry. The tools that we have are better today, like collaborative tools that are available for us to use, and those are the ones that he was recommending that we need to explore further.

That is where enterprise architecture is a powerful methodology to use and say, "Let’s take a look at it from a holistic point of view of all the stakeholders. See what their information needs are. Get that information to them in real time and let them make the right decisions."

Therefore, there is no reason for the health information to be stuck in organizations. It could go with where the patient and providers are, and let them make the best decision, based on the best practices that are available to them, as opposed to having siloed information.

So enterprise-architecture methods are most suited for developing a very collaborative environment. Dr. Gawande was pointing out that, if healthcare is going to improve, it has to think about it not as medicine, but as healthcare delivery.
There are definitely complexities that occur based on the different insurance models and how healthcare is delivered across and between countries.

Gardner: And it seems that not only are there challenges in terms of technology adoption and even operating more like an efficient business in some ways. We also have very different climates from country to country, jurisdiction to jurisdiction. There are regulations, compliance, and so forth.

Going back to you, Larry, how important of an issue is that? How complex does it get because we have such different approaches to healthcare and insurance from country to country?

Schmidt: There are definitely complexities that occur based on the different insurance models and how healthcare is delivered across and between countries, but some of the basic and fundamental activities in the past that happened as a result of delivering healthcare are consistent across countries.

As Jason has offered, enterprise architecture can provide us the means to explore what the art of the possible might be today. It could allow us the opportunity to see how innovation can occur if we enable better communication flow between the stakeholders that exist with any healthcare delivery model in order to give us the opportunity to improve the overall population.

After all, that’s what this is all about. We want to be able to enable a collaborative model throughout the stakeholders to improve the overall health of the population. I think that’s pretty consistent across any country that we might work in.

Ongoing work

Gardner: Jim Hietala, maybe you could help us better understand what’s going on within The Open Group and, even more specifically, at the conference in Philadelphia. There is the Population Health Working Group and there is work towards a vision of enabling the boundaryless information flow between the stakeholders. Any other information and detail you could offer would be great. [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Hietala: On Tuesday of the conference, we have a healthcare focus day. The keynote that morning will be given by Dr. David Nash, Dean of the Jefferson School of Population Health. He'll give what’s sure to be a pretty interesting presentation, followed by a reactors' panel, where we've invited folks from different stakeholder constituencies.

Hietala
We're are going to have clinicians there. We're going to have some IT folks and some actual patients to give their reaction to Dr. Nash’s presentation. We think that will be an interesting and entertaining panel discussion.

The balance of the day, in terms of the healthcare content, we have a workshop. Larry Schmidt is giving one of the presentations there, and Jason and myself and some other folks from our working group are involved in helping to facilitate and carry out the workshop.

The goal of it is to look into healthcare challenges, desired outcomes, the extended healthcare enterprise, and the extended healthcare IT enterprise and really gather those pain points that are out there around things like interoperability to surface those and develop a work program coming out of this.
We want to be able to enable a collaborative model throughout the stakeholders to improve the overall health of the population.

So we expect it to be an interesting day if you are in the healthcare IT field or just the healthcare field generally, it would definitely be a day well spent to check it out.

Gardner: Larry, you're going to be talking on Tuesday. Without giving too much away, maybe you can help us understand the emphasis that you're taking, the area that you're going to be exploring.

Schmidt: I've titled the presentation "Remixing Healthcare through Enterprise Architecture." Jason offered some thoughts as to why we want to leverage enterprise architecture to discipline healthcare. My thoughts are that we want to be able to make sure we understand how the collaborative model would work in healthcare, taking into consideration all the constituents and stakeholders that exist within the complete ecosystem of healthcare.

This is not just collaboration across the doctors, patients, and maybe the payers in a healthcare delivery model. This could be out as far as the drug companies and being able to get drug companies to a point where they can reorder their raw materials to produce new drugs in the case of an epidemic that might be occurring.


Real-time model

It would be a real-time model that allows us the opportunity to understand what's truly happening, both to an individual from a healthcare standpoint, as well as to a country or a region within a country and so on from healthcare. This remixing of enterprise architecture is the introduction to that concept of leveraging enterprise architecture into this collaborative model.

Then, I would like to talk about some of the technologies that I've had the opportunity to explore around what is available today in technology. I believe we need to have some type of standardized messaging or collaboration models to allow us to further facilitate the ability of that technology to provide the value of healthcare delivery or betterment of healthcare to individuals. I'll talk about that a little bit within my presentation and give some good examples.

It’s really interesting. I just traveled from my company’s home base back to my home base and I thought about something like a body scanner that you get into in the airport. I know we're in the process of eliminating some of those scanners now within the security model from the airports, but could that possibly be something that becomes an element within healthcare delivery? Every time your body is scanned, there's a possibility you can gather information about that, and allow that to become a part of your electronic medical record.
There is a lot of information available today that could be used in helping our population to be healthier.

Hopefully, that was forward thinking, but that kind of thinking is going to play into the art of the possible, with what we are going to be doing, both in this presentation and talking about that as part of the workshop.

Gardner: Larry, we've been having some other discussions with The Open Group around what they call Open Platform 3.0, which is the confluence of big data, mobile, cloud computing, and social.

One of the big issues today is this avalanche of data, the Internet of things, but also the Internet of people. It seems that the more work that's done to bring Open Platform 3.0 benefits to bear on business decisions, it could very well be impactful for centers and other data that comes from patients, regardless of where they are, to a medical establishment, regardless of where it is.

So do you think we're really on the cusp of a significant shift in how medicine is actually conducted?

Schmidt: I absolutely believe that. There is a lot of information available today that could be used in helping our population to be healthier. And it really isn't only the challenge of the communication model that we've been speaking about so far. It's also understanding the information that's available to us to take that and make that into knowledge to be applied in order to help improve the health of the population.

As we explore this from an as-is model in enterprise architecture to something that we believe we can first enable through a great collaboration model, through standardized messaging and things like that, I believe we're going to get into even deeper detail around how information can truly provide empowered decisions to physicians and individuals around their healthcare.

So it will carry forward into the big data and analytics challenges that we have talked about and currently are talking about with The Open Group.

Healthcare framework

Gardner: Jason Uppal, we've also seen how in other business sectors, industries have faced transformation and have needed to rely on something like enterprise architecture and a framework like TOGAF in order to manage that process and make it something that's standardized, understood, and repeatable.

It seems to me that healthcare can certainly use that, given the pace of change, but that the impact on healthcare could be quite a bit larger in terms of actual dollars. This is such a large part of the economy that even small incremental improvements can have dramatic effects when it comes to dollars and cents.

So is there a benefit to bringing enterprise architect to healthcare that is larger and greater than other sectors because of these economics and issues of scale?

Uppal: That's a great way to think about this thing. In other industries, applying enterprise architecture to do banking and insurance may be easily measured in terms of dollars and cents, but healthcare is a fundamentally different economy and industry.

It's not about dollars and cents. It's about people’s lives, and loved ones who are sick, who could very easily be treated, if they're caught in time and the right people are around the table at the right time. So this is more about human cost than dollars and cents. Dollars and cents are critical, but human cost is the larger play here.
Whatever systems and methods are developed, they have to work for everybody in the world.

Secondly, when we think about applying enterprise architecture to healthcare, we're not talking about just the U.S. population. We're talking about global population here. So whatever systems and methods are developed, they have to work for everybody in the world. If the U.S. economy can afford an expensive healthcare delivery, what about the countries that don't have the same kind of resources? Whatever methods and delivery mechanisms you develop have to work for everybody globally.

That's one of the thing that a methodology like TOGAF brings out and says to look at it from every stakeholder’s point of view, and unless you have dealt with every stakeholder’s concerns, you don't have an architecture, you have a system that's designed for that specific set of audience.

The cost is not this 18 percent of the gross domestic product in the U.S. that is representing healthcare. It's the human cost, which is many multitudes of that. That's is one of the areas where we could really start to think about how do we affect that part of the economy, not the 18 percent of it, but the larger part of the economy, to improve the health of the population, not only in the North America, but globally.

If that's the case, then what really will be the impact on our greater world economy is improving population health, and population health is probably becoming our biggest problem in our economy.

We'll be testing these methods at a greater international level, as opposed to just at an organization and industry level. This is a much larger challenge. A methodology like TOGAF is a proven and it could be stressed and tested to that level. This is a great opportunity for us to apply our tools and science to a problem that is larger than just dollars. It's about humans.

All "experts"

Gardner: Jim Hietala, in some ways, we're all experts on healthcare. When we're sick, we go for help and interact with a variety of different services to maintain our health and to improve our lifestyle. But in being experts, I guess that also means we are witnesses to some of the downside of an unconnected ecosystem of healthcare providers and payers.

One of the things I've noticed in that vein is that I have to deal with different organizations that don't seem to communicate well. If there's no central process organizer, it's really up to me as the patient to pull the lines together between the different services -- tests, clinical observations, diagnosis, back for results from tests, sharing the information, and so forth.

Have you done any studies or have anecdotal information about how that boundaryless information flow would be still relevant, even having more of a centralized repository that all the players could draw on, sort of a collaboration team resource of some sort? I know that’s worked in other industries. Is this not a perfect opportunity for that boundarylessness to be managed?

Hietala: I would say it is. We all have experiences with going to see a primary physician, maybe getting sent to a specialist, getting some tests done, and the boundaryless information that’s flowing tends to be on paper delivered by us as patients in all the cases.

So the opportunity to improve that situation is pretty obvious to anybody who's been in the healthcare system as a patient. I think it’s a great place to be doing work. There's a lot of money flowing to try and address this problem, at least here in the U.S. with the HITECH Act and some of the government spending around trying to improve healthcare.
We'll be testing these methods at a greater international level, as opposed to just at an organization and industry level.

You've got healthcare information exchanges that are starting to develop, and you have got lots of pain points for organizations in terms of trying to share information and not having standards that enable them to do it. It seems like an area that’s really a great opportunity area to bring lots of improvement.

Gardner: Let’s look for some examples of where this has been attempted and what the success brings about. I'll throw this out to anyone on the panel. Do you have any examples that you can point to, either named organizations or anecdotal use case scenarios, of a better organization, an architectural approach, leveraging IT efficiently and effectively, allowing data to flow, putting in processes that are repeatable, centralized, organized, and understood. How does that work out?

Uppal: I'll give you an example. One of the things that happens when a patient is admitted to hospital and in hospital is that hey get what's called a high-voltage care. There is staff around them 24x7. There are lots of people around, and every specialty that you can think of is available to them. So the patient, in about two or three days, starts to feel much better.

When that patient gets discharged, they get discharged to home most of the time. They go from very high-voltage care to next to no care. This is one of the areas where in one of the organizations we work with is able to discharge the patient and, instead of discharging them to the primary care doc, who may not receive any records from the hospital for several days, they get discharged to into a virtual team. So if the patient is at home, the virtual team is available to them through their mobile phone 24x7.

Connect with provider

If, at 3 o’clock in the morning, the patient doesn't feel right, instead of having to call an ambulance to go to hospital once again and get readmitted, they have a chance to connect with their care provider at that time and say, "This is what the issue is. What do you want me to do next? Is this normal for the medication that I am on, or this is something abnormal that is happening?"

When that information is available to that care provider who may not necessarily have been part of the care team when the patient was in the hospital, that quick readily available information is key for keeping that person at home, as opposed to being readmitted to the hospital.

We all know that the cost of being in a hospital is 10 times more than it is being at home. But there's also inconvenience and human suffering associated with being in a hospital, as opposed to being at home.

Those are some of the examples that we have, but they are very limited, because our current health ecosystem is a very organization specific, not  patient and provider specific. This is the area there is a huge room for opportunities for healthcare delivery, thinking about health information, not in the context of the organization where the patient is, as opposed to in a cloud, where it’s an association between the patient and provider and health information that’s there.
Extending that model will bring infinite value to not only reducing the cost, but improving the cost and quality of care.

In the past, we used to have emails that were within our four walls. All of a sudden, with Gmail and Yahoo Mail, we have email available to us anywhere. A similar thing could be happening for the healthcare record. This could be somewhere in the cloud’s eco setting, where it’s securely protected and used by only people who have granted access to it.

Those are some of the examples where extending that model will bring infinite value to not only reducing the cost, but improving the cost and quality of care.

Schmidt: Jason touched upon the home healthcare scenario and being able to provide touch points at home. Another place that we see evolving right now in the industry is the whole concept of mobile office space. Both countries, as well as rural places within countries that are developed, are actually getting rural hospitals and rural healthcare offices dropped in by helicopter to allow the people who live in those communities to have the opportunity to talk to a doctor via satellite technologies and so on.

The whole concept of a architecture around and being able to deal with an extension of what truly lines up being telemedicine is something that we're seeing today. It would be wonderful if we could point to things like standards that allow us to be able to facilitate both the communication protocols as well as the information flows in that type of setting.

Many corporations can jump on the bandwagon to help the rural communities get the healthcare information and capabilities that they need via the whole concept of telemedicine.

That’s another area where enterprise architecture has come into play. Now that we see examples of that working in the industry today, I am hoping that as part of this working group, we'll get to the point where we're able to facilitate that much better, enabling innovation to occur for multiple companies via some of the architecture or the architecture work we are planning on producing.

Single view

Gardner: It seems that we've come a long way on the business side in many industries of getting a single view of the customer, as it’s called, the customer relationship management, big data, spreading the analysis around among different data sources and types. This sounds like a perfect fit for a single view of the patient across their life, across their care spectrum, and then of course involving many different types of organizations. But the government also needs to have a role here.

Jim Hietala, at The Open Group Conference in Philadelphia, you're focusing on not only healthcare, but finance and government. Regarding the government and some of the agencies that you all have as members on some of your panels, how well do they perceive this need for enterprise architecture level abilities to be brought to this healthcare issue?

Hietala: We've seen encouraging signs from folks in government that are encouraging to us in bringing this work to the forefront. There is a recognition that there needs to be better data flowing throughout the extended healthcare IT ecosystem, and I think generally they are supportive of initiatives like this to make that happen.
Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: The Open Group.


You may also be interested in:

HP-fueled application delivery transformation pays ongoing dividends for McKesson

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: HP.

The next edition of the HP Discover Performance Podcast Series examines how McKesson Corp. accomplished a multi-year, pan-IT management transformation. We’ll learn how McKesson's performance journey, from 2005 to the present, has enabled it to better leverage an agile, hybrid cloud model.

The discussion comes from the recent HP Discover 2013 Conference in Las Vegas.

Andy Smith, Vice President of Applications Hosting Services at McKesson, joins host Dana Gardner, Principal Analyst at Interarbor Solutions, to explore how McKesson gained a standardized services orientation to gain agility in deploying its many active applications. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: It's hard to believe it's been a full year since we last spoke. What's changed in the last year in how McKesson had been progressing and maturing its applications delivery capabilities?

Smith: Probably one of the things that have changed in the last year is that our performance metrics have continued to improve. We're continuing to see a drop in the number of outages from the standardization and automation. The reliability of the systems has increased, the utilization of the systems has increased, and our system admin ratios have increased. So everything, all the key performance indicators (KPIs) are going in the right direction.

That allowed us to make the next shift, which was to focus on how can we do better at providing capabilities to our customers. How do we do it faster and better through provisioning, because now it's taking less time to do the support side of it.

Gardner: It's really interesting to me that a big part of all this is the provisioning aspect going from fewer manual processes and multiple points of touch to more self-provisioning. How has that worked out?

Smith: It's been very well received. We've been in production now roughly two-and-a-half months. Rather than delivering requests via business requests to add some compute capacity in an average of six months, we’re down to less than four days. I think we can get it down to less than 10 minutes by the time we hit the end of summer.

Well received

It's been a challenge to get people to think differently about their processes internal to IT that would allow us to do the automation, but it's been very well received.

Gardner: What were some of the hurdles in terms of trying to get standardized and creating that operating procedure that people could rally behind, self provision, and automate?

Smith: The first piece is just a change in culture. We believe we were customer-centric providers of services. What that really translated to was that we were customer-centric customized providers of services. So every request was a custom request. That resulted in slow delivery, but it also resulted in non-standardized solutions.

One of the most difficult things was getting the architects and engineers to think differently and to understand that standardization would actually be better for the customer. We could get it to them faster, more consistently, and more reliably, and on the back end, provide the support much more cheaply to get that mind shift.

But we were successful. I think everybody still likes to customize, but we haven't had to do that.

Gardner: Just for the edification of our listeners, tell us a bit about McKesson. You’re not just a small mom-and-pop shop.

Smith: No, I think we’re Fortune 14 now, with more than $122 billion in revenue and more than 43,500 employees. We focus specifically on healthcare, how to ensure that whatever is needed by  healthcare organizations is there when they need it.

Smith
That might be software systems that we write for providers. That could be claims processing that we do for providers. But, the biggest chunk of our business is supply chain, ensuring that the supplies, whether they be medical, surgical, or pharmaceutical, are in the hospital's and providers' hands as soon as they need them.

If a line of business needs to make an improvement in order to capture a need of a customer, with the old way of doing business, it would take me six months to get the computer on the floor. Then they could start their development. Now, you're down to less than a week and days. So they can start their development six months earlier, which really helps us be in a position to capture that new market faster. In turn, this also helps McKesson customers deliver critical healthcare solutions more rapidly to meet today's emerging healthcare needs and enable better health.

Gardner: And there are also some other factors in the market. There's even more talk now about cloud than last year, focusing on hybrid capabilities, where you can pick and choose how to deploy your apps. Then, there's the mobile factor.

Smith: We are recognizing that we have to build that next generation of applications. Part of that is the mobility piece of it, because we have to separate the physical application, the software-as-a-service (SaaS) application from the display device that the customer is going to use. It might be an Android, an iPhone,  or something else, a tablet.
We really have to separate that mobile portion from it, because that display device could be almost anything.

So we're recognizing the fact that for next-generation of product, we really have to separate that mobile portion from it, because that display device could be almost anything.

Gardner: We’re here at HP Discover. How have the HP products and services come together to help you not only tackle these technical issues, but to foster the right culture?

Smith: When we talked last year, we had a lot of the support tools in place from HP -- operations orchestration, server automation, monitoring tools -- but we were using them to do support better. What we're able to do from the provisioning side is leverage that capability and leverage those existing tools.

All we had to do is purchase one additional tool which is a Cloud Service Automation (CSA) that sits on top of our existing tools. So it was a very minor investment, and we were able to leverage all the support tools to do the provisioning side of the business. It was very practical for us and relatively quick.

Gardner: Of course, a big emphasis here at HP Discover is HP Converged Cloud and talking about these different hybrid models. How has the automation provisioning services orientation, and standardization put you in a place to be able to avail yourselves of some of these hybrid models and the efficiencies and speed that come with that? How do they tie together -- what you’ve done with applications now and what you can perhaps do with cloud?
From a technology standpoint, we know we can do it. We’ve done it in the labs.

Smith: We’ll be the first to admit that providing the services internally is not necessarily always the best. We may not be the cheapest and we may not be the most capable. By getting better at how we do provisioning and how we do our own internal cloud frees up resources, and those resources now can start thinking about how we work with an external provider.

That's a lot of concern for us right now, because there is that risk factor. Do you put your intellectual property (IP) out there? Do you put your patients’ medical records out there? How do you protect it? And so there are a lot of business rules and contracting issues that we have to get through.

From a technology standpoint, we know we can do it. We’ve done it in the labs. We’ve provisioned out to third-party providers. It all works from a technology standpoint with the tools we have. Now we have to get through the business issues.

On the same journey

It's fortunate, in some ways, that HP is on the same journey. We partner on a lot of these things. When we brought CSA in, it was one of the earlier releases, and now we’ve partnered with them through the Customer Advisory Boards (CABs) and other methods. They continue to enhance this to meet our needs, but also to meet their needs.

Gardner: Now that you've been on this journey from 2005, where do you see yourselves in a couple of years?

Smith: Because we’re in healthcare, very similar to banking, we've hit a point where we don't believe we can afford to be down anymore.

Instead of talking about three nines, four nines, or five nines, we're starting to talk about, how we ensure the machines are never down, even for planned maintenance. That's taking a different kind of infrastructure, but that’s also taking a different kind of application that can tolerate machines being taken offline, but continue to run.
That's where our eye is, trying to figure out how to change the environment to be constantly on.

That's where our eye is, trying to figure out how to change the environment to be constantly on.

If the application isn't smart enough to tolerate a piece of machine going down, then you have to redesign the application architecture. Our applications are going to have to scale out horizontally across the equipment as the peaks and valleys of the customer demands change through the day or through the week.
The current architecture doesn't scale horizontally. It scales up and down. So you end up with a really big box that’s not needed some times of the day. It would be better if we could spread the load out horizontally.

Gardner: So just to close out, we have to think about applications now in the context of where they are deployed, in a cloud spectrum or continuum of hybrid types of models. We also have to think about them being delivered out to a variety of different endpoints.

Different end points

What do you think you’ll need to be doing differently from an application-development, deployment, and standardization perspective in order to accomplish both that ability to deploy anywhere and be high performance, as well as also be out on a variety of different end points?

Smith: The reality is that part of our journey over the last several years has been to consolidate the environment, consolidate the data centers, and consolidate and virtualize the servers. That's been great from a customer cost standpoint and standardization standpoint.

But now, when you're starting to deliver that SaaS mobile kind of application, speed of response to the customer, the keystroke, the screen refresh, are really important. You can't do that from a central data center. You've got to be able to push some of the applications and data out to regional locations. We’re not going to build those regional locations. It's just not practical.

That's where we see bringing in these hybrid clouds. We’ll host the primary app, let's say, back in our corporate data center, but then the mobile piece, the customer experience piece, is going to be have to be hosted in data centers that are scattered throughout the country and are much physically much closer to where the customer is.
You’re going to really have to be watching the endpoints so you can see that customer experience.

Gardner: Of course, that’s going to require a different level of performance monitoring and management.

Smith: Exactly, because then you really have to monitor the application, not just the server at the back-end. You’ve got to be watching that performance to know whether you have a local ISP that’s come down, if you have got a local cloud that’s come down. You’re going to really have to be watching the endpoints so you can see that customer experience. So it is a different kind of application monitoring.
Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: HP.

You may also be interested in:

Thursday, July 11, 2013

CSC and HP team up to define the new state needed for comprehensive enterprise cybersecurity

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: HP.

This next edition of the HP Discover Performance Podcast Series targets on how IT leaders are improving security and reducing risks as they adapt to new and often harsh realities of doing business online.

We’re going to learn from a panel how professional services provider CSC, in a strategic partnership with HP, is helping companies and governments better understand and adapt to the tough cybersecurity landscape. 

Our panel consists of co-host Paul Muller, Chief Software Evangelist at HP Software; Dean Weber, Chief Technology Officer, CSC Global Cybersecurity, and Sam Visner, Vice President and General Manager, CSC Global Cybersecurity. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: What is the real scale of the threat here? Are we only just catching up in terms of the public perception of the reality of cyber-insecurity? How different is the reality from the public perception?

Weber: The difference is night and day. The reality is that we are under attack, and have been for quite some time. We are, as Sam likes to put it, facing a weapons-grade threat.

Visner: When I think about the threat, I think about several things happening at once. The first thing is that we’re asking IT, on which we depend, to do more. It's not just emails, collaboration, documents, and spreadsheets. It isn’t even just enterprise systems.

IT for manufacturing

It extends all the way down to the IT that we use for manufacturing, to control power plants, pipelines, airplanes, centrifuges, and medical devices. So, the first thing is that we’re asking IT to do more, and therefore there's more to defend. Secondly, the stakes are higher. It's not just up to us.

Visner
Government has said that the cybersecurity of the private sector is of public concern. If you're a regulated public utility for power, water, healthcare, finance, or transportation, your cybersecurity is an issue of public interest. So, this isn’t just the public cybersecurity, it's the cybersecurity of the private sector, which is in the public interest.

Third is the point that Dean made, and I want to elaborate on it. The threat is very different.

Today, intellectual property, whether or not it's possessed by the public sector or the private sector, if it's valuable, if it's worth something. It's worth something to a bad guy who wants to steal it. And if you have critical infrastructure that you’re trying to manage, and a bad guy may want to disrupt it, because their government may want to be able to exercise power.

And the threats are different. The threats are not just technically sophisticated. That's something a hacker, a teenager, can do. In addition to being technically sophisticated, they’re operationally sophisticated.
The threats are not just technically sophisticated. That's something a hacker, a teenager, can do.

That means this is foreign governments, or in some cases, foreign intelligence services that have the resources and the patience to study a target, a company, or a government agency over a long period of time, use social networking to figure out who has administrative privileges inside of that organization, and use that social networking to identify people whom they may want to subvert and who may help them in introducing malware.

Then, once they have decided what information they want, who safeguards it, they use their technical sophistication to follow up on it to exploit their operational knowledge. This is what differentiates a group of hackers, who maybe technically very bright, from an actual nation-state government that has the resources, the discipline, the time, and the patience to stick with the target and to exploit it over a long, long period of time.

So, when we use the term "weapons grade," what we mean is a cyber threat that's hard to detect, that's been wielded by a foreign government, a foreign armed force, or a foreign intelligence service -- the way a foreign government wields a weapon. That's what we’re really facing today in the way of cybersecurity threats.

Muller
Muller: You asked if the headlines are simply reflecting what has always been going on, and I think the answer is, yes. Definitely, there is an increased willingness of organizations to share the fact that they have been breached and to share what some of those vulnerabilities have been.

That's actually a healthy thing for society as a whole, rather than pretending that nothing is going on. Reporting the broken window is good for everybody. But, the reality is the sophistication and the scale of attacks as we have just heard, have gone up and have gone up quite measurably.

Cost of cybercrime

Every year we conduct a Cost of Cyber Crime Study with the Ponemon Institute. If we look just at the numbers between 2010 and 2012, from the most recent study in October, the cost impact of cyber crime has gone up 50 percent over that period of time. The number of successful attacks has gone up by two times. And the time to resolve attack is almost doubled as well. So it has become more expensive, greater scale, and it's becoming more difficult to solve.
The number of successful attacks has gone up two times. And the time to resolve attack is almost doubled as well.

Gardner: What strikes me as being quite different from the past, too, is when businesses encountered risks, even collective risks, they often had a law enforcement or other regulatory agency that would come to their rescue.

But, in reading the most recent The New Yorker, the May 20 issue, in an article titled Network Insecurity by John Seabrook, Richard McFeely, the Executive Assistant Director of the F.B.I, says quite straightforwardly, "We simply don't have the resources to monitor the mammoth quantity of intrusions that are going on out there."

So, enterprises, corporations, governments even can't really wait for the cavalry to come riding in. We’re sort of left to our own devices, or have I got that a little off-base, Dean?

Weber: The government can provide support in talking about threats and providing information about best practices, but overall, the private sector has a responsibility to manage its own infrastructures. The private sector may have to manage those infrastructures consistent with the public interest. That's what regulation means.

Weber
But the government is not going to provide cybersecurity for power companies’ power grid or for pharmaceutical companies’ research program. It can insist that there be good cybersecurity, but those organizations have always had to manage their own infrastructures.

Today, however, the threat to those infrastructures and the stakes of losing control of those infrastructures are much higher than they have ever been. That's what's amplified now.

There is also a tradeoff that can be done there in terms of how the government shares its threat intelligence. Today, threat intelligence shared at the highest levels generally requires a very, very high level of security, and that puts it out of reach of some organizations to be able to effectively utilize, even if they were so desirous.

So as we migrate ourselves into dealing with this enhanced threat environment, we need to also deal with the issues of enhancing the threat intelligence that we use as the basis of decision.

Gardner: Well, we've defined the fact that the means are there and that the incidences are increasing in scale, complexity, and severity. There is profit motive, the state secrets, and intellectual-property motives. Given all of that, what's wrong with the old methods?

Current threat

Weber: Against the current state-of-the-art threat, our ability to detect them, as they are coming in or while they are in has almost diminished to the point of non-existence. If we're catching them at all, we're catching them on the way out.

We've got to change the paradigm here. We've got to get better at threat intelligence. We've got to get better at event correlation. We've got to get better at the business of cybersecurity. And it has to be a public-private partnership that actually gets us there, because the public has an interest in the private infrastructure to operate its countries. That’s not just US; that’s global.

Visner: Let me add a point to that that’s germane to the relationship between CSC and HP Software. It's no longer an issue of finding a magic bullet. If I could just keep my antivirus up to fully updated, I would have the best signatures and I would be protected from the threat. Or if my firewall were adequately updated, I will be well protected.

Today, the threat is changing and the IT environment that we're trying to protect is changing. The threat, in many cases, doesn’t have a known signature and is being crafted by nations/states not to have it. Organizations ought to think twice about trying to do these themselves.

Our approach is to use a managed cybersecurity service that uses an infrastructure, a set of security operation centers, and an architecture of tools. That’s the approach we're using. What we're doing with HP Software is using some key pieces of HP Software technology to act as the glue that assembles the cybersecurity information management architecture that we use to manage the cybersecurity for Global 1000 companies and for key government agencies.
Customers, who try to manage a piece at a time, invariably get into trouble, because they can't do it.

Our security operations centers have set of tools, some of which we've developed, and some of which we've sourced from partners, bound together with HP’s ArcSight Security Information and Event Management System. This allows us to add new tools, as we need to retire old tools, when they are no longer useful.

They do a better job of threat correlation and analysis, so that we can help organizations manage that cybersecurity in a dynamic environment, rather than leave them to the game of playing Whac-A-Mole. I've got a new threat. Let me add a new tool. Oh, I've got another new threat. Let me add another new tool. That's opposed to managing the total environment with total visibility.

So that managed cybersecurity approach is the approach that we're using, and the role of HP Software here is to provide a key technology that is the sort of binder, that is the backbone for much of that architecture that allows us to manage organically, as opposed to a piece at a time.

Customers, who try to manage a piece at a time, invariably get into trouble, because they can't do it. They're always playing catch up with the latest threat and they are always at least one or two steps behind that threat by trying to figure out what is the latest band-aid to stick over the wound.

Increased sophistication

Muller: The sophistication of the adversary has risen, especially if you're in that awkward position -- you're big enough to be interesting to an attacker, especially when it’s motivated by money, but you are not large enough to have access to up-to-date threat information from some of the intelligence agencies of your national government.

You're not large enough to be able to afford the sort of sophisticated resources who are able to dedicate the time taken to build and maintain honey pots to understand and hang out in all of the deep dark corners of the internet that nobody wants to go to.

Those sort of things are the types of behaviors you need to exhibit to stay ahead, or at least to not get behind, of those threat landscape. By working with an organization that has that sort of capacities by opting for managed service, you're able to tap into a skill set that’s deeper and broader and that often has an international or global outlook, which is particularly important. When the threat is distributed around the planet, your ability to respond to that needs to be distributed likewise.

Gardner: I'm hearing two things. One that this is a team sport. I'm also hearing that this is a function of better analytics -- of really knowing your systems, knowing your organization, monitoring in real time, and then being able to exploit that. Maybe we could drill down on those. This new end-state of a managed holistic security approach, let's talk about it being a team sport and a function of better analytics. Sam?

Visner: There's no question about it. It is a team sport. Fortunately, in the United States and in a few other countries, people recognize that it's a team sport. More and more, the government has said that the cybersecurity of the private sector is an issue of public interest, either to regulation, standards regulation, or policy.
There's no question about it. It is a team sport.

More and more in the private sector, people have realized that they need threat information from the government, but there are also accruing threat information they need to share with the government and proliferate around their industries.

That has happened, and you can see coming out of the original Comprehensive National Cybersecurity Initiative of 2006-2007, all the way to the current recent executive order from the President of the United States, that this is a team sport. There is no question about that.

At the same time, a lot of companies are now developing tools that have APIs, programming interfaces that allow them to work together. Tools like ArcSight provide an environment that allows you to integrate a lot of different tools.

What's really changing is that global companies like CSC have become a global cybersecurity provider based on the idea that we will do this as a partner. We're not going to just sell a tool to a customer. We're going to be their partner to manage this environment.

More and more, they have the discussion underway about improved information sharing from the government to the private sector, based on intelligence information that might be provided to the private sector, and the private sector being provided with more protected means to share information relating to incidents, events, and investigations with the public sector.

Team sport

At the same time, enterprises themselves know that this has to be a team sport within an enterprise. It used to be that the email system was discreet, or your SAP system was discreet, inside of an enterprise. That might have been 10 years ago. But today, these things are part of a common enterprise and tomorrow they're going to be part of a common enterprise, where these things are provided as a service.

And the day after that, they'll be provided as a common enterprise with these things as a service on a common infrastructure that we call a cloud. And the day after that, that cloud will extend all the way down to the manufacturing systems on the shop floor, or the SCADA systems that control a railway, a pipeline, or the industrial control systems that control a medical device or an elevator, all the way out to 3D manufacturing.
The cybersecurity partner and the enterprise have to work together with the public sector and with regulatory and policy authorities.

The entire enterprise has to work together. The enterprise has to work together with its cybersecurity partner. The cybersecurity partner and the enterprise have to work together with the public sector and with regulatory and policy authorities. Governments increasingly have to work together to build a secured international ecosystem, because there are bad actors out there who don’t regard the theft of intellectual property as cyber crime.

Now fortunately, people get this increasingly and we're working together. That’s why we're finding partners who do the manage cybersecurity, and finding partners who can provide key pieces of technology. CSC and HP is an example of two companies working together in differentiated roles, but for a common and desirable outcome.

Three-step process

Weber: So let me think about how we chop this up, Dana. It’s a three-step process. The first is see, understand, and act -- at the risk of trivializing the complexity of approaching the problem. Seeing, as Sam has already pointed out, is to just try to get visibility of intent to attack, attacks in progress, or worse case, attacks that have taken place, attacks in progress, and finally, how we manage the exfiltration process.

Understanding is all about trying to unpack the difference between "bragging rights attacks," what I call high-intensity but low-grade attacks in terms of cyber threat. This is stuff that’s being done to deface the corporate website. Don’t get me wrong, it’s important, but in this scheme of things, it’s a distraction from some of the other activities that’s taking place. Also understanding is in terms of shifting or changing your compliance posture for some sort of further action.

Then, the last part is acting. It’s not good enough to simply to understand what’s going on, but it’s shutting down attacks in progress. It’s being able to take proactive steps to address breaches that may exist and particularly to address breaches in the underlying software.

We have always been worried about protecting the perimeter of our organization through the technologies, but continue to ignore one of the great issues out there, which is that software itself, in many cases, is inherently insecure. People are not scanning for, identifying, and addressing those issues in source code and binary vulnerability.

Gardner: What do you have to do in terms of thinking differently in order to start really positioning yourself to be proactive and aggressive with cybersecurity?

Visner: The first thing is that you’ve got to make an adequate assessment of the kind of organization you are. The role information and information technology plays in your organization, what we use the information for, and what information is most valuable. Or conversely, what would cause you the great difficulty, if you were to either lose control of that information or confidence in its integrity.

That has to be done not just for one piece of an enterprise, but for all pieces of the enterprise. By the way, there is a tremendous benefit, because you can re-visualize your enterprise. You can sort of business-process reengineer your enterprise, if you know on and what information you rely, what information is most valuable, what information, if was to be damaged, would cause you the most difficulty.
Rather than trying to manage it yourself, get a confident managed cyber-security services provider.

That’s the first thing I would do. The second thing is, since as-a-service is the way organizations buy things today and the way organizations provide things today, consider taking a look at cybersecurity as a service.

Rather than trying to manage it yourself, get a confident managed cyber-security services provider, which is our business at CSC, to do this work and be sure that they are equipped with the right tools and technologies, such as ArcSight Security Information and Event Management and other key technologies that we are sourcing from HP Software.

Third, if you're not willing to have somebody else manage it for you, get a managed cybersecurity services provider to build up your own internal cybersecurity management capabilities, so that you are your own managed cybersecurity services provider.

Next, be sure you understand, if you are part of critical infrastructure -- and there are some 23 critical infrastructure sectors -- what it is that you are required to do, what standards the government believes are pertinent to your business.

What information you should have shared with you, what information you are obligated to share, what regulations are relevant to your business, and be sure you understand that those are things that you want to do.

Strategic investment

Next, rather than trying to play Whac-A-Mole, having made these decisions, determine that you're going to make a strategic investment and not think of security as being added on and what's the least you need to do, but realize that cybersecurity is as organic to your value proposition as R&D is. It's as organic to your value proposition as electricity is. It's as organic to your value proposition as the good people who do the work. It's not once the least you need to do, but what are the things that contribute value.

Cybersecurity doesn’t just protect value, but in many cases, it can be a discriminator that enhances the value of your business, particularly if your business either relies on information, or information is your principal product, as it is today for many businesses in a knowledge economy. Those are things that you can do.

Lastly, you can get comfortable with the fact that this is a septic environment. There will always be risks. There will always be malware. Your job is not to eliminate it. Your job is to function confidently in the midst of it. You can, in fact, get to the point, both intellectually and emotionally, where that’s a possibility.

The fact that you can have an accident doesn’t deter us from driving. The fact that you can have a cold doesn’t deter us from going out to dinner or sending our kids to school.

What it does is make sure that we're vaccinated, that we drive well, that we are competent in our dealings with the rest of the society, and that we're prudent, but not frightened. Acting as if we are prudent, but not frightened, is a step we need to take.
It's as organic to your value proposition as the good people who do the work.


Our brand name is CSC Global Cybersecurity. The term we use is Cyber Confidence. We're not going to make you threat proof, but we will make you competent and confident enough to be able to operate in the presence of these threats, because they are the new norms. Those are the things you can do.

Weber: In addition to what Sam talked about, I'm a huge fan of data classification. Knowing what to protect, gives you the opportunity to decide how much protection is necessary by whatever data classification that is.

Whether that’s a risk management framework like FISMA, or it’s a risk management framework like the IL Series Controls of the UK Government or similar in Australia, these are risk management frameworks. They are deterministic about the appropriate level of security. Is this public information, in which case all you have to do is worry about whether it’s damaged and how to recover if and when it is? Or is this critical? Is this injurious to life, limb, or the pursuit of profits? And if it is, then you need to apply all the protections that you can to it.

And last but not least, again, as I pointed out earlier, our ability to detect every intrusion is almost nil today. The state of the threat is so far advanced. Basically, they can get in when they want to, where they want to.

They can be in for a very long period of time without detection. I would encourage organizations to beef up their perimeter controls for egress filtering and enclaving, so that they have the ability to manage the data that is being actually traded out of their networks.

Cultural shift

Muller: It’s important to be alert, but not alarmed. Do not let security send you into a sense of panic and inaction. Don’t hire an organization to help you write security policy that then just sits on the shelf. A policy is not going to give you security. It’s certainly not going to stop any of bad guys from exfiltrating any of that information that you have.

I'll say a couple of things. First, it’s not like buying an alarm and locks for your organization. Before, physical security was kind of a process you went through, where you started, it had a start and middle and an end. This is an ongoing process of continually identifying incoming threats and activities from an adversary that is monetized and has a lot to gain from their success.

It’s an ongoing process. As a result, as we said earlier today, security is a team sport. Find a friend who does it really well and is prepared to invest on an ongoing manner to make sure that they're able to stay here.

I'd concur with Dean's point as well. Ultimately, it's about the exfiltrating of your data. Put in place processes that help you understand the information that is leaving your organization and take steps to mitigate that as quickly as possible. Those are my highest priorities.
This is an ongoing process of continually identifying incoming threats and activities from an adversary that is monetized and has a lot to gain from their success.

I'd also add that if you're having trouble identifying some of the benefits for your organization, and even having trouble trying to get a threat assessment prioritized in your organization, have a look at the Cost of Cyber Crime Study that we've conducted across the Globe, United Kingdom, Germany, Australia, Japan and of course the US, was the third in the series, now we do it annually. You can get to hpenterprisesecurity.com and get a copy of that report and hopefully shift a few of the, maybe more intransigent people in your organization to action.
Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: HP.

You may also be interested in:

Tuesday, July 9, 2013

Want a data-driven business culture? Start sorting out the BI and big data myths now

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: Dell Software.

Debunking myths around big data should be a first step to making better business decisions for improving data analysis and data management capabilities in your company.

As the volume and purpose of data and business intelligence (BI) has dramatically shifted, older notions and misconceptions -- what amount to myths about data infrastructure -- need to updated and corrected, too.

So we're here to pose some better questions about data, and provide up-to-date answers for running data-driven businesses that can efficiently and repeatedly predict dynamic market trends and customer wants in real time.

As the volume and types of data that are brought to bear on business analytics advance, the means to manage and exploit that sea of data needs to be none too costly nor too complex for mid-size companies to master. There are better ways than traditional data architectures.

To help identify what works best around modern big data management, BriefingsDirect interviews Darin Bartik, Executive Director of Products in the Information Management Group at Dell Software. The discussion is conducted by Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: Dell is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: Are people losing sight of the business value by getting lost in speeds and feeds and technical jargon around big data? Is there some sort of a disconnect between the providers and consumers of big data?

Bartik: You hit the nail on the head with the first question.  We are experiencing a disconnect between the technical side of big data and the business value of big data, and that’s happening because we’re digging too deeply into the technology.

Bartik
With a term like big data, or any one of the trends that the information technology industry talks about so much, we tend to think about the technical side of it. But with analytics, with the whole conversation around big data -- what we've been stressing with many of our customers -- is that it starts with a business discussion. It starts with the questions that you're trying to answer about the business; not the technology, the tools, or the architecture of solving those problems. It has to start with the business discussion.

That’s a pretty big flip. The traditional approach to BI and reporting has been one of technology frameworks, and a lot of things that were owned more by the IT group. This is part of the reason why a lot of the BI projects of the past struggled, because there was a disconnect between the business goals and the IT methods.

So you're right. There has been a disconnect, and that’s what I've been trying to talk a lot about with customers -- how to refocus on the business issues you need to think about, especially in the mid-market, where you maybe don’t have as many resources at hand. It can be pretty confusing.

I've been a part of Dell Software since the acquisition of Quest Software. I was a part of that organization for close to 10 years. I've been in technology coming up on 20 years now. I spent a lot of time in enterprise resource planning (ERP), supply chain, and monitoring, performance management, and infrastructure management, especially on the Microsoft side of the world.

Most recently, as part of Quest, I was running the database management area -- a business very well-known for its products around Oracle, especially Toad, as well as our SQL Server management capabilities. We leveraged that expertise when we started to evolve into BI and analytics.

I started working with Hadoop back in 2008-2009, when it was still very foreign to most people. When Dell acquired Quest, I came in and had the opportunity to take over the Products Group in the ever-expanding world of information management. We're part of the Dell Software Group, which is a big piece of the strategy for Dell over all, and I'm excited to be here.

Part of the hype cycle

Without disparaging the vendors like us, or anyone else, the current confusion is part of the problem of any hype cycle. Many people jumped on the bandwagon of big data. Just like everyone was talking cloud. Everyone was talking virtualization, bring your own device (BYOD), and so forth.

Everyone jumps on these big trends. So it's very confusing for customers, because there are many different ways to come at the problem. This is why I keep bringing people back to staying focused on what the real opportunity is. It’s a business opportunity, not a technical problem or a technical challenge that we start with.
It’s not a size issue. It's really a trend that has happened as a result of digitizing so much more of the information that we all have already.

Gardner: Even the name "big data" stirs up myths right from the get-go, with "big" being a very relative term. Should we only be concerned about this when we have more data than we can manage? What is the relative position of big data and what are some of the myths around the size issue?

Bartik: That’s the perfect one to start with. The first word in the definition is actually part of the problem. "Big." What does big mean? Is there a certain threshold of petabytes that you have to get to? Or, if you're dealing with petabytes, is it not a problem until you get to exabytes

It’s not a size issue. When I think about big data, it's really a trend that has happened as a result of digitizing so much more of the information that we all have already and that we all produce. Machine data, sensor data, all the social media activities, and mobile devices are all contributing to the proliferation of data.

It's added a lot more data to our universe, but the real opportunity is to look for small elements of small datasets and look for combinations and patterns within the data that help answer those business questions that I was referencing earlier.

It's not necessarily a scale issue. What is a scale issue is when you get into some of the more complicated analytical processes and you need a certain data volume to make it statistically relevant. But what customers first want to think about is the business problems that they have. Then, they have to think about the datasets that they need in order to address those problems.

Big-data challenge

That may not be huge data volumes. You mentioned mid-market earlier. When we think about some organizations moving from gigabytes to terabytes, or doubling data volumes, that’s a big data challenge in and of itself.

Analyzing big data won't necessarily contribute to your solving your business problems if you're not starting with the right questions. If you're just trying to store more data, that’s not really the problem that we have at hand. That’s something that we can all do quite well with current storage architectures and the evolving landscape of hardware that we have.

We all know that we have growing data, but the exact size, the exact threshold that we may cross, that’s not the relevant issue.

Gardner: I suppose this requires prioritization, which has to come from the business side of the house. As you point out, some statistically relevant data might be enough. If you can extrapolate and you have enough to do that, fine, but there might be other areas where you actually want to get every little bit of possible data or information relevant, because you don't know what you're looking for. They are the unknown unknowns. Perhaps there's some mythology about all data. It seems to me that what’s important is the right data to accomplish what it is the business wants.

Bartik: Absolutely. If your business challenge is an operational efficiency or a cost problem, where you have too much cost in the business and you're trying to pull out operational expense and not spend as much on capital expense, you can look at your operational data.
There's a lot of variability and prioritization that all starts with that business issue that you're trying to address.

Maybe manufacturers are able to do that and analyze all of the sensor, machine, manufacturing line, and operational data. That's a very different type of data and a very different type of approach than looking at it in terms of sales and marketing.

If you're a retailer looking for a new set of customers or new markets to enter in terms of geographies, you're going to want to look at maybe census data and buying-behavior data of the different geographies. Maybe you want datasets that are outside your organization entirely. You may not have the data in your hands today. You may have to pull it in from outside resources. So there's a lot of variability and prioritization that all starts with that business issue that you're trying to address.

Gardner: Perhaps it's better for the business to identify the important data, rather than the IT people saying it’s too big or that big means we need to do something different. It seems like a business term rather than a tech term at this point.

Bartik: I agree with you. The more we can focus on bringing business and IT to the table together to tackle this challenge, the better. And it does start with the executive management in the organization trying to think about things from that business perspective, rather than starting with the IT infrastructure management team. 

Gardner: What’s our second myth?

Bartik: I'd think about the idea of people and the skills needed to address this concept of big data. There is the term "data scientist" that has been thrown out all over the place lately. There’s a lot of discussion about how you need a data scientist to tackle big data. But “big data” isn't necessarily the way you should think about what you’re trying to accomplish. Instead, think about things in terms of being more data driven, and in terms of getting the data you need to address the business challenges that you have. That’s not always going to require the skills of a data scientist.

Data scientists rare

I suspect that a lot of organizations would be happy to hear something like that, because data scientists are very rare today, and they're very expensive, because they are rare. Only certain geographies and certain industries have groomed the true data scientist. That's a unique blend between a data engineer and someone like an applied scientist, who can think quite differently than just a traditional BI developer or BI programmer.

Don’t get stuck on thinking that, in order to take on a data-driven approach, you have to go out and hire a data scientist. There are other ways to tackle it. That’s where you're going to combine people who can do the programming around your information, around the data management principles, and the people who can ask and answer the open-minded business questions. It doesn’t all have to be encapsulated into that one magical person that’s known now as the data scientist.

There are varying degrees of tackling this problem. You can get into very sophisticated algorithms and computations for which a data scientist may be the one to do that heavy lifting. But for many organizations and customers that we talk to everyday, it’s something where they're taking on their first project and they are just starting to figure out how to address this opportunity.
For that, you can use a lot of the people that you have inside your organization, as well potentially consultants that can just help you break through some of the old barriers, such as thinking about intelligence, based strictly on a report and a structured dashboard format.
Often a combination of programming and some open-minded thinking, done with a  team-oriented approach, rather than that single keyhole person, is more than enough to accomplish your objectives.

That’s not the type of approach we want to take nowadays. So often a combination of programming and some open-minded thinking, done with a  team-oriented approach, rather than that single keyhole person, is more than enough to accomplish your objectives.

Gardner: It seems also that you're identifying confusion on the part of some to equate big data with BI and BI with big data. The data is a resource that the BI can use to offer certain values, but big data can be applied to doing a variety of other things. Perhaps we need to have a sub-debunking within this myth, and that is that big data and BI are different. How would you define them and separate them?

Bartik: That's a common myth. If you think about BI in its traditional, generic sense, it’s about gaining more intelligence about the business, which is still the primary benefit of the opportunity this trend of big data presents to us. Today, I think they're distinct, but over time, they will come together and become synonymous.

I equate it back to one of the more recent trends that came right before big data, cloud. In the beginning, most people thought cloud was the public-cloud concept. What’s turned out to be true is that it’s more of a private cloud or a hybrid cloud, where not everything moved from an on-premise traditional model, to a highly scalable, highly elastic public cloud. It’s very much a mix.

They've kind of come together. So while cloud and traditional data centers are the new infrastructure, it’s all still infrastructure. The same is true for big data and BI, where BI, in the general sense of how can we gain intelligence and make smarter decisions about our business, will include the concept of big data.

Better decisions

So while we'll be using new technologies, which would include Hadoop, predictive analytics, and other things that have been driven so much faster by the trend of big data, we’ll still be working back to that general purpose of making better decisions.

One of the reasons they're still different today is because we’re still breaking some of the traditional mythology and beliefs around BI -- that BI is all about standard reports and standard dashboards, driven by IT. But over time, as people think about business questions first, instead of thinking about standard reports and standard dashboards first, you’ll see that convergence.

Gardner: We probably need to start thinking about BI in terms of a wider audience, because all the studies I've seen don't show all that much confidence and satisfaction in the way BI delivers the analytics or the insights that people are looking for. So I suppose it's a work in progress when it comes to BI as well.

Bartik: Two points on that. There has been a lot of disappointment around BI projects in the past. They've taken too long, for one. They've never really been finished, which of course, is a problem. And for many of the business users who depend on the output of BI -- their reports, their dashboard, their access to data -- it hasn’t answered the questions in the way that they may want it to.

One of the things in front of us today is a way of thinking about it differently. Not only is there so much data, and so much opportunity now to look at that data in different ways, but there is also a requirement to look at it faster and to make decisions faster. So it really does break the old way of thinking.
People are trying to make decisions about moving the business forward, and they're being forced to do it faster.

Slowness is unacceptable. Standard reports don't come close to addressing the opportunity in front us, which is to ask a business question and answer it with the new way of thinking supported by pulling together different datasets. That’s fundamentally different from the way we used to do it.

People are trying to make decisions about moving the business forward, and they're being forced to do it faster. Historical reporting just doesn't cut it. It’s not enough. They need something that’s much closer to real time. It’s more important to think about open-ended questions, rather than just say, "What revenue did I make last month, and what products made that up?" There are new opportunities to go beyond that.

Gardner: When it comes to these technology issues, do you also find, Darin, that there is a lack of creativity as to where the data and information resides or exists and thinking not so much about being able to run it, but rather acquire it? Is there a dissonance between the data I have and the data I need. How are people addressing that?

Bartik: There is and there isn’t. When we look at the data that we have, that’s oftentimes a great way to start a project like this, because you can get going faster and it’s data that you understand. But if you think that you have to get data from outside the organization, or you have to get new datasets in order to answer the question that’s in front of us, then, again, you're going in with a predisposition to a myth.

You can start with data that you already have. You just may not have been looking at the data that you already have in the way that’s required to answer the question in front of you. Or you may not have been looking at it all. You may have just been storing it, but not doing anything with it.
Storing data doesn’t help you answer questions. Analyzing it does.

Storing data doesn’t help you answer questions. Analyzing it does. It seems kind of simple, but so many people think that big data is a storage problem. I would argue it's not about the storage. It’s like backup and recovery. Backing up data is not that important, until you need to recover it. Recovery is really the game changing thing.

Gardner: It’s interesting that with these myths, people have tended, over the years, without having the resources at hand, to shoot from the hip and second-guess. People who are good at that and businesses that have been successful have depended on some luck and intuition. In order to take advantage of big data, which should lead you to not having to make educated guesses, but to have really clear evidence, you can apply the same principle. It's more how you get big data in place, than how you would use the fruits of big data.

It seems like a cultural shift we have to make. Let’s not jump to conclusions. Let’s get the right information and find out where the data takes us.

Bartik: You've hit on one of the biggest things that’s in front of us over the next three to five years -- the cultural shift that the big data concept introduces.

We looked at traditional BI as more of an IT function, where we were reporting back to the business. The business told us exactly what they wanted, and we tried to give that to them from the IT side of the fence.

Data-driven organization

But being successful today is less about intuition and more about being a data-driven organization, and, for that to happen, I can't stress this one enough, you need executives who are ready to make decisions based on data, even if the data may be counter intuitive to what their gut says and what their 25 years of experience have told them.

They're in a position of being an executive primarily because they have a lot of experience and have had a lot of success. But many of our markets are changing so frequently and so fast, because of new customer patterns and behaviors, because of new ways of customers interacting with us via different devices. Just think of the different ways that the markets are changing. So much of that historical precedence no longer really matters. You have to look at the data that’s in front of us.

Because things are moving so much faster now, new markets are being penetrated and new regions are open to us. We're so much more of a global economy. Things move so much faster than they used to. If you're depending on gut feeling, you'll be wrong more often than you'll be right. You do have to depend on as much of a data-driven decision as you can. The only way to do that is to rethink the way you're using data.

Historical reports that tell you what happened 30 days ago don't help you make a decision about what's coming out next month, given that your competition just introduced a new product today. It's just a different mindset. So that cultural shift of being data-driven and going out and using data to answer questions, rather than using data to support your gut feeling, is a very big shift that many organizations are going to have to adapt to.

Executives who get that and drive it down into the organization, those are the executives and the teams that will succeed with big data initiatives, as opposed to those that have to do it from the bottom up.
It's fair to say that big data is not just a trend; it's a reality. And it's an opportunity for most organizations that want to take advantage of it.

Gardner: Listening to you Darin, I can tell one thing that isn’t a product of hype is just how important this all is. Getting big data right, doing that cultural shift, recognizing trends based on the evidence and in real-time as much as possible is really fundamental to how well many businesses will succeed or not.

So it's not hype to say that big data is going to be a part of your future and it's important. Let's move towards how you would start to implement or change or rethink things, so that you can not fall prey to these myths, but actually take advantage of the technologies, the reduction in costs for many of the infrastructures, and perhaps extend and exploit BI and big data problems.

Bartik: It's fair to say that big data is not just a trend; it's a reality. And it's an opportunity for most organizations that want to take advantage of it. It will be a part of your future. It's either going to be part of your future, or it's going to be a part of your competition’s future, and you're going to be struggling as a result of not taking advantage of it.

The first step that I would recommend -- I've said it a few times already, but I don't think it can't be said too often -- is pick a project that's going to address a business issue that you've been unable to address in the past.

What are the questions that you need to ask and answer about your business that will really move you forward?" Not just, "What data do we want to look at?" That's not the question.

What business issue?

The question is what business issue do we have in front of us that will take us forward the fastest? Is it reducing costs? Is it penetrating a new regional market? Is it penetrating a new vertical industry, or evolving into a new customer set?

These are the kind of questions we need to ask and the dialogue that we need to have. Then let's take the next step, which is getting data and thinking about the team to analyze  it and the technologies to deploy. But that's the first step – deciding what we want to do as a business.

That sets you up for that cultural shift as well. If you start at the technology layer, if you start at the level of let's deploy Hadoop or some type of new technology that may be relevant to the equation, you're starting backwards. Many people do it, because it's easier to do that than it is to start an executive conversation and to start down the path of changing some cultural behavior. But it doesn’t necessarily set you up for success.

Gardner: It sounds as if you know you're going on a road trip and you get yourself a Ferrari, but you haven't really decided where you're going to go yet, so you didn’t know that you actually needed a Ferrari.

Bartik: Yeah. And it's not easy to get a tent inside a Ferrari. So you have to decide where you're going first. It's a very good analogy.
Get smart by going to your peers and going to your industry influencer groups and learning more about how to approach this.

Gardner: What are some of the other ways when it comes to the landscape out there? There are vendors who claim to have it all, everything you need for this sort of thing. It strikes me that this is more of an early period and that you would want to look at a best-of-breed approach or an ecosystem approach.

So are there any words of wisdom in terms of how to think about the assets, tools, approaches, platforms, what have you, or not to limit yourself in a certain way?

Bartik: There are countless vendors that are talking about big data and offering different technology approaches today. Based on the type of questions that you're trying to answer, whether it's more of an operational issue, a sales market issue, HR, or something else, there are going to be different directions that you can go in, in terms of the approaches and the technologies used.

I encourage the executives, both on the line-of-business side as well as the IT side, to go to some of the events that are the "un-conferences," where we talk about the big-data approach and the technologies. Go to the other events in your industry where they're talking about this and learn what your peers are doing. Learn from some of the mistakes that they've been making or some of the successes that they've been having.

There's a lot of success happening around this trend. Some people certainly are falling into the pitfalls, but get smart by going to your peers and going to your industry influencer groups and learning more about how to approach this.

Technical approaches

There are technical approaches that you can take. There are different ways of storing your data. There are different ways of computing and processing your data. Then, of course, there are different analytical approaches that get more to the open-ended investigation of data. There are many tools and many products out there that can help you do that.

Dell has certainly gone down this road and is investing quite heavily in this area, with both structured and unstructured data analysis, as well as the storage of that data. We're happy to engage in those conversations as well, but there are a lot of resources out there that really help companies understand and figure out how to attack this problem.

Gardner: In the past, with many of the technology shifts, we've seen a tension and a need for decision around best-of-breed versus black box, or open versus entirely turnkey, and I'm sure that's going to continue for some time.

But one of the easier ways or best ways to understand how to approach some of those issues is through some examples. Do we have any use cases or examples that you're aware of, of actual organizations that have had some of these problems? What have they put in place, and what has worked for them?
There are a lot of resources out there that really help companies understand and figure out how to attack this problem.

Bartik: I'll give you a couple of examples from two very different types of organizations, neither of which are huge organizations. The first one is a retail organization, Guess Jeans. The business issue they were tackling was, “How do we get more sales in our retail stores? How do we get each individual that's coming into our store to purchase more?”

We sat down and started thinking about the problem. We asked what data would we need to understand what’s happening? We needed data that helps us understand the buyer’s behavior once they come into the store. We don't need data about what they are doing outside the store necessarily, so let's look specifically at behaviors that take place once they get into the store.

We helped them capture and analyze video monitoring information. Basically it followed each of the people in the store and geospatial locations inside the store, based on their behavior. We tracked that data and then we compared against questions like did they buy, what did they buy, and how much did they buy. We were able to help them determine that if you get the customer into a dressing room, you're going to be about 50 percent more likely to close transactions with them.

So rather than trying to give incentives to come into the store or give discounts once they get into the store, they moved towards helping the store clerks, the people who ran the store and interacted with the customers, focus on getting those customers into a dressing room. That itself is a very different answer than what they might have thought of at first. It seems easy after you think about it, but it really did make a significant business impact for them in rather short order.

Now, they're also thinking about other business challenges that they have and other ways of analyzing data and other datasets, based on different business challenges, but that’s one example.

Another example is on the higher education side. In universities, one of the biggest challenges is having students drop out or reduce their class load. The fewer classes they take, or if they dropout entirely, it obviously goes right to the top and bottom line of the organization, because it reduces tuition, as well as the other extraneous expenses that students incur at the university.

Finding indicators

The University of Kentucky went on an effort to reduce students dropping out of classes or dropping entirely out of school. They looked at a series of datasets, such as demographic data, class data, the grades that they were receiving, what their attendance rates were, and so forth. They analyzed many different data points to determine the indicators of a future drop out.

Now, just raising the student retention rate by one percent would in turn mean about $1 million of top-line revenue to the university. So this was pretty important. And in the end, they were able to narrow it down to a couple of variables that strongly indicated which students were at risk, such that they could then proactively intervene with those students to help them succeed.

The key is that they started with a very specific problem. They started it from the university's core mission: to make sure that the students stayed in school and got the best education, and that's what they are trying to do with their initiative. It turned out well for them.

These were very different organizations or business types, in two very different verticals, and again, neither are huge organizations that have seas of data. But what they did are much more manageable and much more tangible examples  many of us can kind of apply to our own businesses.

Gardner: Those really demonstrate how asking the right questions is so important.
What we have today is a set of capabilities that help customers take more of a data-type agnostic view and a vendor agnostic view to the way they're approaching data and managing data.

Darin, we're almost out of time, but I did want to see if we could develop a little bit more insight into the Dell Software road map. Are there some directions that you can discuss that would indicate how organizations can better approach these problems and develop some of these innovative insights in business?

Bartik: A couple of things. We've been in the business of data management, database management, and managing the infrastructure around data for well over a decade. Dell has assembled a group of companies, as well as a lot of organic development, based on their expertise in the data center for years. What we have today is a set of capabilities that help customers take more of a data-type agnostic view and a vendor agnostic view to the way they're approaching data and managing data.

You may have 15 tools around BI. You may have tools to look at your Oracle data, maybe new sets of unstructured data, and so forth. And you have different infrastructure environments set up to house that data and manage it. But the problem is that it's not helping you bring the data together and cross boundaries across data types and vendor toolset types, and that's the challenge that we're trying to help address.

We've introduced tools to help bring data together from any database, regardless of where it may be sitting, whether it's a data warehouse, a traditional database, a new type of database such as Hadoop, or some other type of unstructured data store.

We want to bring that data together and then analyze it. Whether you're looking at more of a traditional structured-data approach and you're exploring data and visualizing datasets that many people may be working with, or doing some of the more advanced things around unstructured data and looking for patterns, we’re focused on giving you the ability to pull data from anywhere.

Using new technologies

We're investing very heavily, Dana, into the Hadoop framework to help customers do a couple of key things. One is helping the people that own data today, the database administrators, data analysts, the people that are the stewards of data inside of IT, advance their skills to start using some of these new technologies, including Hadoop.

It's been something that we have done for a very long time, making your C players B players, and your B players A players. We want to continue to do that, leverage their existing experience with structured data, and move them over into the unstructured data world as well.

The other thing is that we're helping customers manage data in a much more pragmatic way. So if they are starting to use data that is in the cloud, via Salesforce.com or Taleo, but they also have data on-prem sitting in traditional data stores, how do we integrate that data without completely changing their infrastructure requirements? With capabilities that Dell Software has today, we can help integrate data no matter where it sits and then analyze it based on that business problem.

We help customers approach it more from a pragmatic view, where you're  taking a stepwise approach. We don't expect customers to pull out their entire BI and data-management infrastructure and rewrite it from scratch on day one. That's not practical. It's not something we would recommend. Take a stepwise approach. Maybe change the way you're integrating data. Change the way you're storing data. Change, in some perspective, the way you're analyzing data between IT and the business, and have those teams collaborate.
But you don't have to do it all at one time. Take that stepwise approach.

But you don't have to do it all at one time. Take that stepwise approach. Tackle it from the business problems that you're trying to address, not just the new technologies we have in front of us.

There's much more to come from Dell in the information management space. It will be very interesting for us and  for our customers to tackle this problem together. We're excited to make it happen.
Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: Dell Software.

You may also be interested in: