It now falls to CIOs to not only rapidly adapt to cloud computing, but to find the ways to protect their employees and customers as they adopt cloud models – even as security threats grow.
This is a serious -- but not insurmountable challenge.
Cloud computing has clearly sparked the imagination of business leaders, who see it as a powerful new way to be innovative and gain first-mover advantages -- with or without traditional IT's consent.
This simply now means that the center of gravity for IT services is shifting toward the enterprise’s boundaries – moving increasingly outside their firewalls. And so how can companies have it both ways -- exploit cloud's promise but also provide enough security to make the risks acceptable? How can organizations retain rigor and control while pursuing cloud benefits?
In a special BriefingsDirect sponsored HP Expert Chat discussion on how to define and obtain improved security, I recently moderated an in-depth session with Tari Schreider, HP Chief Architect of HP Technology Consulting and IT Assurance Practice. Tari is a Distinguished Technologist with 30 years of IT and cyber security experience, and he has designed, built, and managed some of the world’s largest information protection programs.
In our discussion, you’ll see the latest recommendations for how to enable and protect the many cloud models being considered by companies the world over.
As part of our chat, we're also joined by three other HP experts, Lois Boliek, World Wide Manager in the HP IT Assurance Program; Jan De Clercq, World Wide IT Solution Architect in the HP IT Assurance Program; and Luis Buezo, HP IT Assurance Program Lead for EMEA. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]
If you understand the security risk, gain a detailed understanding of your own infrastructure, and follow proven reference architectures and methods, security can move from an inhibitor of cloud adoption to an enabler.
Here are some excerpts from our discussion on how to make the move:
Schreider: It's always a pleasure to be able to chat about some of the technology issues of the day, and certainly cloud computing protection is the topic that’s top of mind for many of our customers.
I want to begin talking about the four immutable laws of cloud security. For those of you who have been involved in information security over time, you understand that there is a certain level of immutability that is incumbent within security. These are things that will always be, things that will never change, and it is a state of being.
When we started working on building clouds at HP a few years ago, we were also required to apply data protection and security controls around those platforms we built. We understood that the same immutable laws that apply to security, business continuity, and disaster recovery extended into the cloud world.
First is an understanding that if your data is hosted in the cloud, you no longer directly control its privacy and protection. You're going to have to give up a bit of control, in order to achieve the agility, performance, and cost savings that a cloud ecosystem provides you.
The next immutable law is that when your data is burst into the cloud, you no longer directly control where the data resides or is processed.
One of the benefits of cloud-based computing is that you don’t have to have all of the resources at any one particular time. In order to control your costs, you want to have an infrastructure that supports you for daily business operations, but there are ebbs and flows to that. This is the whole purpose of cloud bursting. For those of you who are familiar with grid-based computing, the models are principally the same.
Different locations
Rather than your data being in one or maybe a secondary location, it could actually be in 5, 10, or maybe 30 different locations, because of bursting, and also be under the jurisdiction of many different rules and regulations, something that we're going to talk about in just a little bit.
The next immutable law is that if your security controls are not contractually committed to, then you may not have any legal standing in terms of the control over your data or your assets. You may feel that you have the most comprehensive security policy that is rigorously reviewed by your legal department, but if that is not ensconced in the terminology of the agreement with a service provider, then you don’t have the standing that you may have thought you had.
The last immutable law is that if you don’t extend your current security policies and controls in the cloud computing platform, you're more than likely going to be compromised.
You want to resist trying to create two entirely separate, disparate security programs and policy manuals. Cloud-based computing is an attribute on the Internet. Your data and your assets are the same. It’s where they reside and how they're being accessed where there is a big change. We strongly recommend that you build that into your existing information security program.
Gardner: Tari, these are clearly some significant building blocks in moving towards cloud activities, but as we think about that, what are the top security threats from your perspective? What should we be most concerned about?
The reason to move to cloud is for making data and assets available anywhere, anytime.
Schreider: Dana, we have the opportunity to work with many of our customers who, from time to time, experience breaches of security. As you might imagine, HP, a very large organization, has literally hundreds of thousands of customers around the world. This provides us with a unique vantage point to be able to study the morphology of cloud computing platform, security, outages, and security events.
One of the things that we also do is take the pulse of our customer base. We want to know what’s keeping them up at night. What are the things that they're most concerned with? Generally, we find that there is a gap between what actually happens and what people believe could happen.
I want to share with you something that we feel is particularly poignant, because it is a direct interlock between what we're seeing actually happening in the industry and also what keeps our clients up late at night.
First and foremost, there's the ensured continuity of the cloud-computing platform. The reason to move to cloud is for making data and assets available anywhere, anytime, and also being able to have people from around the world accept that data and be able to solve business needs.
If the cloud computing platform is not continuously available, then the business justification as to why you went there in the first place is significantly mooted.
Loss of GRC control
Next is the loss of span of governance, risk management, and compliance (GRC) control. In today’s environment, we can build an imperfect program and we can have a GRC management program with dominion over our assets and our information within our own environment.
Unfortunately, when we start extending this out into a cloud ecosystem, whether private, public, or hybrid, we don’t necessarily have the same span of control that we have had before. This requires some delicate orchestration between multiple parties to ensure that you have the right governance controls in place.
The next is data privacy. Much has been written on data privacy and protection across the cloud ecosystem. Today, you may have a data privacy program that’s designed to address the security and privacy laws of your specific country or your particular state that you might reside in.
However, when you're moving into a cloud environment, that data can now be moved or burst anywhere in the world, which means that you could be violating data-privacy laws in another country unwittingly. This is something that clients want to make sure that they address, so it does not come back in terms of fines or regulatory penalties.
Mobility access is the key to the enablement of the power of the cloud. It could be a bring-your-own-device (BYOD) scenario, or it could be devices that are corporately managed. Basically you want to provide the data and put it in the hands of the people.
You have to make sure that you have an incident-response plan that recognizes the roles and responsibilities between owner and custodian.
Whether they're out on an oil platform and they need access to data, or whether it’s the sales force that need access to Salesforce.com data on BlackBerrys, the fact remains that the data in the cloud has to land on those mobile devices, and security is an integral part.
You may be the owner of the data, but there are many custodians of the data in a cloud ecosystem. You have to make sure that you have an incident-response plan that recognizes the roles and responsibilities between owner and custodian.
Gardner: Tari, the notion of getting control over your cloud activities is important, but a lot of people get caught up in the devil in the details. We know that cloud regulations and laws change from region to region, country to country, and in many cases, even within companies themselves. What is your advice, when we start to look at these detailed issues and all of the variables in the cloud?
Schreider: Dana, that is a central preoccupation of law firms, courts, and regulatory bodies today. What tenets of law apply to data that resides in the cloud? I want to talk about a couple of areas that we think are the most crucial, when putting together a program to secure data from a privacy perspective.
Just as you have to have order in the courts, you have to have order in the clouds. First and foremost, and I alluded to this earlier, is that the terms and conditions of the cloud computing services are really what adjudicates the rights, roles, and responsibilities between a data owner and a data custodian.
Choice of law
However, within that is the concept of choice of law. This means that, wherever the breach of security occurs, the courts can actually go to the choice of the law, which means whatever is the law of the land where the data resides, in order to determine who is at fault and at breach of security.
This is also true for data privacy. If your data resides in your home location, is that the choice of law by which you follow the data privacy standards? Or if your data is burst, how long does this have to be in that other jurisdiction before it is covered by that choice of law? In either case, it is a particularly tricky situation to ensure that you understand what rules and regulations apply to you.
The next one is transporter data flow triggers. This is an interesting concept, because when your data moves, if you do a data-flow analysis for a cloud ecosystem, you'll find that the data can actually go across various borders, going from jurisdiction to jurisdiction.
The data may be created in one jurisdiction. It may be sent to another jurisdiction for processing and analysis, and then may be sent to another location for storage, for intermediate use, and yet a fourth location for backup, and then possibly a fifth location for a recovery site.
This is not an atypical example. You could have five triggering events across five different borders. So you have to understand the legal obligations in multiple jurisdictions.
The onus is predominantly placed on the owner of the data for the integrity of the data. The CSP basically wants no direct responsibility for maintaining the integrity of that data.
The next one is reasonable security, which is, under the law, what would a prudent person do? What is reasonable under the choice of law for that particular country? When you're putting together your own private cloud, in which you may have a federated client base, this ostensibly makes you a cloud service provider (CSP).
Or, in an environment where you are using several CSPs, what are the data integrity disclaimers? The onus is predominantly placed on the owner of the data for the integrity of the data, and after careful crafting of terms and conditions, the CSP basically wants no direct responsibility for maintaining the integrity of that data.
When we talk about who owns the data, there is an interesting concept, and there are a few test cases that are coursing their way through various courts. It’s called the Berne Convention.
In the late 1990s, there were a number of countries that got together and said, "Information is flowing all over the place. We understand copyright protection for works of art and for songs and those types of things, but let’s take it a step further."
In the context of a cloud, could not the employees of an organization be considered authors, and could not the data they produce be considered work? Therefore wouldn’t it be covered by the Berne Convention, and therefore be covered under standard international copyright laws. This is also something that’s interesting.
Modify policies
The reason that I bring this to your attention is that it is this kind of analysis that you should do with your own legal counsel to make sure that you understand the full scope of what’s required and modify your existing security policies.
The last point is around electronic evidence and eDiscovery. This is interesting. In some cases it can be a dual-edged sword. If I have custody of the data, then it is open under the rules of discovery. They can actually request that I produce that information.
However, if I don’t directly have control of that data, then I don’t have the right, or I don’t have the obligation, to turn it over under eDiscovery. So you have to understand what rules and regulations apply where the data is, and that, in some cases, it could actually work to your advantage.
Join the next Expert Chat presentation on May 15 on support automation best practices.Different risk profiles
Your risk profile may be different, if you are the custodian, versus the risk profile if you're the owner of the data. This is something that you can very easily put together and present to your executives. It allows you to model the safeguards and controls to protect the cloud ecosystem.
Gardner: We certainly know that there is a great deal of opportunity for cloud models, but unfortunately, there is also significant down side, when things don’t go well. You're exposed. You're branded in front of people. Social media allows people to share issues when they arise. What can we learn from the unfortunate public issues that have cropped up in the past few years that allows us to take steps to prevent that from happening to us?
Schreider: These are all public events. We've all read about these events over the last 16-18 months, and some of them have occurred within just the last 30 days or so. This is not to admonish anybody, but basically to applaud these companies that have come forward in the interest of security. They've shared their postmortem of what worked and what didn’t work.
What goes up can certainly come down. Regardless of the amount of investment that one can put into protecting their cloud computing environment, nobody is immune, whether it’s a significant and pervasive hacking attempt against an organization, where sensitive data is exfiltrated, or whether it is a service-oriented cloud platform that has an outage that prevents people from being able to board a plane.
When an outage happens in your cloud computing environment, it definitely has a reverberation effect. It’s almost a digital quake, because it can affect people from around the world.
You want to make sure that you have a secure system development lifecycle methodology to ensure that the application is secure and has been tested for all conventional threats and vulnerabilities.
One of the things that I mentioned before is that we're very fortunate that we have that opportunity to look at disaster events and breaches of security and study what worked and what didn’t.
I've put together a little model that would reanalyze the storm damage. if you look at the types of major events that have occurred. I've looked at the control construct that would exist, or should exist, in a private cloud and the control construct that should exist in a public cloud, and of course in a hybrid cloud. It's the convergence of the two, and we would be able to mix and match those.
If you have a situation where you have an external threat that infiltrates an application, hacks into it, compromises an application, in a private cloud environment, you want to make sure that you have a secure system development lifecycle methodology to ensure that the application is secure and has been tested for all conventional threats and vulnerabilities.
In a public cloud environment, you normally don’t have that same avenue available to you. So you want to make sure that you either have presented to you, or on behalf of the service provider, have a web-application security review, external threat and vulnerability test.
In a cloud environment, where you are dealing in the situation of grouping many different customers and users together, you have to have a basis to be able to segregate data and operation, so that one of that doesn’t affect everybody.
The level of investment that you make in protecting your cloud environment should be commensurate with the value of the assets that are being burst or hosted in that cloud environment.
Protection through layers
We're a big believer in, whether it's cloud or just in security, having an information technology architecture that's defined by layers. What is the business rationale for the cloud and what are we trying to protect? How should it work together functionally? Technically, what types of products and services will we use, and then how will it all be implemented?
We also have a suite of products that we can bring to our cloud computing environment to ensure that we're securing and providing governance, securing applications, and then also trying to detect breaches of security. I've talked about our reference architecture.
Something that's also unique is our P5 Model, where basically we look at the cloud computing controls and we have an abstraction of five characteristics that should be true to ensure that they are deployed correctly.
As I mentioned before, we're either a principal member, contributing member, or founding member of virtually every cloud security standards organization that's out there. Once again, we can't do it by ourselves, and that's why we have strategic partners with VMwares and the Symantecs of the world.
Gardner: There's a question here about key challenges regarding data lifecycle specifically. How do you view that? What are some of the issues about secure data, even across the data lifecycle?
Key challenges
Luis Buezo: Based on CSA recommendations, we're not only talking about data security related to confidentiality, integrity, and availability, but there are other key challenges in the cloud like location of the data to guarantee that the geographical locations are permitted by regulations.
There's data permanence, in order to guarantee that data is effectively removed, for example, when moving from one CSP to a new one, or data backup and recovery schemes. Don't assume that cloud-based data is backed up by default.
There are also data discovery capabilities to ensure that all data requested by authorities can be retrieved.
Another example is data aggregation on inference issues. This will be implemented to prevent revealing protected information. So there are many issues with having data lifecycle management.
Gardner: Our next question is about being cloud ready for dealing with confidential company data, how do you come down on that?
Jan De Clercq: HP's vision on that is that we think that many cloud service today are not always ready for letting organizations store their confidential or important data. That's why we recommend to organizations, before they consider moving data into the cloud, to always do a very good risk assessment.
They should make sure that they clearly understand the value of their data, but also understand the risks that can occur to that data in the cloud provider’s environment. Then, based on those three things, they can determine whether they should move their data into the cloud.
We also recommend that consumers get clear insights from the CSP on exactly where their organization's data is stored and processed, and where travels inside the network environment of the job provider.
As a consumer you need to get a complete view on what's done with your data and how the CSP is protecting them.
Gardner: Okay, Jan, what are essential data protection security controls that they should look for from their provider?
Clercq: It’s important that you have security controls in place that protect the entire data lifecycle. By data lifecycle we mean from the moment that the data is created to the moment that the data is destroyed.
Data creation
When data is created it’s important that you have a data classification solution in place and that you apply proper access controls to the data. When the data is stored, you need confidentiality, integrity, and availability protection mechanisms in place. Then, you need to look at things like encryption tools, and information rights management tools.
When the data is in use, it’s important that you have proper access control in place,so that you can make sure that only authorized people can access the data. When the data is shared, or when it’s sent to another environment, it’s important that you have things like information rights management or data loss prevention solutions in place.
When the data is archived, it’s important that it is archived in a secured way, meaning that you have proper confidentiality, integrity, and availability protection.
When the data is destroyed, it’s important, as a consumer, that you make sure that the data is really destroyed on the storage systems of your CSP. That’s why you need to look at things like crypto-shredding and other data destruction tools.
Gardner: Tari, how does cloud computing change my risk profile? It's a general subject, but do you really reduce or lose risk control when you start doing cloud?
When the data is destroyed, it’s important, as a consumer, that you make sure that the data is really destroyed on the storage systems of your CSP.
Schreider: An interesting question to be sure, because in some cases, your risk profile could be vastly improved. In other cases, it could be significantly diminished. If you find yourself no longer in a position to be able to invest in a hardened data center, it may be more prudent for you to move your data to a CSP that is already classified as a data-carrier grade, Tier 1 infrastructure, where they have the ability to invest the tens of millions of dollars for a hardened facility that you wouldn’t normally be able to invest yourself.
On the other hand, you may have a scenario where you're using smaller CSPs that don’t necessarily have that same level of rigor. We always recommend, from a strategic perspective when you are looking at application deployment, you consider its risk profile and where best to place that application and how it affects your overall threat posture.
Gardner: Lois, how can HP help clients get started, as they determine how and when to implement cloud?
Lois Boliek: We offer a full lifecycle of cloud-related services and we can help clients get started on their transition to the cloud, no matter where they are in that process.
We have the Cloud Discovery Workshop. That’s where we can help customers in a very interactive work session on all aspects of considerations of the cloud, and it will result in a high-level strategy and a roadmap for helping to move forward.
Business/IT alignment
We also offer the Hybrid Delivery Strategy Services. That’s where we drill down into all the necessary components that you need to gain business and IT alignment, and it also results in a well-defined cloud service delivery model.
We also have some fast-start services. One of those is the CloudStart service, where we come in with a pre-integrated architecture to help speed up the deployment of the production-ready private cloud, and we can do that in less than 30 days.
We also offer a Cloud System Enablement service, and in this we can help fast track setting up the initial cloud service catalog development, metering, and reporting.
Gardner: Does HP have the services to implement protection in the cloud?
Boliek: We believe in building security into the cloud environment from the beginning through our architectures and our services. We offer something called HP Cloud Protection Program, and what we have done is extended the cloud service offerings that I've just mentioned by addressing the cloud security threats and vulnerabilities.
We always recommend that you consider its risk profile and where best to place that application and how it affects your overall threat posture.
We've also integrated a defense in depth approach to cloud infrastructure. We address the people, process, policies, products improved, and the P5 Model that Tari covered, and this is just to help to address confidently and securely build out the hybrid cloud environment.
We have service modules that are available, such as the Cloud Protection Workshop. This is for deep-dive discussions on all the security aspects of cloud, and it results in a high-level cloud security strategy and next steps.
We offer the Cloud Protection Roadmap Service, where we can define the specific control recommendations, also based on our P5 Model, and a roadmap that is very customized and specific to our clients’ risk and compliance requirements.
We have a Foundation Service that is also like a fast start, specific to implementing the pre-integrated, hardened cloud infrastructure, and we mitigate the most common cloud security threats and vulnerabilities.
Then, for customers who require very specific custom security, we can do custom design and implementation. All these services are based on the Cloud Reference Architecture that Jan and Tari mentioned earlier, as well as extensive research that we do ahead of time, before coming out with customers with our Cloud Protection Research & Development Center.
Gardner: Tari, what should users do to determine how good their service provider is when it comes to these security issues?
Incumbent on us
Schreider: I wish we did have a rating system, but unfortunately, it's still incumbent upon us to determine the veracity of the claims of security and continuity of the CSPs.
However, there are actually a number of accepted methods to gauge whether one's CSP is secure. Many organizations have had what's referred to as an attestation. Formally, most people are familiar with SAS 70, which is now SSAE 16, or you can have an ISO 27000.
Basically, you have an independent attestation body, typically an auditing firm, that will come in and test the operational efficiency and design of your security program to ensure that whatever you have declared as your control schema, maybe ISO, NIST, CSA, is properly deployed.
However, there is a fairly significant caveat here. These attestations can also be very narrowly scoped, and many of the CSPs will only attach it to a very narrow portion of their infrastructure, maybe not their entire facility, and maybe not even the application that you're a customer of.
Also, we found that CSPs many application-as-service providers don’t even own their own data centers. They're actually provided elsewhere, and there also may be some support mechanisms in place. In some cases, you may have to evaluate three attestations just to have a sense of security that you have the right controls in place, or the CSP does.
You may also be interested in:
- Expert Chat with HP on How IT Can Enable Cloud While Maintaining Control and Governance
- Expert Chat on How HP Ecosystem Provides Holistic Support for VMware Virtualized IT Environments
- Continuous Improvement and Flexibility Are Keys to Successful Data Center Transformation, Say HP Experts
- HP's Liz Roche on Why Enterprise Technology Strategy Must Move Beyond the 'Professional' and 'Consumer' Split
- Well-Planned Data Center Transformation Effort Delivers IT Efficiency Paybacks, Green IT Boost for Valero Energy
- Hastening Trends Around Cloud, Mobile Push Application Transformation as Priority, Says Research