Tuesday, August 23, 2016

Big data and cloud combo spark momentous genomic medicine advances at HudsonAlpha

The next BriefingsDirect Voice of the Customer IT innovation case study explores how the HudsonAlpha Institute for Biotechnology engages in digital transformation for genomic research and healthcare paybacks.

We'll learn how HudsonAlpha leverages modern IT infrastructure and big-data analytics to power a pioneering research project incubator and genomic medicine innovator.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe new possibilities for exploiting cutting-edge IT infrastructure and big data analytics for potentially unprecedented healthcare benefits, we're joined by Dr. Liz Worthey, Director of Software Development and Informatics at the HudsonAlpha Institute for Biotechnology in Huntsville, Alabama. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: It seems to me that genomics research and IT have a lot in common. There's not much daylight between them -- two different types of technology, but highly interdependent. Have I got that right?

Worthey: Absolutely. It used to be that the IT infrastructure was fairly far away from the clinic or the research, but now they're so deeply intertwined that it necessitates many meetings a week between the leadership of both in order to make sure we get it right.

Gardner: And you have background in both.

Worthey: My background is primarily on the biology side, although I'm Director of Informatics and I've spent about 20 years working in the software-development and informatics side. I'm not IT Director, but I'm pretty IT savvy, because I've had to develop that skill set over the years. My undergraduate degree was in immunology, and since then, my focus has really been on genetics informatics and bioinformatics.

Gardner: Please describe what genetic informatics or genomic informatics is for our audience.

Worthey: Since 2003, when we received the first version of a human reference genome, there's been a large field involved in the task of extracting knowledge that can be used for society and health from genomic data.

Worthey
A [human] genome is 3.2 billion nucleotides in length, and in there, there's a lot of really useful information. There's information about which diseases that individual may be more likely to get and which diseases they will get.

It’s also information about which drugs they should and shouldn't take; information about which types of procedures, surveillance procedures, what colonoscopies they should have. And so, the clinical aspects of genomics are really developing the analytical capabilities to extract that data in real time so that we can use it to help an individual patient.

On top of that, there's also a lot of research. A lot of that is in large-scale studies across hundreds of thousands of individuals to look for signals that are more difficult to extract from a single genome. Genomics, clinical genomics, is all of that together.

Parallel trajectory

Gardner: Where is the societal change potential in terms of what we can do with this information and these technologies?

Worthey: Genomics has existed for maybe 20 years, but the vast majority of that was the first step. Over the last six years, we've taken maybe the second or third step in a journey that’s thousands of steps long.

We're right on the edge. We didn’t used to be able to do this, because we didn't have any data. We didn't have the capability to sequence a genome cheaply enough to sequence lots. We also didn't have the storage capabilities to store that data, even if we could produce it, and we certainly didn't have enough compute to do the analysis, infrastructure-wise. On top of that, we didn’t actually have the analytical know-how or capabilities either. All of that is really coalescing at the same time.
Start Your HPE Vertica
Community Edition Trial
As we are doing genomics, and that technology and the sequencing side has come up, the compute and the computing technologies have come up at the time. They're feeding each other, and genomics is now driving IT to think about things in a very different way.

Gardner: Let's dive into that a little bit. What are the hurdles technologically for getting to where you want to be, and how do you customize that or need to customize that, for your particular requirements?

Worthey: There are a number of hurdles. Certainly, there are simpler hurdles that we have to get past, like storage, storage tied with compression. How do you compress that data to where you can store millions of genomes at a price that's affordable.

A bigger hurdle is the ability to query information at a lot of disparate sites. When we think about genomic medicine, one of the things that we really want do is share data between institutions that are geographically diverse. And the data that we want to share is millions of data points, each of which has hundreds or thousands of annotations or curations.
When we think about genomic medicine, one of the things that we really want do is share data between institutions that are geographically diverse.

Those are fairly complex queries, even when you're doing it in one site, but in order to really change the practice of medicine, we have to be able to do that regionally, nationally, and globally. So, the analytics questions there are large.

We have 3.2 billion data points for each individual. The data is quite broad, but it’s also pretty deep. One of the big problems is that we don’t have all the data that we need to do genomic medicine. There's going to be data mining -- generate the data, form a hypothesis, look at the data, see what you get, come back with a new hypothesis, and so on.

Finally, one of the problems that we have is that a lot of algorithms that you might use only exists in the brains of MDs, other clinical folks, or researchers. There is really a lot of human computer interaction work to be done, so that we can extract that knowledge.

There are lots of problems. Another big problem is that we really want to put this knowledge in the hands of the doctor while they have seven minutes to see the patient. So, it’s also delivery of answers at that point in time, and the ability to query the data by the person who is doing the analysis, which ideally will be an MD.

Cloud technology

Gardner: Interestingly, the emergence of cloud methods and technology over the past five or 10 years would address some of those issues about distributing the data effectively -- and also perhaps getting actionable intelligence to a physician in an actual critical-care environment. How important is cloud to this process and what sort of infrastructure would be optimal for the types of tasks that you have in mind?

Worthey: If you had asked me that question two years ago, on the genomic medicine side, I would have said that cloud isn't really part of the picture. It wasn't part of the picture for anything other than business reasons. There were a lot of questions around privacy and sharing of healthcare information, and hospitals didn’t like the idea.

They're very reluctant to move to the cloud. Over the last two years, that has started to change. Enough of them had to decide to do it, before everybody would view it as something that was permissible.

Cloud is absolutely necessary in many ways, because we have periods where lots of data that has to be computed and analytics has to be run. Then, we have periods where new information is coming off the sequencer. So, it’s that perfect crest and trough.

If you don't have the ability to deal with that sort of fluctuation, if you buy a certain amount of hardware and you only have it available in-house, your pipeline becomes impacted by the crests and then often sits idle for a long time.
Start Your HPE Vertica
Community Edition Trial
But it’s also important to have stuff in-house, because sometimes, you want to do things in a different way. Sometimes, you want to do things in a more secure manner.

It's kind of our poster child for many of the new technologies that are coming out that look at both of those, that allow you to run things in-house and then also allow you to run the same jobs on the same data in the cloud as well. So, it’s key.

Gardner: That brings me to the next question about this concept of genomics as a service or a platform to support genomics as a service. How do you envision that and how might that come about?

Worthey: When we think about the infrastructure to support that, it has to be something flexible and it has to be provided by organizations that are able to move rapidly, because the field is moving really quickly.

It has to be infrastructure that supports this hypothesis-driven research, and it has to be infrastructure that can deal with these huge datasets. Much of the data is ordered, organized, and well-structured, but because it's healthcare, a lot of the information that we use as part of the interpretation phase of genomic medicine is completely unstructured. There needs to be support for extraction of data from silos.

My dream is that the people who provide these technologies will also help us deal with some of these boundaries, the policy boundaries, to sharing data, because that’s what we need to do for this to become routine.

Data and policy

Gardner: We've seen some of that when it comes to other forms of data, perhaps in the financial sector. More and more, we're seeing tokenization, authentication, and encryption, where data can exist for a period of time with a certain policy attached to it, and then something will happen if the data is a result for that policy. Is that what you're referring to?

Worthey: Absolutely. It's really interesting to come to a meeting like HPE Discover because you get to see what everybody else is doing in different fields. Much of the things that people in my field have regarded as very difficult are actually not that hard at all; they happen all the time in other industries.

A lot of this -- the encryption, the encrypted data sharing, the ability to set those access controls in a particular way that only lasts for a certain amount of time for a particular set of users -- seems complex, but it happens all the time in other fields. A big part of this is talking to people who have a lot of experience in a regulated environment. It’s just not this regulated environment and learning the language that they use to talk to the people that set policy there and transferring that to our policy makers and ideally getting them together to talk to one another.

Gardner: Liz, you mentioned the interest layers in getting your requirements to the technology vendors, cloud providers, and network providers. Is that under way? Is that something that's yet to happen? Where is the synergy between the genomic research community and the technology-vendor platform provider community?
This is happening fast. For genomics, there's been a shift in the volume of genomic data that we can produce with some new sequencing technology that's coming.

Worthey: This is happening fast. For genomics, there's been a shift in the volume of genomic data that we can produce with some new sequencing technology that's coming. If you're a provider of hardware or service user solutions to deal with big data, looking at genomics, as the people here are probably going to overtake many of those other industries in terms of the volume and complexity of the data that we have.

The reason that that's really interesting is because then you get invited to come and talk at forums, where there's lots of technology companies and you make them aware of the work that has to be done in the field of medicine, and in genomic research, and then you can start having those discussions.

A lot of the things that those companies are already doing, the use cases, are similar and maybe need some refinement, but a lot of that capability is already there.

Gardner: It's interesting that you’ve become sort of the “New York” of use cases. If you can make it there, you can make it anywhere. In other words, if we can solve this genomic data issue and use the cloud fruitfully to distribute and gather -- and then control and monitor the data as to where it should be under what circumstances -- we can do just about anything.

Correct me if I am wrong, though. We're using data in the genomic sense for population groups. We're winnowing those groups down into particular diseases. How farfetched is it to think about individuals having their own genomic database that would follow them like an authenticated human design? Is that completely out of the bounds? How far would that possibly be?

Technology is there

Worthey: I’ve had my genome sequenced, and it’s accessible. I could pick it up and look at it on the tools that I developed through my phone sitting here on the table. In terms of the ability to do that, a lot of that technology is already here.

The number of people that are being sequenced is increasing rapidly. We're already using genomics to make diagnosis in patients and to understand their drug interactions. So, we are here.

One of the things that we are talking about just now is, at what point in a person’s life should you sequence their genome. I and a number of other people in the field believe that that is earlier, rather than later, before they get sick. Then, we have that information to use when they get those first symptoms. You are not waiting until they're really ill before you do that.

I can’t imagine a future where that's not what's going to happen, and I don’t think that future is too far away. We're going to see it in our lifetimes, and our children are definitely going to see it in theirs.
The data that we already have, clinical information, is really for that one person, but your genome is shared among your family, even distant relatives that you’ve never met.

Gardner: The inhibitors, though, would be more of an ethical nature, not a technological nature.

Worthey: And policy, and society; the society impact of this is huge.

The data that we already have, clinical information, is really for that one person, but your genome is shared among your family, even distant relatives that you’ve never met. So, when we think about this, there are many very hard ethical questions that we have to think about. There are lots of experts that are working on that, but we can’t let that get in the way of progress. We have to do it. We just have to make sure we do it right.

Gardner: To come back down a little bit toward the technology side of things, seeing as so much progress has been made and that there is the tight relationship between information technology and some of the fantastic things that can happen with the proper knowledge around genomic information, can you describe the infrastructure you have in place? What’s working? What do you use for big-data infrastructure, and cloud or hybrid cloud as well?

Worthey: I'm not on the IT side, but I can tell you about the other side and I can talk a little bit on the IT side as well. In terms of the technologies that we use to store all of that varying information, we're currently using Hadoop and Mongo DB. We finished our proof of concept with HPE, looking at their Vertica solution.

We have to work out what the next steps might be for our proof of concept. Certainly, we’re very interested in looking at the solutions that they have in here. They fit with our needs. The issue that’s been addressed on that side is lots of variants, complex queries, that you need to answer really fast.
Start Your HPE Vertica
Community Edition Trial
On the other side, one of the technological hurdles that we have to meet is the unstructured data. We have electronic health record (EHR) information that’s coming in. We want to hook up to those EHRs and we want to use systems to process that data to make it organized, so that we can use it for the interpretation part.

In-house solution

We developed in-house solutions that we're using right now that allow humans to come in and look at that data and select the terms from it. So, you’d select disease terms. And then, we have in-house solutions to map them to the genomic side. We're looking at things like HPE’s IDOL as a proof-of-concept (POC) on that side. We're talking to some EHR companies about how to hook up the EHR to those solutions to our software to make it a seamless product and that would give us all that.

In terms of hardware, we do have HPE hardware in-house. I think we have 12 petabytes of their storage. We also have data direct network hardware, a general parallel file system solution. We even have things down to graphics processors for some of the analysis that we do. We’ve a large deck of such GPUs because in some cases it’s much faster for some other types of problems that we have to solve. So we are pretty IT-rich, a lot of heavy investment on the IT side.

Gardner: And cloud -- any preference to the topology that works for you architecturally for cloud, or is that still something you are toying with?
We not only do the research and the clinical, but we also have a lab that produces lots of data for other customers, a lab that produces genomic data as a service.

Worthey: We're currently looking at three different solutions that are all cloud solutions. We not only do the research and the clinical, but we also have a lab that produces lots of data for other customers, a lab that produces genomic data as a service.

They have a challenge of getting that amount of data returned to customers in a timely fashion. So, there are solutions that we're looking at there. There are also, as we talked at the start, solutions to help us with that in-flow of the data coming off the sequencers and the compute -- and so we're looking at a number of different solutions that are cloud-based to solve some of those challenges.

Gardner: Before we close, we’ve talked about healthcare and population impacts, but I should think there's also a commercial aspect to this. That kind of information will lend itself to entrepreneurial activities, products and services, a great demand in the marketplace? Is that something you're involved with as well, and wouldn’t that help foot the bill for some of these many costly IT infrastructure investments?

Worthey: One of the ways that HudsonAlpha Institute was set up was just that model. We have a research, not-for-profit side, but we also have a number of affiliate companies that are for-profit, where intellectual property and ideas can go across to that site and be used to generate revenue that fund the research and keep us moving and be on the cutting-edge.

We do have a services lab that does genomic sequencing in analytics. You can order that from them. We also service a lot of people who have government contracts for this type of work. And then, we have an entity called Envision Genomics. For disclosure, I'm one of founders of that entity. It’s focused on empowering people to do genomic medicine and working with lots of different solution providers to get genomic medicine being done everywhere it’s applicable.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Tuesday, August 16, 2016

Cybersecurity crosses the chasm: How IT now looks to the cloud for best security

The next BriefingsDirect cybersecurity innovation and transformation panel discussion explores how cloud security is rapidly advancing, and how enterprises can begin to innovate and prevail over digital disruption by increasingly using cloud-defined security.

We'll examine how a secure content collaboration services provider removes the notion of organizational boundaries so that businesses can better extend processes. And we'll hear how less boundaries and cloud-based security together support transformative business benefits.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To share how security technology leads to business innovations, we're joined by Daren Glenister, Chief Technology Officer at Intralinks in Houston, and Chris Steffen, Chief Evangelist for Cloud Security at HPE. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Daren, what are the top three trends driving your need to extend security and thereby preserve trust with your customers?

Glenister
Glenister: The top thing for us is speed of business, people being able to do business beyond boundaries, and how can they enable the business rather than just protect it. In the past, security has always been about how we shut things down and stop data. But now it's how we do it all securely, and how we perform business outside of the organization. So, it's enabling business.

The second thing we've seen is compliance. Compliance is a huge issue for most of the major corporations. You have to be able to understand where the data is and who has access to it, and to know who's using it and make sure that they can be completely compliant.

The third thing is primarily around the shift between security inside and outside of the organization. It's been a fundamental shift for us, and we've seen that security has moved from people's trust in their own infrastructure, versus using a third-party who can provide that security and have a far higher standard, because that’s what they do the whole day, every day. That security shift from on-premise to the cloud is a third big driver for us, and we've seen that in the market.

Gardner: You're in a unique position to be able to comment on this. Tell us about Intralinks, what the company does, and why security at the edge is part of your core competency.

Secure collaboration

Glenister: We're a software-as-a-service (SaaS) provider and we provide secure collaboration for data, wherever that data is, whether it’s inside a corporation or it’s shared outside. Typically, once people share data outside, whether it’s through e-mail or any other method, some of the commercial tools out there have lost control of that data.

We have the ability to actually lock that data down, control that, and put the governance and the compliance around that to secure that data, know where the high-value intellectual property (IP) is, who has access to it, and then be able to even share as well. And, if you’re in a situation of losing data, revoke access to someone who has left the organization.

Gardner: And these are industries that have security as a paramount concern. So, we’re talking about finance and insurance. Give us a little bit more indication of the type of data we’re talking about.

Glenister: It's anybody with high-value IP or compliance requirements -- banking, finance, healthcare, life sciences, for example, and manufacturing. Even when you’re looking at manufacturing overseas and you have IP going over to China to manufacture your product, your plans are also being shared overseas. We've seen a lot of companies now asking how to protect those plans and therefore, protect IP.
Critical Security
And Compliance Considerations
For Hybrid Cloud Deployments
Gardner: Chris, Intralinks seems to be ahead of the curve, recognizing how cloud can be an enabler for security. We're surely seeing a shift in the market, at least I certainly am. In the last six months or so, companies that were saying that security was a reason not to go to the cloud are now saying that security is a reason they're going to the cloud. They can attain security better. What's happened that has made that perspective flip?

Steffen: I don't know exactly what’s happened, but you're absolutely right; that flip is going on. We've done a lot of research recently and shown that when you’re looking at inherent barriers going to a cloud solution, security and compliance considerations are always right there at the top. We commissioned the study through 451 Research, and we kind of knew that’s what was going on, but they sure nailed it down, one and two, security and compliance, right there. [Get a copy of the report.]

Steffen
The reality, though, is that that the C-table, executives, IT managers, those types, are starting to look at the massive burden of security and hoping to find help somewhere. They can look at a provider like Intralinks, they can look at a provider like HPE and ask, "How can they help us meet our security requirements?"

They can’t just third-party their security requirements away. That’s not going to cut it with all the regulators that are out there, but we have solutions. HPE has a solution, Intralinks has solutions, a lot of third-party providers have solutions that will help the customer address some of those concerns, so those guys can actually sleep at night.

Gardner: We're hearing so much about digital disruption in so many industries, and we're hearing about why IT can’t wait, IT needs to be agile and have change in the business model to appeal to customers to improve their user experience.

It seems that security concerns have been a governor on that. "We can’t do this because 'blank' security issue arises." It seems to me that it's a huge benefit when you can come to them and say, "We're going to allow you to be agile. We're going to allow you to fight back against disruption because security can, in fact, be managed." How far are we to converting disruption in security into an enabler when you go to the cloud?

Very difficult

Glenister: The biggest thing for most organizations is they're large, and it’s very difficult to transform just the legacy systems and processes that are in-place. It's very difficult for organizations to change quickly. To actually drive that, they have to look at alternatives, and that’s why a lot of people move into cloud. Driving the move to the cloud is, "Can we quickly enable the business? Can we quickly provide those solutions, rather than having to spend 18 months trying to change our process and spend millions of dollars doing it?"

Enablement of the business is actually driving the need to go to the cloud, and obviously will drive security around that. To Chris’s point a few minutes ago, not all vendors are the same. Some vendors are in the cloud and they're not as secure as others. People are looking for trusted partners like HPE and Intralinks, and they are putting their trust and their crown jewels, in effect, with us because of that security. That’s why we work with HPE, because they have a similar philosophy around security as we do, and that’s important.

Steffen: The only thing I would add to that is that security is not only a concern of the big business or the small business; it’s everybody’s concern. It’s one of those things where you need to find a trusted provider. You need to find that provider that will not only understand the requirements that you're looking for, but the requirements that you have.
You don’t want to migrate to a cloud solution and then have all the compliance work that you’ve done previously just wiped away.

This is my opinion, but when you're kicking tires and looking at your overall compliance infrastructure, there's a pretty good chance you had to have that compliance for more than a day or two. It’s something that has been iterative; it may change, it may grow, whatever.

So, when you're looking at a partner, a lot of different providers will start to at least try to ensure that you don’t start at square-one again. You don’t want to migrate to a cloud solution and then have all the compliance work that you’ve done previously just wiped away. You want a partner that will map those controls and that really understands those controls.

Perfect examples are in the financial services industry. There are 10 or 11 regulatory bodies that some of the biggest banks in the world all have to be compliant with. It’s extremely complicated. You can’t really expect that Big Bank 123 is going to just throw away all that effort, move to whatever provider, and hope for the best. Obviously, they can’t be that way. So the key is to take a map of those controls, understand those controls, then map those controls to your new environment.

Gardner: Let’s get into a little bit of the how ... How this happens. What is it that we can do with security technology, with methodologies, with organizations that allow us to go into cloud, remove this notion of a boundary around your organization and do it securely? What’s the secret sauce, Daren?

Glenister: One of the things for us, being a cloud vendor, is that we can protect data outside. We have the ability to actually embed the security into documents wherever documents go. Instead of just having the control of data at rest within the organization, we have the ability to actually control it in motion inside and outside the perimeter.

You have the ability to control that data, and if you think about sharing with third parties, quite often people say, "We can’t share with a third-party because we don’t have compliance, we don’t have a security around it." Now, they can share, they can guarantee that the information is secure at rest, and in motion.

Typically, if you look at most organizations, they have at-rest data covered. Those systems and procedures are relative child’s play. But that’s been covered for many years. The challenge is that it's newly in motion. How do you actually extend working with third parties and working with outside organizations?

Innovative activities

Gardner: It strikes me that we're looking at these capabilities through the lens of security, but isn’t it also the case that this enables entirely new innovative activities. When you can control your data, when you can extend where it goes, for how long, to certain people, under certain circumstances, we're applying policies, bringing intelligence to a document, to a piece of data, not just securing it but getting control over it and extending its usefulness. So why would companies not recognize that security-first brings larger business benefits that extend for years?

Glenister: Historically, security has always been, "No, you can’t do this, let’s stop." If you look in a finance environment, it’s stop using thumb drives, stop using emails, stop using anything rather than ease of solution. We've seen a transition. Over the last six months, you're starting to see a transition where people are saying, "How do we enable? How do we get people to control them?' As a result of that, you see new solutions coming out from organizations and how they can impact the bottom line.

Gardner: Behavior modification has always a big part of technology adoption. Chris, what is it that we can do in the industry to show people that being secure and extending the security to wherever the data is going to go gives us much more opportunity for innovation? To me this is a huge enticing carrot that I don’t think people have perhaps fully grokked.
What is cloud security? What does it mean to have defense in depth? What does it mean to have a matured security policy vision?

Steffen: Absolutely. And the reality of it is that it’s an educational process. One of the things that I've been doing for quite some time now is trying to educate people. I can talk with a fellow CISSP and we can talk about Diffie-Hellman encryption and I promise that your CEO does not care, and he shouldn’t. He shouldn’t ever have to care. That’s not something that he needs to care about, but he does need to understand total cost of ownership (TCO), he needs to understand return on investment (ROI). He needs to be able to go to bed at night understanding that his company is going to be okay when he wakes up in the morning and that his company is secure.

It’s an iterative process; it’s something that they have to understand. What is cloud security? What does it mean to have defense in depth? What does it mean to have a matured security policy vision? Those are things that really change the attitudinal barriers that you have at a C-table that you then have to get past.

Security practitioners, those tinfoil hat types -- I classify myself as one of those people, too -- truly believe that they understand how data security works and how the cloud can be secured, and they already sleep well at night. Unfortunately, they're not the ones who are writing the checks.

It's really about shifting that paradigm of education from the practitioner level, where they get it, up to the CIO, the CISO who hopefully understands, and then up to the C-table and the CFO making certain that they can understand and write that check to ensure that going to a cloud solution will allow them to sleep at night and allow the company to innovate. They'll take any security as an enabler to move the business forward.
Critical Security
And Compliance Considerations
For Hybrid Cloud Deployments
Gardner: So, perhaps it’s incumbent upon IT and security personnel to start to evangelize inside their companies as to the business benefits of extended security, rather than the glass is always half empty.

Steffen: I couldn’t agree more. It’s a unique situation. Having your -- again, I'll use the term -- tinfoil hat people talking to your C-table about security -- they're big and scary, and so on. But the reality of it is that it really is critically important that they do understand the value that security brings to an organization.

Going back to our original conversations, in the last 6 to 12 months, you're starting to see that paradigm shifted a little bit, where C-table executives aren’t satisfied with check-box compliance. They want to understand what it takes to be secure, and so they have experts in house and they want to understand that. If they don’t have experts in-house, there are third-party partners out there that can provide that amount of education.

Gardner: I think it’s important for us to establish that the more secure and expert you are at security the more of a differentiator you have against your competition. You're going to clean up in your market if you can do it better than they can.

Step back

Steffen: Absolutely, and even bring that a step further back. People have been talking for two decades now about technology as a differentiator and how you can make a technical decision or embrace and exploit technology to be the differentiator in your vertical, in your segment, so on.

The credit reporting agency that I worked for a long time ago was one of those innovators, and people thought we were nuts for doing some of the stuff that we are doing. Years later, everybody is doing the same thing now.

It really can set up those things. Security is that new frontier. If you can prove that you're more secure than the next guy, that your customer data is more secured than the next guy, and that you're willing to protect your customers more than the next guy, maybe it’s not something you put on a billboard, but people know.

Would you go to retailer A because they have had a credit card breach or do you decide to go retailer B? It's not a straw man. Talk to Target, talk to Home Depot, talk to some of these big big-box stores that have had breaches and ask how their numbers looked after they had to announce that they had a breach.
Customers are now more demanding because the media is blowing up all of the cyber crimes, threats, and hacks. The consumer is now saying they need their data to be protected.

Gardner: Daren, let’s go to some examples. Can think of an example of IntraLinks and a security capability that became a business differentiator or enable?

Glenister: Think about banks at the moment, where they're working with customers. There's a drive for security. Security people have always known about security and how they can enable and protect the business.

But what’s happening is that the customers are now more demanding because the media is blowing up all of the cyber crimes, threats, and hacks. The consumer is now saying they need their data to be protected.

A perfect example is my daughter, who was applying for a credit card recently. She's going off to college. They asked her to send a copy of her passport, Social Security card, and driver’s license to them by email. She looked at me and said, "What do you think?" It's like, "No. Why would you?"

People have actually voted, saying they're not going to do business with that organization. If you look in the finance organizations now, banks and the credit-card companies are now looking at how to engage with the customer and show that they have been securing and protecting their data to enable new capabilities like loan or credit-card applications and protecting the customer’s data, because customers can vote with their feet and choose not to do business with you.

So, it’s become a business-enabler to say we're protecting your data and we have your concerns at heart.

Gardner: And it’s not to say that that information shouldn’t be made available to a credit card or an agency that’s ascertaining credit, but you certainly wouldn’t do it through email.

Insecure tool

Glenister: Absolutely, because email is the biggest sharing tool on the planet, but it’s also one of the most insecure tools on the planet. So, why would you trust your data to it?

Steffen: We've talked about security awareness, the security awareness culture, and security awareness programs. If you have a vendor management program and you’re subject to a vendor management from some other entity, one of the things they also would request is that you have a security awareness program?

Even five to seven years ago, people looked at that as drudgery. It was the same thing as all the other nonsensical HR training that you have to look at. Maybe, to some extent, it still is, but the reality is that when I've given those programs before, people are actually excited. It's not only because you get the opportunity to understand security from a business perspective, but a good security professional will then apply that to, "By the way, your email is not secured here, but your email is not secured at home, too. Don’t be stupid here, but don’t be stupid there either."

We're going to fix the router passwords. You don’t need to worry about that, but you have a home router, change the default password. Those sounds like very simple straightforward things, but when you share that with your employees and you build that culture, not only do you have more secure employees, but then the culture of your business and the culture of security changes.
It has to be a year-round, day-to-day culture with every organization understanding the implications of security and the risk associated with that.

In effect, what’s happening is that you'll finally be getting to see that translate into stuff going on outside of corporate America. People are expecting to have information security parameters around the businesses that they do business with. Whether it's from the big-box store, to the banks, to the hospitals, to everybody, it really is starting to translate.

Glenister: Security is a culture. I look at a lot of companies for whom we do once-a-year certification or attestation, an online test. People click through it, and some may have a test at the end and they answer the questions and that’s it, they're done. It's nice, but it has to be a year-round, day-to-day culture with every organization understanding the implications of security and the risk associated with that.

If you don’t do that, if you don’t embed that culture, then it becomes a one-time entity and your security is secure once a year.

Steffen: We were talking about this before we started. I'm a firm believer in security awareness. One of the things that I've always done is take advantage of these pretend Hallmark holidays. The latest one was Star Wars Day. Nearly everybody has seen Star Wars or certainly heard of Star Wars at some point or another, and you can’t even go into a store these days without hearing about it.

For Star Wars Day, I created a blog to talk about how information-security failures led to the downfall of the Galactic Empire.
Critical Security
And Compliance Considerations
For Hybrid Cloud Deployments
It was a fun blog. It wasn't supposed to be deadly serious, but the kicker is that we talked about key information security points. You use that holiday to get people engaged with what's going on and educate them on some key concepts of information security and accidentally, they're learning. That learning then comes to the next blog that you do, and maybe they pay a little bit more attention to it. Maybe they pay attention to simply piggybacking through the door and maybe they pay attention to not putting something in an e-mail and so on.

It's still a little iterative thing; it’s not going to happen overnight. It sounds silly talking about information security failures in Star Wars, but those are the kind of things that engage people and make people understand more about information security topics.

Looking to the future

Gardner: Before we sign off, let’s put on our little tinfoil hat with a crystal ball in front. If we've flipped in the last six months or so, people now see the cloud as inherently more secure, and they want to partner with their cloud provider to do security better. Let’s go out a year or two, how impactful will this flip be? What are the implications when we think about this, and we take into consideration what it really means when people think that cloud is the way to go to be secure on the internet?

Steffen: The one that immediately comes to mind for me -- Intralinks is actually starting to do some of this -- is you're going to see niche cloud. Here's what I mean by niche cloud. Let’s just take some random regulatory body that's applicable to a certain segment of business. Maybe they can’t go to a general public cloud because they're regulated in a way that it's not really possible.

What you're going to see is a cloud service that basically says, "We get it, we love your type, and we're going to create a cloud. Maybe it will cost you a little bit more to do it, but we understand from a compliance perspective the hell that you are going through. We want to help you, and our cloud is designed specifically to address your concerns."

When you have niche cloud, all of a sudden, it opens up your biggest inherent barriers. We’ve already talked about security. Compliance is another one, and compliance is a big fat ugly one. So, if you have a cloud provider that’s willing to maybe even assume some of the liability that comes with moving to their cloud, they're the winners. So let’s talk 24 months from now. I'm telling you that that’s going to be happening.
You definitely see security now transforming business, enabling businesses to do things and interact with their customs in ways they've never done before.

Gardner: All right, we'll check back on that. Daren, your prediction?

Glenister: You are going to see a shift that we're already seeing, and Chris will probably see this as well. It's a shift from discussions around security to transformation. You definitely see security now transforming business, enabling businesses to do things and interact with their customs in ways they've never done before.

You'll see that impacting two ways. One is going to be new business opportunities, so revenue coming in, but it’s also going to be streamlined in the internal processes, so making things easier to do internally. And you'll see a transformation of the business inside and outside. That’s going to drive a lot of new opportunities and new capabilities and innovations we've seen before.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Tuesday, August 9, 2016

How software-defined storage translates into just-in-time data center scaling and hybrid IT benefits

The next BriefingsDirect Voice of the Customer case study examines how hosting provider Opus Interactive adopted a software-defined storage approach to better support its thousands of customers.

We'll learn how scaling of customized IT infrastructure for a hosting organization in a multi-tenant environment benefits from flexibility of modern storage, unified management, and elastic hardware licensing. The result is gaining the confidence that storage supply will always meet dynamic hybrid computing demand -- even in cutting-edge hosting environments.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To describe how massive storage and data-center infrastructure needs can be met in a just-in-time manner, we're joined by Eric Hulbert, CEO at Opus Interactive in Portland, Oregon. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: What were the major drivers when you decided to re-evaluate your storage, and what were the major requirements that you had?

Hulbert: Our biggest requirement was high-availability in multi-tenancy. That was number one, because we're service providers and we have to meet the needs of a lot of customers, not just a single enterprise or even enterprises with multiple business groups.

Hulbert
So we were looking for something that met those requirements. Cost was a concern as well. We wanted it to be affordable, but needed it to be enterprise-grade with all the appropriate feature sets -- but most importantly it would be the scale-out architecture.

We were tired of the monolithic controller-bound SANs, where we'd have to buy a specific bigger size. We'd start to get close to where the boundary would be and then we would have to do a lift-and-shift upgrade, which is not easy to do with almost a thousand customers.

Ultimately, we made the choice to go to one of the first software-defined storage architectures, which is a company called LeftHand Networks, later acquired by Hewlett Packard Enterprise (HPE), and then some 3PAR equipment, also acquired by HPE. Those were, by far, the biggest factors while we made that selection on our storage platform.

Gardner: Give us a sense of the scale-out requirements.

Hulbert: We have three primary data centers in the Pacific Northwest and one in Dallas, Texas. We also have the ability for a little bit of space in New York, for some of our East Coast customers, and one in San Jose, California. So, we have five data centers in total.

Gardner: Is there a typical customer, or a wide range of customers?

Big range

Hulbert: We have a pretty big range. Our typical customers are in finance and travel and tourism, and the hospitality industries. There are quite a few in there. Healthcare is a growing vertical for us as well.

Then, we rounded out with manufacturing and little bit of retail. One of our actual verticals, if you could call it vertical, are the MSPs and IT companies, and even some VARs, that are moving into the cloud.

We enable them to do their managed services and be the "boots on the ground" for their customers. That spreads us into the tens of thousands of customers, because we have about 30 to 25 MSPs that work with us throughout the country, using our infrastructure. We just provide the infrastructure as a service, and that's been a pretty growing vertical for us.
Software Defined Storage
Eliminate Complexity and Free Infrastructure
From the Limitations of Dedicated Hardware
Gardner: And then, across that ecosystem, you're doing colocation, cloud hosting, managed services? What's the mix? What’s the largest part of the pie chart in terms of the services you're providing in the market?

Hulbert: We're about 75 percent cloud hosting, specifically a VMware-based private cloud, a multi-tenant private cloud. It's considered public cloud, but we call it private cloud.

We do a lot of hybrid cloud, where we have customers that are doing bursting into Amazon or [Microsoft] Azure. So, we have the ability to get them either Direct Connect Amazon connections or Azure ExpressRoute connections into any of our data centers. Then, 20 percent is colocation and about 5 percent for back-up, and disaster recovery (DR) rounds that out.

Gardner: Everyone, it seems, is concerned about digital disruption these days. For you, disruption is probably about not being able to meet demand. You're in a tight business, a competitive business. What’s the way that you're looking at this disruption in terms of your major needs as a business? What are your threats? What keeps you up at night?

Still redundant

Hulbert: Early on, we wanted a concurrently maintainable infrastructure, which also follows through with the data centers that we're at. So, we needed Tier 3-plus facilities that are concurrently maintainable. We wanted the infrastructure be the same. We're not kept up at night, because we can take an entire section of our solution offline for maintenance. It could be a failure, but we're still redundant.

It's a little bit more expensive, but we're not trying to compete with the commodity hosting providers out there. We're very customized. We're looking for customers that need more of that high-touch level of service, and so we architect these big solutions for them -- and we host with a 100 percent up-time.

The infrastructure piece is scalable with scale-out architecture on the storage side. We use only HP blades, so that we just keep stacking in blades as we go. We try to stay a couple of blade chassis ahead, so that we can take pretty large bursts of that infrastructure as needed.

That's the architecture that I would recommend for other service providers looking for a way to make sure they can scale out and not have to do any lift-and-shift on their SAN, or even the stack and rack services, which take more time.

We have to cable all of them versus needing to do one-blade chassis. Then, you can just slot in 16 blades quickly, as you're scaling. That allows you to scale quite a bit faster.
We use only HP blades, so that we just keep stacking in blades as we go. We try to stay a couple blade chassis ahead, so that we can take pretty large bursts of that infrastructure as needed.

Gardner: When it comes to making the choice for software-defined, what has that gotten you? I know people are thinking about that in many cases -- not just service providers, but enterprises. What did service-defined storage get for you, and are you furthering your software-defined architecture to more parts of your infrastructure?

Hulbert: We wanted it to be software-defined because we have multiple locations and we wanted one pane of glass. We use HPE OneView to manage that, and it would be very similar for an enterprises. Say we have 30 remote offices, they want to put the equipment there, and the business units need to provision some service and storage. We want to be going to each individual appliance or chassis or application in one place to provision it all.

Since we're dealing now with nearly a thousand customers -- and thousands and thousands of virtual servers, storage nodes, and all of that, the chunklets of data are distributed across all these. Being able to do that from one single pane of the glass from a management standpoint is quite important for us.

So, it's that software-defined aspect, especially distributing the data into chunklets, which allows us to grow quicker, and putting a lot of  automation on the back-end.

We only have 11 system administrators and engineers on our team managing that many servers, which shows you that our density is pretty high. That only works well if we have really good management tools, and having it software-defined means fewer people walking to and from the data center.

Even though our data centers are manned facilities, our infrastructure is basically lights out. We do everything from remote terminals.

Gardner: And does this software-defined extend across networking as well? Are you hyper-converged, converged? How would you define where you're going or where you'd like to go?

Converged infrastructure

Hulbert: We're not hyper-converged. For our scale, we can’t get into the prepackaged hyper-converged product. For us, it would be more of a converged infrastructure approach.

As I said, we do use the c-Class blade chassis with Virtual Connect, which is software-defined networking. We do a lot of VLANs and things like that on the software side.

We till have some outside of that out of band, networking, the network stacks, because we're not just a cloud provider. We also do colocation and a lot of hybrid computing where people are connecting between them. So, we have to worry about Fibre Channel on iSCSI and connections in SAN.

That adds a couple of other layers that are a few extra management steps, but in our scale, it’s not like we're adding tens of thousands of servers a day or even an hour, as I'm sure Amazon has to. So we can take that one small hit to pull that portion of the networking out, and it works pretty good for us.
Software Defined Storage
Eliminate Complexity and Free Infrastructure
From the Limitations of Dedicated Hardware
Gardner: How do you see the evolution of your business in terms of moving past disruption, adopting these newer architectures? Are there types of services, for example, that you're going to be able to offer soon or in the foreseeable future, based on what you're hearing from some of the vendors?

Hulbert: Absolutely. One of the first ones I mentioned earlier was the ability for customers that want to burst into public cloud to be able to do the Amazon Direct Connects. Even with the telecom providers back on, you're looking at 15 to 25 milliseconds latency. For some of these applications, that’s just too much latency. So, it’s not going to work.

Now, with the most recent announcement from Amazon, they put a physical Direct Connect node in Oregon, about a mile from our data-center facility. It's from EdgeConneX, who we partnered with.

Now, we can offer the lowest latency for both Amazon and Azure ExpressRoute in the Pacific Northwest, specifically in Oregon. That’s really huge for our customers, because we have some that do a lot of public-cloud bursting on bold platforms. So that’s one new offering we are doing.

Disruption, as we've heard, is around containers. We're launching a new container-as-a-service platform later this year based on ContainerX. That will allow us to do containers for both Windows or Starnix platforms, regardless of what the developers are looking for.

We're targeting developers, DevOps guys, who are looking to do microservices to take their application, old or new, and architect it into the containers. That’s going to be a very disruptive new offering. We've been working on a platform for a while now because we have multiple locations and we can do the geographic dispersion for that.

I think it’s going to take a little bit of the VMware market share over time. We're primarily a VMware shop, but I don’t think it’s going to be too much of an impact to us. It's another vertical we're going to be going after. Those are probably the two most important things we see as big disruptive factors for us.

Hybrid computing

Gardner: As an organization that's been deep into hybrid cloud and hybrid computing, is there anything out there in terms of the enterprises that you think they should better understand? Are there any sort of misconceptions about hybrid computing that you detect in the corporate space that you would like to set them straight on?

Hulbert: The hybrid that people typically hear about is more like having on-premises equipment. Let’s say I'm a credit union and I’ve got one of the bank branches that we decided to put three or four cabinets of our equipment and one on the vaults. Maybe they've added one UPS and one generator, but it’s not to the enterprise level, and they're bursting to the public cloud for the things that makes sense to meet their security requirements.

To me, that’s not really the best use of hybrid IT. Hybrid IT is where you're putting what used to be on-premises in an actual enterprise-level, Tier 3 or higher data center. Then, you're using either a form of bursting into private dedicated cloud from a provider in one of those data centers or into the public cloud, which is the most common definition of that hybrid cloud. That’s what I would typically define as hybrid cloud and hybrid IT.

Gardner: What I'm hearing is that you should get out of your own data center, use somebody else's, and then take advantage of the proximity in that data center, the other cloud services that you can avail yourself of.
Then, you're using either a form of bursting into private dedicated cloud from a provider in one of those data centers or into the public cloud which is the most common definition of that hybrid cloud.

Hulbert: Absolutely. The biggest benefit to them is at their individual location or bank branches. This the scenario where we use the credit union. They're going to have maybe one or two telco providers, and they're going to be their 100 or maybe 200 Mb-per-second circuits.

They're paying a pretty premium for them, and now when they get into one of these data centers, they're going to have the ability to have 10-gig or even 40- or 100-gig connected internet pipes with a lot higher headroom for connectivity at a better price point. 

On top of that, they'll have 10-gig connection options into the cloud, all the different cloud providers. Maybe they have an Oracle stack that they want to put on an Oracle cloud some day along with their own on- premises. The hybrid things get more challenging, because now, they're not going to get the connectivity they need. Maybe they want to be into the software, they want to do an Amazon or Azure, or maybe they want a Opus cloud.

They need faster connectivity for that, but they have equipment that still has usable life. Why not move that to an enterprise-grade data center and not worry about air conditioning challenges, electrical problems, or whether it’s secure.

All of these facilities, including ours, have every checkbox for compliance and auditing that happens on an annual basis. Those things that used to be really headaches aren’t core of their business. They don’t do those any more. Focus on what's core, focus on the application and their customers.

Gardner: So proximity still counts, and probably will count for an awfully long time. You get benefits from taking advantage of proximity in these data centers, but you can still have, as you say, what you consider core under your control, under your tutelage and set up your requirements appropriately?

Mature model

Hulbert: It really comes down to the fact that the cloud model is very mature at this point. We’ve been doing it for over a decade. We started doing cloud before it was even called cloud. It was just virtualization. We launched our platform in late 2005 and it proved out, time and time again, with 100 percent up-time.

We have one example of a large customer, a travel and tourism operator, that brings visitors from outside the US to the US. They do over a $1 billion a year in revenue, and we host their entire infrastructure.

It's a lot of infrastructure and it’s a very mature model. We've been doing it for a long time, and that helps them to not worry about what used to be on-premises for them. They moved it all. A portion of it is colocated, and the rest is all on our private cloud. They can just focus on the application, all the transactions, and ultimately on making their customers happy.

Gardner: Going back to the storage equation, Eric, do you have any examples of where the storage software-defined environment gave you the opportunity to satisfy customers or price points, either business or technical metrics that demonstrate how this new approach to storage particularly fills out this costs equation?
The ability to easily provision the different sized data storage we need for the virtual servers that are running on that is absolutely paramount.

Hulbert: In terms of the software-defined storage, the ability to easily provision the different sized data storage we need for the virtual servers that are running on that is absolutely paramount.

We need super-quick provisioning, so we can move things around. When you add in the layers of VMware, like storage vMotion, we can replicate volumes between data centers. Having that software-defined makes that very easy for us, especially with the built-in redundancy that we have and not being controller-bound like we mentioned earlier on.

Those are pretty key attributes, but on top of that , as customers are growing, we can very easily add more volumes for them. Say they have a footprint in our Portland facility and want to add a footprint in our Dallas, Texas facility and do geographic load balancing. It makes it very easy for us to do the applications between the two facilities, slowly adding on those layers as customers need to grow. It makes that easy for them as well.
Software Defined Storage
Eliminate Complexity and Free Infrastructure
From the Limitations of Dedicated Hardware
Gardner: One last question, what comes next in terms of containers? What we're seeing is that containers have a lot to do with developers and DevOps, but ultimately I'd  think that the envelope gets pushed out into production, especially when you hear about things like composable infrastructure. If you've been composing infrastructure in the earlier part of the process and development, it takes care of itself in production.

Do you actually see more of these trends accomplishing that where production is lights-out like you are, where more of the definition of infrastructure and applications, productivity, and capabilities is in that development in DevOps stage?

Virtualization

Hulbert: Definitely. Over time, it is going to be very similar to what we saw when customers were moving from dedicated physical equipment into the cloud, which is really virtualization.

This is the next evolution, where we're moving into containers. At the end of the day, the developers, the product managers for the applications for whatever they're actually developing, don't really care what and how it all works. They just want it to work.

They want it to be a utility consumption-based model. They want the composable infrastructure. They want to be able to get all their microservices deployed at all these different locations on the edge, to be close to their customers.

Containers are going to be a great way to do that because they have all the overhead of dealing with the operations knowledge. So, they can just put these little APIs and the different things that they need where they need it. As we see more of that stuff pushed to the edge to get the eyeball traffic, that’s going to be a great way to do that. With the ability to do even further bursting and into the bigger public clouds worldwide, I think we can get to a really large scale in a great way.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

 You may also be interested in: