Thursday, December 1, 2016

HPE takes aim at customer needs for speed and agility in age of IoT, hybrid everything

A leaner, more streamlined Hewlett Packard Enterprise (HPE) advanced across several fronts at HPE Discover 2016 in London, making inroads into hybrid IT, Internet of Things (IoT), and on to the latest advances in memory-based computer architecture. All the innovations are designed to help customers address the age of digital disruption with speed, agility, and efficiency.

Addressing a Discover audience for the first time since HPE announced spinning off many software lines to Micro Focus, Meg Whitman, HPE President and CEO, said that company is not only committed to those assets, becoming a major owner of Micro Focus in the deal, but building its software investments.
HPE is not getting out of software but doubling-down on the software that powers the apps and data workloads of hybrid IT.

"HPE is not getting out of software but doubling-down on the software that powers the apps and data workloads of hybrid IT," she said Tuesday at London's ExCel exhibit center.

"Massive compute resources need to be brought to the edge, powering the Internet of Things (IoT). ... We are in a world now where everything computes, and that changes everything," said Whitman, who has now been at the helm of HPE and HP for five years.

HPE's new vision: To be the leading provider of hybrid IT, to run today's data centers, and then bridge the move to multi-cloud and empower the intelligent edge, said Whitman. "Our goal is to make hybrid IT simple and to harness the intelligent edge for real-time decisions" to allow enterprises of all kinds to win in the marketplace, she said.

Hyper-converged systems

To that aim, the company this week announced an extension of HPE Synergy's fully programmable infrastructure to HPE's multi-cloud platform and hyper-converged systems, enabling IT operators to deliver software-defined infrastructure as quickly as customers' businesses demand. The new solutions include:
  • HPE Synergy with HPE Helion CloudSystem 10 -- This brings full composability across compute, storage and fabric to HPE's OpenStack technology-based hybrid cloud platform to enable customers to run bare metal, virtualized, containerized and cloud-native applications on a single infrastructure and dynamically compose and recompose resources for unmatched agility and efficiency.
  • HPE Hyper Converged Operating Environment -- The software update leverages composable technologies to deliver new capabilities to the HPE Hyper Converged 380, including new workspace controls that allow IT managers to compose and recompose virtualized resources for different lines of business, making it easier and more efficient for IT to act as an internal service provider to their organization.
    This year's HPE Discover was strong on showcasing the ecosystem approach to creating and maintaining hybrid IT.
This move delivers a full-purpose composable infrastructure platform, treating infrastructure as code, enabling developers to accelerate application delivery, says HPE. HPE Synergy has nearly 100 early access customers across a variety of industries, and is now broadly available. [Disclosure: HPE is a sponsor of BriefingsDirect podcasts.]

This year's HPE Discover was strong on showcasing the ecosystem approach to creating and maintaining hybrid IT. Heavy hitters from Microsoft Azure, Arista, and Docker joined Whitman on stage to show their allegiance to HPE's offerings -- along with their own -- as essential ingredients to Platform 3.0 efficiency.

See more on my HPE Discover analysis on The Cube.

HPE also announced plans to expand Cloud28+, an open community of commercial and public sector organizations with the common goal of removing barriers to cloud adoption. Supported by HPE's channel program, Cloud28+ unites service providers, solution providers, ISVs, system integrators, and government entities to share knowledge, resources and services aimed at helping customers build and consume the right mix of cloud solutions for their needs.

Internet of Things

Discover 2016 also saw new innovations designed to help organizations rapidly, securely, and cost-effectively deploy IoT devices in wide area, enterprise and industrial deployments. These solutions include:
"Cost-prohibitive economics and the lack of a holistic solution are key barriers for mass adoption of IoT," said Keerti Melkote, Senior Vice President and General Manager, HPE. "By approaching IoT with innovations to expand our comprehensive framework built on edge infrastructure solutions, software platforms, and technology ecosystem partners, HPE is addressing the cost, complexity and security concerns of organizations looking to enable a new class of services that will transform workplace and operational experiences."

As organizations integrate IoT into mainstream operations, the onboarding and management of IoT devices remains costly and inefficient particularly at large scale. Concurrently, the diverse variations of IoT connectivity, protocols and security, prevent organizations from easily aggregating data across a heterogeneous fabric of connected things.
The edge of the network is becoming a very crowded place, but these devices need to be made more useful.

To improve the economies of scale for massive IoT deployments over wide area networks, HPE announced the new HPE Mobile Virtual Network Enabler (MVNE) and enhancements to the HPE Universal IoT (UIoT) Platform.

As the amount of data generated from smart “things” grows and the frequency at which it is collected increases, so will the need for systems that can acquire and analyze the data in real-time. Real-time analysis is enabled through edge computing and the close convergence of data capture and control systems in the same box.

HPE Edgeline Converged Edge Systems converge real-time analog data acquisition with data center-level computing and manageability, all within the same rugged open standards chassis. Benefits include higher performance, lower energy, reduced space, and faster deployment times.

"The intelligent edge is the new frontier of the hybrid computing world," said Whitman. "The edge of the network is becoming a very crowded place, but these devices need to be made more useful."

This means that the equivalent of a big data crunching data center needs to be brought to the edge affordably.

Biggest of big data

"IoT is the biggest of big data," said Tom Bradicich, HPE Vice President and General Manager, Servers and IoT Systems. "HPE EdgeLine and [partner company] PTC help bridge the digital and physical worlds for IoT and augmented reality (AR) for fully automated assembly lines."

IoT and data analysis at the edge helps companies finally predict the future, head off failures and maintenance needs in advance. And the ROI on edge computing will be easy to prove when factory downtime can be greatly eliminated using IoT, data analysis and AR at the edge everywhere.

Along these lines, Citrix, together with HPE, has developed a new architecture around HPE Edgeline EL4000 with XenApp, XenDesktop and XenServer to allow graphically rich, high-performance applications to be deployed right at the edge.  They're now working together on next-generation IoT solutions that bring together the HPE Edge IT and Citrix Workspace IoT strategies.
I predict that HPC will be a big driver for HPE, both in private cloud implementations and in supporting technical differentiation for HPE customers and partners.

In related news, SUSE has entered into an agreement with HPE to acquire technology and talent that will expand SUSE's OpenStack infrastructure-as-a-service (IaaS) solution and accelerate SUSE's entry into the growing Cloud Foundry platform-as-a-service (PaaS) market.

The acquired OpenStack assets will be integrated into SUSE OpenStack Cloud, and the acquired Cloud Foundry and PaaS assets will enable SUSE to bring to market a certified, enterprise-ready SUSE Cloud Foundry PaaS solution for all customers and partners in the SUSE ecosystem.

As part of the transaction, HPE has named SUSE as its preferred open source partner for Linux, OpenStack IaaS, and Cloud Foundry PaaS.

#HPE also put force behind its drive to make high performance computing (HPC) a growing part of enterprise data centers and private clouds. Hot on the heels of buying SGI, HPE has recognized that public clouds leave little room for those workloads that do not perform best in virtual machines.

Indeed, if all companies buy their IT from public clouds, they have little performance advantage over one another. But many companies want to gain the best systems with the best performance for the workloads that give them advantage, and which run the most complex -- and perhaps value-creating -- applications. I predict that HPC will be a big driver for HPE, both in private cloud implementations and in supporting technical differentiation for HPE customers and partners.

Memory-driven computing

Computer architecture took a giant leap forward with the announcement that HPE has successfully demonstrated memory-driven computing, a concept that puts memory, not processing, at the center of the computing platform to realize performance and efficiency gains not possible today.

Developed as part of The Machine research program, HPE's proof-of-concept prototype represents a major milestone in the company's efforts to transform the fundamental architecture on which all computers have been built for the past 60 years.

Gartner predicts that by 2020, the number of connected devices will reach 20.8 billion and generate an unprecedented volume of data, which is growing at a faster rate than the ability to process, store, manage, and secure it with existing computing architectures.

"We have achieved a major milestone with The Machine research project -- one of the largest and most complex research projects in our company's history," said Antonio Neri, Executive Vice President and General Manager of the Enterprise Group at HPE. "With this prototype, we have demonstrated the potential of memory-driven computing and also opened the door to immediate innovation. Our customers and the industry as a whole can expect to benefit from these advancements as we continue our pursuit of game-changing technologies."
We have achieved a major milestone with The Machine research project -- one of the largest and most complex research projects in our company's history.

The proof-of-concept prototype, which was brought online in October, shows the fundamental building blocks of the new architecture working together, just as they had been designed by researchers at HPE and its research arm, Hewlett Packard Labs. HPE has demonstrated:
  • Compute nodes accessing a shared pool of fabric-attached memory
  • An optimized Linux-based operating system (OS) running on a customized system on a chip (SOC)
  • Photonics/Optical communication links, including the new X1 photonics module, are online and operational
  • New software programming tools designed to take advantage of abundant persistent memory.
During the design phase of the prototype, simulations predicted the speed of this architecture would improve current computing by multiple orders of magnitude. The company has run new software programming tools on existing products, illustrating improved execution speeds of up to 8,000 times on a variety of workloads. HPE expects to achieve similar results as it expands the capacity of the prototype with more nodes and memory.

In addition to bringing added capacity online, The Machine research project will increase focus on exascale computing. Exascale is a developing area of HPC that aims to create computers several orders of magnitude more powerful than any system online today. HPE's memory-driven computing architecture is incredibly scalable, from tiny IoT devices to the exascale, making it an ideal foundation for a wide range of emerging high-performance compute and data intensive workloads, including big data analytics.

Commercialization

HPE says it is committed to rapidly commercializing the technologies developed under The Machine research project into new and existing products. These technologies currently fall into four categories: Non-volatile memory, fabric (including photonics), ecosystem enablement and security.

Martin Banks, writing in Diginomica, questions whether these new technologies and new architectures represent a new beginning or a last hurrah for HPE. He poses the question to David Chalmers, HPE's Chief Technologist in EMEA, and Chalmers explains HPE's roadmap.

The conclusion? Banks feels that the in-memory architecture has the potential to be the next big step that IT takes. If all the pieces fall into place, Banks says, "There could soon be available a wide range of machines at price points that make fast, high-throughput systems the next obvious choice. . . . this could be the foundation for a whole range of new software innovations."

Storage initiative

HPE lastly announced a new initiative to address demand for flexible storage consumption models, accelerate all-flash data center adoption, assure the right level of resiliency, and help customers transform to a hybrid IT infrastructure.

Over the past several years, the industry has seen flash storage rapidly evolve from niche application performance accelerator to the default media for critical workloads. During this time, HPE's 3PAR StoreServ Storage platform has emerged as a leader in all-flash array market share growth, performance, and economics. The new HPE 3PAR Flash Now initiative gives customers a way to acquire this leading all-flash technology on-premises starting at $0.03 per usable Gigabyte per month, a fraction of the cost of public cloud solutions.
This keynote address and the news makes more sense as pertains to current and future IT market than I’ve ever seen.

"Capitalizing on digital disruption requires that customers be able to flexibly consume new technologies," said Bill Philbin, vice president and general manager, Storage, Hewlett Packard Enterprise. "Helping customers benefit from both technology and consumption flexibility is at the heart of HPE's innovation agenda."

Whitman's HPE, given all of the news at HPE Discover, has assembled the right business path to place HPE and its ecoystems of partners and alliances squarely the very center of the major IT trends of the next five years.

Indeed, I’ve been at HPE Discover conferences for more than 10 years now, and this keynote address and the news makes more sense as pertains to current and future IT market than I’ve ever seen.

You may also be interested in:

Monday, November 21, 2016

Meet George Jetson – your new AI-empowered chief procurement officer

The next BriefingsDirect technology innovation thought leadership discussion explores how rapid advances in artificial intelligence (AI) and machine learning are poised to reshape procurement -- like a fast-forwarding to a once-fanciful vision of the future.

Whereas George Jetson of the 1960s cartoon portrayed a world of household robots, flying cars, and push-button corporate jobs -- the 2017 procurement landscape has its own impressive retinue of decision bots, automated processes, and data-driven insights.

We won’t need to wait long for this vision of futuristic business to arrive. As we enter 2017, applied intelligence derived from entirely new data analysis benefits has redefined productivity and provided business leaders with unprecedented tools for managing procurement, supply chains, and continuity risks.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about the future of predictive -- and even proactive -- procurement technologies, please welcome Chris Haydon, Chief Strategy Officer at SAP Ariba. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: It seems like only yesterday that we were content to gain a common view of the customer or develop an end-to-end bead on a single business process. These were our goals in refining business in general, but today we've leapfrogged to a future where we're using words like “predictive” and “proactive” to define what business function should do and be about. Chris, what's altered our reality to account for this rapid advancement from visibility into predictive -- and on to proactive?

Haydon: There are a couple of things. The acceleration of the smarts, the intelligence, or the artificial intelligence, whatever the terminology that you identify with, has really exploded. It’s a lot more real, and you see these use-cases on television all the time. The business world is just looking to go in and adopt that.

And then there’s this notion of the Lego block of being able to string multiple processes together via an API is really exciting -- that coupled with the ability to have insight. The last piece, the ability to make sense of big data, either from a visualization perspective or from a machine-learning perspective, has accelerated things.

These trends are starting to come together in the business-to-business (B2B) world, and today, we're seeing them manifest themselves in procurement.

Gardner: What is it about procurement as a function that’s especially ripe for taking advantage of these technologies?

Transaction intense

Haydon: Procurement is obviously very transaction-intense. Historically, what transaction intensity means is people, processing, exceptions. When we talk about these trends now, the ability to componentize services, the ability to look at big data or machine learning, and the input on top of this contextualizes intelligence. It's cognitive and predictive by its very nature, a bigger data set, and [improves] historically inefficient human-based processes. That’s why procurement is starting to be at the forefront.

Haydon

Gardner: Procurement itself has changed from the days of when we were highly vertically integrated as corporations. We had long lead times on product cycles and fulfillment. Nowadays, it’s all about agility and compressing the time across the board. So, procurement has elevated its position. Anything more to add?

Haydon: Everyone needs to be closer to the customer, and you need live business. So, procurement is live now. This change in dynamic -- speed and responsiveness -- is closer to your point. It’s also these other dimensions of the consumer experience that now has to be the business-to-business experience. All that means same-day shipping, real-time visibility, and changing dynamically. That's what we have to deliver.

Gardner: If we go back to our George Jetson reference, what is it about this coming year, 2017? Do you think it's an important inception point when it comes to factoring things like the rising role of procurement, the rising role of analytics, and the fact that the Internet of Things (IoT) is going to bring more relevant data to bear? Why now?

Haydon: There are a couple of things. The procurement function is becoming more mature. Procurement leaders have extracted those first and second levels of savings from sourcing and the like. And they have control of their processes.

With cloud-based technologies and more of control of their processes, they're looking now to how they're going to serve their internal customers by being value-generators and risk-reducers.

How do you forward the business, how do you de-risk, how do you get supply continuity, how do you protect your brand? You do that by having better insight, real-time insight into your supply base, and that’s what’s driving this investment.

Gardner: We've been talking about Ariba being a 20-year-old company. Congratulations on your anniversary of 20 years.

Haydon: Thank you.

AI and bots

Gardner: You're also, of course, part of SAP. Not only have you been focused on procurement for 20 years, but you've got a large global player with lots of other technologies and platform of benefits to avail yourselves of. So, that brings me to the point of AI and bots.

It seems to me that right at the time when procurement needs help, when procurement is more important than ever, that we're also in a position technically to start doing some innovative things that get us into those words "predictive" and more "intelligent."

Set the stage for how these things come together.

Haydon: You allude to being part of SAP, and that's really a great strength and advantage for a domain-focused procurement expertise company.

The machine-learning capabilities that are part of a native SAP HANA platform, which we naturally adopt and get access to, put us on the forefront of not having to invest in that part of the platform, but to focus on how we take that platform and put it into the context of procurement.

There are a couple of pretty obvious areas. There's no doubt that when you’ve got the largest B2B network, billions in spend, and hundreds and millions of transactions on invoicing, you apply some machine learning on that. We can start doing a lot smarter matching an exception management on that, pretty straightforward. That's at one end of the chain.
It's not about upstream and downstream, it's about end-to-end process, and re-imagining and reinventing that.

On the other end of the chain, we have bots. Some people get a little bit wired about the word “bot,” “robotics,” or whatever, maybe it's a digital assistant or it's a smart app. But, it's this notion of helping with decisions, helping with real-time decisions, whether it's identifying a new source of supply because there's a problem, and the problem is identified because you’ve got a live network. It's saying that you have a risk or you have a continuity problem, and not just that it's happening, but here's an alternative, here are other sources of a qualified supply.

Gardner: So, it strikes me that 2017 is such a pivotal year in business. This is the year where we're going to start to really define what machines do well, and what people do well, and not to confuse them. What is it about an end-to-end process in procurement that the machine can do better that we can then elevate the value in the decision-making process of the people?

Haydon: Machines can do better in just identifying patterns -- clusters, if you want to use a more technical word. They transform category management and enables procurement to be at the front of their internal customer set by looking not just at their traditional total cost of ownership (TCO), but total value and use. That's a part of that real dynamic change.

What we call end-to-end, or even what SAP Ariba defined in a very loose way when we talked about upstream, it was about outsourcing and contracting, and downstream was about procurement, purchasing, and invoicing. That's gone, Dana. It's not about upstream and downstream, it's about end-to-end process, and re-imagining and reinventing that.

The role of people

Gardner: When we give more power to a procurement professional by having highly elevated and intelligent tools, their role within the organization advances and the amount of improvement they can make financially advances. But I wonder where there's risk if we automate too much and whether companies might be thinking that they still want people in charge of these decisions. Where do we begin experimenting with how much automation to bring, now that we know how capable these machines have been, or is this going to be a period of exploration for the next few years?

Haydon: It will be a period of exploration, just because businesses have different risk tolerances and there are actually different parts of their life cycle. If you're in a hyper growth mode and you're pretty profitable, that's a little bit different than if you're under a very big margin pressure.

For example, maybe if you're in high tech in the Silicon Valley, and some big names that we could all talk about are, you're prepared to be able to go at it, and let it all come.

If you're in a natural-resource environment, every dollar is even more precious than it was a year ago.

That’s also the beauty, though, with technology. If you want to do it for this category, this supplier, this business unit, or this division you can do that a lot easier than ever before and so you go on a journey.
If you're in a hyper growth mode and you're pretty profitable, that's a little bit different than if you're under a very big margin pressure.

Gardner: That’s an important point that people might not appreciate, that there's a tolerance for your appetite for automation, intelligence, using machine learning, and AI. They might even change, given the context of the certain procurement activity you're doing within the same company. Maybe you could help people who are a little bit leery of this, thinking that they're losing control. It sounds to me like they're actually gaining more control.

Haydon: They gain more control, because they can do more and see more. To me, it’s layered. Does the first bot automatically requisition something -- yes or no? So, you put tolerances on it. I'm okay to do it if it is less than $50,000, $5,000, or whatever the limit is, and it's very simple. If the event is less than $5,000 and it’s within one percent of the last time I did it, go and do it. But tell me that you are going to do it or let’s have a cooling-off period.

If you don't tell me or if you don’t stop me, I'm going to do it, and that’s the little bit of this predictive as well. So you still control the gate, you just don’t have to be involved in all the sub-processes and all that stuff to get to the gate. That’s interesting.

Gardner: What’s interesting to me as well, Chris, is because the data is such a core element of how successful this is, it means that companies in a procurement intelligence drive will want more data, so they can make better decisions. Suppliers who want to be competitive in that environment will naturally be incentivized to provide more data, more quickly, with more openness. Tell us some of the implications for intelligence brought to procurement on the supplier? What we should expect suppliers to do differently as a result?

Notion of content

Haydon: There's no doubt that, at a couple of levels, suppliers will need to let the buyers know even more about themselves than they have ever known before.

That goes to the notion of content. It’s like there is unique content to be discovered, which is whom am I, what do I do well and demonstrate that I do well. That’s being discovered. Then, there is the notion of being able to transact. What do I need to be able to do to transact with you efficiently whether that's a payment, a bank account, or just the way in which I can consume this?

Then, there is also this last notion of the content. What content do I need to be able to provide to my customer, aka the end user, for them to be able to initiate the business with them?

These three dimensions of being discovered, how to be dynamically transacted with, and then actually providing the content of what you do even as a material of service to the end user via the channel. You have to have all of these dimensions right.
If you don't have the context of the business process between a buyer and a seller and what they are trying to affect through the network, how does it add value?

That’s why we fundamentally believe that a network-based approach, when it's end to end, meaning a supplier can do it once to all of the customers across the [Ariba] Discovery channel, across the transactional channel, across the content channel is really value adding. In a digital economy, that's the only way to do it.

Gardner: So this idea of the business network, which is a virtual repository for all of this information isn't just quantity, but it's really about the quality of the relationship. We hear about different business networks vying for attention. It seems to me that understanding that quality aspect is something you shouldn't lose track of.

Haydon: It’s the quality. It’s also the context of the business process. If you don't have the context of the business process between a buyer and a seller and what they are trying to affect through the network, how does it add value? The leading-practice networks, and we're a leading-practice network, are thinking about Discovery. We're thinking about content; we're thinking about transactions.

Gardner: Again, going back to the George Jetson view of the future, for organizations that want to see the return on their energy and devotion to these concepts around AI, bots, and intelligence. What sort of low-hanging fruit do we look for, for assuring them that they are on the right path? I'm going to answer my own question, but I want you to illustrate it a bit better, and that’s risk and compliance and being able to adjust to unforeseen circumstances seems to me an immediate payoff for doing this.

Severance of pleadings

Haydon: The United Kingdom is enacting a law before the end of the year for severance of pleadings. It’s the law, and you have to comply. The real question is how you comply.

You eye your brand, you eye your supply chain, and having the supply-chain profile information at hand right now is top of mind. If you're a Chief Procurement Officer (CPO) and you walk into the CEO’s office, the CEO could ask, "Can you tell me that I don’t have any forced labor, I don’t have any denied parties, and I'm Office of Foreign Assets Control (OFAC) compliant? Can you tell me that now?"

You might be able to do it for your top 50 suppliers or top 100 suppliers, and that’s great, but unfortunately, a small, $2,000 supplier who uses some forced labor in any part of the world is potentially a problem in this extended supply chain. We've seen brands boycotted very quickly. These things roll.

So yes, I think that’s just right at the forefront. Then, it's applying intelligence to that to give that risk threshold and to think about where those challenges are. It's being smart and saying, "Here is a high risk category. Look at this category first and all the suppliers in the category. We're not saying that the suppliers are bad, but you better have a double or triple look at that, because you're at high risk just because of the nature of the category."
Think larger than yourself in trying to solve that problem differently. Those cloud deployment models really help you.

Gardner: Technically, what should organizations be thinking about in terms of what they have in place in order for their systems and processes to take advantage of these business network intelligence values? If I'm intrigued by this concept, if I see the benefits in reducing risk and additional efficiency, what might I be thinking about in terms of my own architecture, my own technologies in order to be in the best position to take advantage of this?

Haydon: You have to question how much of that you think you can build yourself. If you think you're asking different questions than most of your competitors, you're probably not. I'm sure there are specific categories and specific areas on tight supplier relationships and co-innovation development, but when it comes to the core risk questions, more often, they're about an industry, a geography, or the intersection of both.

Our recommendation to corporations is never try and build it yourself. You might need to have some degree of privacy, but look to have it as more industry-based. Think larger than yourself in trying to solve that problem differently. Those cloud deployment models really help you.

Gardner: So it really is less of a technical preparatory thought process than process being a digital organization, availing yourself of cloud models, being ready to think about acting intelligently and finding that right demarcation between what the machines do best and what the people do best.

More visible

Haydon: By making things digital they are actually more visible. You have to be able to balance the pure nature of visibility to get at the product; that's the first step. That’s why people are on a digital journey.

Gardner: Machines can’t help you with a paper-based process, right?

Haydon: Not as much. You have to scan it and throw it in. Then, you are then digitizing it.

Gardner: We heard about Guided Buying last year from SAP Ariba. It sounds like we're going to be getting a sort of "Guided Buying-Plus" next year and we should keep an eye on that.

Haydon: We're very excited. We announced that earlier this year. We're trying to solve two problems quickly through Guided Buying.
Our Guided Buying has a beautiful consumer-based look and feel, but with embedded compliance. We hide the complexity. We just show the user what they need to know at the time, and the flow is very powerful.

One is the nature of the ad-hoc user. We're all ad-hoc users in the business today. I need to buy things, but I don’t want to read the policy, I don’t want to open the PDF on some corporate portal on some threshold limit that, quite honestly, I really need to know about once or twice a year.

So our Guided Buying has a beautiful consumer-based look and feel, but with embedded compliance. We hide the complexity. We just show the user what they need to know at the time, and the flow is very powerful.

Listen to the podcast. Find it on iTunes. Get the mobile app. Download the transcript. Sponsor: SAP Ariba.

You may also be interested in:

Friday, November 18, 2016

Strategic view across more data delivers digital business boost for AmeriPride

The next BriefingsDirect Voice of the Customer digital transformation case study explores how linen services industry leader AmeriPride Services uses big data to gain a competitive and comprehensive overview of its operations, finances and culture.

We’ll explore how improved data analytics allows for disparate company divisions and organizations to come under a single umbrella -- to become more aligned -- and to act as a whole greater than the sum of the parts. This is truly the path to a digital business.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

Here to describe how digital transformation has been supported by innovations at the big data core, we’re joined by Steven John, CIO, and Tony Ordner, Information Team Manager, both at at AmeriPride Services in Minnetonka, Minnesota. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Let’s discuss your path to being a more digitally transformed organization. What were the requirements that led you to become more data-driven, more comprehensive, and more inclusive in managing your large, complex organization?

John


John: One of the key business drivers for us was that we're a company in transition -- from a very diverse organization to a very centralized organization. Before, it wasn't necessarily important for us to speak the same data language, but now it's critical. We’re developing the lexicon, the Rosetta Stone, that we can all rely on and use to make sure that we're aligned and heading in the same direction.

Gardner: And Tony, when we say “data,” are we talking about just databases and data within applications? Or are we being even more comprehensive -- across as many information types as we can?

Ordner: It’s across all of the different information types. When we embarked on this journey, we discovered that data itself is great to have, but you also have to have processes that are defined in a similar fashion. You really have to drive business change in order to be able to effectively utilize that data, analyze where you're going, and then use that to drive the business. We're trying to institute into this organization an iterative process of learning.

Gardner: For those who are not familiar with AmeriPride Services, tell us about the company. It’s been around for quite a while. What do you do, and how big of an umbrella organization are we talking about?

Long-term investments

John: The company is over 125 years old. It’s family-owned, which is nice, because we're not driven by the quarter. We can make longer-term investments through the family. We can have more of a future view and have ambition to drive change in different ways than a quarter-by-quarter corporation does.

We're in the laundry business. We're in the textiles and linen business. What that means is that for food and beverage, we handle tablecloths, napkins, chef coats, aprons, and those types of things. In oil and gas, we provide the safety garments that are required. We also provide the mats you cross as you walk in the door of various restaurants or retail stores. We're in healthcare facilities and meet the various needs of providing and cleansing the garments and linens coming out of those institutions. We're very diverse. We're the largest company of our kind in Canada, probably about fourth in the US, and growing.
Become a Member of myVertica
Gain Access to the Free 
HPE Vertica Community Edition
Gardner: And this is a function that many companies don't view as core and they're very happy to outsource it. However, you need to remain competitive in a dynamic world. There's a lot of innovation going on. We've seen disruption in the taxicab industry and the hospitality industry. Many companies are saying, “We don’t want to be a deer in the headlights; we need to get out in front of this.”

Tony, how do you continue to get in front of this, not just at the data level, but also at the cultural level?

Ordner: Part of what we're doing is defining those standards across the company. And we're coming up with new programs and new ways to get in front and to partner with the customers.

Ordner
As part of our initiative, we're installing a lot of different technology pieces that we can use to be right there with the customers, to make changes with them as partners, and maybe better understand their business and the products that they aren't buying from us today that we can provide. We’re really trying to build that partnership with customers, provide them more ways to access our products, and devise other ways they might not have thought of for using our products and services.

With all of those data points, it allows us to do a much better job.

Gardner: And we have heard from Hewlett Packard Enterprise (HPE) the concept that it's the “analytics that are at the core of the organization,” that then drive innovation and drive better operations. Is that something you subscribe to, and is that part of your thinking?

John: For me, you have to extend it a little bit further. In the past, our company was driven by the experience and judgment of the leadership. But what we discovered is that we really wanted to be more data-driven in our decision-making.

Data creates a context for conversation. In the context of their judgment and experience, our leaders can leverage that data to make better decisions. The data, in and of itself, doesn’t drive the decisions -- it's that experience and judgment of the leadership that's that final filter.

We often forget the human element at the end of that and think that everything is being driven by analytics, when analytics is a tool and will remain a tool that helps leaders lead great companies.

Gardner: Steven, tell us about your background. You were at a startup, a very successful one, on the leading edge of how to do things different when it comes to apps, data, and cloud delivery.

New ways to innovate

John: Yes, you're referring to Workday. I was actually Workday’s 33rd customer, the first to go global with their product. Then, I joined Workday in two roles: as their Strategic CIO, working very closely with the sales force, helping CIOs understand the cloud and how to manage software as a service (SaaS); and also as their VP of Mid-Market Services, where we were developing new ways to innovate, to implement in different ways and much more rapidly.

And it was a great experience. I've done two things in my life, startups and turnarounds, and I thought that I was kind of stepping back and taking a relaxing job with AmeriPride. But in many ways, it's both; AmeriPride’s both a turnaround and a startup, and I'm really enjoying the experience.

Gardner: Let’s hear about how you translate technology advancement into business advancement. And the reason I ask it in that fashion is that it seems as a bit of a chicken and the egg, that they need to be done in parallel -- strategy, ops, culture, as well as technology. How are you balancing that difficult equation?

John: Let me give you an example. Again, it goes back to that idea of, if you just have the human element, they may not know what to ask, but when you add the analytics, then you suddenly create a set of questions that drive to a truth.
Become a Member of myVertica
Gain Access to the Free 
HPE Vertica Community Edition
We're a route-based business. We have over a 1,000 trucks out there delivering our products every day. When we started looking at margin we discovered that our greatest margin was from those customers that were within a mile of another customer.

So factoring that in changes how we sell, that changes how we don't sell, or how we might actually let some customers go -- and it helps drive up our margin. You have that piece of data, and suddenly we as leaders knew some different questions to ask and different ways to orchestrate programs to drive higher margin.

Gardner: Another trend we've seen is that putting data and analytics, very powerful tools, in the hands of more people can have unintended, often very positive, consequences. A knowledge worker isn't just in a cube and in front of a computer screen. They're often in the trenches doing the real physical work, and so can have real process insights. Has that kicked in yet at AmeriPride, and are you democratizing analytics?

Ordner: That’s a really great question. We've been trying to build a power-user base and bring some of these capabilities into the business segments to allow them to explore the data.

You always have to keep an eye on knowledge workers, because sometimes they can come to the wrong conclusions, as well as the right ones. So it's trying to make sure that we maintain that business layer, that final check. It's like, the data is telling me this, is that really where it is?

I liken it to having a flashlight in a dark room. That’s what we are really doing with visualizing this data and allowing them to eliminate certain things, and that's how they can raise the questions, what's in this room? Well, let me look over here, let me look over there. That’s how I see that.

Too much information

John: One of the things I worry about is that if you give people too much information or unstructured information, then they really get caught up in the academics of the information -- and it doesn’t necessarily drive a business process or drive a business result. It can cause people to get lost in the weeds of all that data.

You still have to orchestrate it, you still have to manage it, and you have to guide it. But you have to let people go off and play and innovate using the data. We actually have a competition among our power-users where they go out and create something, and there are judges and prizes. So we do try to encourage the innovation, but we also want to hold the reins in just a little bit.

Gardner: And that gets to the point of having a tight association between what goes on in the core and what goes on at the edge. Is that something that you're dabbling in as well?

John: It gets back to that idea of a common lexicon. If you think about evolution, you don't want a Madagascar or a Tasmania, where groups get cut off and then they develop their own truth, or a different truth, or they interpret data in a different way -- where they create their own definition of revenue, or they create their own definition of customer.

If you think about it as orbits, you have to have a balance. Maybe you only need to touch certain people in the outer orbit once a month, but you have to touch them once a month to make sure they're connected. The thing about orbits and keeping people in the proper orbits is that if you don't, then one of two things happens, based on gravity. They either spin out of orbit or they come crashing in. The idea is to figure out what's the right balance for the right groups to keep them aligned with where we are going, what the data means, and how we're using it, and how often.

Gardner: Let’s get back to the ability to pull together the data from disparate environments. I imagine, like many organizations, that you have SaaS apps. Maybe it’s for human capital management or maybe it’s for sales management. How does that data then get brought to bear with internal apps, some of them may even be on a mainframe still, or virtualized apps from older code basis and so forth? What’s the hurdle and what words of wisdom might you impart to others who are earlier in this journey of how to make all that data common and usable?

Ordner: That tends to be a hurdle. As to the data acquisition piece, as you set these things up in the cloud, a lot of the times the business units themselves are doing these things or making the agreements. They don't put into place the data access that we've always needed. That’s been our biggest hurdle. They'll sign the contracts, not getting us involved until they say, "Oh my gosh, now we need the data." We look at it and we say, "Well, it’s not in our contracts and now it’s going to cost more to access the data." That’s been our biggest hurdle for the cloud services that we've done.

Once you get past that, web services have been a great thing. Once you get the licensing and the contract in place, it becomes a very simple process, and it becomes a lot more seamless.

Gardner: So, maybe something to keep in mind is always think about the data before, during, and after your involvement with any acquisition, any contract, and any vendor?

Ordner: Absolutely.

You own three things

John: With SaaS, at the end of the day, you own three things: the process design, the data, and the integration points. When we construct a contract, one of the things I always insist upon is what I refer to as the “prenuptial agreement.”

What that simply means is, before the relationship begins, you understand how it can end. The key thing in how it ends is that you can take your data with you, that it has a migration path, and that they haven't created a stickiness that traps you there and you don't have the ability to migrate your data to somebody else, whether that’s somebody else in the cloud or on-premise.

Gardner: All right, let’s talk about lessons learned in infrastructure. Clearly, you've had an opportunity to look at a variety of different platforms, different requirements that you have had, that you have tested and required for your vendors. What is it about HPE Vertica, for example, that is appealing to you, and how does that factor into some of these digital transformation issues?

Ordner: There are two things that come to mind right away for me. One is there were some performance implications. We were struggling with our old world and certain processes that ran 36 hours. We did a proof of concept with HPE and Vertica and that ran in something like 17 minutes. So, right there, we were sold on performance changes.

As we got into it and negotiated with them, the other big advantage we discovered is that the licensing model with the amount of data, versus the core model that everyone else runs in the CPU core. We're able to scale this and provide that service at a high speed, so we can maintain that performance without having to take penalties against licensing. Those are a couple of things I see. Anything from your end, Steven?

John: No, I think that was just brilliant.

Gardner: How about on that acquisition and integration of data. Is there an issue with that that you have been able to solve?

Ordner: With acquisition and integration, we're still early in that process. We're still learning about how to put data into HPE Vertica in the most effective manner. So, we're really at our first source of data and we're looking forward to those additional pieces. We have a number of different telematics pieces that we want to include; wash aisle telematics as well as in-vehicle telematics. We're looking forward to that.

There's also scan data that I think will soon be on the horizon. All of our garments and our mats have chips in them. We scan them in and out, so we can see the activity and where they flow through the system. Those are some of our next targets to bring that data in and take a look at that and analyze it, but we're still a little bit early in that process as far as multiple sources. We're looking forward to some of the different ways that Vertica will allow us to connect to those data sources.

Gardner: I suppose another important consideration when you are picking and choosing systems and platforms is that extensibility. RFID tags are important now; we're expecting even more sensors, more data coming from the edge, the information from the Internet of Things (IoT). You need to feel that the systems you're putting in place now will scale out and up. Any thoughts about the IoT impact on what you're up to?

Overcoming past sins

John: We have had several conversations just this week with HPE and their teams, and they are coming out to visit with us on that exact topic. Being about a year into our journey, we've been doing two things. We've been forming the foundation with HPE Vertica and we've been getting our own house in order. So, there's a fair amount of cleanup and overcoming the sins of the past as we go through that process.

But Vertica is a platform; it's a platform where we have only tapped a small percentage of its capability. And in my personal opinion, even HPE is only aware of a portion of its capability. There are a whole set of things that it can do, and I don’t believe that we have discovered all of them.
Become a Member of myVertica
Gain Access to the Free 
HPE Vertica Community Edition
With that said, we're going to do what you and Tony just described; we're going to use the telematics coming out of our trucks. We're going to track safety and seat belts. We're going to track green initiatives, routes, and the analytics around our routes and fuel consumption. We're going to make the place safer, we're going to make it more efficient, and we're going to get proactive about being able to tell when a machine is going to fail and when to bring in our vendor partners to get it fixed before it disrupts production.

Gardner: It really sounds like there is virtually no part of your business in the laundry services industry that won't be in some way beneficially impacted by more data, better analytics delivered to more people. Is that fair?

Ordner: I think that’s a very fair statement. As I prepared for this conference, one of the things I learned, and I have been with the company for 17 years, is that we've done a lot technology changes, and technology has taken an added significance within our company. When you think of laundry, you certainly don't think of technology, but we've been at the leading edge of implementing technology to get closer to our customers, closer to understanding our products.

[Data technology] has become really ingrained within the industry, at least at our company.

John: It is one of those few projects where everyone is united, everybody believes that success is possible, and everybody is willing to pay the price to make it happen.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Monday, November 7, 2016

Swift and massive data classification advances score a win for better securing sensitive information

The next BriefingsDirect Voice of the Customer digital transformation case study explores how -- in an era when cybersecurity attacks are on the rise and enterprises and governments are increasingly vulnerable -- new data intelligence capabilities are being brought to the edge to provide better data loss prevention (DLP).

We'll learn how Digital Guardian in Waltham, Massachusetts analyzes both structured and unstructured data to predict and prevent loss of data and intellectual property (IP) with increased accuracy.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or  download a copy.
 
To learn how data recognition technology supports network and endpoint forensic insights for enhanced security and control, we're joined by Marcus Brown, Vice President of Corporate Business Development for Digital Guardian. The discussion is moderated by BriefingsDirect's Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: What are some of the major trends making DLP even more important, and even more effective?

Brown: Data protection has very much to come to the forefront in the last couple of years. Unfortunately, we wake up every morning and read in the newspapers, see on television, and hear on the radio a lot about data breaches. It’s pretty much every type of company, every type of organization, government organizations, etc., that’s being hit by this phenomenon at the moment.

Brown

So, awareness is very high, and apart from the frequency, a couple of key points are changing. First of all, you have a lot of very skilled adversaries coming into this, criminals, nation-state actors, hactivists, and many others. All these people are well-trained and very well resourced to come after your data. That means that companies have a pretty big challenge in front of them. The threat has never been bigger.

In terms of data protection, there are a couple of key trends at the cyber-security level. People have been aware of the so-called insider threat for a long time. This could be a disgruntled employee or it could be someone who has been recruited for monetary gain to help some organization get to your data. That’s a difficult one, because the insider has all the privilege and the visibility and knows where the data is. So, that’s not a good thing.

Then, you have employees, well-meaning employees, who just make mistakes. It happens to all of us. We touch something in Outlook, and we have a different email address than the one we were intending, and it goes out. The well-meaning employees, as well, are part of the insider threat.

Outside threats

What’s really escalated over the last couple of years are the advanced external attackers or the outside threat, as we call it. These are well-resourced, well-trained people from nation-states or criminal organizations trying to break in from the outside. They do that with malware or phishing campaigns.

About 70 percent of the attacks stop with the phishing campaign, when someone clicks on something that looked normal. Then, there's just general hacking, a lot of people getting in without malware at all. They just hack straight in using different techniques that don’t rely on malware.

People have become so good at developing malware and targeting malware at particular organizations, at particular types of data, that a lot of tools like antivirus and intrusion prevention just don’t work very well. The success rate is very low. So, there are new technologies that are better at detecting stuff at the perimeter and on the endpoint, but it’s a tough time.

There are internal and external attackers. A lot of people outside are ultimately after the two main types of data that companies have. One is a customer data, which is credit card numbers, healthcare information, and all that stuff. All of this can be sold on the black market per record for so-and-so many dollars. It’s a billion-dollar business. People are very motivated to do this.
Learn More About HPE IDOL
Advanced Enterprise Search and Analytics
For Unstructured Data
Most companies don’t want to lose their customers’ data. That’s seen as a pretty bad thing, a bad breach of trust, and people don’t like that. Then, obviously, for any company that has a product where you have IP, you spent lots of money developing that, whether it’s the new model of a car or some piece of electronics. It could be a movie, some new clothing, or whatever. It’s something that you have developed and it’s a secret IP. You don’t want that to get out, as well as all of your other internal information, whether it’s your financials, your plans, or your pricing. There are a lot of people going after both of those things, and that’s really the challenge.

In general, the world has become more mobile and spread out. There is no more perimeter to stop people from getting in. Everyone is everywhere, private life and work life is mixed, and you can access anything from anywhere. It’s a pretty big challenge.

Gardner: Even though there are so many different types of threats, internal, external, and so forth, one of the common things that we can do nowadays is get data to learn more about what we have as part of our inventory of important assets.

While we might not be able to seal off that perimeter, maybe we can limit the damage that takes place by early detection of problems. The earlier that an organization can detect that something is going on that shouldn’t be, the quicker they can come to the rescue. How does the instant analysis of data play a role in limiting negative outcomes?

Can't protect everything

Brown: If you want to protect something, you have to know it’s sensitive and that you want to protect it. You can’t protect everything. You're going to find which data is sensitive, and we're able to do that on-the-fly to recognize sensitive data and nonsensitive data. That’s a key part of the DLP puzzle, the data protection puzzle.

We work for some pretty large organizations, some of the largest companies and government organizations in the world, as well as lot of medium- and smaller-sized customers. Whatever it is we're trying to protect, personal information or indeed the IP, we need to be in the right place to see what people are doing with that data.

Our solution consists of two main types of agents. Some agents are on endpoint computers, which could be desktops or servers, Windows, Linux, and Macintosh. It’s a good place to be on the endpoint computer, because that’s where people, particularly the insider, come into play and start doing something with data. That’s where people work. That’s how they come into the network and it’s how they handle a business process.

So the challenge in DLP is to support the business process. Let people do with data what they need to do, but don’t let that data get out. The way to do that is to be in the right place. I already mentioned the endpoint agent, but we also have network agents, sensors, and appliances in the network that can look at data moving around.

The endpoint is really in the middle of the business process. Someone is working, they're working with different applications, getting data out of those applications, and they're doing whatever they need to do in their daily work. That’s where we sit, right in the middle of that, and we can see who the user is and what application they're working with it. It could be an engineer working with the computer-aided design (CAD) or the product lifecycle management (PLM) system developing some new automobile or whatever, and that’s a great place to be.

We rely very heavily on the HPE IDOL technology for helping us classify data. We use it particularly for structured data, anything like a credit card number, or alphanumeric data. It could be also free text about healthcare, patient information, and all this sort of stuff.

We use IDOL to help us scan documents. We can recognize regular expressions, that’s a credit card number type of thing, or Social Security. We can also recognize terminology. We rely on the fact that IDOL supports hundreds of languages and many different subject areas. So, using IDOL, we're able to recognize a whole lot of anything that’s written in textual language.

Our endpoint agent also has some of its own intelligence built in that we put on top of what we call contextual recognition or contextual classification. As I said, we see the customer list coming out of Salesforce.com or we see the jet fighter design coming out of the PLM system and we then tag that as well. We're using IDOL, we're using some of our technology, and we're using our vantage point on the endpoint being in the business process to figure out what the data is.

We call that data-in-use monitoring and, once we see something is sensitive, we put a tag on it, and that tag travels with the data no matter where it goes.

An interesting thing is that if you have someone making a mistake, an unintentional, good-willed employee, accidentally attaching the wrong doc to something that it goes out, obviously it will warn the user of that.

We can stop that

If you have someone who is very, very malicious and is trying to obfuscate what they're doing, we can see that as well. For example, taking a screenshot of some top-secret diagram, embedding that in a PowerPoint and then encrypting the PowerPoint, we're tagging those docs. Anything that results from IP or top-secret information, we keep tagging that. When the guy then goes to put it on a thumb drive, put it on Dropbox, or whatever, we see that and stop that.

So that’s still a part of the problem, but the two points are classify it, that’s what we rely on IDOL a lot for, and then stop it from going out, that’s what our agent is responsible for.

Gardner: Let’s talk a little bit about the results here, when behaviors, people and the organization are brought to bear together with technology, because it’s people, process and technology. When it becomes known in the organization that you can do this, I should think that that must be a fairly important step. How do we measure effectiveness when you start using a technology like Digital Guardian? Where does that become explained and known in the organization and what impact does that have?

Brown: Our whole approach is a risk-based approach and it’s based on visibility. You’ve got to be able to see the problem and then you can take steps and exercise control to stop the problems.
Learn More About HPE IDOL
Advanced Enterprise Search and Analytics
For Unstructured Data
When you deploy our solution, you immediately gain a lot of visibility. I mentioned the endpoints and I mentioned the network. Basically, you get a snapshot without deploying any rules or configuring in any complex way. You just turn this on and you suddenly get this rich visibility, which is manifested in reports, trends, and all this stuff. What you get, after a very short period of time, is a set of reports that tell you what your risks are, and some of those risks may be that your HR information is being put on Dropbox.

You have engineers putting the source code onto thumb drives. It could all be well-meaning, they want to work on it at home or whatever, or it could be some bad guy.

One the biggest points of risk in any company is when an employee resigns and decides to move on. A lot of our customers use the monitoring and the reporting we have at that time to actually sit down with the employee and say, "We noticed that you downloaded 2,000 files and put them on a thumb drive. We’d like you to sign this saying that you're going to give us that data back."

That’s a typical use case, and that’s the visibility you get. You turn it on and you suddenly see all these risks, hopefully, not too many, but a certain number of risks and then you decide what you're going to do about it. In some areas you might want to be very draconian and say, "I'm not going to allow this. I'm going to completely block this. There is no reason why you should put the jet fighter design up on Dropbox."

Gardner: That’s where the epoxy in the USB drives comes in.

Warning people

Brown: Pretty much. On the other hand, you don’t want to stop people using USB, because it’s about their productivity, etc. So, you might want to warn people, if you're putting some financial data on to a thumb drive, we're going to encrypt that so nothing can happen to it, but do you really want to do this? Is this approach appropriate? People get a feeling that they're being monitored and that the way they are acting maybe isn't according to company policy. So, they'll back out of it.

In a nutshell, you look at the status quo, you put some controls in place, and after those controls are in place, within the space of a week, you suddenly see the risk posture changing, getting better, and the incidence of these dangerous actions dropping dramatically.

Very quickly, you can measure the security return on investment (ROI) in terms of people’s behavior and what’s happening. Our customers use that a lot internally to justify what they're doing.

Generally, you can get rid of a very large amount of the risk, say 90 percent, with an initial pass, or initial first two passes of rules to say, we don’t want this, we don’t want that. Then, you're monitoring the status, and suddenly, new things will happen. People discover new ways of doing things, and then you’ve got to put some controls in place, but you're pretty quickly up into the 90 percent and then you fine-tuning to get those last little bits of risk out.

Gardner: Because organizations are becoming increasingly data-driven, they're getting information and insight across their systems and their applications. Now, you're providing them with another data set that they could use. Is there some way that organizations are beginning to assimilate and analyze multiple data sets including what Digital Guardian’s agents are providing them in order to have even better analytics on what’s going on or how to prevent unpleasant activities?

Brown: In this security world, you have the security operations center (SOC), which is kind of the nerve center where everything to do with security comes into play. The main piece of technology in that area is the security information and event management (SIEM) technology. The market leader is HPE’s ArcSight, and that’s really where all of the many tools that security organizations use come together in one console, where all of that information can be looked at in a central place and can also be correlated.

We provide a lot of really interesting information for the SIEM for the SOC. I already mentioned we're on the endpoint and the network, particularly on the endpoint. That’s a bit of a blind spot for a lot of security organizations. They're traditionally looking at firewalls, other network devices, and this kind of stuff.

We provide rich information about the user, about the data, what’s going on with the data, and what’s going on with the system on the endpoint. That’s key for detecting malware, etc. We have all this rich visibility on the endpoint and also from the network. We actually pre-correlate that. We have our own correlation rules. On the endpoint computer in real time, we're correlating stuff. All of that gets populated into ArcSight.

At the recent HPE Protect Show in National Harbor in September we showed the latest generation of our integration, which we're very excited about. We have a lot of ArcSight content, which helps people in the SOC leverage our data, and we gave a couple of presentations at the show on that.

Gardner: And is there a way to make this even more protected? I believe encryption could be brought to bear and it plays a role in how the SIEM can react and behave.

Seamless experience

Brown: We actually have a new partnership, related to HPE's acquisition of Voltage, which is a real leader in the e-mail security space. It’s all about applying encryption to messages and managing the keys and making that user experience very seamless and easy to use.

Adding to that, we're bundling up some of the classification functionality that we have in our network sensors. What we have is a combination between Digital Guardian Network, DOP, and the HPE Data Security Encryption solution, where an enterprise can define a whole bunch of rules based on templates.

We can say, "I need to comply with HIPAA," "I need to comply with PCI," or whatever standard it is. Digital Guardian on the network will automatically scan all the e-mail going out and automatically classify according to our rules which e-mails are sensitive and which attachments are sensitive. It then goes on to the HPE Data Security Solution where it gets encrypted automatically and then sent out.

It’s basically allowing corporations to apply standard set of policies, not relying on the user to say they need to encrypt this, not leaving it to the user’s judgment, but actually applying standard policies across the enterprise for all e-mail making sure they get encrypted. We are very excited about it.
Learn More About HPE IDOL
Advanced Enterprise Search and Analytics
For Unstructured Data
Gardner: That sounds key -- using encryption to the best of its potential, being smart about it, not just across the waterfront, and then not depending on a voluntary encryption, but doing it based on need and intelligence.
 
Brown: Exactly.

Gardner: For those organizations that are increasingly trying to be data-driven, intelligent, taking advantage of the technologies and doing analysis in new interesting ways, what advice might you offer in the realm of security? Clearly, we’ve heard at various conferences and other places that security is, in a sense, the killer application of big-data analytics. If you're an organization seeking to be more data-driven, how can you best use that to improve your security posture?

Brown: The key, as far as we’re concerned, is that you have to watch your data, you have to understand your data, you need to collect information, and you need visibility of your data.

The other key point is that the security market has been shifting pretty dramatically from more of a network view much more toward the endpoint. I mentioned earlier that antivirus and some of these standard technologies on the endpoint aren't really cutting it anymore. So, it’s very important that you get visibility down at the endpoint and you need to see what users are doing, you need to understand what your systems are running, and you need to understand where your data is.

So collect that, get that visibility, and then leverage that visibility with analytics and tools so that you can profit from an automated kind of intelligence.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or  download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in: