Wednesday, June 18, 2014

Big data meets the supply chain — SAP’s Supplier InfoNet and Ariba Network combine to predict supplier risk

The next BriefingsDirect case study interview explores how improved visibility analytics and predictive responses are improving supply-chain management. We’ll now learn how SAP’s Supplier InfoNet, coupled with the Ariba Network, allows for new levels of transparency in predictive analytics that reduce risk in supplier relationships.

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: Ariba, an SAP company.

BriefingsDirect had an opportunity to uncover more about about how the intelligent supply chain is evolving at the recent 2014 Ariba LIVE Conference in Las Vegas when we spoke to David Charpie, Vice President of Supplier InfoNet at SAP, and Sundar Kamakshisundaram, Senior Director of Solutions Marketing at Ariba, an SAP company. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: We’ve brought two things together here, SAP’s Supplier InfoNet and Ariba Network. What is it about these two that gives us the ability to analyze or predict, and therefore reduce, risk?

Charpie: To be able to predict and understand risk, you have to have two major components together. One of them is actually understanding this multi-tiered supply chain. Who is doing business with whom, all the way down the line, from the customer to the raw material in a manufacturing sense? To do that you need to be able to bring together a very large graph, if you will, of how all these companies are inter-linked.

Charpie
And that is ultimately what the Ariba Network brings to bear. With over 1.5 million companies that are inter-linked and transacting with each other, we can really see what those supply chains look like.

The second piece of it is to bring together, as Sundar talked about, lots of information of all kinds to be able to understand what’s happening at any point within that map. The kinds of information you need to understand are sometimes as simple as who is the company, what do they make, where are they located, what kind of political, geopolitical issues are they dealing with?
What we find is that suppliers don’t behave the same for everybody.

The more complex issues are things around precisely what exact product are they making with what kind of requirements, in terms of performance, and how they’re actually doing that on a customer-by-customer basis. What we find is that suppliers don’t behave the same for everybody.

So InfoNet and the network have come together to bring those two perspectives, all the data about how companies perform and what they are about with this interconnectedness of how companies work with each other. That really brings us to the full breadth of being able to address this issue about risk.

Gardner: Sundar, we have a depth of transactional history. We have data, we have relationships, and now we’re applying that to how supply chains actually behave and operate. How does this translate into actual information? How does the data go from your systems to someone who is trying to manage their business process?

Kamakshisundaram
Kamakshisundaram: A very good question. If you take a step back and understand the different data points you need to analyze to predict risk, they fall into two different buckets. The first bucket is around the financial metrics that you typically get from any of the big content providers you have in place. We can understand how the supplier is performing, based on current data, and exactly what they’re doing financially, if they’re a public company.

The second aspect, through the help of Ariba Network or Supplier InfoNet, is the ability to understand the operational and the transactional relationship a supplier has in place to predict how the supplier is going to behave six-to-eight months from now.

For example, you may be a large retailer or a consumer packaged goods (CPG) organization, maybe working with a very large trucking company. This particular trucking company may be doing really well and they may have great historical financial information, which basically puts them in a very good shape.

Financial viability

But if only one-third of the business is from retail and CPG and the remaining two-thirds comes from some of the challenging industries, all of a sudden, operational and financial viability of the transportation supply chain may not look good. Though the carrier's historical financials may be in good shape, you can’t really predict how the supplier is going to have working capital management in terms of cash available for them to run the business and maintain the operation in a sustainable manner.

How does Ariba, Ariba Network, and InfoNet help? By taking all the information across this multitude of variables, not only in a financial metrics, but also the operational metrics, and modeling the supply chain.

You don’t limit yourself with the first tier or second tier, but go all the way to the multi-tier supply chain and also the interactions that some of these suppliers may have with their customers. It will help you understand whether this particular supplier will be able to supply the right product and get you the right product to your docks at the right time.

Without having this inter-correlation of network data well laid out in a multi-tier supply chain, it would have been almost impossible to predict what is going to happen in this particular supply-chain example.

Gardner: What sort of trends or competitive pressures are making companies seek better ways to identify, acquire, or manage information and data to have a better handle on their supply chains?

Kamakshisundaram: The pressures are multifaceted. To start with, many organizations are faced with globalization pressure. Finding the right suppliers who can actually supply both the product and service at the right time is a second challenge. And the third challenge many companies grapple with right now is the ability to balance savings and cost reductions with risk mitigation.

These two opposing variables have to be in check in order to drive sustainable savings from the bottom line. These challenges, coupled with the supply-chain disruptions, are making it difficult not only to find suppliers, but also to get the right product at the right time.

Gardner: When we talk about risk in a supply-chain environment what are we really talking about? Risk can be a number of things in a number of different directions.

Many variables

Kamakshisundaram: Risk, at a very high level, is composed of many different variables. Many of us understand that risk is a function of, number one, the supply. If you don’t have the right supplier, if you don’t have the right product at the right time, you have risk.

And, there is the complexity involved in finding the suppliers to address needs in different parts of the world. You may have a supplier in North America, but if you really want to expand your market share in the Far East, especially in China, you need to have the right supply chain to do that.

Companies traditionally have looked at historical information to predict risk. And this is no longer enough because more and more, supply chains are becoming complex. Supply chains are affected by the number of globalized variables including the ability to have suppliers in different parts of the world, and also other challenges which will make risk more difficult to predict in the long run.

Gardner: Where do you see the pressures to change or improve how supply-chain issues are dealt with, and how do you also define the risks that are something to avoid in supply-chain management?

Charpie: When we think about risk we’re really thinking about it from two dimensions. One of them is environmental risk. That is, what are all the factors outside of the company that are impacting performance?

That can be as varied as wars, on one hand, right down to natural disasters and other political types of events that can also cause them to be disrupted in terms of managing their supply base and keeping the kind of cost structure they are looking for.

The other kind are more inherent operational types of risks. These are the things like on-time performance risk, as Sundar was referring to. What do we have in terms of quality? What do we have in terms of product and deliverables, and do they meet the needs of the customer?

As we look at these two kinds of risks, we’ve seen increasing amounts of disruption, because we’re in a time where the supply chains are getting much longer, leaner, and more complex to manage. As a result of that, you’re seeing that over 40 percent of interruptions right now are caused by interruptions in the supply chain downstream, tiers two, tier three, all the way to tier N.

So now we need a different way of managing suppliers than we had in the past. Just working with them and talking to them about how they do things and what they do isn’t enough. We need to understand how they’re actually managing their suppliers, and so on, down the line.

Predicting risk
These are models that behave more like the human brain than like some of the statistical math we learned when we were back in high school

Gardner: So, David, it sounds to me as algorithmic or as if a score card is there to generate this analysis. Is that the right way to look at this, or is it just making the data available for other people to reach conclusions that then allows them to reduce their risk?

Charpie: There absolutely is an algorithmic component to this. In fact, what we do in Supplier InfoNet and with the Ariba Network is to run machine-learning models. These are models that behave more like the human brain than like some of the statistical math we learned when we were back in high school and college.

What it looks for is patterns of behavior, and as Sundar said, we’re looking at how a company has performed in the past with all of their customers. How is that changing? What other variables are changing at the same time or what kinds of events are going on that may be influencing them?

We talked about environmental risk a bit ago. We capture information from about 160,000 newswire sources on a daily basis and, on an automated basis, are able to extract what that article is about, who it’s about, and what the impact on supply chain could be.

By integrating that with the transactional history of the Ariba Network and by integrating that with all the linkage on who does business with whom, we can start to see a pattern of behavior. That pattern of behavior can then help us understand what’s likely to happen moving forward.

To make it a little more concrete, let’s take Sundar’s example of a company having financial trouble. If I take a company, for example, under $100 million, what we have found is that if we see a company that begins to deliver late, within three months of that begins to have quality problems, and within two months or less begins to have cash-flow problems and can’t pay their bills on time, we may be seeing the beginning of a company that’s about to have a financial disaster.

Interestingly, what we find is for the pattern that really means something, after those three events. If they begin paying their bills on time all of a sudden, that’s the worst indicator there possibly could be. It’s very counterintuitive, but the models tell us that when that happens, we’re on the verge of someone who will go bankrupt within two to three months of that time frame.

Delivery model

Gardner: Now I can see why this wasn’t something readily available until fairly recently. We needed to have a cloud infrastructure delivery model. We needed to have the data available and accessible. And then we needed to have a big data capability to drive real-time analysis across multiple tiers on a global scale.

So here we are, Ariba LIVE 2014. What are we going to hear when can people start to actually use this? Where are we on the timeline for delivery in this really compelling value?

Kamakshisundaram: Both Supplier InfoNet and Ariba Network are available today for customers, so that they can continue to leverage these solutions. With the help of SAP’s innovation team, we’re planning to bring in additional solutions that not only help customers look at real-time risk modeling, but also more of predicted analytical capability show.
They can identify the suppliers they want to track to as many as the entire supply base.

Charpie: In terms of the business benefits in what we are offering, the features that really bring to life this notion of integrating the Ariba Network with InfoNet are, first and foremost, an ability to push alerts to our customers on a proactive basis to let them know when something is happening within their supply chain and could be impacting them in any way whatsoever.

That is, they can set their own levels. They can set what interests them. They can identify the suppliers they want to track to as many as the entire supply base. We will track those on an automated basis and give them updates to keep them abreast of what’s happening.

Second, we’re also going to give them the ability to monitor the entire supply base, from a heat-map perspective, to strategically see the hot pockets -- by industry, by spend, or by geography -- that they need to pay particular attention to.

Third, we’re also going to bring to them this automated capability to look at these 160,000 newswire sources and tell them the newswires that they need to pay attention to, so they can determine what kind of actions can they take from those, based on the activity that they see.

We’re also going to bring those predictions to them. We have the ability now to look at and predict performance and disruption and deliver those also as alerts, as well as deeper analytics. By leveraging the power of HANA, we’re able to bring real-time analysis to the customer.

They have those tools today, and so it’d be creating a totally personalized experience, where they can look at big data, look at it the way they want to, look at it the way that they believe risk should be measured and monitored, and be able to use that information right there and then for themselves.

Sharing environment

Last, they also have the ability to do this in an environment where they can share with each other, with their suppliers, and with others in the network, if they choose. What I mean by that is the model that we have used within Supplier InfoNet is very much like you see in Facebook.

When you have a supplier and you would like to see more of their supply base you request that you can see that, much like friending someone on Facebook. They will open up that portion -- some, little, none -- of their supply base that they would like you to be able to have access to. Once you have that, you can get alerts on them, you can manage them, and you can get input on them as well.

So there’s an ability for the community to work together, and that’s really the key piece that we see in the future, and it’s going to continue to expand and grow as we take InfoNet and the Network out to the market.
Focusing on a certain industry and having the suppliers only in that particular industry will give you only a portion of that information to understand and predict risk.

Kamakshisundaram: If you take a step back, you can see why companies haven’t been able to do something like this in the past. There were analytical models available. There were tools and technologies available, but in order to build a model that will help customers identify a multi-tier supply chain risk, you need a community of suppliers who are able to participate and provide information which will continue to help understand where the risk points are.

As David mentioned, where is your heat map? What does it say? And also, point to how you not only collect the information, but what kind of mitigating processes you have to put in place to mitigate those risks.

In certain industries, we see certain trends, whether it’s automotive or aerospace. A lot of the suppliers that are critical in these industries are cross-industry. Focusing on a certain industry and having the suppliers only in that particular industry will give you only a portion of that information to understand and predict risk.

And this is where a community where participants actively share information and insights for the greater good helps. And this is exactly what we’re trying to do with the Ariba Network and Supplier InfoNet.

Gardner: I’m trying to help our listeners solidify their thinking of how this would work in a practical sense in the real world. David, do you have any use-case scenarios that come to mind that would demonstrate the impact and the importance and reinforce this notion that you can’t do this without the community involvement?

Case study

Charpie: Let’s start with a case study. I’m going to talk about one of our customers that is a relatively small electronics distributor.

They signed on to use InfoNet and the Ariba Network to better understand what was happening down the multiple tiers of their supply chain. They wanted to make sure that they could deliver to their ultimate customers, a set of aerospace and defense contractors. They knew what they needed, when they needed it, and the quality that was required.

To manage that and find out what was going to happen, they loaded up Supplier InfoNet, began to get the alerts, and began to react to them. They found very quickly that they were able to find savings in three different areas that ultimately they could pass on to their customers through lower prices.

One of them was that they were able to reduce the amount of time their folks would spend just firefighting the risks that would come up when they didn’t have information ahead of time. That saved about 20 percent on an annual basis.
They needed an independent third party doing it, and SAP and Ariba are a trusted source for doing that.

Second, they also found that they were able to reduce the amount of inventory obsolescence by almost 15 percent on an annual basis as a result of that.

And third, they found that they were avoiding shortages that historically cut their revenues by about 5 percent due to the fact that previously they couldn’t deliver on product that was demanded often on short notice. With the InfoNet all of these benefits were realized for them and became practical to achieve.

Their own perspective on this, relative to the second part of your question, was they couldn’t do this on their own and that no one else could. As they like to say, I certainly wouldn’t share my supply base with my competitor. The idea is that we can take those in aggregate, anonymize them, and make sure the information is cleansed in such a way that no one can know who the contributing folks are.

The fact that they ultimately have control of what people see and what they don’t allows them to have an environment where they feel like they can trust it and act on it, and ultimately, they can. As a result, they’re able to take advantage of that in a way that no one could on their own.

We’ve even had a few of the aerospace and defense folks who tried to build this on their own. All of them ultimately came back because they said they couldn’t get the benchmark data and the aggregate community data. They needed an independent third party doing it, and SAP and Ariba are a trusted source for doing that.

Gardner: For those folks here at Ariba LIVE who are familiar with one or other of these services and programs or maybe not using either one, how do they start? They’re saying, “This is a very compelling value in the supply chain, taking advantage of these big-data capabilities, recognizing that third party role that we can’t do on our own.” How do they get going on this?

Two paths

Charpie: There are two paths you can take. One of them is that you can certainly call us. We would be more than happy to sit down and go through this and look at what your opportunities are by examining your supply base with you.

Second, is to look at this a bit on your own and be reflective. We often take customers through a process, where sit down and look at supply risk and disruption they’ve have had in the past, and based on that, categorize those into the types of disruptions they’ve seen. What is based on quality? What is based on sub-tier issues? What is based on environmental things like natural disasters? Then, we group them.

Then we say, let’s reflect on if you had known these problems were going to happen, as Sundar said three, six, eight months ahead, could you have done something that would have impacted the business, saved money, driven more revenue, whatever the outcome may be?

If the answer to those questions is yes, then we’ll take those particular cases where the impact is understood and where an early warning system would have made a difference financially. We’ll analyze what that really looks like and what the data tells us. And if we can find a pattern within that data, then we know going in that you're going to be successful with the Network and with InfoNet before you ever start.
We would be more than happy to sit down and go through this and look at what your opportunities are by examining your supply base with you.

Gardner: This also strikes me as something that doesn’t fall necessarily into a traditional bucket, as to who would go after these services and gain value from them. That is to say, this goes beyond procurement and just operations, and it enters well into governance, risk, and compliance (GRC).

Who should be looking at this in a large organization or how many different types of groups or constituencies in a large organization should be thinking about this unique service?

Kamakshisundaram: We have found that it depends on the vertical and the industry. Typically, it all starts with the procurement, trying to understand, making sure they can assure supply, that they can get the right suppliers.

Very quickly, procurement also continues to work with supply chain. So you have procurement, supply chain, and depending on how the organization is set up, you also have finance involved, because you need all these three areas to come together.

This is one of the projects where you need complete collaboration and trust within the internal procurement organization, supply chain/operations organization, and finance organization.

As David mentioned, when we talk to aerospace, as well as automotive or even heavy industrial or machinery companies, some of these organizations already are working together. If you really think about how product development is done, procurement participates at the start of the black-box process, where they actually are part and parcel of the process. You also have finance involved.

Assurance of supply

To really understand and manage risk in your supply chain, especially for components that go into your end-level product, which makes up significant revenue for your organization, Supplier Management continues all the way through, even after you actually have assurance of supply.

The second type of customers we have worked with are in the business services/financial/insurance companies, where the whole notion around compliance and risk falls under a chief risk officer or under the risk management umbrella within the financial organization.

Again, here in this particular case, it's not just the finance organization that's responsible for predicting, monitoring, and managing risk. In fact, finance organizations work collaboratively with the procurement organization to understand who their key suppliers are, collect all the information required to accurately model and predict risk, so that they can execute and mitigate risk.

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: Ariba, an SAP company.

You may also be interested in:

Tuesday, June 17, 2014

Latest ServiceNow update makes turning any awkward process into a managed service available to more workers

IT service management (ITSM) has long been a huge benefit to complex and exception-rich IT operations by helping to standardize, automate and apply a common system-of-record approach to tasks, incidents, assets, and workflows.

ServiceNow has been growing rapidly as a software-as-a-service (SaaS) provider of ITSM, but clearly sees a larger opportunity — making service creation, use, and management a benefit to nearly all workers for any number of business processes.

It’s one of those rare instances where IT has been more mature and methodological in solving complexity than many other traditional business functions. Indeed, siloed and disjointed "productivity applications" that require lots of manual effort have been a driver to bring service orientation to the average business process.
Traditional applications in any business setting can soon reach their point in inflexibility and break down and therefore don’t scale.

Just as in IT operations and performance monitoring, traditional applications in any business setting can soon reach their point in inflexibility and break down and therefore don’t scale. Despite human productivity efforts — via shuffling emails, spreadsheets, phone calls, sticky pads and text messages — processes bog down. Exceptions are boondoggles. Tasks go wanting. Customers can sense it all through lackluster overall performance.

So ServiceNow this week launched its Eureka version of its online service management suite with new features aimed at letting non-technical folks build custom applications and process flows, just like the technical folks in IT have been doing for years. Think of it as loosely coupled interactions that span many apps and processes for the rest of us.

Available globally

Now available globally, the fifth major release of ServiceNow includes more than 100 changes, new modules, and has a new user interface (UI) that allows more visualizations and drag and drop authoring and is more "mobile friendly," says Dave Wright, Chief Strategy Officer at ServiceNow, based in Santa Clara, CA.

“Enterprise users just can’t process work fast enough,” says Wright. “So our Service Creator uses a catalog and an new UI to allow workers to design services without IT.”

IT does, however, get the opportunity to vet and manage these services, and can decide what gets into the service catalog or not. Those of us who have been banging the SOA drum for years, well predicted this level of user-driven services and self-service business process management.

I, for one, am very keen to see how well enterprises pick up on this, especially as the cloud-deployed nature of ServiceNow can allow for extended enterprise process enablement and even a federated approach to service catalogs. Not only are internal processes hard to scale, but those work flows and processes that include multiple companies and providers are also a huge sticking point.
Systems integrators and consultancies may not like it as much, but the time has come for an organic means of automating tasks and complexity that most power users can leverage and innovate on.

Systems integrators and consultancies may not like it as much, but the time has come for an organic means of automating tasks and complexity that most power users can leverage and innovate on.

With this new release, it’s clear that ServiceNow has a dual strategy. One, it’s expanding its offerings to core IT operators, along the traditional capabilities of application lifecycle management, IT operations management, IT service management, project management, and change management. And there are many features in the new release to target this core IT user.

Additionally, ServiceNow has its sights on a potentially much larger market, the Enterprise Service Management (ESM) space. This is where today’s release is more wholly focused. Things like visualization, task boards, a more social way of working, and use of HTML 5 for the services interface, giving the cloud-delivered features native support and adaptability across devices. There is also a full iOS client on the App Store.

Indeed, this shift to ESM is driving the ServiceNow roadmap. I attended last month’s Knowledge 14 conference in Las Vegas, and came away thinking that this level of services management could be a sticky on-ramp to a cloud relationship for enterprises. Other cloud on-ramps include public cloud infrastructure as a service (IaaS), hybrid cloud platforms and management, business SaaS apps like Salesforce and Workday, and data lifecycle and analytics services. [Disclosure: ServiceNow paid my travel expenses to the user conference.]

Common data model

But as a cloud service, ServiceNow, if it attracts a large clientele outside of IT, could prove sticky too. That’s because all the mappings and interactions for more business processes would be within its suite — with the common data model shared by the entire ServiceNow application portfolio.

The underlying portfolio of third-party business apps and data are still important, of course, but the ways that enterprises operate at the process level — the very rules of work across apps, data and organizations — could be a productivity enhancement offer too good to refuse if they solve some major complexity problems.

Strategically, the cloud provider that owns the processes solution also owns the relationship with the manager corps at companies. And if the same cloud owns the relationship with IT processes — via the same common data model, well, then, that’s where a deep, abiding and lasting cloud business could long dwell. Oh, and its all paid for on an as-needed, per user, OpEx basis.
As a cloud service, ServiceNow, if it attracts a large clientele outside of IT, could prove sticky too. 

Specifically, the new ServiceNow capabilities include:
  • Service Creator -- a new feature that allows non-technical business users to create service-oriented applications faster than ever before
  • Form Designer -- a new feature that enables rapid creation and modification of forms with visual drag-and-drop controls
  • Facilities Service Automation -- a new application that routes requests to the appropriate facilities specialists and displays incidents on floor plan visualizations
  • Visual Task Boards -- a new feature to organize services and othervtasks using kanban-inspired boards that foster collaboration and increase productivity
  • Demand Management -- a new application that consolidates strategic requests from the business to IT and automates the steps in the investment decision process
  • CIO Roadmap — a new timeline visualization feature that displays prioritized investment decisions across business functions
  • Event Management - a new application that collects and transforms infrastructure events from third-party monitoring tools into meaningful alerts that trigger service workflows
  • Configuration Automation -- an application that controls and governs infrastructure configuration changes, enhanced to work in environments managed with Chef data center automation.
For more, a blog post on today's news from Wright.

You may also be interested in:

Wednesday, June 11, 2014

Big data should eclipse cloud as priority for enterprises

Big data is big -- but just how big may surprise you. 

According to a new QuinStreet survey, 77 percent of respondents consider big data analytics a priority. Another 72 percent cite enhancing the speed and accuracy of business decisions as a top benefit of big-data analytics. And 71 percent of mid-sized and large firms are planning for, if they are not already active, in big-data initiatives.

And based on what I'm hearing this week at the HP Discover conference, much of the zeitgeist has shifted from an emphasis on cloud benefits to the more meaningful and long-term implications of big data improvements. 

I recently discussed in a BriefingsDirect podcast how big data’s big payoff has arrived as customer experience insights drive new business advantages. But there are also some interesting case studies worth pointing out as we look at the big momentum behind big data. Despite the hype, big data may deliver productivity goods and benefits better, bigger than, and earlier than, cloud for enterprises and small and medium-sized businesses (SMBs) alike.
We’ve been able to do some deep analytic research on what that is and get valuable information.

Auto racing powerhouse NASCAR, for example, has engineered a way to learn more about its many fans -- and their likes and dislikes -- using big data analysis. The result is that they can rapidly adjust services and responses to keep connected best to those fans across all media and social networks.

BriefingsDirect had an opportunity to learn first-hand how NASCAR engages with its audiences using big data and the latest analysis platforms when we interviewed Steve Worling, Senior Director of IT at NASCAR, based in Daytona Beach, Fla. at the recent HP Discover 2013 Conference in Barcelona. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Listen to what Worling said: “As we launch a new car this year, our Gen-6 Car, what is the engagement or sentiment from our fans? We’ve been able to do some deep analytic research on what that is and get valuable information to be able to hand GM, who launched this car with us this year and say, ‘This is the results of the news’ instantly -- a lot of big data.”

Nimble Storage

Meanwhile, Nimble Storage is leveraging big data and the cloud to produce data performance optimization on the fly. It turns out that high-performing, cost-effective big-data processing helps to make the best use of dynamic storage resources by taking in all the relevant storage activities data, analyzing it and then making the best real-time choices for dynamic hybrid storage optimization.

BriefingsDirect recently sat down with optimized hybrid storage provider Nimble Storage to hear their story on the use of HP Vertica as their data analysis platform of choice. Yes, it’s the same Nimble that this year had a highly successful IPO. The expert is Larry Lancaster, Chief Data Scientist at Nimble Storage Inc. in San Jose, California. The discussion is, again, moderated by me.

Listen to how Nimble gets the analysis in speed, at the scale and at the cost, it requires. Lancaster explains how he uses HP Vertica to drive results:
When you start thinking about collecting as many different data points as we like to collect, you have to recognize that you’re going to end up with a couple choices on a row store.

“When you start thinking about collecting as many different data points as we like to collect, you have to recognize that you’re going to end up with a couple choices on a row store. Either you’re going to have very narrow tables and a lot of them or else you’re going to be wasting a lot of I/O overhead, retrieving entire rows where you just need a couple fields. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

That was what piqued his interest at first. But as he began to use it more and more at Glassbeam, where he was previously CTO, he realized that the performance benefits you could gain by using HP Vertica properly were another order of magnitude beyond what you would expect just with the column-store efficiency.

“That’s because of certain features that Vertica allows, such as something called pre-join projections. We can drill into that sort of stuff more if you like, but, at a high-level, it lets you maintain the normalized logical integrity of your schema, while having under the hood, an optimized denormalized query performance physically on disk.”

Healthcare industry

The healthcare industry is also turning to big-data analytics platforms to gain insight and awareness for improved patient outcomes. Indeed, analytics platforms and new healthcare-specific solutions together are offering far greater insight and intelligence into how healthcare providers are managing patient care, cost, and outcomes.

To learn how, BriefingsDirect sat down with Patrick Kelly, Senior Practice Manager at the Avnet Services Healthcare Practice, and Paul Muller, Chief Software Evangelist at HP, to examine the impact that big-data technologies and solutions are having on the highly dynamic healthcare industry. I moderated the discussion.
Medical information can be sensitive when available not just to criminals but even to prospective employers, members of the family, and others.

Muller said dealing with large volumes of sensitive personally identifiable information (PII) is not just a governance issue, but it’s a question of morals and making sure that we are doing the right thing by the people who are trusting themselves not just with their physical care, but with how they present in society. 

“Medical information can be sensitive when available not just to criminals but even to prospective employers, members of the family, and others,” he said. “The other thing we need to be mindful of is we’ve got to not just collect the big data, but we’ve got to secure it. We’ve got to be really mindful of who’s accessing what, when they are accessing, are they appropriately accessing it, and have they done something like taking a copy or moved it else where that could indicate that they have malicious intent. It’s also critical we think about big data in the context of health from a 360-degree perspective.”

So with all this in mind, how big will big data get? It’s not clear. The challenges are as big as big data itself but the QuinStreet survey suggests survey responses are pressing forward, with 45 percent expecting to data volumes to grow 45 percent in the next two years along.

You may also be interested in:

Thursday, June 5, 2014

Perfecto Mobile goes to cloud-based testing so developers can build the best apps faster

We have surely entered a golden age of mobile apps development, not just for app stores wares, but across all kinds of enterprise and productivity applications. The notion of mobile-first has altered the development landscape so much that the very notion of software development writ large will never be the same.

With the shift comes a need for speed, but not so much so that security and performance requirements suffer. How to maintain the balance between rapid delivery and quality assurance falls to the testing teams. Into the fray comes cloud-based testing efficiencies.

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy.

Our next innovation case study interview therefore highlights how Perfecto Mobile is using a variety of cloud-based testing tools to help its developers rapidly create the best mobile apps for both enterprises and commercial deployment.

BriefingsDirect had an opportunity to learn first-hand how rapid cloud testing begets better mobile development when we interviewed Yoram Mizrachi, CTO and Founder of Perfecto Mobile, based in Woburn, Mass. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Tell us about the state of the mobile development market. How fast is it growing, and who are building mobile apps these days?

Mizrachi
Mizrachi: Everyone is building mobile applications today. We have not gone into a single company that doesn’t have anything on mobile. It’s like what happened on the web 15 years ago. Mobile is moving fast. Even today, we have customers with more transactions on mobile than any other channel that they’re offering, including web or making calls. Mobile is here.

Gardner: So that’s a big challenge for companies that perhaps are used to a development cycle that took a lot longer, where they had more time to do testing and quality assurance. Mobile development seems to be speeding up. Is there a time crunch that they’re concerned about?

Mizrachi: Absolutely. In mobile there are two factors that come into play. The first one is that everyone today is expecting things to happen much faster. So everyone is talking about agile and DevOps, and crunching the time for a version from a few months, maybe even a year, into few weeks.

Bigger problem

With mobile, there’s a bigger problem. The market itself is moving faster. Looking at the mobile market, you see hundreds of mobile models being launched every year. Apple is releasing many models. Android is releasing tremendous amount of new models every year. The challenge for enterprises is how to release faster on one side, but still maintain a decent quality on all the wide ranges of devices available.

Gardner: So that’s a big challenge in terms of coming up with a test environment for each of those iterations.

Of course, we’re also seeing mobile first, where they’re going to build mobile, and it's changing the whole nature of development. It's a very dynamic and busy time for developers and enterprises. Tell us about Perfecto Mobile and how you’re helping them to manage these difficult times.

Mizrachi: Yes, it is mobile first. Many of our existing customers, as I mentioned, have more transactions on mobile than anything else. Today, they’re building an interface for their customers starting from mobile. This means there are tremendous issues that they need to handle, starting with automation. If automation was nice to have on traditional web -- with mobile it’s no longer a question. Building a robust and continuous automated testing environment is a must in mobile.

Gardner: Now, we’re talking about not only different targets for mobile, but we’re talking about different types of applications. There’s Android, Apple, native, HTML 5, Web, hybrid. How wide a landscape of types of apps are you supporting with your testing capabilities?
We support native, hybrid applications, Web services, iOS, Android, and any other platform.

Mizrachi: When you look at the market today, mobile is moving very fast, and you’re right, there are lots of solutions available in the market. One of the things that Perfecto Mobile is bringing to the market is the fact that we support them all. We support native, hybrid applications, Web services, iOS, Android, and any other platform. All of this is provided as a cloud service. We enable our customers to worry a little bit less about the environment and a little bit more about the actual testing.

Gardner: Tell us how you’re doing this? I know that you are a software-as-a-service (SaaS) provider and that the testing that you provide is through a cloud-based model. A lot of organizations have traditionally done their own testing or used some tools that may have been SaaS-provided. How are companies viewing going purely to a SaaS model for their testing with their mobile apps?

Mizrachi: The nice thing about what we do with cloud is that it solves a huge logistical problem for the enterprises. We’re providing managed solution for those physical devices. So it’s many things.

One of them is just physically managing those devices and enabling access to them from anywhere in the world. For example, if I’m a U.S.-based company, I can have my workforce and my testing, located anywhere in the world without the need to worry about the logistics of managing devices, offshoring, or anything like that. Our customers are utilizing this cloud model to not change their existing processes when moving into mobile.

ALM integration

Gardner: And in order to be able to use cloud amid a larger application lifecycle, you must also offer application lifecycle management (ALM) or at least integrate with ALM, source code management, and other aspects of development. How does that work?

Mizrachi: Our approach was to not reinvent the wheel. When looking at the large enterprises, we figured out that the existing ALM solutions in the market, led by HP, is there, and the right approach is to integrate or to extend them into mobile and not to replace them.

What we have is an extension to the ALM products  in such a way that you, as a customer, don’t have to change your existing processes and practices in order to move to mobile. You’ll have a lot of issues when moving into mobile, and we don’t believe that changing the processes should be one of them.

Gardner: Of course with HP having some 65 percent of the market for ALM and a major market presence for a lot of other testing and business service management capabilities, it was a no-brainer for you to have to integrate to HP. But you’ve gone beyond that. You’re using HP yourself for your own testing. Tell us how you came to do that.

Mizrachi: HP has the largest market in ALM, and looking at our customers in Fortune 500 companies, it was really obvious that we needed to utilize, integrate, or extend HP ALM tools in order to provide a market with the best solution.
One of the things I’m quite proud of is that we, as a company, have proofs of success in the market.

Internally, of course, we’re using the HP suites, including Unified Functional Testing (UFT) Performance Center, and Load Runner in order to manage our own development.

One of the things I’m quite proud of is that we, as a company, have proof of success in the market, with hundreds of customers already using us and tens of thousands of hours of automation every month being utilized.
We have customers with thousands of automated scripts running continuously in order to validate the applications. It's a competitive environment, obviously, but with Perfecto Mobile, the value that we’re bringing to the table is that we have a proven solution today used by the largest Fortune 500 companies in finance, retail, travel, utilities, and they have been using us not for months, but for years.

Gardner: Where do you see this going next? Is there a platform-as-a-service (PaaS) opportunity where we’re going to do not just testing but development and deployment ultimately? If you are in the cloud for more and more of what you do in development and deployment, it makes sense to try to solidify and unify across a cloud from start to finish.

Mizrachi: I’m obviously a little bit biased, but, yes, my belief is that the software development life cycle (SDLC) is moving to the cloud. If you want to go ahead, you don’t really have a choice. One of the major failures in SDLC is setup of the environment. If you don’t have the right environment, just in time, you will fail to deliver regardless of the tool that you have.

Just in time

Moving to the cloud means that you have everything that you need just in time. It's available for you. Someone has to make sure this solution is available with a given service-level agreement (SLA) and all of that. This is what Perfecto Mobile is doing of course, but I believe the entire market is going into that. Software development is moving to the cloud. This is quite obvious.

For our customers, the top insurance and top financial banks customers, healthcare organizations, all of them, security is extremely important, and of course it is for us. Our hosting solution is a SOC 2-certified solution. We have dedicated personnel for security and we make sure that our customers enjoy the highest level of privacy and, of course, security -- physical security, network security, and all the tools and processes in place.
As the mobile market matures, organization are relying more on mobile to assure and  increase their revenue.

Gardner: And, as we know, HP has been doing testing in the cloud successfully for more than 10 years and moving aggressively in that space early on.

Mizrachi: We’re enjoying the fact that our research and development center and HP's research and development center are close-by. So the development of the two products is very close. We have weekly or biweekly meetings between products and R and D teams in order to make sure that those two tools are moving together.

SDLC, as you mentioned, is a lifecycle. It's not only about one time testing; it's ongoing. And post-deployment, when moving into production, you need to see that what you’re offering to the market on the real device is actually what you expect. That’s extremely important.
As the mobile market matures, organization are relying more on mobile to assure and  increase their revenue. So making sure the mobile offering is up and running and meets the right key performance indicators (KPIs) on an ongoing basis is extremely important. The integration that we’ve made with BSM is utilizing an existing extremely mature product on the monitoring aspect and extending that with cloud-based real mobile devices for application monitoring.

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: HP.

You may also be interested in:

Tuesday, June 3, 2014

SAP’s Ariba teams with eBay to improve rogue B2B procurement for buyers, sellers and enterprises

It remains one of the last bastions of enterprise spend over which companies have little or no control. Yet companies have been loathe to tamper with how their employers and managers buy ad-hoc goods — known as indirect spend, shadow purchasing, or "spot buying."

Now, SAP’s Ariba cloud is bringing the best of flexible, innovative spot-buying practices into a more controlled and sanctioned process by teaming with eBay and its B2B marketplace for an integrated, yet dynamic, approach to those indirect purchases not covered by contracts and formal invoicing.

Such scattered, and often unmonitored, spot buying amounts to 15 to 20 percent of a typical enterprise’s total purchasing. And so it provides a huge opportunity for improvement, the type that cloud, big-data analytics, and a marketplace of marketplaces approach can best solve. [Disclosure: Ariba is a sponsor of BriefingsDirect podcasts.]

The Ariba Network Spot Buy service was announced today in Orlando at Sapphire by SAP CEO Bill McDermott. “The most intractable CEO issue of our time is complexity,” McDermott said in a keynote address Tuesday. “It’s getting worse and worse. We see a dream for a simpler SAP, and a simpler customer experience.”

Long before the Web, the Thomas Register or vertical industry buyers’ catalogs were the mainstays for how many business goods were discovered and procured. There were often done with no contracts, no bids, and no invoices. A material or product was needed, and so it was bought and paid for — fast.

The Web -- and especially Internet search — only increased the ability for those workers in need to find and buy whatever they had to to get their jobs done. Because these buys were deemed “emergency” purchases, or amounted to smaller total amounts, the rogue process essentially flew under the corporate radar.

Under the new Ariba Network Spot Buy service, major and public online marketplaces are brought into the procurement process inside of SAP and Ariba applications and services. eBay is the first, but Ariba expects to extend the process efficiency to other online B2B markets, said Joe Fox, Vice President of Business Network Strategy at SAP.

Pilot program

The new indirect procurement approach, which will operate as a pilot program between August and December this year, before general availability, will allow those buying through the integrated Ariba Network Spot Buy services to use eBay’s PayPal service to transfer and manage funding, said Fox.

Consistently updated content about B2B goods (services support will come later) will be available to users inside of their existing procurement applications, including Ariba, SAP and later third-party enterprise resource planning (ERP) applications, explained Fox. The users can search inside their Ariba apps, including soon-to-be-delivered mobile versions, alongside of their traditional purchasing app services, he said.

“It’s consumerizing business,” said Fox, adding that users gain convenience and access inside of procurement apps and processes while enjoying ad hoc flexibility and one-click, no-invoice payments from converged markets, catalogs and sanctioned search. Enterprises, on the other hand, gain a new ability to monitor spot buying, analyze it, and provide guidance and curation of what goods should be available to buy — and on what general terms. “It’s the best of Web-based buying but with some corporate control,” said Fox.
We are facilitating access — with controls and filters — to all the public and third-party content from various markets. It’s basically unlimited appropriate content for buyers and seekers.

Eventually, as multiple marketplaces become seamlessly available to procurement apps users, deeper analysis — via SAP’s HANA big-data infrastructure on which all Ariba apps and cloud services are being deployed — will allow business to determine if redundancy or waste indicates that the sourcing should be done differently.

The net net, said Fox, is that more unmonitored spending can fall under spot buying, even as some spot buying can move to more formal procurement where bids, negotiation and payment efficiencies such as dynamic discounting can play a role. What’s more, analytics can be applied to a whole new area of spend, amounting to higher productivity over many billions of dollars of B2B spending per year worldwide.

“We are not going to build any marketplaces," said Fox. “We are facilitating access — with controls and filters — to all the public and third-party content from various markets. It’s basically unlimited appropriate content for buyers and seekers.”

These marketplaces will also allow those selling goods and products to gain improved access into the B2B environments (such as the SAP installed base globally) as a new way to go seller-direct with information and content about their wares. New business models and relationships are no doubt bound to develop around that.

Fox said no other business app or procurement services providers have anything like the new offering, one that targets rogue and unmonitored buying by workers using open and aggregated mainstream markets for B2B goods.

You may also be interested in:

Wednesday, May 21, 2014

Big data’s big payoff arrives as customer experience insights drive new business advantages

The power of big data technology is being successfully applied to understanding such complex unknowns as consumer sentiment and even intent. And that understanding then vastly improves how retailers and myriad service providers manage their users' experiences -- increasingly in real time.
Fortunately, today's consumers are quite willing to share their intents and sentiments via social media, if you can gather and process the information. Hence the rapidly developing field of social customer relationship management, or Social CRM.

Listen to the podcast. Read a full transcript or download a copy. Sponsor: HP.

Part of the equation for making Social CRM effective comes from properly capturing the natural language knowledge delivered through the many social channels available to users. But even that is but a first step to being able to gain ever-deeper analysis, and rapidly and securely making those insights available where they pay off best.

And so the next BriefingsDirect thought leader discussion brings together customer analytics services provider Attensity, with its natural-language processing (NLP) technology, and HP Vertica, with big data analytics capabilities, to explain how to effectively listen to the social web and rapidly gain valuable insights and actionable intelligence.

Our guests are Howard Lau, Chairman and CEO of Attensity, and Chris Selland, Vice President of Marketing and Business Development at HP Vertica. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner:  Sellers and marketers worldwide have always wanted to know what their customers are anticipating or what they want next. I guess we could go back hundreds of years with these questions.

But as someone said recently, it seems that the ability to know what customers want and how to respond to them rapidly has changed more in the last 5 years than in the past 500. Do you agree with that? And why is that the case? What’s so new and different?

Lau: What has happened and emerged in the past 10 years or so, especially in the world of Twitter -- Twitter has been around since 2006 -- is that consumers are finding a voice to express their opinions about companies, products and brands. They can express their voice immediately through social channels.

That’s one of the new emerging things where, not only are they finding their voice online, but they’re also realizing that they’re able to amplify that voice by connecting with their friends and their followers.
JetBlue Case Study

NY-based JetBlue Airways created a new airline market category based on value, service, and style

Goals:
  • Provide a unique flying experience that truly satisfies each individual customer and improves services quality
  • Better understand and meet customer needs, as amenities such as its individual TVs and spacious leather seats are no longer enough to set them apart from the competition
Solution:
  • Attensity Analyze, powered by HP HAVEn with HP Vertica Analytics Engine
Results:
  • Instituted Customer Bill of Rights
  • More clearly understand what customers need and are able to make improvements and be proactive
  • Track complaints by plane’s tail number, allowing the customer service organization to see which planes have the most and fewest  issues
See more at:
http://www.attensity.com/2014/04/02/jetblue-airways/

Gardner: Why is that making such a big difference in how we know what customers  want? I understand that the social part is new and innovative, but how is this changing marketing?

Lau: The way things have happened before is that companies, as they engage with consumers, controlled the conversation. Whether you fill in an online form or you call an 800 number for customer service or purchase, you’re greeted initially with an automated prompt, and the whole prompt system navigates your engagement.

Lau
What makes Social CRM so unique and empowering for consumers is that, for the first time, it’s transferring the control and ownership of the conversation to the consumer, the customer. What that means is that the customer now controls what they want to talk about, where they want to talk about it, and what channel they want to use to communicate their needs or issues.

They don’t want to do it in a predefined form, where you check off boxes or answer specific prompts. They want to express their interests more organically and use the company’s branded channels on Facebook and Twitter and non-branded channels on industry forums and communities. That’s what’s key about Social CRM and that’s what’s so unique about this new generation of products to analyze the social web.

Gardner: Let’s go to Chris Selland. Chris, HP Vertica is dealing with a lot of organizations that are trying to do new and innovative things with marketing. Do you also agree that marketing and what we can do have shifted just dramatically in the last five years? Has it really changed the game?.

Selland: There’s been a very dramatic shift in the last five years in marketing. That’s driven, not exclusively, but certainly heavily, by what’s been going on in the social-media world -- Twitter and other channels, Facebook, LinkedIn, and so forth.

Selland
It has had two impacts. First, it has amplified the voice of the customer. I always remember that commercial about I will tell two friends and she will tell two friends, and so on. Customer voice has always had an impact, but the impact of customer voice these days is dramatically amplified by social media.

The other thing that’s really changed the game entirely is that now organizations that are seeking to understand their customers can no longer exclusively rely on internal data, and by internal data I mean things like customer relationship management (CRM).

In the past, when I, as a marketer, or any customer-facing exec running support or something else, wanted to understand my customer relationships, as long as we have had computers and applications had been able to look at something like my CRM system to see when my customer called the call center or when they bought something. Or I can view my transaction logs with them.

But what I haven't been able to look at and analyze is what they are doing when they’re not interacting with me, when they are interacting with the world, or when my customer is tweeting or on Facebook. Obviously, there is a very rich vein of data there. There is also a lot of noise to screen through, but if you do it right, there is potentially a very rich vein of data to help enhance relationships.

As I said, companies can choose to ignore that, but generally that would be strategically disadvantageous to do. Most companies recognize that there's a tremendous amount of data out there that doesn’t belong to me and that’s not necessarily all about me, but I can certainly use it to understand my present and future customers better.

If you interview a typical consumer, when are you more truthful, when you are interacting directly with the company or when you are actually tweeting or making recommendations to your friends or liking something on Facebook, a lot of the real information is outside of the walls of traditional IT. That’s what’s really changed things dramatically as well.

Quite a challenge

Gardner: Of course, that’s also provided quite a challenge when the information is in the form of sentiment or intent that we see through social interactions. It's more difficult to attain that and assess it.

Let’s go back to Howard. What are some of the challenges when it comes to getting information, maybe through NLP in order to extend it into this analysis capability?

Lau: When people go online in a social realm, they don’t think about their intent. They just express themselves. So the challenge is letting people communicate the way they choose to communicate and then try to figure out and infer what is their intent and their sentiment.

Trying to determine that is what we do using NLP in an effort to understand what the chatter is about and what the sentiment is about that chatter.
When you get down to what people are talking about, you have to understand from which domain they’re talking.

Gardner: In doing so, have you developed limits in terms of what you can do with the technology? It seems like this is a fairly a vast amount of information?

Lau: It's vast, and it's also very domain specific. There’s different terminology based on the domain. For example, in the hospitality and travel industry, when you use the word “service,” service means the service you are getting from the hotel or from the airline.

But when you use word “service” in the telecommunications space, that means something totally different. It means, your service plan, how many minutes you have, do you have text, and so forth.

So when you get down to what people are talking about, you have to understand from which domain they’re talking, infer their meaning and understand their sentiments.

Gardner: So there is a difficult issue in terms of language issues and then there are also technology issues around scale and depth, but let’s stick to the ones about NLP. What is it that Attensity does in order to solve that problem?

Ingesting data

Lau: First thing is that we ingest a tremendous amount of data. Most of it is social, but we also ingest company’s internal emails, customer notes, employee notes, and online surveys.

Then, we analyze it and annotate it. Part of the annotation is trying to explain the meaning of a sentence or a sentence fragment. The way we do annotations is driven by our proprietary NLP technology.

One of the first things we do is figure out who is this person and what he’s talking about. We’re trying to find the right industry domain that they are talking about and then distill that into the actual meaning -- the intent, as well as the sentiment.

Gardner: Howard, tell me a little bit more about how your relationship with HP has evolved. You have been working with Vertica for a while. Tell us a little bit about why Vertica was of interest to you as you’re trying to accomplish your goals with NLP.

Lau: With the annotations, we generate a lot of intelligence, a lot of metadata. Prior to our relationship with HP, we basically serviced the online surveys and certain internal notes and customer notes for corporations. As we embraced social, we had an explosion of content and annotations.
We’re trying to find the right industry domain that they are talking about and then distill that into the actual meaning -- the intent, as well as the sentiment.

For us, our relationship with HP was indispensable. HAVEn is not just a product; it's a platform. And it's a platform that scales well, not just handling the process of injecting large amounts of data, but also creating stores, a large store for us, as well as customer stores for each of our clients.

There’s absolutely no way we could have scaled our solution to address the continuing growth of the social realm without this relationship and partnership we have with HP and on the HAVEn platform.

Gardner: Just to be clear, HAVEn, of course, includes quite a few things. Maybe you could just help us understand which elements of HAVEn you’re using and which ones are the most beneficial to you?

Lau: First, it's Vertica. We use Vertica for every customer we have for analytical tools. Vertica sits behind that. Then, for managing the whole ingestion and the storage of the documents that we get from the social space, we use Hadoop and HBase from Hadoop. That’s how we embraced the HAVEn platform.

Gardner: Chris Selland, what is it about the Attensity use case that you think demonstrates some unique characteristics of Vertica and perhaps even more elements of HAVEn?

Complementary nature

Selland: First of all, it demonstrates the complementary nature of Vertica and Hadoop. The Vertica platform has been built to do very high-performance analytics on very large volumes of data. That’s really what we’re all about.

Obviously, Hadoop is also built to scale for very large volumes of data, and so we have bidirectional integration, actually huge integration and increasing convergence with Hadoop. Attensity is doing a great job of showing that.

Then, as we were talking about, it’s just the massive volumes of data that they’re managing. When you’re in the realm of the social world, again, it's not just the volume. I always say that big data is not just big, but it's the velocity, the variety, the ability to ingest very fast, and interpret, analyze, and produce results very fast. That’s really what the Vertica engine is all about, and it’s doing that with very high performance.

It's a very important market segment for us, and it's great to have partners. Vertica is a platform. We rely on our partners to provide solutions to run our platforms. It's social CRM and social analytics and all the kinds of solutions we’re looking to highlight. We love it when we have great partners like Attensity bringing those to market, being successful, and making our joint customers successful.
The Vertica platform has been built to do very high-performance analytics on very large volumes of data. That’s really what we’re all about.

Gardner: Of course, Howard, your customers are probably not so much concerned about what’s going on underneath the hood, whether it's Vertica, HAVEn, or Hadoop. They’re interested in getting results. I’d like to go back to that Social CRM aspect of our discussion and help people understand why that can be so beneficial, which then of course makes it clear why the technology that supports it is so important.

Can you give us any examples, Howard, of where people have used Social CRM, where they have leveraged NLP and Attensity and what that’s done for them in real business terms?

Lau: Absolutely. Some of the industries we service include industries such as telecommunications, hospitality, travel, consumer electronics, financial services, and eCommerce. We provide the services, the tools for our customers and they implement them for very different use cases based on their priorities.

One of the leading prepaid mobile phone providers use Attensity’s deep semantic approach to analyze sentiment about their service and alert the brand management teams to their unique voice of the customer (VoC)

Attensity effectively measures the overall experience for each brand taking into account their different products and services to determine the accurate wants and needs of the customer. Their whole return-on-investment (ROI) story is how can they use what’s going on in the social realm to manage their install base and minimize customer churn.

Focusing on that, they were able to achieve a 25 percent reduction in customer churn. Now, in the mobile telco space, that directly translates into a 25 percent increase in revenue. Keep in mind that this company is somewhere between half a billion to one billion dollars in revenue. That’s a very sizable return on investment.

We also have other cases where we have an insurance company in the financial services space, and they focus on fraud detection. They use our technology, not only in social space, but also reviewing claims. They were able to reduce workers’ compensation pretty dramatically, to a tune of over $25 million annually, just using our technology, and using our NLP to analyze the data and then figure out which ones they could go after to manage their fraud cost.

Looking toward the future

Gardner: Where do we go next with this, Howard? We have a capability to deal with large data and the variety of data. We certainly have a great treasure trove of information available from the social media and social web. Combining that with the traditional datasets in CRM, where do you go next? Are you looking for even more datasets and what do you have your eye on?

Lau: Getting more datasets is always helpful. The more you get, the more complete your analysis is, but the view right now is just analyzing big data. We are finding that, within that big data, there are tremendous amounts of individual voices. So the goal is to figure out where these individual voices are and how to build relationships with ones that are important to you.

I’m going to go back to a book that Malcolm Gladwell wrote way back called The Tipping Point. He talks about mavens and the influence of mavens. In the social chatter, there are all these people that have outside influence on other people. The next step in applying our NLP technology in the social realm is uncovering these mavens, so that companies can build relationships with these outside influencers. So that’s one of the next things that we’re really excited about.

Gardner: Tell us also where you are going in terms of services for business. Obviously we have talked about marketing, but are their other aspects -- maybe product development? How deeply does this extend into how it can influence a business, not just on the selling and marketing, but perhaps even knowing where their business should be going, a strategy level?
Having an analytical store where you can do what-if scenarios after the fact is incredibly useful for them.

Lau: When people hear about social, the first thing they do is listen, but there is a whole model for how people adopt business solutions in the social realm. We have a model we call LARA, and it stands for Listen, Analyze, Relate, and Act.

The first thing that a lot of companies do is become aware that they need to pay attention to what’s being discussed socially. So they put out these listening posts and they use us to ingest all this information and analyze it for them. The benefit of that is sentiment analysis on companies, on brands, and products. They want this type of sentiment in real time, and we’re able to deliver it in real time.

The next thing companies want to do is analyze the data they have accumulated, and it's for variety of different use cases. I mentioned fraud detection and customer churn. They also want to surface emerging trends. Having an analytical store where you can do what-if scenarios after the fact is incredibly useful for them.

Once they have the store of customer data and they’ve analyzed and segmented their customers, they want to define how they want to relate to the customers, in aggregate or in smaller segments.

The last and final thing they want to do as part of the whole consumer experience is figure out how to engage with the ones that are important to them.

As an example, if someone tweets that they like this phone, that’s great  sentiment. But if somebody else tweets that they don’t like the service they’re getting from this mobile phone provider, if that mobile phone provider is an Attensity customer, we actually take that tweet, route it into their customer-care organization, route it to the proper person, and respond to someone in the social realm.

This ability to kind of close that loop, from a person just tweeting generally to his friends about an experience, and then actually getting the customer to hear them and respond to them is incredibly powerful for organizations.

Following the path

Gardner: For companies that see the value here pretty readily, what steps should they take in order to be in the position to follow that path, that LARA path? Do they need to gather this data themselves? Should they try to ramp up how social media interactions are focused on their products or services? Are there any steps that companies should take in order to better leverage something like Attensity, that’s built on something like Vertica, to get these really powerful insights? Howard?

Lau: That’s part of the value that we bring. All the customer needs to do is recognize that social is important for them. We’re not just talking about corporations that are in the B2C space, but also in the B2B. Once they have that recognition, we’ll handle it for them afterwards.

Part of our products and services offering is that we ingest all this data for them, whether from the social sphere or in the companies emails or customer service notes. We ingest all that information, and they're all defined by one common trait, which is that they are unstructured data. We apply our NLP technology to provide an understanding of the big stream of data and then we create the analytical store for them.

All companies need to do is recognize the importance of wanting to hear their customers, listen to the customers, and ultimately, engage with them socially. They just have to have that motivation, and we will work with them as a partner to realize that solution for them.
Part of our products and services offering is that we ingest all this data for them, whether from the social sphere or in the companies emails or customer service notes.

Gardner: Chris Selland, I’m thinking that organizations that are sophisticated about this will go to a company like Attensity and get some great value, but eventually they’re going to want to try to get that holistic view of analysis. That means that, not only would they leverage what services and insights that Attensity could provide to them, but they’re going to want to share and correlate and integrate that with what they have going on internally and across many other systems.

Is there something about HAVEn that we should bring out for them in terms of open standards and integration capabilities that allows, over time, for more and more of these different data activities to relate to one another, so that we do get a whole greater than the sum of the parts?

Selland: HAVEn certainly provides a very broad platform of which, as we mentioned, Vertica is obviously a key part, the V in the middle. Yes is the short answer. The solutions ultimately need to be part of a much broader data architecture and strategy around how to leverage all sorts of different types of data, that’s not even necessarily customer data.

Just to give you an example and to make that tangible, there was an airline that I was engaged with not too long ago, probably about a year-and-a-half ago at this point. I can’t name them, but it's a well-known airline, and it was one that didn’t have a particularly good reputation for customer service.

They were working on their social-media strategy and trying to figure out how to make customers who were tweeting unhappily that they hated the airline say nicer things -- so how to analyze and respond more quickly.

What they quickly discovered was the reason so many of these customers were angry and saying they hated the airline was that their flight wasn’t on time. What they also realized was they had an awful lot of data on their maintenance operation, and sensor data from the planes, and so on from their fleet.

Predictive maintenance

They saw that by maybe doing a better job of predictive maintenance, keeping their flights on time, and keeping their fleets better maintained, they would actually have much more impact on customer satisfaction than responding to the tweet from the customer who was stranded, which kind of makes sense, if you think about it.

I just bring that example out because that’s an example of data that has nothing to do with the customer. It might be a sensor on an engine, or it might be a performance data of some sort, but it's related obviously to customer satisfaction.

So ultimately, yes, there needs to be a data infrastructure and a data strategy that spans the different solutions. It's not to say you don’t absolutely still need Social CRM solutions and all sorts of different solutions, predictive maintenance solutions and operational, financial analytic solutions, but ultimately the data infrastructure needs to be unified.

That’s really where this is going next. In many leading organizations that’s where it's going already, which is, these solutions absolutely play a key role, but they can’t be 24/7. So there needs to be an infrastructure and a strategy behind them that is very, very holistic.
What he’s driving towards is a world where it's really the Internet of Things, where everything is wired to the Internet and they broadcast messages or communicate messages related to their purpose and their focus. 

We're talking about the competitive bar moving here, and that’s the direction that the competitive bar is going to continue to move in.

Gardner: Howard, do you have any reaction to what Chris has said in terms of seeing a value of a holistic data architecture, not only from what Attensity can do, but extending it across many aspects of business?

Lau: I totally agree with what Chris just said. What he’s driving towards is a world where it's really the Internet of Things, where everything is wired to the Internet and they broadcast messages or communicate messages related to their purpose and their focus. 

Where we provide our value is that before we get to the world of Internet of Things, there is the Internet of People. People need to express themselves the way they normally do. Where we add value is trying to understand, distill the customers in a person’s voice, and have that complement the future of the Internet of Things.

I totally agree that having an integrated architecture, integrated approach to data management, big data management is crucial going forward.

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: HP.

You may also be interested in: