Wednesday, June 18, 2014

Big data meets the supply chain — SAP’s Supplier InfoNet and Ariba Network combine to predict supplier risk

The next BriefingsDirect case study interview explores how improved visibility analytics and predictive responses are improving supply-chain management. We’ll now learn how SAP’s Supplier InfoNet, coupled with the Ariba Network, allows for new levels of transparency in predictive analytics that reduce risk in supplier relationships.

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: Ariba, an SAP company.

BriefingsDirect had an opportunity to uncover more about about how the intelligent supply chain is evolving at the recent 2014 Ariba LIVE Conference in Las Vegas when we spoke to David Charpie, Vice President of Supplier InfoNet at SAP, and Sundar Kamakshisundaram, Senior Director of Solutions Marketing at Ariba, an SAP company. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: We’ve brought two things together here, SAP’s Supplier InfoNet and Ariba Network. What is it about these two that gives us the ability to analyze or predict, and therefore reduce, risk?

Charpie: To be able to predict and understand risk, you have to have two major components together. One of them is actually understanding this multi-tiered supply chain. Who is doing business with whom, all the way down the line, from the customer to the raw material in a manufacturing sense? To do that you need to be able to bring together a very large graph, if you will, of how all these companies are inter-linked.

Charpie
And that is ultimately what the Ariba Network brings to bear. With over 1.5 million companies that are inter-linked and transacting with each other, we can really see what those supply chains look like.

The second piece of it is to bring together, as Sundar talked about, lots of information of all kinds to be able to understand what’s happening at any point within that map. The kinds of information you need to understand are sometimes as simple as who is the company, what do they make, where are they located, what kind of political, geopolitical issues are they dealing with?
What we find is that suppliers don’t behave the same for everybody.

The more complex issues are things around precisely what exact product are they making with what kind of requirements, in terms of performance, and how they’re actually doing that on a customer-by-customer basis. What we find is that suppliers don’t behave the same for everybody.

So InfoNet and the network have come together to bring those two perspectives, all the data about how companies perform and what they are about with this interconnectedness of how companies work with each other. That really brings us to the full breadth of being able to address this issue about risk.

Gardner: Sundar, we have a depth of transactional history. We have data, we have relationships, and now we’re applying that to how supply chains actually behave and operate. How does this translate into actual information? How does the data go from your systems to someone who is trying to manage their business process?

Kamakshisundaram
Kamakshisundaram: A very good question. If you take a step back and understand the different data points you need to analyze to predict risk, they fall into two different buckets. The first bucket is around the financial metrics that you typically get from any of the big content providers you have in place. We can understand how the supplier is performing, based on current data, and exactly what they’re doing financially, if they’re a public company.

The second aspect, through the help of Ariba Network or Supplier InfoNet, is the ability to understand the operational and the transactional relationship a supplier has in place to predict how the supplier is going to behave six-to-eight months from now.

For example, you may be a large retailer or a consumer packaged goods (CPG) organization, maybe working with a very large trucking company. This particular trucking company may be doing really well and they may have great historical financial information, which basically puts them in a very good shape.

Financial viability

But if only one-third of the business is from retail and CPG and the remaining two-thirds comes from some of the challenging industries, all of a sudden, operational and financial viability of the transportation supply chain may not look good. Though the carrier's historical financials may be in good shape, you can’t really predict how the supplier is going to have working capital management in terms of cash available for them to run the business and maintain the operation in a sustainable manner.

How does Ariba, Ariba Network, and InfoNet help? By taking all the information across this multitude of variables, not only in a financial metrics, but also the operational metrics, and modeling the supply chain.

You don’t limit yourself with the first tier or second tier, but go all the way to the multi-tier supply chain and also the interactions that some of these suppliers may have with their customers. It will help you understand whether this particular supplier will be able to supply the right product and get you the right product to your docks at the right time.

Without having this inter-correlation of network data well laid out in a multi-tier supply chain, it would have been almost impossible to predict what is going to happen in this particular supply-chain example.

Gardner: What sort of trends or competitive pressures are making companies seek better ways to identify, acquire, or manage information and data to have a better handle on their supply chains?

Kamakshisundaram: The pressures are multifaceted. To start with, many organizations are faced with globalization pressure. Finding the right suppliers who can actually supply both the product and service at the right time is a second challenge. And the third challenge many companies grapple with right now is the ability to balance savings and cost reductions with risk mitigation.

These two opposing variables have to be in check in order to drive sustainable savings from the bottom line. These challenges, coupled with the supply-chain disruptions, are making it difficult not only to find suppliers, but also to get the right product at the right time.

Gardner: When we talk about risk in a supply-chain environment what are we really talking about? Risk can be a number of things in a number of different directions.

Many variables

Kamakshisundaram: Risk, at a very high level, is composed of many different variables. Many of us understand that risk is a function of, number one, the supply. If you don’t have the right supplier, if you don’t have the right product at the right time, you have risk.

And, there is the complexity involved in finding the suppliers to address needs in different parts of the world. You may have a supplier in North America, but if you really want to expand your market share in the Far East, especially in China, you need to have the right supply chain to do that.

Companies traditionally have looked at historical information to predict risk. And this is no longer enough because more and more, supply chains are becoming complex. Supply chains are affected by the number of globalized variables including the ability to have suppliers in different parts of the world, and also other challenges which will make risk more difficult to predict in the long run.

Gardner: Where do you see the pressures to change or improve how supply-chain issues are dealt with, and how do you also define the risks that are something to avoid in supply-chain management?

Charpie: When we think about risk we’re really thinking about it from two dimensions. One of them is environmental risk. That is, what are all the factors outside of the company that are impacting performance?

That can be as varied as wars, on one hand, right down to natural disasters and other political types of events that can also cause them to be disrupted in terms of managing their supply base and keeping the kind of cost structure they are looking for.

The other kind are more inherent operational types of risks. These are the things like on-time performance risk, as Sundar was referring to. What do we have in terms of quality? What do we have in terms of product and deliverables, and do they meet the needs of the customer?

As we look at these two kinds of risks, we’ve seen increasing amounts of disruption, because we’re in a time where the supply chains are getting much longer, leaner, and more complex to manage. As a result of that, you’re seeing that over 40 percent of interruptions right now are caused by interruptions in the supply chain downstream, tiers two, tier three, all the way to tier N.

So now we need a different way of managing suppliers than we had in the past. Just working with them and talking to them about how they do things and what they do isn’t enough. We need to understand how they’re actually managing their suppliers, and so on, down the line.

Predicting risk
These are models that behave more like the human brain than like some of the statistical math we learned when we were back in high school

Gardner: So, David, it sounds to me as algorithmic or as if a score card is there to generate this analysis. Is that the right way to look at this, or is it just making the data available for other people to reach conclusions that then allows them to reduce their risk?

Charpie: There absolutely is an algorithmic component to this. In fact, what we do in Supplier InfoNet and with the Ariba Network is to run machine-learning models. These are models that behave more like the human brain than like some of the statistical math we learned when we were back in high school and college.

What it looks for is patterns of behavior, and as Sundar said, we’re looking at how a company has performed in the past with all of their customers. How is that changing? What other variables are changing at the same time or what kinds of events are going on that may be influencing them?

We talked about environmental risk a bit ago. We capture information from about 160,000 newswire sources on a daily basis and, on an automated basis, are able to extract what that article is about, who it’s about, and what the impact on supply chain could be.

By integrating that with the transactional history of the Ariba Network and by integrating that with all the linkage on who does business with whom, we can start to see a pattern of behavior. That pattern of behavior can then help us understand what’s likely to happen moving forward.

To make it a little more concrete, let’s take Sundar’s example of a company having financial trouble. If I take a company, for example, under $100 million, what we have found is that if we see a company that begins to deliver late, within three months of that begins to have quality problems, and within two months or less begins to have cash-flow problems and can’t pay their bills on time, we may be seeing the beginning of a company that’s about to have a financial disaster.

Interestingly, what we find is for the pattern that really means something, after those three events. If they begin paying their bills on time all of a sudden, that’s the worst indicator there possibly could be. It’s very counterintuitive, but the models tell us that when that happens, we’re on the verge of someone who will go bankrupt within two to three months of that time frame.

Delivery model

Gardner: Now I can see why this wasn’t something readily available until fairly recently. We needed to have a cloud infrastructure delivery model. We needed to have the data available and accessible. And then we needed to have a big data capability to drive real-time analysis across multiple tiers on a global scale.

So here we are, Ariba LIVE 2014. What are we going to hear when can people start to actually use this? Where are we on the timeline for delivery in this really compelling value?

Kamakshisundaram: Both Supplier InfoNet and Ariba Network are available today for customers, so that they can continue to leverage these solutions. With the help of SAP’s innovation team, we’re planning to bring in additional solutions that not only help customers look at real-time risk modeling, but also more of predicted analytical capability show.
They can identify the suppliers they want to track to as many as the entire supply base.

Charpie: In terms of the business benefits in what we are offering, the features that really bring to life this notion of integrating the Ariba Network with InfoNet are, first and foremost, an ability to push alerts to our customers on a proactive basis to let them know when something is happening within their supply chain and could be impacting them in any way whatsoever.

That is, they can set their own levels. They can set what interests them. They can identify the suppliers they want to track to as many as the entire supply base. We will track those on an automated basis and give them updates to keep them abreast of what’s happening.

Second, we’re also going to give them the ability to monitor the entire supply base, from a heat-map perspective, to strategically see the hot pockets -- by industry, by spend, or by geography -- that they need to pay particular attention to.

Third, we’re also going to bring to them this automated capability to look at these 160,000 newswire sources and tell them the newswires that they need to pay attention to, so they can determine what kind of actions can they take from those, based on the activity that they see.

We’re also going to bring those predictions to them. We have the ability now to look at and predict performance and disruption and deliver those also as alerts, as well as deeper analytics. By leveraging the power of HANA, we’re able to bring real-time analysis to the customer.

They have those tools today, and so it’d be creating a totally personalized experience, where they can look at big data, look at it the way they want to, look at it the way that they believe risk should be measured and monitored, and be able to use that information right there and then for themselves.

Sharing environment

Last, they also have the ability to do this in an environment where they can share with each other, with their suppliers, and with others in the network, if they choose. What I mean by that is the model that we have used within Supplier InfoNet is very much like you see in Facebook.

When you have a supplier and you would like to see more of their supply base you request that you can see that, much like friending someone on Facebook. They will open up that portion -- some, little, none -- of their supply base that they would like you to be able to have access to. Once you have that, you can get alerts on them, you can manage them, and you can get input on them as well.

So there’s an ability for the community to work together, and that’s really the key piece that we see in the future, and it’s going to continue to expand and grow as we take InfoNet and the Network out to the market.
Focusing on a certain industry and having the suppliers only in that particular industry will give you only a portion of that information to understand and predict risk.

Kamakshisundaram: If you take a step back, you can see why companies haven’t been able to do something like this in the past. There were analytical models available. There were tools and technologies available, but in order to build a model that will help customers identify a multi-tier supply chain risk, you need a community of suppliers who are able to participate and provide information which will continue to help understand where the risk points are.

As David mentioned, where is your heat map? What does it say? And also, point to how you not only collect the information, but what kind of mitigating processes you have to put in place to mitigate those risks.

In certain industries, we see certain trends, whether it’s automotive or aerospace. A lot of the suppliers that are critical in these industries are cross-industry. Focusing on a certain industry and having the suppliers only in that particular industry will give you only a portion of that information to understand and predict risk.

And this is where a community where participants actively share information and insights for the greater good helps. And this is exactly what we’re trying to do with the Ariba Network and Supplier InfoNet.

Gardner: I’m trying to help our listeners solidify their thinking of how this would work in a practical sense in the real world. David, do you have any use-case scenarios that come to mind that would demonstrate the impact and the importance and reinforce this notion that you can’t do this without the community involvement?

Case study

Charpie: Let’s start with a case study. I’m going to talk about one of our customers that is a relatively small electronics distributor.

They signed on to use InfoNet and the Ariba Network to better understand what was happening down the multiple tiers of their supply chain. They wanted to make sure that they could deliver to their ultimate customers, a set of aerospace and defense contractors. They knew what they needed, when they needed it, and the quality that was required.

To manage that and find out what was going to happen, they loaded up Supplier InfoNet, began to get the alerts, and began to react to them. They found very quickly that they were able to find savings in three different areas that ultimately they could pass on to their customers through lower prices.

One of them was that they were able to reduce the amount of time their folks would spend just firefighting the risks that would come up when they didn’t have information ahead of time. That saved about 20 percent on an annual basis.
They needed an independent third party doing it, and SAP and Ariba are a trusted source for doing that.

Second, they also found that they were able to reduce the amount of inventory obsolescence by almost 15 percent on an annual basis as a result of that.

And third, they found that they were avoiding shortages that historically cut their revenues by about 5 percent due to the fact that previously they couldn’t deliver on product that was demanded often on short notice. With the InfoNet all of these benefits were realized for them and became practical to achieve.

Their own perspective on this, relative to the second part of your question, was they couldn’t do this on their own and that no one else could. As they like to say, I certainly wouldn’t share my supply base with my competitor. The idea is that we can take those in aggregate, anonymize them, and make sure the information is cleansed in such a way that no one can know who the contributing folks are.

The fact that they ultimately have control of what people see and what they don’t allows them to have an environment where they feel like they can trust it and act on it, and ultimately, they can. As a result, they’re able to take advantage of that in a way that no one could on their own.

We’ve even had a few of the aerospace and defense folks who tried to build this on their own. All of them ultimately came back because they said they couldn’t get the benchmark data and the aggregate community data. They needed an independent third party doing it, and SAP and Ariba are a trusted source for doing that.

Gardner: For those folks here at Ariba LIVE who are familiar with one or other of these services and programs or maybe not using either one, how do they start? They’re saying, “This is a very compelling value in the supply chain, taking advantage of these big-data capabilities, recognizing that third party role that we can’t do on our own.” How do they get going on this?

Two paths

Charpie: There are two paths you can take. One of them is that you can certainly call us. We would be more than happy to sit down and go through this and look at what your opportunities are by examining your supply base with you.

Second, is to look at this a bit on your own and be reflective. We often take customers through a process, where sit down and look at supply risk and disruption they’ve have had in the past, and based on that, categorize those into the types of disruptions they’ve seen. What is based on quality? What is based on sub-tier issues? What is based on environmental things like natural disasters? Then, we group them.

Then we say, let’s reflect on if you had known these problems were going to happen, as Sundar said three, six, eight months ahead, could you have done something that would have impacted the business, saved money, driven more revenue, whatever the outcome may be?

If the answer to those questions is yes, then we’ll take those particular cases where the impact is understood and where an early warning system would have made a difference financially. We’ll analyze what that really looks like and what the data tells us. And if we can find a pattern within that data, then we know going in that you're going to be successful with the Network and with InfoNet before you ever start.
We would be more than happy to sit down and go through this and look at what your opportunities are by examining your supply base with you.

Gardner: This also strikes me as something that doesn’t fall necessarily into a traditional bucket, as to who would go after these services and gain value from them. That is to say, this goes beyond procurement and just operations, and it enters well into governance, risk, and compliance (GRC).

Who should be looking at this in a large organization or how many different types of groups or constituencies in a large organization should be thinking about this unique service?

Kamakshisundaram: We have found that it depends on the vertical and the industry. Typically, it all starts with the procurement, trying to understand, making sure they can assure supply, that they can get the right suppliers.

Very quickly, procurement also continues to work with supply chain. So you have procurement, supply chain, and depending on how the organization is set up, you also have finance involved, because you need all these three areas to come together.

This is one of the projects where you need complete collaboration and trust within the internal procurement organization, supply chain/operations organization, and finance organization.

As David mentioned, when we talk to aerospace, as well as automotive or even heavy industrial or machinery companies, some of these organizations already are working together. If you really think about how product development is done, procurement participates at the start of the black-box process, where they actually are part and parcel of the process. You also have finance involved.

Assurance of supply

To really understand and manage risk in your supply chain, especially for components that go into your end-level product, which makes up significant revenue for your organization, Supplier Management continues all the way through, even after you actually have assurance of supply.

The second type of customers we have worked with are in the business services/financial/insurance companies, where the whole notion around compliance and risk falls under a chief risk officer or under the risk management umbrella within the financial organization.

Again, here in this particular case, it's not just the finance organization that's responsible for predicting, monitoring, and managing risk. In fact, finance organizations work collaboratively with the procurement organization to understand who their key suppliers are, collect all the information required to accurately model and predict risk, so that they can execute and mitigate risk.

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: Ariba, an SAP company.

You may also be interested in:

Tuesday, June 17, 2014

Latest ServiceNow update makes turning any awkward process into a managed service available to more workers

IT service management (ITSM) has long been a huge benefit to complex and exception-rich IT operations by helping to standardize, automate and apply a common system-of-record approach to tasks, incidents, assets, and workflows.

ServiceNow has been growing rapidly as a software-as-a-service (SaaS) provider of ITSM, but clearly sees a larger opportunity — making service creation, use, and management a benefit to nearly all workers for any number of business processes.

It’s one of those rare instances where IT has been more mature and methodological in solving complexity than many other traditional business functions. Indeed, siloed and disjointed "productivity applications" that require lots of manual effort have been a driver to bring service orientation to the average business process.
Traditional applications in any business setting can soon reach their point in inflexibility and break down and therefore don’t scale.

Just as in IT operations and performance monitoring, traditional applications in any business setting can soon reach their point in inflexibility and break down and therefore don’t scale. Despite human productivity efforts — via shuffling emails, spreadsheets, phone calls, sticky pads and text messages — processes bog down. Exceptions are boondoggles. Tasks go wanting. Customers can sense it all through lackluster overall performance.

So ServiceNow this week launched its Eureka version of its online service management suite with new features aimed at letting non-technical folks build custom applications and process flows, just like the technical folks in IT have been doing for years. Think of it as loosely coupled interactions that span many apps and processes for the rest of us.

Available globally

Now available globally, the fifth major release of ServiceNow includes more than 100 changes, new modules, and has a new user interface (UI) that allows more visualizations and drag and drop authoring and is more "mobile friendly," says Dave Wright, Chief Strategy Officer at ServiceNow, based in Santa Clara, CA.

“Enterprise users just can’t process work fast enough,” says Wright. “So our Service Creator uses a catalog and an new UI to allow workers to design services without IT.”

IT does, however, get the opportunity to vet and manage these services, and can decide what gets into the service catalog or not. Those of us who have been banging the SOA drum for years, well predicted this level of user-driven services and self-service business process management.

I, for one, am very keen to see how well enterprises pick up on this, especially as the cloud-deployed nature of ServiceNow can allow for extended enterprise process enablement and even a federated approach to service catalogs. Not only are internal processes hard to scale, but those work flows and processes that include multiple companies and providers are also a huge sticking point.
Systems integrators and consultancies may not like it as much, but the time has come for an organic means of automating tasks and complexity that most power users can leverage and innovate on.

Systems integrators and consultancies may not like it as much, but the time has come for an organic means of automating tasks and complexity that most power users can leverage and innovate on.

With this new release, it’s clear that ServiceNow has a dual strategy. One, it’s expanding its offerings to core IT operators, along the traditional capabilities of application lifecycle management, IT operations management, IT service management, project management, and change management. And there are many features in the new release to target this core IT user.

Additionally, ServiceNow has its sights on a potentially much larger market, the Enterprise Service Management (ESM) space. This is where today’s release is more wholly focused. Things like visualization, task boards, a more social way of working, and use of HTML 5 for the services interface, giving the cloud-delivered features native support and adaptability across devices. There is also a full iOS client on the App Store.

Indeed, this shift to ESM is driving the ServiceNow roadmap. I attended last month’s Knowledge 14 conference in Las Vegas, and came away thinking that this level of services management could be a sticky on-ramp to a cloud relationship for enterprises. Other cloud on-ramps include public cloud infrastructure as a service (IaaS), hybrid cloud platforms and management, business SaaS apps like Salesforce and Workday, and data lifecycle and analytics services. [Disclosure: ServiceNow paid my travel expenses to the user conference.]

Common data model

But as a cloud service, ServiceNow, if it attracts a large clientele outside of IT, could prove sticky too. That’s because all the mappings and interactions for more business processes would be within its suite — with the common data model shared by the entire ServiceNow application portfolio.

The underlying portfolio of third-party business apps and data are still important, of course, but the ways that enterprises operate at the process level — the very rules of work across apps, data and organizations — could be a productivity enhancement offer too good to refuse if they solve some major complexity problems.

Strategically, the cloud provider that owns the processes solution also owns the relationship with the manager corps at companies. And if the same cloud owns the relationship with IT processes — via the same common data model, well, then, that’s where a deep, abiding and lasting cloud business could long dwell. Oh, and its all paid for on an as-needed, per user, OpEx basis.
As a cloud service, ServiceNow, if it attracts a large clientele outside of IT, could prove sticky too. 

Specifically, the new ServiceNow capabilities include:
  • Service Creator -- a new feature that allows non-technical business users to create service-oriented applications faster than ever before
  • Form Designer -- a new feature that enables rapid creation and modification of forms with visual drag-and-drop controls
  • Facilities Service Automation -- a new application that routes requests to the appropriate facilities specialists and displays incidents on floor plan visualizations
  • Visual Task Boards -- a new feature to organize services and othervtasks using kanban-inspired boards that foster collaboration and increase productivity
  • Demand Management -- a new application that consolidates strategic requests from the business to IT and automates the steps in the investment decision process
  • CIO Roadmap — a new timeline visualization feature that displays prioritized investment decisions across business functions
  • Event Management - a new application that collects and transforms infrastructure events from third-party monitoring tools into meaningful alerts that trigger service workflows
  • Configuration Automation -- an application that controls and governs infrastructure configuration changes, enhanced to work in environments managed with Chef data center automation.
For more, a blog post on today's news from Wright.

You may also be interested in:

Wednesday, June 11, 2014

Big data should eclipse cloud as priority for enterprises

Big data is big -- but just how big may surprise you. 

According to a new QuinStreet survey, 77 percent of respondents consider big data analytics a priority. Another 72 percent cite enhancing the speed and accuracy of business decisions as a top benefit of big-data analytics. And 71 percent of mid-sized and large firms are planning for, if they are not already active, in big-data initiatives.

And based on what I'm hearing this week at the HP Discover conference, much of the zeitgeist has shifted from an emphasis on cloud benefits to the more meaningful and long-term implications of big data improvements. 

I recently discussed in a BriefingsDirect podcast how big data’s big payoff has arrived as customer experience insights drive new business advantages. But there are also some interesting case studies worth pointing out as we look at the big momentum behind big data. Despite the hype, big data may deliver productivity goods and benefits better, bigger than, and earlier than, cloud for enterprises and small and medium-sized businesses (SMBs) alike.
We’ve been able to do some deep analytic research on what that is and get valuable information.

Auto racing powerhouse NASCAR, for example, has engineered a way to learn more about its many fans -- and their likes and dislikes -- using big data analysis. The result is that they can rapidly adjust services and responses to keep connected best to those fans across all media and social networks.

BriefingsDirect had an opportunity to learn first-hand how NASCAR engages with its audiences using big data and the latest analysis platforms when we interviewed Steve Worling, Senior Director of IT at NASCAR, based in Daytona Beach, Fla. at the recent HP Discover 2013 Conference in Barcelona. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Listen to what Worling said: “As we launch a new car this year, our Gen-6 Car, what is the engagement or sentiment from our fans? We’ve been able to do some deep analytic research on what that is and get valuable information to be able to hand GM, who launched this car with us this year and say, ‘This is the results of the news’ instantly -- a lot of big data.”

Nimble Storage

Meanwhile, Nimble Storage is leveraging big data and the cloud to produce data performance optimization on the fly. It turns out that high-performing, cost-effective big-data processing helps to make the best use of dynamic storage resources by taking in all the relevant storage activities data, analyzing it and then making the best real-time choices for dynamic hybrid storage optimization.

BriefingsDirect recently sat down with optimized hybrid storage provider Nimble Storage to hear their story on the use of HP Vertica as their data analysis platform of choice. Yes, it’s the same Nimble that this year had a highly successful IPO. The expert is Larry Lancaster, Chief Data Scientist at Nimble Storage Inc. in San Jose, California. The discussion is, again, moderated by me.

Listen to how Nimble gets the analysis in speed, at the scale and at the cost, it requires. Lancaster explains how he uses HP Vertica to drive results:
When you start thinking about collecting as many different data points as we like to collect, you have to recognize that you’re going to end up with a couple choices on a row store.

“When you start thinking about collecting as many different data points as we like to collect, you have to recognize that you’re going to end up with a couple choices on a row store. Either you’re going to have very narrow tables and a lot of them or else you’re going to be wasting a lot of I/O overhead, retrieving entire rows where you just need a couple fields. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

That was what piqued his interest at first. But as he began to use it more and more at Glassbeam, where he was previously CTO, he realized that the performance benefits you could gain by using HP Vertica properly were another order of magnitude beyond what you would expect just with the column-store efficiency.

“That’s because of certain features that Vertica allows, such as something called pre-join projections. We can drill into that sort of stuff more if you like, but, at a high-level, it lets you maintain the normalized logical integrity of your schema, while having under the hood, an optimized denormalized query performance physically on disk.”

Healthcare industry

The healthcare industry is also turning to big-data analytics platforms to gain insight and awareness for improved patient outcomes. Indeed, analytics platforms and new healthcare-specific solutions together are offering far greater insight and intelligence into how healthcare providers are managing patient care, cost, and outcomes.

To learn how, BriefingsDirect sat down with Patrick Kelly, Senior Practice Manager at the Avnet Services Healthcare Practice, and Paul Muller, Chief Software Evangelist at HP, to examine the impact that big-data technologies and solutions are having on the highly dynamic healthcare industry. I moderated the discussion.
Medical information can be sensitive when available not just to criminals but even to prospective employers, members of the family, and others.

Muller said dealing with large volumes of sensitive personally identifiable information (PII) is not just a governance issue, but it’s a question of morals and making sure that we are doing the right thing by the people who are trusting themselves not just with their physical care, but with how they present in society. 

“Medical information can be sensitive when available not just to criminals but even to prospective employers, members of the family, and others,” he said. “The other thing we need to be mindful of is we’ve got to not just collect the big data, but we’ve got to secure it. We’ve got to be really mindful of who’s accessing what, when they are accessing, are they appropriately accessing it, and have they done something like taking a copy or moved it else where that could indicate that they have malicious intent. It’s also critical we think about big data in the context of health from a 360-degree perspective.”

So with all this in mind, how big will big data get? It’s not clear. The challenges are as big as big data itself but the QuinStreet survey suggests survey responses are pressing forward, with 45 percent expecting to data volumes to grow 45 percent in the next two years along.

You may also be interested in:

Thursday, June 5, 2014

Perfecto Mobile goes to cloud-based testing so developers can build the best apps faster

We have surely entered a golden age of mobile apps development, not just for app stores wares, but across all kinds of enterprise and productivity applications. The notion of mobile-first has altered the development landscape so much that the very notion of software development writ large will never be the same.

With the shift comes a need for speed, but not so much so that security and performance requirements suffer. How to maintain the balance between rapid delivery and quality assurance falls to the testing teams. Into the fray comes cloud-based testing efficiencies.

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy.

Our next innovation case study interview therefore highlights how Perfecto Mobile is using a variety of cloud-based testing tools to help its developers rapidly create the best mobile apps for both enterprises and commercial deployment.

BriefingsDirect had an opportunity to learn first-hand how rapid cloud testing begets better mobile development when we interviewed Yoram Mizrachi, CTO and Founder of Perfecto Mobile, based in Woburn, Mass. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Tell us about the state of the mobile development market. How fast is it growing, and who are building mobile apps these days?

Mizrachi
Mizrachi: Everyone is building mobile applications today. We have not gone into a single company that doesn’t have anything on mobile. It’s like what happened on the web 15 years ago. Mobile is moving fast. Even today, we have customers with more transactions on mobile than any other channel that they’re offering, including web or making calls. Mobile is here.

Gardner: So that’s a big challenge for companies that perhaps are used to a development cycle that took a lot longer, where they had more time to do testing and quality assurance. Mobile development seems to be speeding up. Is there a time crunch that they’re concerned about?

Mizrachi: Absolutely. In mobile there are two factors that come into play. The first one is that everyone today is expecting things to happen much faster. So everyone is talking about agile and DevOps, and crunching the time for a version from a few months, maybe even a year, into few weeks.

Bigger problem

With mobile, there’s a bigger problem. The market itself is moving faster. Looking at the mobile market, you see hundreds of mobile models being launched every year. Apple is releasing many models. Android is releasing tremendous amount of new models every year. The challenge for enterprises is how to release faster on one side, but still maintain a decent quality on all the wide ranges of devices available.

Gardner: So that’s a big challenge in terms of coming up with a test environment for each of those iterations.

Of course, we’re also seeing mobile first, where they’re going to build mobile, and it's changing the whole nature of development. It's a very dynamic and busy time for developers and enterprises. Tell us about Perfecto Mobile and how you’re helping them to manage these difficult times.

Mizrachi: Yes, it is mobile first. Many of our existing customers, as I mentioned, have more transactions on mobile than anything else. Today, they’re building an interface for their customers starting from mobile. This means there are tremendous issues that they need to handle, starting with automation. If automation was nice to have on traditional web -- with mobile it’s no longer a question. Building a robust and continuous automated testing environment is a must in mobile.

Gardner: Now, we’re talking about not only different targets for mobile, but we’re talking about different types of applications. There’s Android, Apple, native, HTML 5, Web, hybrid. How wide a landscape of types of apps are you supporting with your testing capabilities?
We support native, hybrid applications, Web services, iOS, Android, and any other platform.

Mizrachi: When you look at the market today, mobile is moving very fast, and you’re right, there are lots of solutions available in the market. One of the things that Perfecto Mobile is bringing to the market is the fact that we support them all. We support native, hybrid applications, Web services, iOS, Android, and any other platform. All of this is provided as a cloud service. We enable our customers to worry a little bit less about the environment and a little bit more about the actual testing.

Gardner: Tell us how you’re doing this? I know that you are a software-as-a-service (SaaS) provider and that the testing that you provide is through a cloud-based model. A lot of organizations have traditionally done their own testing or used some tools that may have been SaaS-provided. How are companies viewing going purely to a SaaS model for their testing with their mobile apps?

Mizrachi: The nice thing about what we do with cloud is that it solves a huge logistical problem for the enterprises. We’re providing managed solution for those physical devices. So it’s many things.

One of them is just physically managing those devices and enabling access to them from anywhere in the world. For example, if I’m a U.S.-based company, I can have my workforce and my testing, located anywhere in the world without the need to worry about the logistics of managing devices, offshoring, or anything like that. Our customers are utilizing this cloud model to not change their existing processes when moving into mobile.

ALM integration

Gardner: And in order to be able to use cloud amid a larger application lifecycle, you must also offer application lifecycle management (ALM) or at least integrate with ALM, source code management, and other aspects of development. How does that work?

Mizrachi: Our approach was to not reinvent the wheel. When looking at the large enterprises, we figured out that the existing ALM solutions in the market, led by HP, is there, and the right approach is to integrate or to extend them into mobile and not to replace them.

What we have is an extension to the ALM products  in such a way that you, as a customer, don’t have to change your existing processes and practices in order to move to mobile. You’ll have a lot of issues when moving into mobile, and we don’t believe that changing the processes should be one of them.

Gardner: Of course with HP having some 65 percent of the market for ALM and a major market presence for a lot of other testing and business service management capabilities, it was a no-brainer for you to have to integrate to HP. But you’ve gone beyond that. You’re using HP yourself for your own testing. Tell us how you came to do that.

Mizrachi: HP has the largest market in ALM, and looking at our customers in Fortune 500 companies, it was really obvious that we needed to utilize, integrate, or extend HP ALM tools in order to provide a market with the best solution.
One of the things I’m quite proud of is that we, as a company, have proofs of success in the market.

Internally, of course, we’re using the HP suites, including Unified Functional Testing (UFT) Performance Center, and Load Runner in order to manage our own development.

One of the things I’m quite proud of is that we, as a company, have proof of success in the market, with hundreds of customers already using us and tens of thousands of hours of automation every month being utilized.
We have customers with thousands of automated scripts running continuously in order to validate the applications. It's a competitive environment, obviously, but with Perfecto Mobile, the value that we’re bringing to the table is that we have a proven solution today used by the largest Fortune 500 companies in finance, retail, travel, utilities, and they have been using us not for months, but for years.

Gardner: Where do you see this going next? Is there a platform-as-a-service (PaaS) opportunity where we’re going to do not just testing but development and deployment ultimately? If you are in the cloud for more and more of what you do in development and deployment, it makes sense to try to solidify and unify across a cloud from start to finish.

Mizrachi: I’m obviously a little bit biased, but, yes, my belief is that the software development life cycle (SDLC) is moving to the cloud. If you want to go ahead, you don’t really have a choice. One of the major failures in SDLC is setup of the environment. If you don’t have the right environment, just in time, you will fail to deliver regardless of the tool that you have.

Just in time

Moving to the cloud means that you have everything that you need just in time. It's available for you. Someone has to make sure this solution is available with a given service-level agreement (SLA) and all of that. This is what Perfecto Mobile is doing of course, but I believe the entire market is going into that. Software development is moving to the cloud. This is quite obvious.

For our customers, the top insurance and top financial banks customers, healthcare organizations, all of them, security is extremely important, and of course it is for us. Our hosting solution is a SOC 2-certified solution. We have dedicated personnel for security and we make sure that our customers enjoy the highest level of privacy and, of course, security -- physical security, network security, and all the tools and processes in place.
As the mobile market matures, organization are relying more on mobile to assure and  increase their revenue.

Gardner: And, as we know, HP has been doing testing in the cloud successfully for more than 10 years and moving aggressively in that space early on.

Mizrachi: We’re enjoying the fact that our research and development center and HP's research and development center are close-by. So the development of the two products is very close. We have weekly or biweekly meetings between products and R and D teams in order to make sure that those two tools are moving together.

SDLC, as you mentioned, is a lifecycle. It's not only about one time testing; it's ongoing. And post-deployment, when moving into production, you need to see that what you’re offering to the market on the real device is actually what you expect. That’s extremely important.
As the mobile market matures, organization are relying more on mobile to assure and  increase their revenue. So making sure the mobile offering is up and running and meets the right key performance indicators (KPIs) on an ongoing basis is extremely important. The integration that we’ve made with BSM is utilizing an existing extremely mature product on the monitoring aspect and extending that with cloud-based real mobile devices for application monitoring.

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: HP.

You may also be interested in:

Tuesday, June 3, 2014

SAP’s Ariba teams with eBay to improve rogue B2B procurement for buyers, sellers and enterprises

It remains one of the last bastions of enterprise spend over which companies have little or no control. Yet companies have been loathe to tamper with how their employers and managers buy ad-hoc goods — known as indirect spend, shadow purchasing, or "spot buying."

Now, SAP’s Ariba cloud is bringing the best of flexible, innovative spot-buying practices into a more controlled and sanctioned process by teaming with eBay and its B2B marketplace for an integrated, yet dynamic, approach to those indirect purchases not covered by contracts and formal invoicing.

Such scattered, and often unmonitored, spot buying amounts to 15 to 20 percent of a typical enterprise’s total purchasing. And so it provides a huge opportunity for improvement, the type that cloud, big-data analytics, and a marketplace of marketplaces approach can best solve. [Disclosure: Ariba is a sponsor of BriefingsDirect podcasts.]

The Ariba Network Spot Buy service was announced today in Orlando at Sapphire by SAP CEO Bill McDermott. “The most intractable CEO issue of our time is complexity,” McDermott said in a keynote address Tuesday. “It’s getting worse and worse. We see a dream for a simpler SAP, and a simpler customer experience.”

Long before the Web, the Thomas Register or vertical industry buyers’ catalogs were the mainstays for how many business goods were discovered and procured. There were often done with no contracts, no bids, and no invoices. A material or product was needed, and so it was bought and paid for — fast.

The Web -- and especially Internet search — only increased the ability for those workers in need to find and buy whatever they had to to get their jobs done. Because these buys were deemed “emergency” purchases, or amounted to smaller total amounts, the rogue process essentially flew under the corporate radar.

Under the new Ariba Network Spot Buy service, major and public online marketplaces are brought into the procurement process inside of SAP and Ariba applications and services. eBay is the first, but Ariba expects to extend the process efficiency to other online B2B markets, said Joe Fox, Vice President of Business Network Strategy at SAP.

Pilot program

The new indirect procurement approach, which will operate as a pilot program between August and December this year, before general availability, will allow those buying through the integrated Ariba Network Spot Buy services to use eBay’s PayPal service to transfer and manage funding, said Fox.

Consistently updated content about B2B goods (services support will come later) will be available to users inside of their existing procurement applications, including Ariba, SAP and later third-party enterprise resource planning (ERP) applications, explained Fox. The users can search inside their Ariba apps, including soon-to-be-delivered mobile versions, alongside of their traditional purchasing app services, he said.

“It’s consumerizing business,” said Fox, adding that users gain convenience and access inside of procurement apps and processes while enjoying ad hoc flexibility and one-click, no-invoice payments from converged markets, catalogs and sanctioned search. Enterprises, on the other hand, gain a new ability to monitor spot buying, analyze it, and provide guidance and curation of what goods should be available to buy — and on what general terms. “It’s the best of Web-based buying but with some corporate control,” said Fox.
We are facilitating access — with controls and filters — to all the public and third-party content from various markets. It’s basically unlimited appropriate content for buyers and seekers.

Eventually, as multiple marketplaces become seamlessly available to procurement apps users, deeper analysis — via SAP’s HANA big-data infrastructure on which all Ariba apps and cloud services are being deployed — will allow business to determine if redundancy or waste indicates that the sourcing should be done differently.

The net net, said Fox, is that more unmonitored spending can fall under spot buying, even as some spot buying can move to more formal procurement where bids, negotiation and payment efficiencies such as dynamic discounting can play a role. What’s more, analytics can be applied to a whole new area of spend, amounting to higher productivity over many billions of dollars of B2B spending per year worldwide.

“We are not going to build any marketplaces," said Fox. “We are facilitating access — with controls and filters — to all the public and third-party content from various markets. It’s basically unlimited appropriate content for buyers and seekers.”

These marketplaces will also allow those selling goods and products to gain improved access into the B2B environments (such as the SAP installed base globally) as a new way to go seller-direct with information and content about their wares. New business models and relationships are no doubt bound to develop around that.

Fox said no other business app or procurement services providers have anything like the new offering, one that targets rogue and unmonitored buying by workers using open and aggregated mainstream markets for B2B goods.

You may also be interested in: