Monday, January 28, 2013

Ford scours for more big data to bolster quality, improve manufacturing, streamline processes

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: The Open Group.

Ford has exploited the strengths of big data analytics by directing them internally to improve business results. In doing so, they scour the metrics from the company’s best processes across myriad manufacturing efforts and through detailed outputs from in-use automobiles -- all to improve and help transform their business.

So explains Michael Cavaretta, PhD, Technical Leader of Predictive Analytics for Ford Research and Advanced Engineering in Dearborn, Michigan. Cavaretta is one of a group of experts assembled this week for The Open Group Conference in Newport Beach, California.

Cavaretta has led multiple data-analytic projects at Ford to break down silos inside the company to best define Ford’s most fruitful data sets. Ford has successfully aggregated customer feedback, and extracted all the internal data to predict how best new features in technologies will improve their cars.

As a contributor to the The Open Group conference and its focus on "Big Data -- The Transformation We Need to Embrace Today," Cavaretta explains how big data is fostering business transformation by allowing deeper insights into more types of data efficiently, and thereby improving processes, quality control, and customer satisfaction.

The interview was moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: What's different now in being able to get at this data and do this type of analysis from five years ago?

Cavaretta: The biggest difference has to do with the cheap availability of storage and processing power, where a few years ago people were very much concentrated on filtering down the datasets that were being stored for long-term analysis. There has been a big sea change with the idea that we should just store as much as we can and take advantage of that storage to improve business processes.

Gardner: How did we get here? What's the process behind the benefits?

Sea change in attitude

Cavaretta: The process behind the benefits has to do with a sea change in the attitude of organizations, particularly IT within large enterprises. There's this idea that you don't need to spend so much time figuring out what data you want to store and worry about the cost associated with it, and more about data as an asset. There is value in being able to store it, and being able to go back and extract different insights from it. This really comes from this really cheap storage, access to parallel processing machines, and great software.
Cavaretta

I like to talk to people about the possibility that big data provides and I always tell them that I have yet to have a circumstance where somebody is giving me too much data. You can pull in all this information and then answer a variety of questions, because you don't have to worry that something has been thrown out. You have everything.

You may have 100 questions, and each one of the questions uses a very small portion of the data. Those questions may use different portions of the data, a very small piece, but they're all different. If you go in thinking, "We’re going to answer the top 20 questions and we’re just going to hold data for that," that leaves so much on the table, and you don't get any value out of it.
The process behind the benefits has to do with a sea change in the attitude of organizations, particularly IT within large enterprises.

We're a big believer in mash-ups and we really believe that there is a lot of value in being able to take even datasets that are not specifically big-data sizes yet, and then not go deep, not get more detailed information, but expand the breadth. So it's being able to augment it with other internal datasets, bridging across different business areas as well as augmenting it with external datasets.
A lot of times you can take something that is maybe a few hundred thousand records or a few million records, and then by the time you’re joining it, and appending different pieces of information onto it, you can get the big dataset sizes.

Gardner: You’re really looking primarily at internal data, while also availing yourself of what external data might be appropriate. Maybe you could describe a little bit about your organization, what you do, and why this internal focus is so important for you.

Internal consultants

Cavaretta: I'm part of a larger department that is housed over in the research and advanced-engineering area at Ford Motor Company, and we’re about 30 people. We work as internal consultants, kind of like Capgemini or Ernst & Young, but only within Ford Motor Company. We’re responsible for going out and looking for different opportunities from the business perspective to bring advanced technologies. So, we’ve been focused on the area of statistical modeling and machine learning for I’d say about 15 years or so.

And in this time, we’ve had a number of engagements where we’ve talked with different business customers, and people have said, "We'd really like to do this." Then, we'd look at the datasets that they have, and say, "Wouldn’t it be great if we would have had this. So now we have to wait six months or a year."

These new technologies are really changing the game from that perspective. We can turn on the complete fire-hose, and then say that we don't have to worry about that anymore. Everything is coming in. We can record it all. We don't have to worry about if the data doesn’t support this analysis, because it's all there. That's really a big benefit of big-data technologies.

The real value proposition definitely is changing as things are being pushed down in the company to lower-level analysts who are really interested in looking at things from a data-driven perspective. From when I first came in to now, the biggest change has been when Alan Mulally came into the company, and really pushed the idea of data-driven decisions.
The real value proposition definitely is changing as things are being pushed down in the company to lower-level analysts.

Before, we were getting a lot of interest from people who are really very focused on the data that they had internally. After that, they had a lot of questions from their management and from upper level directors and vice-president saying, "We’ve got all these data assets. We should be getting more out of them." This strategic perspective has really changed a lot of what we’ve done in the last few years.

Gardner: Are we getting to the point where this sort of Holy Grail notion of a total feedback loop across the lifecycle of a major product like an automobile is really within our grasp? Are we getting there, or is this still kind of theoretical. Can we pull it altogether and make it a science?

Cavaretta: The theory is there. The question has more to do with the actual implementation and the practicality of it. We still are talking a lot of data where even with new advanced technologies and techniques that’s a lot of data to store, it’s a lot of data to analyze, there’s a lot of data to make sure that we can mash-up appropriately.

And, while I think the potential is there and I think the theory is there. There is also a work in being able to get the data from multiple sources. So everything which you can get back from the vehicle, fantastic. Now if you marry that up with internal data, is it survey data, is it manufacturing data, is it quality data? What are the things do you want to go after first? We can’t do everything all at the same time.

Highest value

Our perspective has been let’s make sure that we identify the highest value, the greatest ROI areas, and then begin to take some of the major datasets that we have and then push them and get more detail. Mash them up appropriately and really prove up the value for the technologists.

Gardner: Clearly, there's a lot more to come in terms of where we can take this, but I suppose it's useful to have a historic perspective and context as well. I was thinking about some of the early quality gurus like Deming and some of the movement towards quality like Six Sigma. Does this fall within that same lineage? Are we talking about a continuum here over that last 50 or 60 years, or is this something different?

Cavaretta: That’s a really interesting question. From the perspective of analyzing data, using data appropriately, I think there is a really good long history, and Ford has been a big follower of Deming and Six Sigma for a number of years now.

The difference though, is this idea that you don't have to worry so much upfront about getting the data. If you're doing this right, you have the data right there, and this has some great advantages. You’ll have to wait until you get enough history to look for somebody’s patterns. Then again, it also has some disadvantage, which is you’ve got so much data that it’s easy to find things that could be spurious correlations or models that don’t make any sense.

The piece that is required is good domain knowledge, in particular when you are talking about making changes in the manufacturing plant. It's very appropriate to look at things and be able to talk with people who have 20 years of experience to say, "This is what we found in the data. Does this match what your intuition is?" Then, take that extra step.

Gardner: How has the notion of the Internet of things being brought to bear on your gathering of big data and applying it to the analytics in your organization?

Cavaretta: It is a huge area, and not only from the internal process perspective -- RFID tags within the manufacturing plans, as well as out on the plant floor, and then all of the information that’s being generated by the vehicle itself.

The Ford Energi generates about 25 gigabytes of data per hour. So you can imagine selling couple of million vehicles in the near future with that amount of data being generated. There are huge opportunities within that, and there are also some interesting opportunities having to do with opening up some of these systems for third-party developers. OpenXC is an initiative that we have going on to add at Research and Advanced Engineering.

Huge number of sensors

We have a lot of data coming from the vehicle. There’s huge number of sensors and processors that are being added to the vehicles. There's data being generated there, as well as communication between the vehicle and your cell phone and communication between vehicles.

There's a group over at Ann Arbor Michigan, the University of Michigan Transportation Research Institute (UMTRI), that’s investigating that, as well as communication between the vehicle and let’s say a home system. It lets the home know that you're on your way and it’s time to increase the temperature, if it’s winter outside, or cool it at the summer time.

The amount of data that’s been generated there is invaluable information and could be used for a lot of benefits, both from the corporate perspective, as well as just the very nature of the environment.

Gardner: Just to put a stake in the ground on this, how much data do cars typically generate? Do you have a sense of what now is the case, an average?

Cavaretta: The Energi, according to the latest information that I have, generates about 25 gigabytes per hour. Different vehicles are going to generate different amounts, depending on the number of sensors and processors on the vehicle. But the biggest key has to do with not necessarily where we are right now but where we will be in the near future.

With the amount of information that's being generated from the vehicles, a lot of it is just internal stuff. The question is how much information should be sent back for analysis and to find different patterns? That becomes really interesting as you look at external sensors, temperature, humidity. You can know when the windshield wipers go on, and then to be able to take that information, and mash that up with other external data sources too. It's a very interesting domain.
With the amount of information that's being generated from the vehicles, a lot of it is just internal stuff.

Gardner: What skills do you target for your group, and what ways do you think that you can improve on that?

Cavaretta: The skills that we have in our department, in particular on our team, are in the area of computer science, statistics, and some good old-fashioned engineering domain knowledge. We’ve really gone about this from a training perspective. Aside from a few key hires, it's really been an internally developed group.

Targeted training

The biggest advantage that we have is that we can go out and be very targeted with the amount of training that we have. There are such big tools out there, especially in the open-source realm, that we can spin things up with relatively low cost and low risk, and do a number of experiments in the area. That's really the way that we push the technologies forward.

Talking with The Open Group really gives me an opportunity to be able to bring people on board with the idea that you should be looking at a difference in mindset. It's not "Here’s a way that data is being generated, look, try and conceive of some questions that we can use, and we’ll store that too." Let's just take everything, we’ll worry about it later, and then we’ll find the value.

It's important to be thinking about data as an asset, rather than as a cost. You even have to spend some money, and it may be a little bit unsafe without really solid ROI at the beginning. Then, move towards pulling that information in, and being able to store it in a way that allows not just the high-level data scientist to get access to and provide value, but people who are interested in the data overall. Those are very important pieces.

The last one is how do you take a big-data project, how do you take something where you’re not storing in the traditional business intelligence (BI) framework that an enterprise can develop, and then connect that to the BI systems and look at providing value to those mash-ups. Those are really important areas that still need some work.

There are many companies, especially large enterprises, that are looking at their data assets and wondering what can they do to monetize this, not only to just pay for the efficiency improvement but as a new revenue stream.

Gardner: For those organizations that want to get started on this, how do you get started?
Understand that it maybe going to be a little bit more costly and the ROI isn't going to be there at the beginning.

Cavaretta: We're definitely a huge believer in pilot projects and proof of concept, and we like to develop roadmaps by doing. So get out there. Understand that it's going to be messy. Understand that it maybe going to be a little bit more costly and the ROI isn't going to be there at the beginning.

But get your feet wet. Start doing some experiments, and then, as those experiments turn from just experimentation into really providing real business value, that’s the time to start looking at a more formal aspect and more formal IT processes. But you've just got to get going at this point.
Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: The Open Group.

You may also be interested in:

Sunday, January 27, 2013

Improving signal-to-noise in risk management

This guest post comes courtesy of Jack Jones, principal of CXOWARE. He is also the author and creator of the Factor Analysis of Information Risk (FAIR) Framework. Jones will be a speaker at The Open Group Conference in Newport Beach, California this week.

By Jack Jones, CXOWARE

One of the most important responsibilities of the information security professional (or any IT professional, for that matter) is to help management make well-informed decisions. Unfortunately, this has been an elusive objective when it comes to risk. Although we’re great at identifying control deficiencies, and we can talk all day long about the various threats we face, we have historically had a poor track record when it comes to risk. There are a number of reasons for this, but in this article I’ll focus on just one -- definition.
Jones

You’ve probably heard the old adage, “You can’t manage what you can’t measure.”  Well, I’d add to that by saying, “You can’t measure what you haven’t defined.” The unfortunate fact is that the information security profession has been inconsistent in how it defines and uses the term “risk.” Ask a number of professionals to define the term, and you will get a variety of definitions. 

Besides inconsistency, another problem regarding the term “risk” is that many of the common definitions don’t fit the information security problem space or simply aren’t practical. For example, the ISO27000 standard defines risk as, “the effect of uncertainty on objectives.” What does that mean? Fortunately (or perhaps unfortunately), I must not be the only one with that reaction because the ISO standard goes on to define “effect,” “uncertainty,” and “objectives,” as follows:
  • Effect: A deviation from the expected -- positive and/or negative
  • Uncertainty: The state, even partial, of deficiency of information related to, understanding or knowledge of, an event, its consequence or likelihood
  • Objectives: Can have different aspects (such as financial, health and safety, information security, and environmental goals) and can apply at different levels (such as strategic, organization-wide, project, product and process)
NOTE: Their definition for ”objectives” doesn’t appear to be a definition at all, but rather an example. 

Practical Concern

Although I understand, conceptually, the point this definition is getting at, my first concern is practical in nature. As a Chief Information Security Officer (CISO), I invariably have more to do than I have resources to apply. Therefore, I must prioritize and prioritization requires comparison and comparison requires measurement. It isn’t clear to me how “uncertainty regarding deviation from the expected (positive and/or negative) that might affect my organization’s objectives” can be applied to measure, and thus compare and prioritize, the issues I’m responsible for dealing with. 

This is just an example though, and I don’t mean to pick on ISO because much of their work is stellar. I could have chosen any of several definitions in our industry and expressed varied concerns.

In my experience, information security is about managing how often loss takes place, and how much loss will be realized when/if it occurs. That is our profession’s value proposition, and it’s what management cares about. Consequently, whatever definition we use needs to align with this purpose. 

The Open Group’s Risk Taxonomy (shown below), based on Factor Analysis of Information Risk (FAIR), helps to solve this problem by providing a clear and practical definition for risk. In this taxonomy, Risk is defined as, “the probable frequency and probable magnitude of future loss.” 

The elements below risk in the taxonomy form a Bayesian network that models risk factors and acts as a framework for critically evaluating risk. This framework has been evolving for more than a decade now and is helping information security professionals across many industries understand, measure, communicate and manage risk more effectively.

In the communications context, you have to have a very clear understanding of what constitutes signal before you can effectively and reliably filter it out from noise. The Open Group’s Risk Taxonomy gives us an important foundation for achieving a much clearer signal.

I will be discussing this topic in more detail next week at The Open Group Conference in Newport Beach. For more information on my session or the conference, visit: http://www.opengroup.org/newportbeach2013.

This guest post comes courtesy of Jack Jones, principal of CXOWARE. He is also the author and creator of the Factor Analysis of Information Risk (FAIR) Framework. Jones will be a speaker at The Open Group Conference in Newport Beach, California this week.

Copyright The Open Group, 2013. All rights reserved.

You may also be interested in:

Tuesday, January 22, 2013

Dell survey highlights importance of putting users before devices when developing BYOD strategies

The Bring Your Own Device (BYOD) trend has rapidly morphed from a disruptive worry into a strategic, positive initiative for many organizations across the globe, according to results from a new global survey released today.

In fact, significant business benefits often follow when BYOD-adopter companies empower their employees with their preferred devices and work habits, as well as more say in the applications and data access they need to get their jobs done … anytime, from anywhere, the survey shows.
 
What's more, the rapidly improved productivity and enhanced collaboration among employees -- as well as greater communication and better service with customers, suppliers and partners -- are proving to be competitive advantages for companies that do BYOD best.

The survey was recently conducted by Dell Software Group and market researcher Vanson Bourne from nearly 1,500 IT executives around the world. It clearly demonstrates that, while BYOD shows promise, many organizations are struggling with making the most of BYOD. [Disclosure: Quest Software, part of Dell's Software Group, is a sponsor of BriefingsDirect podcasts.]

The majority of survey respondents agree that BYOD strategies deliver benefits -- more than 70 percent said they believe that BYOD can boost employee productivity and customer response time, and 59 percent say they feel their company could be at a competitive disadvantage if they didn't implement BYOD.

It's important that organizations define a program to help manage and protect their corporate information, but also to empower employees to do their jobs better and faster.

And attitudes toward the innovation impact that BYOD has are largely the same among the midsized (between 1,000 and 3,000 employees) and large (more than 3,000 employees) companies surveyed. Roughly half of respondents in both groups say that BYOD has significantly changed the business culture at their organizations.

Yet less than half of the IT leadership respondents -- 44 percent -- say they understand the importance of a user-centric approach to BYOD. And this is where many organizations are missing the boat, according to Dell Software.

"It's important that organizations define a program to help manage and protect their corporate information, but also to empower employees to do their jobs better and faster, or to improve customer satisfaction, or whatever the goal is," says Roger Bjork, Director of Enterprise Mobility Solutions at Dell Software Group. "Companies that do BYOD for a purpose, to help with a broader goal, are seeing better results versus those who simply let BYOD happen. A big part of that is focusing on the user aspect, and not limiting it to a device discussion."

To me these results should remove doubt that embracing desktop virtualization (VDI) and mobile device management are priorities for IT. I also think that BYOD is a catalyst to more general IT transformation and more business-centric emphasis for IT innovation. You could say that BYOD forms a capstone on the rising archway of VDI, web apps, thin clients, terminal services, and other applications delivery advancements overt the past 15 years.

The idea, of course, going back to the very first PCs, is to let the user decide how to work best.

Benefits of putting users first

Indeed, those survey respondents who say they believe the user-centric approach to BYOD is the right one also reported they are able to drive business benefits, satisfy users and gain a competitive edge to a higher degree than survey respondents who don't see the benefit of a user-centric approach. Some 64 percent of respondents say BYOD works well when the specific user needs are understood by IT.

And respondents at companies further along in their BYOD strategy implementations are more likely to agree that the most benefit is derived from programs that put the user first. Understanding individual user needs and the resulting improvements in employee productivity and satisfaction are much more important pillars of a BYOD strategy than simply allowing employees to use the device in their pocket or purse for work, says Bjork.

The survey data definitely shows companies that have well thought-out BYOD strategies have fewer issues and are garnering better results.

"You do a disservice if you make this about what shiny object do employees want to use to connect to email. It's much broader than that," he says. "I think the survey data definitely shows companies that have well thought-out BYOD strategies have fewer issues and are garnering better results."

A one-size fits all approach to BYOD does not work, says Bjork,  IT should manage the spread of BYOD. "You can't just let BYOD happen," he added.

A user-centric approach to BYOD fits with the recent trend for IT to transform into a service-delivery organization that understands not only the goals of the business, but how to support employees to help them achieve these goals, adds Bjork. What's more, as the Millennial Generation enters the workforce with certain digital expectations, user-centric BYOD policies can be leveraged as recruiting tools.

"Let's face it, this is not just a mobile thing. It has a much broader impact," says Bjork. "BYOD is a method, not a result. … You should start by asking 'What is the user trying to accomplish?'"

The bottom line is that IT now needs to support multiple ways of working for a variety of styles and devices that appeal and adjust based on user preferences and innovations.


Additional points

Other noteworthy points that arose from the survey:

  • Survey participants in the U.S. are least likely to stress users over devices when crafting a BYOD strategy, while respondents in Singapore were most likely to do so. Other geographies covered by the survey include the U.K, France, Germany, Italy, Australia, India and the Beijing region of China;
  • Respondents listed more flexible working hours, the ability to foster creativity, speed innovation, boost morale and facilitate teamwork and collaboration as personal gains for employees working with BYOD strategies;
  • Organizations that consider applications as part of a their BYOD strategy are more likely to link and manage devices per user; clearly define roles for their user community in one central database; track and support each user's level of mobility, and deliver applications to users based on their role within the company. Of those with no formal BYOD policy, 27 percent say they can't provide any of these functions;
  • The top five BYOD challenges listed by survey respondents are abuse of policies; theft/loss of mobile devices; lack of control over applications and data on devices; employees leaving the company with insider knowledge; and unauthorized data distribution.
Dell's position on the importance of a user-centric BYOD strategy comes from experience. Carol Fawcett was the CIO of Quest Software, which was recently acquired by Dell, with nearly 4,000 employees in 23 countries. The company focused on giving employees access to the applications and data they needed to get their jobs done, regardless of what device was being used.
The results of this latest BYOD survey reinforce the importance of putting users first in order to develop the most effective policies and turn BYOD into a long-term, sustainable business benefit.


"We found this approach helped us quickly move out of device firefighting mode to be much more strategic, which also enabled us to resolve our biggest BYOD problems, such as security, access
rights and data leakage," says Fawcett. "The results of this latest BYOD survey reinforce the importance of putting users first in order to develop the most effective policies and turn BYOD into a long-term, sustainable business benefit."

As Quest found with its own workers, the better planned the BYOD acceptance, the fewer negative issues and the better the productivity result. I recently spoke with Fawcett at length about the experience.

I think we should also expect that BYOD planning will be done in association with -- and as an accelerant to -- other larger initiatives such as data center consolidation, IT transformation, and applications modernization.

Bjork further suggested that companies that have a policy-based approach to security and access control, and adopt services oriented architectures and data lifecycle management will be in a better position to avail themselves of BYOD faster and at lower risk. VDI and thin-client initiatives also pave the way to BYOD.

(BriefingsDirect contributor Cara Garretson provided editorial assistance and research on this post. She can be reached on LinkedIn.)


You may also be interested in: 

Thursday, January 17, 2013

Latest Jitterbit release further eases application and data integration from among modern sources

Data and apps integration provider Jitterbit this week released a new version of its solution, Jitterbit 5, designed to be the glue between on-premise, cloud, social, and mobile data,

Jitterbit focuses on simple yet powerful integration technologies that can be quickly and easily deployed to create integrated processes and data views. We've seen a lot of interest in light-weight, low-coding integration capabilities as more SaaS and cloud services need to be coordinated. This is now becoming even more pertinent to bringing data together from a variety of sources.

Jitterbit 5 aims to raise the level of simplicity even higher with new features that streamline process integration, said the Oakland, CA company. The wizards-based approach allows non-technical users to design integration projects through a graphical, point-and-click interface. I think making more people able to tailor and specify integrations can significantly boost innovation and productivity.
Vendors have been trying to solve the issue of integration of technology for over twenty years.

In enterprise computing today, there are three main sources of data that must come together to help drive the business forward, according to Jitterbit's thinking. First there's corporate data -- which for years has been the cornerstone of technology strategies -- that sits in databases, data warehouses, enterprise applications, etc. and is typically kept safe and sound on-premise, behind the firewall.

Over the past few years, two other sources of data have emerged as critical for businesses that want to optimize their operations and better serve their customers; data stored in cloud services, and data from a pair of new platforms -- social and mobile. And we'll no doubt be seeing ever larger and more specific data emerge from business and consumer activities from these domains.

These newer sources of data can be located anywhere, and the information they provide comes in a wide variety of formats, making it harder than ever to integrate with structured corporate information using traditional integration technologies.

Three pillars

Jitterbit's focus therefore is to help enterprises better achieve integration of data from all these three pillars of modern computing. And the means to do it must appeal to the business analysts who understand best the need to have many types of different data readily available and associated with business processes in near real time.

"Vendors have been trying to solve the issue of integration of technology for over 20 years.
The majority of companies come at it with a technical perspective -- they try to solve the problem for the professional developer," says Andrew Leigh, vice president of products with Jitterbit.

"But the problem of integration isn't just a technical issue; it's a business issue. The people who are best at building, managing, and changing integration are the ones that understand it's really a process. We're putting integration back in the hands of the business analysts who really understand the data and processes to make that integration effective."
The people who are best at building, managing, and changing integration are the ones that understand it's really a process.

While Jitterbit features wizards and other simple tools to let non-technical users quickly build the data connections that the business requires, it's important that they work in partnership with IT to ensure the process is governed correctly, says Leigh, who recently joined Jitterbit from Salesforce.com.

"We've built all the knowledge and best practices that the industry has been building up over the last two decades into our solution; now we're focused on the user experience and hiding complexity," says Leigh.

This latest release also features enhanced connectivity to Salesforce, Microsoft Dynamics and SAP, as well as Twitter and Chatter. The new Instant View and Process Monitor tools provide visibility to the status and results of more complex business process integrations. And Version 5.0 supports large-volume cloud APIs to allow organizations to rapidly synchronize large volumes of data at higher levels of performance.

Jitterbit's approach also fits into the vision of "integration as a service," which seems a natural development of cloud models. I'd like to see more cloud services providers embed such integration services into their offerings. This is especially important for PaaS to go mainstream.

A video describing the new features in Jitterbit 5.0, available now can be found here. A free 30-day trial of the product is available here.

(BriefingsDirect contributor Cara Garretson provided editorial assistance and research on this post. She can be reached on LinkedIn.)

You may also be interested in:

Wednesday, January 16, 2013

The other shoe drops: SAP puts ERP on HANA

This guest post comes courtesy of Tony Baer's OnStrategies blog. Tony is senior analyst at Ovum.

By Tony Baer

It was never a question of whether SAP would bring it flagship product, Business Suite to HANA, but when. And when I saw this while parking the car at my physical therapist over the holidays, I should’ve suspected that something was up: SAP at long last was about to announce … this.

From the start, SAP has made clear that its vision for HANA was not a technical curiosity, positioned as some high-end niche product or sideshow. In the long run, SAP was going to take HANA to Broadway.
Baer
SAP product rollouts on HANA have proceeded in logical, deliberate fashion. Start with the lowest hanging fruit, analytics, because that is the sweet spot of the embryonic market for in-memory data platforms. Then work up the food chain, with the CRM introduction in the middle of last year – there’s an implicit value proposition for having a customer database on a real-time system, especially while your call-center reps are on the phone and would like to either soothe, cross-sell, or upsell the prospect. Get some initial customer references with a special purpose transactional product in preparation for taking it to the big time.

There’s no question that in-memory can have real impact, from simplifying deployment to speeding up processes and enabling more real-time agility. Your data integration architecture is much simpler and the amount of data you physically must store is smaller. SAP provides a cute video that shows how HANA cuts through the clutter.

For starters, when data is in memory, you don’t have to denormalize or resort to tricks like sharding or striping of data to enhance access to “hot” data. You also don’t have to create staging servers to perform ETL of you want to load transaction data into a data warehouse. Instead, you submit commands or routines that, thanks to processing speeds that are up to what SAP claims to be 1000x faster than disk, convert the data almost instantly to the form in which you need to consume it. And when you have data in memory, you can now perform more ad hoc analyses. In the case of production and inventory planning (a.k.a., the MRP portion of ERP), you could run simulations when weighing the impact of changing or submitting new customer orders, or judging the impact of changing sourcing strategies when commodity process fluctuate. For beta customer John Deere, they achieved positive ROI based solely on the benefits of implementing it for pricing optimization (SAP has roughly a dozen customers in ramp up for Business Suite on HANA).

Supply chain lag time

It’s not a question of whether you can run ERP in real time. No matter how fast you construct or deconstruct your business planning, there is still a supply chain that introduces its own lag time. Instead, the focus is how to make enterprise planning more flexible, enhanced with built-in analytics.
But how hungry are enterprises for such improvements? To date, SAP has roughly 500 HANA installs, primarily for Business Warehouse (BW) where the in-memory data store was a logical upgrade for analytics, where demand for in-memory is more established. But on the transactional side, it’s a more uphill battle as enterprises are not clamoring to conduct forklift replacements of their ERP systems, not to mention their databases as well. Changing both is no trivial matter, and in fact, changing databases is even rarer because of the specialized knowledge that is required. Swap out your database, and you might as well swap out your DBAs.
There’s no question that in-memory can have real impact, from simplifying deployment to speeding up processes and enabling more real-time agility.

The best precedent is Oracle, which introduced Fusion Applications two years ago. Oracle didn’t necessarily see Fusion as replacement for E-Business Suite, JD Edwards, or PeopleSoft. Instead it viewed Fusion Apps as a gap filler for new opportunities among its installed base or the rare case of greenfield enterprise install. We’d expect no less from SAP.

Yet in the exuberance of rollout day, SAP was speaking of the transformative nature of HANA, claiming it “Reinvents the Real-Time Enterprise.” It’s not the first time that SAP has positioned HANA in such terms.

Yes, HANA is transformative when it comes to how you manage data and run applications, but let’s not get caught down another path to enterprise transformation. We’ve seen that movie before, and few of us want to sit through it again.

This guest post comes courtesy of Tony Baer's OnStrategies blog. Tony is senior analyst at Ovum.

You may also be interested in:

Tuesday, January 15, 2013

The Networked Economy forces new directions for collaboration in business and commerce, says author Zach Tumin

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: Ariba.

New levels of collaboration have emerged from an increasingly networked world, and business leaders and academic researchers alike are now sorting out what the new capabilities mean for both commerce and society at large.

To learn more about how rapid trends in collaboration and business networking are driving new innovation and social interactions, BriefingsDirect invited a Harvard Kennedy School researcher and a chief strategist at global business network provider Ariba to a panel discussion.

We joined up Zach Tumin, Senior Researcher at the Science, Technology, and Public Policy Program at the Harvard Kennedy School, and Tim Minahan, Senior Vice-President of Global Network Strategy and Chief Marketing Officer at Ariba, an SAP company.

Tumin is co-author with William Bratton of 2012’s Collaborate or Perish: Reaching Across Boundaries in a Networked World, published by Random House. Minahan, at Ariba, is exploring how digital communities are redefining and extending new types of business and collaboration for advanced commerce. 

The chat is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: Ariba is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: Zach, in your book "Collaborate or Perish," you're exploring collaboration and you show what it can do when it's fully leveraged. And, Tim, at Ariba you've been showing how a more networked economy is producing efficiencies for business and even extending the balance of what we would consider commerce to be. I’d like to start with looking at how these come together.

Tumin: The opportunities for collaboration are expanding even as we speak. The networks around the world are volatile. They're moving fast. The speed of change is coming at managers and executives at a terrific pace. There is an incredible variety of choice, and people are empowered with these great digital devices that we all have in our pockets.
Tumin

That creates a new world, where the possibilities are tremendous for joining forces, whether politically, economically, or socially. Yet it's also a difficult world, where we don't have authority, if we have to go outside of our organizations -- but where we don't have all the power that we need, if we stay within the boundaries of our charters.

So, we're always reaching across boundaries to find people who we can partner with. The key is how we do that. How do we move people to act with us, where we don't have the authority over them? How do we make it pay for people to collaborate?

A lot of change

Minahan: Collaboration certainly is the new business imperative. Companies have leaned out their operations over the past couple of years and they spent the previous 30 years focusing on their internal operations and efficiencies and driving greater performance, and getting greater insights.

Minahan
When they look outside their enterprise today, it's still a mess. Most of the transactions still occur offline or through semi-automated processes. They lack transparency into those processes and efficiency in executing them. As a result, that means lots of paper and lots of people and lots of missed opportunities, whether it's in capitalizing on getting a new product to market or achieving new sales with new potential customers.

What business networks and this new level of collaboration bring is four things. It brings the transparency that’s currently lacking into the process. So you know where your opportunities are. You know where your orders are. You know where your invoices are and what your exposure to payables are.

It brings new levels of efficiencies executing against those processes, much faster than you ever could before through mostly automated process.
It brings new levels of efficiencies executing against those processes, much faster than you ever could before through mostly automated process. It brings new types of collaboration which I am sure we will get into later in this segment.

The last part, which I think is most intriguing, is that it brings new levels of insights. We're no longer making decisions blindly. We no longer need to double order, because we don’t know if that shipment is coming in and we need to stockpile, because we can't let the refinery go down. So it brings new levels of insight to make more informed decisions in real time.

Gardner: Zach, in your book you're basically describing a new workforce, and some companies and organizations are recognizing that and embracing it. What’s driving this? What has happened that is basically redefining the workforce?

It's the demographics

Tumin: It’s in the demographics, Dana. Young people are accustomed to doing things today that were not possible 10 years ago. The digital power in everyone’s pocket or pocket book, the digital wallet in markets, are ready, willing, and able to deal with them and to welcome them. That means that there’s pressure on organizations to integrate and take advantage of the power that individuals have in the marketplace and that come in to their workforce.

Everyone can see what's going on around the world. We're moving to a situation where young people are feeling pretty powerful. They're able to search, find, discover, and become experts all on their own through the use of technologies that 10 years ago weren’t available.

So a lot of the traditional ways of thinking about power, status, and prestige in the workforce are changing as a result, and the organizations that can adapt and adopt these kinds of technologies and turn them to their advantage are the ones that are going to prevail.

Gardner: Tim, with that said, there's this demographic shift, the shift in the mentality of self-started discovery of recognizing that the information you want is out there, and it’s simply a matter of applying your need to the right data and then executing on some action as a result. Your network seems ready-made for that.


The reality of the community is that it is organic. It takes time to grow.
Minahan: The reality of the community is that it is organic. It takes time to grow. At Ariba we have more than 15 years of transactional history, relationship history, and community generated content that we've amassed. In fact, over the past 12 months those, nearly a million connected companies have executed more than $400 billion in purchase, sales, invoice, and payment transactions over the Ariba network.

Aggregate that over 15 years, and you have some great insights beyond just trading efficiencies for those companies participating there. You can deliver insights to them so that they can make more informed decisions, whether that’s in selecting a new trading partner or determining when or how to pay.

Should I take an early-payment discount in order to accelerate or reduce my cost basis? From a sales standpoint, or seller’s standpoint, should I offer an early payment discount in order to accelerate my cash flow? There are actually a host of examples where companies are taking advantage of this today and it’s not just for the large companies. Let me give you two examples.

From the buyer side, there was a company called Plaid Enterprises. Plaid is a company that, if you have daughters like I do who are interested in hobbies and creating crafts, you are very familiar with. They're one of the leading providers for the do-it-yourself crafts that you would get at your craft store.

Like many other manufacturers, they were a mid sized company, but they decided a couple of years ago to offshore their supplies. So they went to the low cost region of China. A few years into it, they realized that labor wages were rising, their quality was declining, and worse than that, it was sometimes taking them five months to get their shipment.

New sources of supply

So they went to the Ariba Network to find new sources of supply. Like many other manufacturers, they thought, "Let’s look in other low cost regions like Vietnam." They certainly found suppliers there, but what they also found were suppliers here in North America.

They went through a bidding process with the suppliers they found there, with the qualifying information on who was doing business with whom and how they performed in the past, and they wound up selecting a supplier that was 30 miles down the road. They wound up getting a 40 percent cost reduction from what they had previously paid in China and their lead times were cut from more than 120 days down to 30.

That’s from the buy side. From the sell side, the inverse is true. I'll use an example of a company called Mediafly. It's a fast growing company that provides mobile marketing services to some of the largest companies in the world, large entertainment companies, large consumer products companies.

They were asked to join the Ariba Network to automate their invoicing and they have gotten some great efficiencies from that. They've gotten transparencies to know when their invoice is paid, but one other thing was really interesting.

Once they were in the networked environment and once they had automated those processes, they were now able to do what we call dynamic discounting. That meant when they want their cash, they can make offers to their customers that they're connected to on the Ariba Network and be able to accelerate their cash.
You have extraordinary volatility on your network and that can rumble all the way through.

So they were able not only to shrink their quote-to-settle cycle by 84 percent, but they gained access to new financing and capital through the Ariba network. So they could go out and hire that new developer to take on that new project and they were even able to defer a next round of funding, because they have greater control over their cash flow.

Gardner: Zach, in listening to Tim, particularly that discovery process, we're really going back to some principles that define being human -- collaboration, word of mouth, sharing information about what you know. It just seems that we have a much greater scale that we can deploy this. How is that fundamentally changing how people are relating in business and society?

Tumin: The scaling means that things can get big in a hurry and they can get fast in a hurry. So you get a lot of volume, things go viral, and you have a velocity of change here. New technologies are introducing themselves to the market. You have extraordinary volatility on your network and that can rumble all the way through, so that you feel it seconds after something halfway around the world has put a glitch in your supply chain. You have enormous variability. You're dealing with many different languages, both computer languages and human languages.

That means that the potential for collaboration really requires coming together in ways that helps people see very quickly why it is that they should work together, rather than go it alone. They may not have a choice, but people are still status quo animals. We're comfortable in the way that we have always done business, and it takes a lot to move us out.

It comes down to people

When crisis hits, it’s not exactly a great time to build those relationships. Speaker of the House Tip O'Neill here in United States once said "Make friends before you need them." That’s a good advice. We have great technology and we have great networks, but at the end of day, it’s people that make them work.

People rely on trust, and trust relies on relationships. Technology here is a great enabler but it’s no super bullet. It takes leadership to get people together across these networks and to then be able to scale and take advantage of what all these networks have to offer.

Gardner: Tim, another big trend today of course, is the ability to use all of this data that Zach has been describing, and you are alluding to, about what’s going on within these networks. Now, of course, with this explosive scale, the amount of that data has likewise exploded.

Minahan: We've only begun to scratch surface on this. When you look at the data that goes on in a business commerce network, it’s really three levels. One is the transactional data, the actual transactions that are going on, knowing what commodities are being purchased and so on. Then, there's relationship data, knowing the relationship between a given buyer and seller.

Finally, there's what I would call community data, or community generated data, and that can take the form of performance ratings, so buyers rating suppliers and suppliers rating buyers. Others in the community can use that to help determine who to do business with or to help to detect some risk in their supply chain.

There are also community generated content, like request for proposal (RFP) templates. A lot of our communities members use a "give a template, take a template" type approach in which they are offering RFP templates to other members of the community that work well for them. These can be templates on how to source temp labor or how to source corrugated packaging.

We have dozens and dozens of those. When you aggregate all of this, the last part of the community data is the benchmarking data. It's understanding not just process benchmarking but also spend benchmarking.

One of the reasons we're so excited about getting access to SAP HANA is the ability to offer this information up in real time, at the point of either purchase or sale decision, so that folks can make more informed decisions about who to engage with or what terms to take or how to approach a particular category. That is particularly powerful and something you can’t get in a non-networked model.

Gardner: One of the things I sense, as people grapple with these issues, is a difficulty in deciding where to let creative chaos rein and where to exercise control and where to lock down and exercise traditional IT imperatives around governance, command and control, and systems of records.

Zach, in your book with William Bratton, are there any examples that you can point to that show how some organizations have allowed that creativity of people to extend their habits and behaviors in new ways unfettered and then at the same time retain that all-important IT control?

Tumin: It's a critical question that you’ve raised. We have young people coming into the workforce who are newly empowered. They understand how to do all the things that they need do without waiting online and without waiting for authority. Yet, they're coming into organizations that have strong cultures that have strong command-and-control hierarchies.

There's a clash that’s happening here, and the strong companies are the ones that find the path to embracing the creativity of networked folks within the organization and across their boundaries, while maintaining focus on set of core deliverables that everyone needs to do.

Wells Fargo

There are plenty of terrific examples. I will give you one. At Wells Fargo, for the development of the online capability for the wholesale shop, Steve Ellis was Executive Vice President. He had to take his group offline to develop the capability, but he had two responsibilities. One was to the bank, which had a history of security and trust. That was its brand. That was its reputation. But he was also looking to the online world, to variability, to choice, and to developing exactly the things that customers want.

Steve Ellis found a way of working with his core group of developers to engage customers in the code design of Wells Fargo's online presence for the wholesale side. As a result, they were able to develop systems that were so integrated into the customers over time that they can move very, very quickly, adapt as new developments required, and yet they gave full head to the creativity of the designers, as well as to the customers in coming to these new ways of doing business.

So here's an example of a pretty staid organization, 150 years old with a reputation for trust and security, making its way into the roiling water of the networked world and finding a path through engagement that helped to prevail in the marketplace over a decade.

Minahan: I'd also like to talk about the dynamics going on that are fueling more B2B collaboration. There is certainly the need for more productivity. So that's a constant in business, particularly as we're in tight environments. Many times companies are finding they are tapped out within the enterprise.

Becoming more dependent

Companies are becoming more and more dependent on getting insights and collaborating with folks outside their enterprise.

So policies do need to be put down. Just like many businesses put policies down on their social media, there needs to be policies put down on how we share information and with whom, but the great thing about technology is that it can enforce those controls. It can help to put in checks and balances and give you a full transparency and audit trail, so you know that these policies are being enforced. You know that there are certain parameters around security of data.

You don't have those controls in the offline world. When paper leaves the building, you don't know. But when a transaction is shared or when information is shared over a network, you, as a company, have greater control. You have a greater insight, and the ability to track and trace.
When a transaction is shared or when information is shared over a network, you, as a company, have greater control.

So there is this balancing act going on between opening the kimono, as we talked about in '80s, being able to share more information with your trading partners, but now being able to do it in a controlled environment that is digitized and process-oriented. You have the controls you need to ensure you're protecting your business, while also growing your business.


Gardner: Tim, for the benefit of our audience, help us better understand how Ariba is helping to fuel this issue of safely allowing creativity and new types of collaboration, but at the same time maintaining that the important principles of good business.

Minahan: The problem we solve at Ariba is quite basic, yet one of the biggest impediments to business productivity and performance that still exists. That's around inter-enterprise collaboration or collaboration between businesses.

We talked about the deficits there earlier. Through our cloud-based applications and business network, we eliminate all of the hassles, the papers, the phone calls, and other manual or disjointed activities that companies do each day to do things like find new suppliers, find new business opportunities as a seller, to place or manage orders, to collaborate with customers suppliers and other partners, or to just get paid.

They can connect with known trading partners much more efficiently and then automate the processes and the information flows between each other.
Nearly a million business today are digitally connected through the Ariba Network. They're empowered to discover one another in new ways, getting qualifying information from the community, so that they know who that party is even if they haven’t met them before. It's similar to what you see on eBay. When you want to sell your golf clubs, you know that that buyer has a performance history of doing business with other buyers.

They can connect with known trading partners much more efficiently and then automate the processes and the information flows between each other. Then, they can collaborate in new ways, not only to find one another, but also to get access to preferred financing or new insights into market trends that are going on around particular commodities.

That’s the power of bringing a business network to bear in today’s world. It's this convergence of cloud applications, the ability to access and automate a process. Those that share that process share the underlying infrastructure and a digitally connected community of relevant parties, whether that’s customers, suppliers, potential trading partners, banking partners, or other participants involved in the commerce process

Sharing data

Gardner: When it comes to exposing the data from these processes, assuming we can do it safely, what can we do now that really wasn’t possible five years ago?

Tumin: One of the things that we're seeing around the world is that innovation is taking place at the level of individual apps and individual developers. There's a great example in London. London Transport had a data set and a website that people would use to find out where their trains were, what the schedule was, and what was happening on a day-to-day basis.

As we all know, passengers on mass transit like to know what's happening on a minute-to-minute basis. London Transport decided they would open up their data, and the open data movement is very, very important in that respect. They opened the data and let developers develop some apps for folks. A number of apps developers did and put these things out on the system. The demand was so high that they crashed London Transport, initially.

London Transport took their data and put it into the cloud, where they could handle the scale much more effectively. Within a few days, they had gone from those thousand hits on the website per day to 2.3 million in the cloud.
You need governance and support people, and people to make it work and to trust each other and share information.

The ability to scale is terribly important. The ability to innovate and turn these open data sets over to communities of developers, to make this data available to people the way they want use it, is terribly important. And these kinds of industry-government relations that makes this possible are critical as well.

So across all those dimensions, technology, people, politics, and the platform, the data has to line up. You need governance and support people, and people to make it work and to trust each other and share information. These are the keys to collaboration today.

Gardner: Zach, last word to you. What do we get? What's the payoff, if we can balance this correctly? If we can allow these new wheels of innovation to spin, to scale up, but also apply the right balance, as Tim was describing, for audit trails and access and privilege controls? If we do this right, what's in the offing?

Tumin: I think you can expect four things, Dana. First is that you can expect innovations faster with ideas that work right away for partners. The partners who collaborate deeply and right from the start get their products right without too much error built-in and they can get them to market faster.

Second is that you're going to rinse out the cost of rework, whether it's from carrying needless inventory or handling paper that you don’t have to touch where there is cost involved. You're going to be able to rinse that out.

Third is that you're going to be able to build revenues by dealing with risk. You're going to take advantage of customer insight. You're going to make life better and that's going to be good news for you and the marketplace.

Constant learning

The fourth is that you have an opportunity for constant learning, so that insight moves to practice faster. That’s really important, because the world is changing so fast, you have the volatility, a velocity, a volume, variability, being able to learn and adapt is critical. That means embracing change, setting out the values that you want to lead by, helping people understand them.

Great leaders are great teachers. The opportunity of the networked world is to share that insight and loop it across the network, so that people understand how to improve every day and every way the core business processes that they're responsible for.

Gardner: Allow me to extend a big thanks to our guests, Zach Tumin, Senior Researcher at the Science, Technology, and Public Policy Program at Harvard Kennedy School, and the co-author with William Bratton of Collaborate or Perish: Reaching Across Boundaries in a Networked World, and Tim Minahan, Senior Vice-President of Global Network Strategy and Chief Marketing Officer at Ariba.
Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: Ariba.

You may also be interested in: