Monday, January 18, 2016

The Open Group president, Steve Nunn, on the inaugural TOGAF User Group and new role of EA in business transformation

The next BriefingsDirect thought leadership interview explores a new user group being formed around TOGAF, The Open Group standard, and how this group will further foster the practical use of TOGAF for effective and practical business transformation.

The discussion, which comes in conjunction with The Open Group San Francisco 2016 event on January 25, sets the stage for the next chapter in enterprise architecture (EA) for digital business success.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy

To learn more about the grassroots ecosystem building around transformational EA, we're joined by the President and CEO of The Open Group, Steve Nunn. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Before we get to the TOGAF User Group news, let’s relate what’s changed in the business world and why EA and frameworks and standards like TOGAF are more practical and more powerful than ever.

Nunn
Nunn: One of the keys, Dana, is that we're seeing EA increasingly used as a tool in business transformation. Whereas in the past, maybe in the early adoptions of TOGAF and implementations of TOGAF, it was more about redesigning EA, redesigning systems inside an organization more generally. Nowadays, with the need to transform businesses for the digital world, EA has another more immediate and more obvious appeal.

It’s really around an enablement tool for companies and organizations to transform their businesses for the digital world, specifically the worlds of the Internet of Things (IoT), big data, social, mobile, all of those things which we at The Open Group lump into something we call Open Platform 3.0, but it really is affecting the business place at large and the markets that our member organizations are part of. [More from Steve on the new user group.]

Gardner: TOGAF has been around for quite a while. How old is TOGAF now?

Nunn: The first version of TOGAF was published in 1993, so it's been quite some time. For a little while, we published a version every year. Once we got to Version 7.0, the refreshes and the new versions came a bit slower after that.

We're now at Version 9.1, and there is a new version being worked on. The key for TOGAF is that we introduced a certification program around it for both tools that help people implement TOGAF, but also for the practitioners, the individuals who are actually using it. We did that with version 8.0 and then we moved to what we consider, and the marketplace certainly considers, to be an improved version with TOGAF 9.0, making it an exam-based certification. It has proved to be very popular indeed, with more than 50,000 certified individuals under that program to date.

Gardner: Now the IT world, the business world, many things about these worlds have changed since 1993. Something that comes to mind, of course, is the need to not just think about architecture within your organization, but how that relates across boundaries of many organizations.

I sometimes tease friends who are Star Trek fans that we have gone from regular chess to 3-D chess, and that’s a leap in complexity. How does this need to better manage Boundaryless Information Flow make EA and standards like TOGAF so important now?

Common vocabulary

Nunn: With the type of change that you talked about and the level of complexity, what standards like TOGAF and others bring is commonality and ability to make architecting organizations a little bit easier; to give it all a bit more structure. One of the things that we hear is most valuable about TOGAF, in particular, is the common vocabulary that it gives to those involved in a business transformation, which obviously involves multiple parts of an organization and multiple partners in a group of organizations, for example.

So, it’s not just for enterprise architects. We're hearing increasingly about a level of training and introductory use of TOGAF at all levels of an organization as a means of communicating and having a common set of terminology. So everyone has the same expectation about what particular terms mean. With added complexity, we need things to help us work through that and divide up the complexity into different layers that we can tackle. EA and TOGAF, in particular, are proving very popular for tackling those levels of complexity.

Gardner: So in the next chapter, these things continue to evolve, react to the market, and adjust. We're hearing that there is news at the event, the January 25 event in San Francisco, around this new user group. Tell me why we're instituting a user group associated with TOGAF at this point?

Nunn: It’s going to be the first meeting of a TOGAF User Group, and it’s something we have been thinking about for some time, but the time seems to be now. I've alluded to the level of popularity of TOGAF, but it really is becoming very widely used. What users of TOGAF are looking for is how to better use it in their day jobs. How can they make it effective? How can they learn from what others have done, both good and bad, the things to try and the things not to try or more the things that worked and things that didn’t work? That isn’t something that we've necessarily offered, apart from a few conference sessions at previous events.

So this really ends up getting a broader community around TOGAF, and not just those members of the Architecture Forum which is our particular forum that advances the TOGAF standard. It’s really to engage the wider community, both those who are certified and those who aren’t certified, as a way of learning how to make better and more effective use of TOGAF. There are a lot of possibilities for what we might do at the meeting, and a lot of it will depend on what those who attend would like to cover. [More from Steve on the new user group.]

Gardner: Now, to be clear, any standard has a fairly rigorous process by which the standard is amended, changed, or evolves over time. But we're talking about something separate from that. We're talking about perhaps more organic information flow, sharing, bringing points into that standard’s process. Maybe you could clarify the separation, the difference, the relationship between a standard’s adoption and a user group's input.
This is the first time we've offered nonmembers a real opportunity, not necessarily to decide what goes into the standard, but certainly a greater degree of influence.

Nunn: That’s the key point, Dana. The standard will get evolved by the members of The Open Group, specifically the members of The Open Group Architecture Forum. They are the ones who have evolved it this far and are very actively working on a future version. So they will be the ones who will ultimately get to propose what goes in and ultimately vote on what goes in.

Where the role of the user community, both members and non-members -- but specifically the opportunity for non-members -- comes in is being able to give their input, put forward ideas that areas where maybe TOGAF might be strengthened or improved in some way. Nobody pretends it’s prefect as you use it. It has evolved over time and it will evolve in the future. But hearing from those who actually use TOGAF day to day, we might get, certainly from The Open Group point of view, some new perspectives, and those perspectives will then get passed on through us to the members of the Architecture Forum.

Many of those we expect to attend the event anyway. They might hear it for the first time, but certainly we would spend part of the meeting looking at what that input might be, so that we have something to pass on to them for consideration in the standard.

This is the first time we've offered nonmembers a real opportunity, not necessarily to decide what goes into the standard, but certainly a greater degree of influence.

It's somewhat of a throwback to the days where user groups were very powerful in what came out of vendor organizations. I do hope that this will be something that will enable everyone to get the benefit of a better overall standard.

Past user groups

Gardner: I certainly remember, Steve, the days when vendors would quake in their boots when user meetings and groups came up, because they had such influence and impact. They both benefited each other. The vendors really benefited by hearing from the user groups and the user groups benefited by the standards that could come forth and vendor cooperation that they basically demanded.

I recall, at the last Open Group event, the synergy discussions around Zachman, and other EA frameworks. Do you expect that some of these user group activities that you're putting forth will allow some of that cross pollination, if you will, people who might be using other EA tools and want to bring more cooperation and collaboration across them?

Nunn: I would certainly expect that to happen. Our position at The Open Group, and we've said it consistently over the years, is that it’s not "TOGAF or," it’s "TOGAF and." The reality is that  most organizations, the vast majority, are not just going to take TOGAF and let it be everything they use in implementing their EAs.

So the other frameworks are certainly relevant. I expect there to be some interest in tools, as well as frameworks. We hear that quite a lot, suggestions of what good tools are for people at different stages of maturity and their implementation of the EA. So, I expect a lot of discussion about the other thoughts or the other tools in the toolbox of an EA to come up here.

Gardner: So user groups serve to bring more of an echo system approach, voices from disparate parties coming together sounds very powerful. Now this is happening on January 25. This is a free first meeting. Is that correct? And being in San Francisco, of course, it's within a couple hours drive of a lot of influential users, start-ups, the VC community, vendors, or service providers. Tell us a little bit about why people who are within a quick access to the Bay Area might consider coming to this on January 25?
What people would get out of it is the chance to hear a bit more about how TOGAF is used by others, case studies, what’s worked, what hasn’t worked, the opportunity to talk directly with people.

Nunn: That’s another reason, the location of our next event. We were first thinking this is the right time to do a first TOGAF User Group, because you see there are a lot of users of TOGAF in the area or within a few hours of it. What people would get out of it is the chance to hear a bit more about how TOGAF is used by others, case studies, what’s worked, what hasn’t worked, the opportunity to talk directly with people, whether it’s through networking or actually in the sessions in the user group meeting.

We're trying to not put too much rigid structure around those particular sessions, because we won’t be able to get the most benefit out of them. So it’s really what they want to get out of it that will probably be achievable.The point of view of The Open Group is that it's about getting that broader perspective for the attendees, learning useful tips and tricks, learning from the experience of others, and learning a bit more about The Open Group and how TOGAF has evolved.

This is a key point. TOGAF is so widely used now and globally, and even though we have quite a few members in The Open Group, we have more than 350 organization participating in some way in the Architecture Forum, and more in The Open Group as a whole.

But there's obviously a much wider community of those who are using it. Hearing more about how it has developed, what the processes are inside The Open Group, might make them feel good about the future of something that they clearly have some investment in. Hopefully, it might even persuade a few of those organizations to join and influence from the inside.

Gardner: Now, there's more information about the user group at www.opengroup.org. You're meeting on January 25 at 9:30 a.m. Pacific Time at the Marriott Union Square right in the heart of San Francisco. But this is happening in association with a larger event. So tell us about the total event that's happening between January 25 and 28.

Quarterly events

Nunn: This is part of one of our quarterly events that we've been running for lot of years now. They take the form generally of a plenary sessions that are open to anyone and also member meetings, where the members of the various Open Group forums get together to progress the work that they do virtually. But it’s to really knuckle down and progress some of it face-to-face, which as, we all know, is generally a very productive way of working.

Apart from the TOGAF User Group, we have on the agenda sessions on the Digital Business Strategy and Customer Experience, which is an activity that's being driven inside our Open Platform 3.0 Forum, as a membership activity, but this is really to open that up to a wide audience at the conference. So, we'll have people talking about that.

Open Platform 3.0 is where the convergence of technologies like cloud, social computing, mobile computing, big data, and IoT all come together. As we see it, our goal is for our members to create an Open Platform 3.0 Standard, which is basically a standard for digital platform, so that the enterprises can more easily use the technologies and get the benefit of these technologies that are now out there. There will be quite a bit of focus on Open Platform 3.0.

The other big thing that is proving very popular for us, which will be featured at the conference is the Open Group IT4IT Reference Architecture, and there is a membership activity, the IT4IT Forum. They're working on standards. We published the first version of that reference architecture at our last quarterly conference, which was in Edinburgh in October last year.
There has been a lot of interest in it so far, and we are working on a certification program for IT4IT that we will be launching later this year, hopefully at our next quarterly event in London in April.

There has been a lot of interest in it, and it's really a standard for running the business of IT. Oftentimes, IT is just seen as doing its own thing and not really part of the business. But the reality nowadays is that whoever is running the IT, be it the CIO or whatever other individual, to be successful they have to not just run IT as a business, with the usual business principles of return on investment, etc., but they have to be seen to be doing so. This is a reference architecture that's not specific to any industry and that provides a guide for how to go about doing that.

We're quite excited about it. There has been a lot of interest in it so far, and we are working on a certification program for IT4IT that we will be launching later this year, hopefully at our next quarterly event in London in April.

Gardner: I'll just remind our listeners and readers that we're going to be doing some separate discussions and sharing with them on the IT4IT Reference Architecture. So please look for that coming up.

Getting back to the event, Steve, I've attended many of these over the years and I find a lot of the discussions around security, around specific markets like healthcare and government really powerful and interesting. Is there anything in particular about this conference that you're particularly interested in or looking forward to?

Nunn: The ones I've already spoken to are the ones that I'm personally most looking forward to. We'll be having sessions on health care and security, as you say.

In the security area it’s worth calling out that one of the suggestions that we've had about TOGAF -- I won’t call it criticism, but one of the suggestions for future versions -- is that TOGAF is a bit light on security. It could do with beefing up that particular area.

The approach that we've taken this time, which people attending the conference will hear about, is that we have actually got the security experts to say what we need to cover in TOGAF, in the next version of TOGAF from a security point of view. Rather than having the architects include what they know about security, we have some heavyweight security folks in there, working with the Architecture Forum, to really beef up the security aspect. We'll hear a bit more about that.

Customer experience

Gardner: I also see that customer experience, which is closely aligned with user experience, is a big part of the event this year. That’s such a key topic these days for me, because it sort of forms a culmination of Platform 3.0. When you can pull together big data, hybrid cloud architectures, mobile enablement and reach, you can start to really do some fantastic new things that just really couldn’t have been done before when it comes to that user experience, real-time adaptation to user behaviors, bringing that inference back into a cloud or a back-end architecture, and then bringing back some sort of predictive or actionable result.

Please flesh out a bit more for us about how this user experience and customer experience is such a key part of the output, the benefit, the value, and the business transformation that we get from all these technical issues that we've discussed; this is sort of a business issue.

Nunn: You're absolutely right. It’s when we start providing a better experience for the customers overall and they can get more out of what the organizations are offering that everybody wins.
What we're trying to do from the organizational side is focus on what is it that you can do to look at it from the customers’ point of view, meet their expectations, and start to evolve from there.

From the group that we have working on this inside The Open Group, they are coming at it from a point of view that some of these new technologies are actually very scary for organizations, because they are forced to transform. The expectations of customers now are completely different. They expect to be able to get things on their cellphones or their tablets, or whatever device they might be using. That's  quite a big shift for a lot of organizations, and that’s not even getting into some of the areas of IoT, which promises to be huge.

What we're trying to do from the organizational side is focus on what is it that you can do to look at it from the customers’ point of view, meet their expectations, and start to evolve from there.

To me, it’s interesting from the point of view that it’s pretty business-driven. The technologies are there to be taken advantage of or to actually be very disruptive. So the business needs to know at a fairly early stage what those customer expectations are and take advantage of the new technologies that are there. That’s the angle that we are coming from inside The Open Group on that.

Some of the main participants in that group are actually coming from the telco world, where things have obviously changed enormously over the last few years. So that one is going to move quite quickly.

Gardner: It certainly seems that the ability to have boundaryless architecture is essential on that customer experience benefit. You certainly seem to be in the right place at the right time for that.

But the event in San Francisco also forms a milestone for you, Steve. You're now in your first full event as President and CEO of The Open Group, having taken over from Allen Brown last Fall. Tell us a little bit about your earlier roles within the standards organization and a bit more about yourself perhaps for those folks who are not yet familiar with you?

Quite different

Nunn: Yes, it will be quite different this time around. I've been with The Open Group for 22 years now. I was originally hired as General Counsel, and then fairly quickly moving on to Vice President of Corporate, Legal and Chief Operating Officer under Allen Brown as CEO. Allen was CEO for 17 years, and I was with him all of that time. It’s going to be quite different to have somebody else running the events, but I'm very much looking forward to it.

From my point of view, it’s a great honor to be leading The Open Group and its members into our next phase of evolution. The events that we hold are one small part of it, but they're a very important part, particularly these quarterly ones. It’s where a lot of our customers and members come together in one place, and as we have heard, there will be some folks who may not have been involved with one of our events before through the user group, so it’s pretty exciting.

I'm looking forward to building on the very solid foundation that we have and some of the great work activities that we mainly have ongoing inside The Open Group.
I'm looking forward to building on the very solid foundation that we have and some of the great work activities that we mainly have ongoing inside The Open Group.

Don’t expect great change from The Open Group, but just really more of the same good stuff that we've been working on before, having regard to the fact that obviously things are changing very rapidly around us and we need to be able to provide value in that fast changing world, which we are very confident we can.

Gardner: As an observer of the market, but also of The Open Group, I'm glad to hear that you're continuing on your course, because the world owes you in many ways. Things you were talking about 5 or 10 years ago have become very essential. You were spot on on how you saw the vision of the world changing on IT and its influence on business and vice versa.

More than ever, it seems that IT and EA is destiny for businesses. So I'm glad to hear that we're having a long vision, and the future seems very bright for your organization as the tools and approaches and the mentality and philosophy that you have been espousing becomes essential to do some of these things we have been discussing, like Platform 3.0, like customer experience, and IoT.

In closing, let’s remind our audience that you can register for the event at The Open Group website, www.opengroup.org. The first day, January 25, includes that free user group, the inaugural user group for TOGAF, and it all happens at the Marriott Union Square, San Francisco, along with the General Conference, which also runs from January 25 to 28.

Any last thoughts Steve, as we close out, in terms of where people should expect The Open Group to go, or how they can become perhaps involved in ways that they hadn’t considered before? [More from Steve on the new user group.]

Good introduction

Nunn: Attending one of our events is a really good introduction to what goes on in The Open Group. For those who haven’t attended one previously, you might be pleasantly surprised.

If I had to pick one thing, I would say it's the breadth of activities there are at these events. It’s very easy for an organization like The Open Group to be known for one thing or a very small number of things, whether it’s UNIX originally and EA more recently, but there really is a lot going on beyond there.

Getting exposure to that at an event such as this, particularly in a location as important to the industry and as beautiful as San Francisco is, is a great chance. So anyone who is on the fence about going, then jump over the fence and try us out.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: The Open Group.

You may also be interested in:

Thursday, January 14, 2016

Learn how SKYPAD and HPE Vertica enable luxury brands to gain rapid insight into consumer trends

The next BriefingsDirect big-data use case leadership discussion explores how retail luxury goods market analysis provider Sky I.T. Group has upped its game to provide more buyer behavior analysis faster -- and with more user depth.

Learn how Sky I.T. changed its data analysis platform infrastructure to Hewlett Packard Enterprise (HPE) Vertica -- and why that has helped solve its challenges around data variety, velocity, and volume and make better insights available across the luxury retail marketplace.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To share how retail intelligence just got a whole lot smarter, we welcome Jay Hakami, President; Dane Adcock, Vice President of Business Development, and Stephen Czetty, Vice President and Chief Technology Officer, all at Sky I.T. Group in New York. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: What's driving the need for greater and better big-data analysis for luxury retailers? Why do they need to know more, better, faster?

Adcock: Well, customers have more choices. As a result, businesses need to be more agile and responsive and fill the customer's needs more completely or lose the business. That's driving the entire industry into practices that mean shorter times from design to shelf in order to be more responsive.

It has created a great deal of gross marketing pressure, because there's simply more competition and more selections that a consumer can make with their dollar today.

Gardner: Is there anything specific to the retail process around luxury goods that is even more pressing when it comes to this additional speed?
Sky I.T. Group
Retail Business-Intelligence Solutions
Get More Information
Adcock: Yes. The downside to making mistakes in terms of designing a product and allocating it in the right amounts to locations at the store level carries a much greater penalty, because it has to be liquidated. There's not a chance to simply cut back on the supply chain side, and so margins are more at risk in terms of making the mistake.
Ten years ago, from a fashion perspective, it was about optimizing the return and focusing on winners. Today, you also have to plan to manage and optimize the margins on your losers as well. So, it's a total package.

Gardner: So, clearly, the more you know about what those users are doing or what they have done is going to be essential. It seems to me, though, that we'rere talking about a market-wide look rather than just one store, one retailer, or one brand.

How does that work, Jay? How do we get to the point where we've been able to gather information at a fairly comprehensive level, rather than cherry-picking or maybe getting a non-representative look based on only one organization’s view into the market?

Hakami: With SKYPAD, what we're doing is collecting data from the supplier, from the wholesaler, as well as from their retail stores, their wholesale business, and their dot-com, meaning the whole omni channel. When we collect that data, we cleanse it to make sure its meaningful to the user.

Hakami
Now, we're dealing with a connected world where the retailer, wholesalers, and suppliers have to talk to one another and plan together for the buying season. So the partnerships and the insight that they get into the product performance is extremely important, as Dane mentioned, in terms of the gross margin and in terms of the software information. SKYPAD basically provides that intelligence, that insight, into this retail/wholesale world.

Gardner: Isn’t this also a case where people are opening up their information and making it available for the benefit of a community or recognizing that the more data and the more analysis that’s available, the better it is for all the participants, even if there's an element of competition at some point?

Hakami: That's correct. The retail business likes to share the information with their suppliers, but they're not sharing it across all the suppliers. They're sharing it with each individual supplier. Then, you have the market research companies who come in and give you aggregation of trends and so on. But the retailers are interested in sell-through. They're interested in telling X supplier, "This is how your products are performing in my stores."

If they're not performing, then there's going to be a mark down. There's going to be less of a margin for you and for us. So, there's a very strong interest between the retailer and a specific supplier to improve the performance of the product and the sell-through of those products on the floor.

Gardner: Before we learn more about the data science and dealing with the technology and business case issues, tell us a little bit more about Sky I.T. Group, how you came about, and what you're doing with SKYPAD to solve some of these issues across this entire supply chain and retail market spot.

Complex history

Hakami: I'll take the beginning. I'll give you a little bit of the history, Dana, and then maybe Dane and Stephen can jump in and tell you what we are doing today, which is extremely complex and interesting at the same time.

We started with SKYPAD about eight years ago. We found a pain point within our customers where they were dealing with so many retailers, as well as their own retail stores, and not getting the information that they needed to make sound business decisions on a timely basis.

We started with one customer, which was Theory. We came to them and we said, "We can give you a solution where we're going to take some data from your retailers, from your retail stores, from your dot-com, and bring it all into one dashboard, so you can actually see what’s selling and what’s not selling."

Fast forward, we've been able to take not only EDI transactions, but also retail portals. We're taking information from any format you can imagine -- from Excel, PDF, merchant spreadsheets -- bringing that wealth of data into our data warehouse, cleansing it, and then populating the dashboard.

So today, SKYPAD is giving a wealth of information to the users by the sheer fact that they don’t have to go out by retailer and get the information. That’s what we do, and we give them, on a Monday morning, the information they need to make decisions.
As these business intelligence (BI) tools have become more popular, the distribution of data coming from the retailers has gotten more ubiquitous and broader in terms of the metrics.

Dane, can you elaborate more on this as well?

Adcock: This process has evolved from a time when EDI was easy, because it was structured, but it was also limited in the number of metrics that were provided by the mainstream. As these business intelligence (BI) tools have become more popular, the distribution of data coming from the retailers has gotten more ubiquitous and broader in terms of the metrics.

But the challenge has moved from reporting to identification of all these data sources and communication methodologies and different formats. These can change from week to week, because they're being launched by individuals, rather than systems, in terms of Excel spreadsheets and PDF files. Sometimes, they come from multiple sources from the same retailer.

One of our accounts would like to see all of their data together, so they can see trends across categories and different geographies and markets. The challenge is to bring all those data sources together and align them to their own item master file, rather than the retailer’s item master file, and then be able to understand trends, which accounts are generating the most profits, and what strategies are the most profitable.
Visit the Sky BI Team
At The 2016 NRF Big Show!
Get More Information
It's been a shifting model from the challenge of reporting all this data together, to data collection. And there's a lot more of it today, because more retailers report at the UPC level, size level, and the store level. They're broadcasting some of this data by day. The data pours in, and the quicker they can make a decision, the more money they can make. So, there's a lot of pressure to turn it around.

Gardner: When you're putting out those reports on Monday morning, do you get queries back? Is this a sort of a conversation, if you will, where not only are you presenting your findings, but people have specific questions about specific things? Do you allow for them to do that, and is the data therefore something that’s subject to query?

Subject to queries

Adcock: It’s subject to queries in the sense that they're able to do their own discovery within the data. In other words, we put it in a BI tool, it’s on the web, and they're doing their own analysis. They're probing to see what their best styles are. They're trying to understand how colors are moving, and they're looking to see where they're low on stock, where they may be able to backfill in the marketplace, and trying to understand what attributes are really driving sales.

But of course, they always have questions about completeness of the data. When things don’t look correct, they have questions about it. That drives us to be able to do analysis on the fly, on-demand, and deliver some responses, "All your stores are there, all of your locations, everything looks normal." Or perhaps there seems to be some flaws or things in the data that don’t actually look correct.

Not only do we need to organize it and provide it to them so that they can do their own broad, flexible analysis, but they're coming back to us with questions about how their data was audited. And they're looking for us to do the analysis on the spot and provide them with satisfactory answers.

Gardner: Stephen Czetty, we've heard about the use case, the business case, and how this data challenge has grown in terms of variety as well as volume. What do you need to bring to the table from the data architecture to sustain this growth and provide for the agility that these market decision-makers are demanding?

Czetty: We started out with an abacus, in a sense, but today we collect information from thousands of sources literally every single week. Close to 9,000 files will come across to us and we'll process them correctly and sort of them out -- what client they belong to and so forth, but the challenge is forever growing.

Czetty
We needed to go from older technology to newer technology, because our volumes of data are increasing and the amount of time that we need to consume to data in is static.

So we're quite aware that we have a time limit. We found HPE Vertica as a platform for us to be able to collect the data into a coherent structure in a very rapid time as opposed to our legacy systems.

It allows us to treat the data in a truly vertical way, although that has nothing to do with the application or the database itself. In the past we had to deal with each client separately. Now we can deal with each retailer separately and just collect their data for every single client that we have. That makes our processes much more pipelined and far faster in performance.

The secret sauce behind that is the ability in our Vertica environment to rapidly sort out the data -- where it belongs, who it belongs to -- calculate it out correctly, put it into the database tables that we need to, and then serve it back to the front end that we're using to represent it.

That's why we've shifted from a traditional database model to a Vertica-type model. It's 100 percent SQL for us, so it looks the same for everybody who is querying it, but under the covers we get tremendous performance and compression and lots of cost savings.

Gardner: For some organizations that are dealing with the different sources and  different types of data, cleansing is one problem. Then, the ability to warehouse that and make it available for queries is a separate problem. You've been able to tackle those both at the same time with the same platform. Is that right?

Proprietary parsers

Czetty: That's correct. We get the data, and we have proprietary parsers for every single data type that we get. There are a couple of hundred of them at this point. But all of that data, after parsing, goes into Vertica. From there, we can very rapidly figure out what is going where and what is not going anywhere, because it’s incomplete or it’s not ours, which happens, or it’s not relevant to our processes, which happens.

We can sort out what we've collected very rapidly and then integrate it with the information we already have or insert new information if it's brand-new. Prior to this, we'd been doing this by hand to a large-scale, and that's not effective any longer with our number of clients growing.

Gardner: I'd like to hear more about what your actual deployment is, but before we do that, let’s go back to the business case. Dane and Jay, when HPE Vertica came online, when Steve was able to give you some of these more pronounced capabilities, how did that translate into a benefit for your business? How did you bring that out to the market, and what's been the response?

Hakami: I think the first response was "wow." And I think the second response was, "Wow, how can we do this fast and move quickly to this platform?"
Prior to this, we'd been doing this by hand to a large-scale, and that's not effective any longer with our number of clients growing.

Let me give you some examples. When Steve did the proof of concept (POC) with the folks from HPE, we were very impressed with the statistics we had seen. In other words, going from a processing time of eight or nine hours to minutes was a huge advantage that we saw from the business side, showing our customers that we can load data much faster.

The ability to use less hardware and infrastructure as a result of the architecture of Vertica allowed us to reduce, and to continue to reduce, the cost of infrastructure. These two are the major benefits that I've seen in the evolution of us moving from our legacy to Vertica.

From the business perspective, if we're able to deliver faster and more reliably to the customer, we accomplished one of the major goals that we set for ourselves with SKYPAD.

Adcock: Let me add something there. Jay is exactly right. The real impact, as it translates into the business, is that we have to stop processing and stop collecting data at a certain point in the morning and start processing it in order for us to make our service-level agreements (SLAs) on reporting for our clients, because they start their analysis. The retail data comes in staggered over the morning and it may not all be in by the time that we need to shut that processing off.

One of the things that moving to Vertica has allowed us to do is to cut that time off later, and when we cut it off later, we have more data, as a rule, for a customer earlier in the morning to do their analysis. They don’t have to wait until the afternoon. That’s a big benefit. They get a much better view of their business.

Driving more metrics

The other thing that it has enabled us to do is drive more metrics into the database and do some processing in the database, rather than in the user tool, which makes the user tool faster and it provides more value.

For example, maybe for age on the floor, we can do the calculation in the background, in the database, and it doesn't impede the response in the front-end engine. We get more metrics in the database calculated rather than in our user tool, and it becomes more flexible and more valuable.
Sky I.T. Group
Retail Business-Intelligence Solutions
Get More Information
Gardner: So not only are you doing what you used to do faster, better, cheaper, but you're able to now do things you couldn't have done before in terms of your quality of data and analysis. Is there anything else that is of a business nature that you're able to do vis-à-vis analytics that just wasn't possible before, and might, in fact, be equivalent of a new product line or a new service for you?

Czetty: In the old model, when we got a new client we had to essentially recreate the processes that we'd built for other clients to match that new client, because they're collecting that data just for that client just at that moment.
In the current model, where we're centered on retailers, the only thing that will take us a long time to do in this particular situation is if there's a new retailer that we've never collected data from.

So 99 percent of it is the same as any other client, but one percent is always different, and it had to be built out. On-boarding a client, as we call it, took us a considerable amount of time -- we are talking weeks.

In the current model, where we're centered on retailers, the only thing that will take us a long time to do in this particular situation is if there's a new retailer that we've never collected data from. We have to understand their methodology of delivery, how it comes, how complex it is and so forth, and then create the logic to load that into the database correctly to match up with what we are collecting for others.

In this scenario, since we’ve got so many clients, very few new stores or new retailers show up, and typically it’s just our clients on retail chain, and therefore our on-boarding is just simplified, because if we are getting Nordstrom’s data from client A, we're getting the same exact data for client B, C, D, E, and F.

Now, it comes through a single funnel and it's the Nordstrom funnel. It’s just a lot easier to deal with, and on-boarding comes naturally.

Hakami: In addition to that, since we're adding more significant clients, the ability to increase variety, velocity, and volume is very important to us. We couldn't scale without having Vertica as a foundation for us. We'd be standing still, rather than moving forward and being innovative, if we stayed where we were. So this is a monumental change and a very instrumental change for us going forward.

Gardner: Steve, tell us about your actual deployment. Is this a single tenant environment? Are you on a single database? What’s your server or data center environment? What's been the impact of that on your storage and compression and costs associated with some of the ancillary issues?

Multi-tenant environment

Czetty: To begin with, we're coming from a multi-tenant environment. Every client had its own private database in the past, because in IBM DB2, we couldn't add all these clients into one database and get the job done. There was not enough horsepower to do the queries and the loads.

We ran a number of databases on a farm of servers, on Rackspace as our hosting system. When we brought in Vertica, we put up a minimal configuration with three nodes, and we're still living with that minimal configuration with three nodes.

We haven't exhausted our capacity on the license by any means whatsoever in loading up this data. The compression is obscenely high for us, because at the end of the day, our data absolutely lends itself to being compressed.

Everything repeats over and over again every single week. In the world of Vertica, that means it only appears once in wherever it lives in the database, and the rest of it is magic. Not to get into the technology underneath it at this point, from our perspective, it's just very effective in that scenario.
With the three nodes, we've had zero problems with performance. It hasn't been an issue at all. We're just looking back and saying that we wish we had this a little sooner.

Also in our IBM DB2 world, we're using quite costly large SAN configurations with lots of spindles, so that we can have the data distributed all across the spindles for performance on DB2, and that does improve the performance of that product.

However, in HPE Vertica, we have 600 GB drives and we can just pop more in if we need to expand our capacity. With the three nodes, we've had zero problems with performance. It hasn't been an issue at all. We're just looking back and saying that we wish we had this a little sooner.

Vertica came in and did the install for us initially. Then, we ended up taking those servers down and reinstalling it ourselves. With a little information from the guide, we were able to do it. We wanted to learn it for ourselves. That took us probably a day and a half to two days, as opposed to Vertica doing it in two hours. But other than that, everything is just fine. We’ve had a little training, we’ve gone to the Vertica event to learn how other people are dealing with things, and it's been quite a bit of fun.

Now there is a lot of work we have to do at the back end to transform our processes to this new methodology. There are some restrictions on how we can do things, updates and so forth. So, we had to reengineer that into this new technology, but other than that, no changes. The biggest change is that we went vertical on the retail silos. That's just a big win for us.

Gardner: As you know, HPE Vertica is cloud-ready. Is there any benefit to that further down the road where maybe it’s around issues of a spike demand in holiday season, for example, or for backup recovery or business continuity? Any thoughts about where you might leverage that cloud readiness in the future?

Dedicated servers

Czetty: We're already sort of in the cloud with the use of dedicated servers, but in our business, the volume increases in the stores around holidays is not doubling the volume. It’s adding 10 percent, 15 percent, maybe 20 percent of the volume for the holiday season. It hasn’t been that big a problem in DB2. So, it’s certainly not going to be a problem in Vertica.

We've looked at virtualization in the cloud, but with the size of the hardware that we actually want to run, we want to take advantage of the speed and the memory and everything else. We put up pretty robust servers ourselves, and it turns out that in secure cloud environments like we're using right now at Rackspace, it's simply less expensive to do it as dedicated equipment. To spin up a machine, like another node for us at Rackspace, would take about same time it would take for virtual system setup and configure to a day or so. They can give us another node just like this on our rack.

We looked at the cloud financially every single time that somebody came around and said there was a better cloud deal, but so far, owning it seems to be a better financial approach.

Gardner: Before we close out, looking to the future, I suppose the retailers are only going to face more competition. They're going to be getting more demand from their end users or customers for user experience for information.
We looked at the cloud financially every single time that somebody came around and said there was a better cloud deal, but so far, owning it seems to be a better financial approach.

We're going to see more mobile devices that will be used in a dot-com world or even a retail world. We are going to start to see geolocation data brought to bear. We're going to expect the Internet of Things (IoT) to kick in at some point where there might be more sensors involved either in a retail environment or across the supply chain.

Clearly, there's going to be more demand for more data doing more things faster. Do you feel like you're in a good position to do that? Where do you see your next challenges from the data-architecture perspective?

Czetty: Not to disparage too much the industry of luxury, but at this point, they're not the bleeding edge on the data collection and analysis side, where they are on the bleeding edge on social media and so forth. We've anticipated that. We've got some clients who were collecting information about their web activities and we have done analysis for identifying customers who are presenting different personas through their different methods as they contact the company.

We're dabbling in that area and that’s going to grow as it becomes so tablet-oriented or phone-oriented as the interfaces go. A lot of sales are potentially going to go through social media and not just the official websites in the future.

We'll be capturing that information as well. We’ve got some experience with that kind of data that we’ve done in the past. So, this is something I'm looking forward to getting more of, but as of today, we’re only doing it for a few clients.

Well positioned

Hakami: In terms of planning, we're very well-positioned as a hub between the wholesaler and the retailer, the wholesaler and their own retail stores, as well as the wholesaler and their dot-coms. One of the things that we are looking into, and this is going to probably get more oxygen next year, is also taking a look at the relationships and the data between the retailer and the consumer.

As you mentioned, this is a growing area, and the retailers are looking to capture more of the consumer information so they can target-market to them, not based on segment but based on individual preferences. This is again a huge amount of data that needs to be cleansed, populated, and then presented to the CMOs of companies to be able to sell more, market more, and be in front of their customers much more than ever before.
Visit the Sky BI Team
At The 2016 NRF Big Show!
Get More Information
Gardner: That’s a big trend that we are seeing in many different sectors of the economy -- that drive for personalization, and it really is a result of these data technologies to allow that to happen.
Any other thoughts about where the intersection of computer science capabilities and market intelligence demands are coming together in new and interesting ways?

Adcock: I'm excited about the whole approach to leveraging some predictive capabilities alongside the great inventory of data that we've put together for our clients. It's not just about creating better forecasts of demand, but optimizing different metrics, using this data to understand when product should be marked down, what types of attributes of products seem to be favored by different locations of stores that are obviously alike in terms of their shopper profiles, and bringing together better allocations and quantities in breadth and depth of products to individual locations to drive better, higher percentage of full-price selling and fewer markdowns for our clients.

So it’s a predictive side, rather than discovery using a BI tool.

Czetty: Just to add to that, there's the margin. When we talked to CEOs and CFOs five or six years ago and told them we could improve business by two, three, or four percent, they were laughing at us, saying it was meaningless to them. Now, three, four, or five percent, even in the luxury market, is a huge improvement to business. The companies like Michael Kors, Tory Burch, Marc Jacobs, Giorgio Armani, and Prada are all looking for those margins.
I'm excited about the whole approach to leveraging some predictive capabilities alongside the great inventory of data that we've put together for our clients.

So, how do we become more efficient with a product assortment, how do we become more efficient with distribution and all of these products to different sales channels, and then how do we increase our margins? How do we not over-manufacture and not create those blue shirts in Florida, where they are not selling, and create them for Detroit, where they're selling like hotcakes.

These are the things that customers are looking at and they must have that tool or tools in place to be able to manage their merchandising and by doing so become a lot more agile and a lot more profitable.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Monday, January 11, 2016

Is 2016 the year that accounts payable becomes strategic?

The next BriefingsDirect business innovation thought leadership discussion focuses on the changing role and impact of accounts payable (AP) as a strategic business force.

We’ll explore how intelligent AP is rapidly transforming by better managing exceptions, adopting fuller automation, and implementing end-to-end processes that leverage connected business networks.

As the so-called digital enterprise adapts to a world of increased collaboration, digital transactions, and e-payables management -- AP is needing to adapt in 2016.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about the future of AP as a focal point of automated business services we are joined by Andrew Bartolini, Chief Research Officer at Ardent Partners in Boston, and Drew Hofler, Senior Director of Marketing at SAP Ariba. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Drew, let’s look at the arrival of 2016. We have more things going on digitally, we have a need to improve efficiency, and AP has been shifting -- but how will 2016 make a difference? What should we expect in terms of AP elevating its role in the enterprise?

Hofler: AP is one of those areas that everybody looks at, first and foremost, as a cost center. So when AP looks at what they can do better, they've typically thought about efficiency and cost savings first. That’s the plus side of the cost center as saving money by spending less.

Hofler
But what we've been seeing happening over the last year or so, and what will accelerate in 2016, is that AP begins to move more from just a cost saving and efficiency focus to value creation. And this is where they sit in the hub of one of the three critical elements of working capital -- inventory, receivables and payables -- and AP sits squarely on that last one.

And they have influence over that which affects the company's working capital. AP has become so very important for companies by creating the efficiencies in the invoice process, it opens up opportunities, and they're going to be able to affect a company’s working capital for the positive going forward. That’s going to grow as they move beyond just the automation that is the foundation to then seeing the opportunities that come out of that.

Gardner: Andrew, do you see AP also as a digital hub, growing in its role and influence and being able to increase its value beyond cost efficiency into these other higher innovation levels or strategic levels of benefit?

Tracking trends

Bartolini: Yes, absolutely. I've been researching and working in this space for 17 years, doing significant market research over the last 11 years. So I've been tracking the trends and the ebbs and flows of relative interest and investment in AP.

What we've seen in 2015 in some of our most recent research is that there has been a broader focus or a shift away from viewing the AP opportunity as an efficiency one or solely an efficiency one. Let’s automate. Let’s reduce our costs in processing invoices. Let’s reduce our costs in payments.

Bartolini
But what we saw this year for the first time in our research was that the top area of focus, the top business pressure that’s driving investments in AP transformation was the need to get better visibility into all the valuable information that comes across the AP departments or through the AP operation, both on the invoice and the payment side.

That begins to change the conversation. We talked about the evolution of AP moving from a strictly back-office siloed department to an increasing point of collaboration with procurement at the purchase-to-pay (P2P) process, with treasury, from a cash-management perspective. Now, we see it starting to move and becoming a true intelligence hub, and that’s where we've seen some momentum. There’s a lot of wind in the sails for AP, really pushing that forward in 2016 and beyond.

Gardner: Andrew, what’s driving this? Is this the technology that's now allowing that data?

Bartolini: There are a couple of factors underlying this movement. The first is taking the broader perspective within business as a whole. Businesses can no longer allow distinct business functions to operate within silos. They need everybody on the same team, rowing in the same direction. That has forced greater collaboration.

That’s something that we've seen more broadly between procurement and finance over the past couple of years, specifically with the role of the CPO and the CFO. A majority of organizations see a very strong level of collaboration within those two job roles and within their departments as a whole.

That has opened up larger opportunities for AP, which is a more tactical function as it relates to procurement, but by bringing the two groups together, you now have shared resources and shared focus on improving the entire source-to-settle process.

That relationship has driven greater interest, because the opportunities are fantastic for procurement to leverage the value of a more efficient AP process and to be able to see the information that’s there.

As Drew mentioned, by becoming more efficient on the front end of the AP process, organizations are doing a better job in reducing the amount of paper that’s coming in through the front door. They're processing their invoices faster. That's opening up opportunities on the back-end, on the payment side.

So, you have a confluence of those factors and you see newer solutions in the marketplace as well that are really changing the view that AP departments have of what defines a transformation. They're thinking more holistically across the entirety of the AP process, from invoice receipt, all the way through payment and settlement.

Allowing for variables

Gardner: Drew, it seems that over history, once a contract is closed the terms remain fairly rigid, and then there is a simple fulfillment aspect to it. But it sounds like -- as we get more visibility, as we get digitized, and we can automate -- we can handle exceptions better and allow for more variables.

I've heard instances where the terms can be adjusted, that market forces can provide for ways in which a deal gets amended as an ongoing basis, whether it's in terms of payment, whether perhaps there are other ancillary issues. Is that what we are seeing, that the digital transformation is giving us more opportunity to be flexible, and is that then elevating the role of the AP organization?

Hofler: You make a couple of good points there, and it really springs from what Andrew just said about not having to silo or not staying in that siloed place where AP and procurement are separate or the processes are separate, because what companies have realized, particularly as the digital age has made it possible, is that the procure-to-pay process, the source-to-settle process, is a fundamentally connected one.

Over the years they've operated very disconnectedly, with hand-offs, where procurement does its thing, writes a contract and then hands it off once the purchase order (PO) goes out the door, and then AP takes up the process from there. But in that, there are a lot of disconnects.
What companies have realized, particularly as the digital age has made it possible, is that the procure-to-pay process, the source-to-settle process, is a fundamentally connected one.

When you're able to bring networked systems together to bring visibility across that entire process, now you have the AP group acting in a more strategic manner to deliver value by acting as the value-capture group.

For example, prior to this age that we live in now, a contract would be written, it would have specific terms for specific items and specific prices for specific SKUs, and maybe some volume discounts. AP had no idea about that, because these contracts would get signed and they get put in a file cabinet or stuck in a PDF file somewhere, and the AP had no idea. So they went off of the invoice that came in.

This is how an entire industry about post-audit recovery came about, to go after the fact and try to claw back over-payment, because there's no visibility in AP to what procurement did.

By bringing these together in a system, on a network, you're able to automatically capture those savings, because AP now has visibility into what’s happening inside of that contract, and can insure on an automated basis that they are paying the right amount. So, it becomes not just a buy-right thing from the procurement side, but a pay-right thing as well, a buy- and pay-right tied together.

But that's your point about terms. Yes, you have certain terms tied into that contract, but again, that's set at the beginning of a relationship with a supplier. There are lots of opportunities that come up when everybody has visibility into what's going on, into an early-approved invoice for example.

Opportunities for collaboration

There are lots of opportunities that arise for collaboration where maybe the situation has changed a little bit. Maybe a supplier, instead of being paid in 45 days, now would very much like to be paid in five days, because they have payroll ahead or they have an equipment purchase to make, and they want to accelerate their cash flow.

In a disconnected world, you can't account for that. But in a networked world, where there is visibility, I like to say that it's the confluence of visibility, opportunity, and capability where all parties have visibility into the opportunity created by efficiencies with that earlier approved invoice. Then, there's the capability inside the system to simply click a button and accelerate that cash flow or modify those terms on that contract, in those payment terms.

So this idea of P2P being a linked value chain and the digital technology of today can bring those together so that there are no barriers to that information flow and that creates all sorts of opportunities for all parties involved.

Gardner: Andrew, we're having a common denominator here of visibility, the visibility is what allows for a lot of these efficiencies and innovation values to occur, where does that visibility come from, where does the data get generated, how is it shared, and how do we further reduce the silos through the free flow of data analysis and information?

Bartolini: Visibility at the core starts with automation tools that automate processes. If we're looking at the P2P process, you're looking at an eProcurement system. You can go back to where it starts, from sourcing and contracting. If you have contract visibility or at least visibility into your header-level information, you begin to have an understanding of what, in fact, the relationship is and what relationships you have as an organization, who are your preferred suppliers, who are your strategic suppliers.
Visibility at the core starts with automation tools that automate processes.

As you start to drill down, if you have the capabilities to capture things like payment terms and service-level agreements (SLAs). That information begins to provide a more robust view of the relationship that can then be more strategically managed from a procurement perspective, and then really sets up the operational procurement side.

If you have an eProcurement system, you're able to generate purchase orders against those contracts and you're ensuring that before the purchase order is even sent to the supplier, the pricing and the terms are correct.

That cascades over onto the AP automation side. We use the term "ePayables" very broadly to describe AP automation solutions. When you have an eProcurement and an ePayables solution connecting, you begin to have greater visibility within the enterprise for the entirety of the relationship and the entirety of the transaction.

On the flip side, where we haven't gotten to the value proposition for suppliers who really view their customer relationship as a single one, what often happens is they have multiple relationships within that customer that really aren't needed. They negotiate a contract, they have their internal customer, and then they are dealing with maybe a procurement department and then trying to then figure out who they are dealing with on the AP side.

When you’ve got visibility that can be shared with trading partners, you get extraordinarily greater value out of the entire thing, and you streamline relationships and you're able to focus on the more important aspects of those relationships. But to the original question, visibility starts and ends with technology.

Centralizing procurement

Gardner: We're also seeing the trend of larger organizations centralizing procurement, sometimes placing it, if it's a global organization, in another country instead of having it in multiple countries or multiple markets. It becomes consolidated and automated. How does that fit in, Drew?

Hofler: We see definitely a move toward a shared service or a global process ownership type of thing, where they want to take the variability out of the different geographies or different business units doing what is essentially a standardized process, or they want to make that standardized.

We definitely see the movement in that, and it's both a business desire and goal to remove the variability, but it's something that's enabled by the technology that we have today in business networks, in centralized systems, that can tie all of this together. Now you have business units operating across the world, but tapping in all of that information, tapping in, getting all the invoices to come into one place through a network. Those business units can see that. Those business units have access at a controlled pace to the information that they need inside of those systems as well.
On the procurement side, if you're sourcing globally, you can have different centers of excellence.

For the ability to connect the data to everybody, to turn that data not just from an information but to intelligence, getting it in front of the right people at the right time and the right process, the business networks really, really help to drive that. Having that centralized network hub where everybody can connect at the point of the process that they need really helps drive or enable the movement towards shared service and centralized AP procurement.

Bartolini: Anyone would be hard-pressed to make a case that you should have a decentralized AP operation. That doesn't mean that you can't have staff that are geographically dispersed, but there's no reason why that should exist.

On the procurement side, if you're sourcing globally, you can have different centers of excellence. Again, you want to have a more centralized view into visibility and to be utilizing the same systems and processes. On the AP side, centralization also helps from the standpoint that you begin to get a better sense of what resources are being applied in the AP process today. It also becomes easier to centralize or to gain budgets for investment in tools that can drive efficiency, visibility, and all the things we've just been talking about.

Gardner: Another thread that I’m hearing in our conversation is that technology needs to be exploited, visibility gained, and automation made possible. Then, centralization can become a huge benefit from all of that. But none of this is possible if we don't go all digital. If we don't get off of manual processes and get off of paper. What do you think is going to be the ratio, if you will, of a paper approach that's left? Are we finally going to pull the last paper invoice out, or the last payment that's manual? Where are we, Drew, when it comes to making that full transition to digital? It seems to me an overwhelmingly beneficial direction.

Still using paper

Hofler: I've been in the payment space for about 20 years and the payable space for the last 10, and in payment, there have been predictions in that space that we would get rid of the paper check completely. Gosh, for the last 20 years everybody is saying it's going to happen, but it hasn't. It's still about 50 percent paper checks.

So I'm not going to make a prediction that paper is going to go away, but most definitely, companies need to deal with and move toward electronic data. Even if it's paper based, a lot of companies are moving toward getting the data in electronically, but a lot of them say, "Well, I get my paper scanned, I've sent it to a scanning service or whatever, and I get it in PDF or electronic data form."

That's fine and that's one step along the process, but companies are realizing that there's a limitation in that. When you do that, you're simply getting the data that was on that paper source document faster. If that paper source document data is garbage, and that's what creates exceptions, then you're just getting the exceptions quicker, and that doesn't really help the process, that doesn't really solve the true issue of making sure you're not only getting the data faster, but that you get it in clean and that you get it in better.
This is where companies need to move toward full electronic invoicing, where it starts its life as an electronic invoice.

This is where companies need to move toward full electronic invoicing, where it starts its life as an electronic invoice, so that a supplier can submit it and have it run through business rules electronically before it even gets the AP. They can identify the exceptions and turn it around to the supplier and have them correct it, all in a very quick and automated fashion, so that by the time AP gets it it's 98 percent exception free or straight-through processing.

Companies are going to realize that just transforming a paper source document into an electronic form has had value in the past, but its value is quickly running out, and they need to move toward true electronic.

How far we are going to get along that path? Well, that’s a big prediction to make, but I think we'll move along way down that path. Companies definitely need to recognize, and are starting to recognize, that they need to deal with native electronic data in order to truly gain value, efficiency, and intelligence and be able to leverage that into other opportunities.

Gardner: We mentioned exception handling, exception management, making that easier, better, faster. It strikes me that exception management is really a means to a greater end, and the greater end is general flexibility -- even looking at things as markets, as auctions, where there's variability and a fit-for-purpose kind of mentality can come in.

So am I off in some pie-in-the-sky direction, Andrew? Or when we think about the ability to do exception management, are we really opening up the opportunity to do even more interesting, innovative things with business transactions?

Reduction of exceptions

Bartolini: No, I don’t think it’s pie in the sky. One of our recent surveys of about 200 or so AP finance, and P2P professionals, a question was asked, what’s the number one game changer that will get your AP operation to the next level of performance? And the answer that came in loud and clear was the reduction of exceptions and the ability to perform root-cause analysis in a much more significant way.

So it’s a fundamental problem, and the opportunity is for a majority of things. About two-thirds of organizations feel that if they could handle this issue better, if they could reduce that number, they would be operating at a significantly higher level.

We haven’t really talked too much about the suppliers in this equation, but a lot of business focus and a lot of the themes in our research this year and into 2016 has been focused on agility and the need for organizations to become more adept and responsive to market shifts and changes.

Part of that is getting better alignment with the strategic suppliers that are going to drive more value and that are having a greater impact on the company's own products and services and ultimately their results.
When that noise in the relationship is reduced it allows organizations to focus on goals and objectives and to invest more in the strategic elements of the relationship.

So, you look at something like exceptions that are problematic for both sides of the trading-partner equation, when you start to reduce those, when you start to eliminate a lot of the friction that is built in, certainly around the manual P2P process, but can exist even in an automated environment. When that noise in the relationship is reduced it allows organizations to focus on goals and objectives and to invest more in the strategic elements of the relationship.

Gardner: Drew, anything to add to that, particularly when you consider that the pool of suppliers is, in a sense, growing when we look at contingent workers, when we look at different types of suppliers as smaller firms, perhaps located at a much greater geographic distance than in the past. We have more open markets as a result of connected business networks. How do you see that panning out in 2016?

Hofler: Yeah, there's definitely a growth in that. There's a pretty good stat that shows that a much larger portion of a company's workforce is not bound to that company, and it's a temporary, it's a contingent workforce, it's services that are from contractors that aren't necessarily tied to them.

The need to handle that, particularly the churn that happens with that, the broader number of contractors that you might have with that, the variability in the services that are asked for, that are needed, all of this adds layers of complexity, potentially, to AP, and to procurement as well. We're focused more on AP here, but it adds layers of complexity in managing that and approving that, and as a result, can add a significant number of exceptions.

So, while you're operating your business in a way that is a little more fitting in today’s world, you're also adding a lot of complexity and exceptions to the process, unless you’ve got a way to automatically build in the ability to define the invoice and to identify the exceptions so that these various suppliers who are much smaller and geographically dispersed can submit online or can submit electronically and can do so in a way that's standardized, even across this large group.

Catching exceptions

The exceptions can be caught right away, for example, field services. If there's a service sheet form that was put out by procurement to hire somebody to go fix an oil well, and they get out to the oil well and there’s more to be done than what was on that, they have to get approval for that. To have the ability to get that approval online, automatically, through a mobile device, and have it tied directly into the invoice, and have the invoice close that eliminates all those potential breakpoints of finally getting that invoice in and getting the exceptions dealt with and approved.

Exceptions to me aren't just a matter of, "Gosh, they're hard to do." They're something we want to get rid of. But exceptions are simply the barrier to the opportunity that comes when you can get that invoice moved through and approved right away, not necessarily a matter of paying the invoice faster from the payer’s perspective, but the ability to have it approved and ready to go right away, so that you have options, and so that the supplier has options potentially for cash flow and things like that.

Exceptions become something that we have to eliminate in order to get to that opportunity, but without the platform to do that, to your point, the dispersed workforce, and the increasing contractors, they can make it even harder than it is or than it has been.

Gardner: When we look at the payoffs from doing things better using AP intelligence and technology, we are not just looking at efficiency for its own sake. I think you're opening up more opportunity, as you put it, to the larger business.
If procurement and accounts payable can adjust and react rapidly to complexity, to exceptions, to new ways of doing business -- this is a powerful tool to the business at large.

If procurement and accounts payable can adjust and react rapidly to complexity, to exceptions, to new ways of doing business -- this is a powerful tool to the business at large. They can go at markets differently. They can acquire goods and services across a wider portfolio of choices, a wider marketplace, and therefore be able to perhaps get things easier, faster, cheaper.

Let’s look at this idea of non-tangible payoffs that elevates the value of AP to being a sophisticated intelligent operation. Let's start with Andrew. What are some of the intangibles -- if we do all the above that we have mentioned well – how does this empower the organization in ways that we haven't seen before?

Bartolini: That’s a great question and it gets back to the one point I was just making about agility. If you were to argue that we're operating in an age of innovation, where globalization and the level of competition, and the speed of business in general has really accelerated the time frames that organizations must react -- I think this is happening at a much faster pace.

You can see that in areas like the consumer electronics market, and in all industries, product lifecycles are shortening, and so the windows of opportunity to maximize sales and revenues in the marketplace are much shorter as well.

Things are happening at a much faster clip and in tighter time frames. This has created a much greater reliance upon your suppliers and upon your supply chain. And so having visibility across the P2P process, across the source-to-settle process, and having much tighter relationships with your strategic suppliers ultimately positions the organization to become much more agile and much more competitive. And that's the value dividend that's created from a more streamlined P2P process.

It’s being able to more fully optimize the relationships that you have with your suppliers, and it's being able to make decisions and shifts in a much faster way than in the past, and that's not just from the sourcing side, that carries all the way through to the payment side as well.

Business agility

Gardner: Drew, when we think about the strategic role of AP -- of providing business agility -- you can’t get more strategic than that.

Hofler: No, that's right. AP particularly can become the source of much of that strategic intelligence that companies need. They can't just see themselves as processing paper or as a back-office cost center, but as being the ones that capture that can, through their use of systems and investment in systems and networks, capture the data in invoices, for example, and can feed that data into the sourcing cycle at the beginning, so it becomes a virtuous circle.

They can create the opportunity for the company to meet some of their very strategic goals around working capital. So now AP and their ability to tie into what procurement has done before them and automate the process and get things done very nimbly and ready to go and create this opportunity, are creating opportunity for treasury as well, so now you have got a third party in there.

The treasurer is very concerned about what his liabilities are out there, what the payments liabilities are. Does he know? Often, in today’s world, treasurers can’t see their payable liabilities until they run through their payment cycle and they're ready to be paid the next day. So they have to move cash around to make sure that they have enough cash to manage those liabilities going out.

With visibility into what’s going to be paid out 30 days from now, having that 30 days in advance offers the treasurer all sorts of options on how to manage their cash among various different bank accounts.
It gives the treasurer the opportunity to pay that supplier early, using excess cash that’s sitting in a bank account.

Plus, it gives them the option to do things around their days payable outstanding (DPO), to bring third parties into a business network, to bring in third-party supply chain finance that allow a supplier who might need early payment liquidity and early cash flow to access that from a third party while the buying organization is able to hold on to their cash, and so extend their DPO and improve their working-capital management.

Or it gives the treasurer the opportunity to pay that supplier early, using excess cash that’s sitting in a bank account. Even though the Fed just raised rates in the last day or two, they only raised it a quarter of a percent. So it’s still not earning very much. But now, a treasurer can take that and pay a supplier early in exchange for a discount that earns them something along the lines of 8-12 percent annually.

It opens up options, but right at the nexus of all of that opportunity, information, and intelligence sits AP. That’s a very strategic place for AP to be if they can get their hands around that data, create those opportunities, and make it visible to the rest of the business.

Gardner: One last area to get into for 2016 … One of the top concerns in addition to business agility for companies and organizations is risk, security, and dealing with compliance issues, with regulatory issues. Is there something that AP brings to the table when it has elevated itself to the strategic level, with that visibility with that data, with the ability to act quickly and be able to take on exceptions and work through them?

Andrew, we've heard about how, on the procurement side, that examining the supply chain, knowing that supply chain, being able to head off interruptions or other issues, having business continuity mindset is important. Does that translate over to AP, and why and how does AP have a larger role in issues around continuity?

Risk mitigation

Bartolini: From a risk-mitigation standpoint, when you have greater assurances, that the invoices are matched to the PO, to the orders that have been generated, to what has been delivered, when you have a clear view into how that payment is made, across and into the supplier’s account, you're reducing the opportunities for fraud, which can exist in any type of environment, manual or fully automated. One of the largest risk-mitigation opportunities for AP is really at the transactions level.

When you start to cascade the visibility that AP generates out into the larger organization, you can start to do some predictive analysis from the procurement side to better understand potential issues that suppliers may be facing.

Also from a treasury standpoint, when you have visibility into the huge amount of money that is being paid out by AP, you have a better sense of your company’s liquidity, your cash positions, and what you need to do to ensure that you maintain that liquidity.

Looking on the supplier’s side, when you're processing invoices more quickly and you have the opportunities to make payments early, there are those opportunities for the larger companies to step in and help out some of their struggling suppliers, whether that’s paying their invoices early or some other mechanism. It starts with visibility, and from that visibility you start to have a better ability to make smarter decisions and to anticipate potential issues.
They may have had an otherwise healthy business, but not sufficient cash flow to maintain operations, and that hurt buying organizations who depend on them.

Gardner: Last word to you Drew on this issue of risk reduction, continuity, using intelligence to head off disruption or fraud, how do you see that panning out in 2016?

Hofler: I think AP does play a large role in that. Andrew touched on some of that.

One of the key areas, if you think about supply chain and from the procurement side, the financial supply chain is pretty much just as important as the physical supply chain when it comes to risk. As we learned, people have gotten it deep in their bones since 2008 and 2009 when liquidity became a very big issue. There was liquidity risk in supply chains of suppliers who couldn’t access cash flow or didn’t have sufficient cash flow. They may have had an otherwise healthy business, but not sufficient cash flow to maintain operations, and that hurt buying organizations who depend on them.

By being able to approve invoices very quickly and offer up to your suppliers, through a single portal, a single network access, access to cash, either from a buying company using their own or bringing in third-party financing, you essentially are able to eliminate or greatly mitigate liquidity risk in your supply chain.

But there are other areas of risk, too. Anytime you're talking about AP, Andrew said it the right way, where he talked about the massive amounts of money that AP is paying out. That’s their job.

In order to do that, they have to actually capture, manage, and maintain bank account information from their suppliers in order to pay electronically. We're always trying to get away from paper checks, because paper checks, we know, are rife with fraud, very horribly opaque and very slow, but electronic payments require them to capture bank account information. And that’s not a core competency of most AP departments.

Network power

But AP departments can tap into the power of network ecosystems that bring in third parties whose core competency that very much is, to eliminate their need to ever even see a supplier’s bank account information.

Some forward-looking AP departments are looking at how they can divest themselves of that which is not their core competency, and in some ways around risk mitigation and payment, one of them is getting rid of having to touch bank account information.

Beyond that, when we talk about compliance and that type of thing, AP sits right in the middle of that, whether that be from VAT compliance in Europe, to archival compliance, to stocks compliance here in the US, having all of the data electronic and having an auditable trail and being able to know exactly where every piece of data and every dollar or euro spent has been and where it went along the way and having a trail of that automatically capture and archived, that goes a long way towards compliance.

AP is the one that sits right there to be able to capture that and provide that.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: SAP Ariba.

You may also be interested in: