Tuesday, August 4, 2015

HP hyper-converged appliance delivers speedy VDI and apps deployment and a direct onramp to hybrid cloud

HP today announced the new HP ConvergedSystem 250-HC StoreVirtual (CS 250), a hyper-converged infrastructure appliance (HCIA) based on HP's new ProLiant Apollo 2000 server and HP StoreVirtual software-defined storage (SDS) technology.

Built on up-to-date HP, Intel, and VMware technologies, the CS 250 combines a virtual server and storage infrastructure that HP says is configurable in minutes for nearly half the price of competitive systems. It is designed for virtual desktops and remote office productivity, as well as  to provide a flexible path to hybrid cloud. [Disclosure: HP is a sponsor of BriefingsDirect.]

Designed to attract customers on a tight budget, the HP CS 250 includes a new three-node configuration that is up to 49 percent more cost effective than comparable configurations from Nutanix, SimpliVity and other competitors, says HP. Because HP's StoreVirtual runs in VMware, Microsoft Hyper-V and KVM virtual environments, the appliance may soon come to support all those hypervisors.

HP recently discontinued the EVO:RAIL version of its HCIA, which was based on the EVO:RAIL software from OEM partner VMware.

Increasingly, even small IT shops want to modernize and simplify how they support existing applications. They want virtualization benefits to extend to storage, backup and recovery, and be ready to implement and consume some cloud services. They want the benefits of software-defined data centers (SDDC), but they don’t want to invest huge amounts of time, money, and risk in a horizontal, pan-IT modernization approach.

That's why, according to IDC, businesses are looking for flexible infrastructure solutions that will allow them to quickly deploy and run new applications. This trend has resulted in a 116 percent year-over-year increase in hyper-converged systems sales and 60 percent compound annual growth rate (CAGR) anticipated through 2019.

The growth in the building blocks approach to IT infrastructure is building rapidly. IDC estimates that in 2015, $10.2 billion will be spent on converged systems, representing 11.4 percent of total IT infrastructure spending. This number will grow to $14.3 billion by 2018, representing 14.9 percent of total IT infrastructure spending, says IDC. Similarly, Technology Business Research, Inc. in Hampton, NH, estimates a $10.6 billion U.S. addressable market over the next 12 months, through mid-2016.

With HCIAs specifically, enterprises can begin making what amounts to mini-clouds based on their required workloads and use cases.  IT can quickly deliver the benefits of modern IT architectures without biting off the whole cloud model. Virtual desktops is a great place to begin, especially as Windows 10 is emerging on the scene.

Indeed, VDI deployments that support as many as 250 desktops on a single appliance at a remote office or agency, for example, allow for ease in administration and deployment on a small footprint while keeping costs clear and predictable. And, if the enterprise wants to scale up and out to hybrid cloud, they can do so with ease and low risk.

Multi-site continuity

The inclusion of three 4TB StoreVirtual Virtual Storage Appliance (VSA) licenses also allows the new HP CS 250 system to replicate data to any other HP StoreVirtual-based solution. This means that customers can leverage their existing infrastructure as a replication target at no additional cost, says HP. The CS 250 also allows customers to tailor the system with a choice of up to 96 processing cores, a mix of SSD and SAS disk drives, and up to 2TB of memory per 4-node appliance -- double that of previous generations.

The CS 250 arrives pre-configured for VMware's vSphere 5.5 or 6.0 and HP OneView InstantOn to enable customers to be production-ready with only 5 minutes of keyboard time and a total of 15 minutes deployment time, with daily management from VMware vCenter via the HP OneView for VMware vCenter plug-in, says HP.

HP sees the CS 250 as a oath to bigger things. For midsize and enterprise customers seeking an efficient and cost-effective cloud entry point, for example, the new HP Helion CloudSystem 9.0 built on the CS 250 provides a direct path to the hybrid cloud. This hyper-converged cloud solution leverages the clustered compute and storage resources of the CS 250 for on-premise workloads but adds self-service portal provisioning and public cloud bursting features for those moving beyond server virtualization.
HP announced that it is enhancing its “Nitro” partner program and opening it up to distributors worldwide, starting with Arrow Electronics in the US.

HP is also introducing new Software-Defined Storage Design and Integration services to help customers deploy highly scalable, elastic cloud storage services, the company announced today. The integration service provides customers with detailed configuration and implementation guidance tailored to their specific needs to accelerate time to value, said HP.

The 4-node CS 250-HC StoreVirtual will be available August 17 with a list price of $112,000. The 3-node CS 250-HC StoreVirtual will available September 28 with a list price of $88,000.

You may also be interested in:

Thursday, July 30, 2015

Full 360 takes big data analysis cloud services to new business heights

The latest BriefingsDirect cloud innovation case study interview highlights how Full 360 uses big data and analytics to improve their applications support services for the financial industry -- and beyond.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Read a full transcript or download a copy.

To learn how Full 360 uses HP Vertica in the Amazon cloud to provide data warehouse and BI applications and services to its customers from Wall Street to the local airport, BriefingsDirect sat down with Eric Valenzuela, Director of Business Development at Full 360, based in New York. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Tell us about Full 360.

Valenzuela: Full 360 is a consulting and services firm, and we purely focus on data warehousingbusiness intelligence (BI), and hosted solutions. We build and consult and then we do managed services for hosting those complex, sophisticated solutions in the cloud, in the Amazon cloud specifically.
Become a member of myVertica today
Register now
Gain access to the free HP Vertica Community Edition
Gardner: And why is cloud a big differentiator for this type of service in the financial sector?

Valenzuela: It’s not necessarily just for finance. It seems to be beneficial for any company that has a large initiative around data warehouse and BI. For us, specifically, the cloud is a platform that we can develop our scripts and processes around. That way, we can guarantee 100 percent that we're providing the same exact service to all of our customers.

Valenzuela
We have quite a bit of intellectual property (IP) that’s wrapped up inside our scripts and processes. The cloud platform itself is a good starting point for a lot of people, but it also has elasticity for those companies that continue to grow and add to their data warehousing and BI solutions. [Register for the upcoming HP Big Data Conference in Boston on Aug. 10-13.]

Gardner: Eric, it sounds as if you've built your own platform as a service (PaaS) for your specific activities and development and analytics on top of a public cloud infrastructure. Is that fair to say?

Valenzuela: That’s a fair assumption.

Primary requirements

Gardner: So as you are doing this cloud-based analytic service, what is it that your customers are demanding of you? What are the primary requirements you fulfill for them with this technology and approach?

Valenzuela: With data warehousing being rather new, Vertica specifically, there is a lack of knowledge out there in terms of how to manage it, keep it up and running, tune it, analyze queries and make sure that they're returning information efficiently, that kind of thing. What we try to do is to supplement that lack of expertise.

Gardner: Leave the driving to us, more or less. You're the plumbers and you let them deal with the proper running water and other application-level intelligence?

Valenzuela: We're like an insurance policy. We do all the heavy lifting, the maintenance, and the management. We ensure that your solution is going to run the way that you expect it to run. We take the mundane out, and then give the companies the time to focus on building intelligent applications, as opposed to worrying about how to keep the thing up and running, tuned, and efficient.

Gardner: Given that Wall Street has been crunching numbers for an awfully long time, and I know that they have, in many ways, almost unlimited resources to go at things like BI -- what’s different now than say 5 or 10 years ago? Is there more of a benefit to speed and agility versus just raw power? How has the economics or dynamics of Wall Street analytics changed over the past few years?
We're like an insurance policy. We do all the heavy lifting, the maintenance, and the management.

Valenzuela: First, it’s definitely the level of data. Just 5 or 10 years ago, either you had disparate pieces of data or you didn’t have a whole lot of data. Now it seems like we are just managing massive amounts of data from different feeds, different sources. As that grows, there has to be a vehicle to carry all of that, where it’s limitless in a sense.

Early on, it was really just a lack of the volume that we have today. In addition to that, 8 or 10 years ago BI was still rather new in what it could actually do for a company in terms of making agile decisions and informed decisions, decisions with intent.

So fast forward, and it’s widely accepted and adopted now. It’s like the cloud. When cloud first came out, everybody was concerned about security. How are we going to get the data in there? How are we going to stand this thing up? How are we going to manage it? Those questions come up a lot less now than they did even two years ago. [Register for the upcoming HP Big Data Conference in Boston on Aug. 10-13.]

Gardner: While you may have cut your teeth on Wall Street, you seem to be branching out into other verticals -- gaming, travel, logistics. What are some of the other areas now to which you're taking your services, your data warehouse, and your BI tools?

Following the trends

Valenzuela: It seems like we're following the trends. Recently it's been gaming. We have quite a few gaming customers that are just producing massive amounts of data.

There's also the airline industry. The customers that we have in airlines, now that they have a way to -- I hate this term -- slice and dice their data, are building really informed, intelligent applications to service their customers, customer appreciation. It’s built for that kind of thing. Airlines are now starting to see what their competition is doing. So they're getting on board and starting to build similar applications so they are not left behind.

Banking was pretty much the first to go full force and adopt BI as a basis for their practice. Finance has always been there. They've been doing it for quite a long time.

Gardner: So as the director of business development, I imagine you're out there saying, "We can do things that couldn’t have been done before at prices that weren’t available before." That must give you almost an unlimited addressable market. How do you know where to go next to sell this?
At first, we were doing a lot of education. Now, it’s just, "Yes, we can do this."

Valenzuela: It’s kind of an open field. From my perspective, I look at the different companies out there that come to me. At first, we were doing a lot of education. Now, it’s just, "Yes, we can do this," because these things are proven. We're not proving any concepts anymore. Everything has already been done, and we know that we can do it.

It is an open field, but we focus purely on the cloud. We expect all of our customers will be in the Amazon cloud. It seems that now I am teaching people a little bit more -- just because it’s cloud, it’s not magic. You still have to do a lot of work. It’s still an infrastructure.
Become a member of myVertica today
Register now
Gain access to the free HP Vertica Community Edition
But we come from that approach and we make sure that the customer is properly aligned with the vision that this is not just a one- or two-month type commitment. We're not just going to build a solution, put it in our pocket, and walk away. We want to know that they're fully committed for 6-12 months.

Otherwise, you're not going to get the benefits of it. You're just going to spend the money and the effort, and you're not really going to get any benefits out of it if you're not going to be committed for the longer period of time. There still are some challenges with the sales and business development.

Gardner: Given this emphasis on selling the cloud model as much as the BI value, you needed to choose an analytics platform that was cloud-friendly and that was also Amazon AWS cloud-friendly. Tell me how Vertica and Amazon -- and your requirements -- came together.

Good timing

Valenzuela: I think it was purely a timing thing. Our CTO, Rohit Amarnath, attended a session at MIT, where Vertica was first announced. So he developed a relationship there.

This was right around the time when Amazon announced that they were offering its public cloud platform, EC2. So it made a lot of sense to look at the cloud as being a vision, looking at the cloud as a platform, looking at column databases as a future way of managing BI and analytics, and then putting the two together.

It was more or less a timing thing. Amazon was there. It was new technology, and we saw the future in that. Analytics was newly adopted. So now you have the column database that we can leverage as well. So blend the two together and start building some platform that hadn’t been done yet.
There are a lot of Vertica customers out there that are going to reach a limitation. That may require procuring more hardware, more IT staff. The cloud aspect removes all of that.

Gardner: What about lessons learned along the way? Are there some areas to avoid or places that you think are more valuable that people might appreciate? If someone were to begin a journey toward a combination of BI, cloud, and vertical industry tool function, what might you tell them to be wary of, or to double-down on?

Valenzuela: We forged our own way. We couldn’t learn from our competitors’ mistakes because we were the ones that were creating the mistakes. We had to to clear those up and learn from our own mistakes as we moved forward.

Gardner: So perhaps a lesson is to be bold and not to be confined by the old models of IT?

Valenzuela: Definitely that. Definitely thinking outside the box and seeing what the cloud can do, focus on forgetting about old IT and then looking at cloud as a new form of IT. Understanding what it cannot do as a basis, but really open up your mind and think about it as to what it can actually do, from an elasticity perspective.

There are a lot of Vertica customers out there that are going to reach a limitation. That may require procuring more hardware, more IT staff. The cloud aspect removes all of that.

Gardner: I suppose it allows you as a director of business development to go downstream. You can find smaller companies, medium-sized enterprises, and say, "Listen, you don’t have to build a data warehouse at your own expense. You can start doing BI based on a warehouse-as-a-service model, pay as you go, grow as you learn, and so forth."

Money concept

Valenzuela: Exactly. Small or large, those IT departments are spending that money anyway. They're spending it on servers. If they are on-premises, the cost of that server in the cloud should be equal or less. That’s the concept. [Register for the upcoming HP Big Data Conference in Boston on Aug. 10-13.]

If you're already spending the money, why not just migrate it and then partner with a firm like us that knows how to operate that. Then, we become your augmented experts, or that insurance policy, to make sure that those things are going to be running the way you want them to, as if it were your own IT department.

Gardner: What are the types of applications that people have been building and that you've been helping them with at Full 360? We're talking about not just financial, but enterprise performance management. What are the other kinds of BI apps? What are some of the killer apps that people have been using your services to do?
I don’t know how that could be driven if it weren’t for analytics and if it weren’t for technology like Vertica to be able to provide that information.

Valenzuela: Specifically, with one of our large airlines, it's customer appreciation. The level of detail on their customers that they're able to bring to the plane, to the flight attendants, in a handheld device is powerful. It’s powerful to the point where you remember that treatment that you got on the plane. So that’s one thing.

That’s something that you don’t get if you fly a lot, if you fly other airlines. That’s just kind of some detail and some treatment that you just don’t get. I don’t know how that could be driven if it weren’t for analytics and if it weren’t for technology like Vertica to be able to provide that information.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Read a full transcript or download a copy. Sponsor: HP.

You may also be interested in:

Tuesday, July 28, 2015

How big data technologies Hadoop and Vertica drive business results at Snagajob

The next BriefingsDirect analytics innovation case study interview explores how Snagajob in Richmond, Virginia – one of the largest hourly employment networks for job seekers and employers – uses big data to finally understand their systems' performance in action. The result is vast improvement in how they provide rapid and richer services to their customers.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Read a full transcript or download a copy.

Snagajob recently delivered 4 million new jobs applications in a single month through their systems. To learn how they're managing such impressive scale, BriefingsDirect sat down with Robert Fehrmann, Data Architect at Snagajob in Richmond, Virginia. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Tell us about your jobs matching organization. You’ve been doing this successfully since 2000. Let's understand the role you play in the employment market.

Fehrmann: Snagajob, as you mentioned, is America's largest hourly network for employees and employers. The hourly market means we have, relatively speaking, high turnover.
Become a member of myVertica today
Register now
Gain access to the free HP Vertica Community Edition
Another aspect, in comparison to some of our competitors, is that we provide an inexpensive service. So our subscriptions are on the low end, compared to our competitors.

Gardner: Tell us how you use big data to improve your operations. I believe that among the first ways that you’ve done that is to try to better analyze your performance metrics. What were you facing as a problem when it came to performance? [Register for the upcoming HP Big Data Conference in Boston on Aug. 10-13.]

Signs of stress

Fehrmann: A couple of years ago, we started looking at our environment, and it became obvious that our traditional technology was showing some signs of stress. As you mentioned, we really have data at scale here. We have 20,000 to 25,000 postings per day, and we have about 700,000 unique visitors on a daily basis. So data is coming in very, very quickly.

Fehrmann
We also realized that we're sitting on a gold mine and we were able to ingest data pretty well. But we had problem getting information and innovation out of our big data lake.

Gardner: And of course, near real time is important. You want to catch degradation in any fashion from your systems right away. How do you then go about getting this in real time? How do you do the analysis?

Fehrmann: We started using Hadoop. I'll use a lot of technical terms here. From our website, we're getting events. Events are routed via Flume directly into Hadoop. We're collecting about 600 million key-value pairs on a daily basis. It's a massive amount of data, 25 gigabytes on a daily basis.

The second piece in this journey to big data was analyzing these events, and that’s where we're using HP Vertica. Second, our original use case was to analyze a funnel. A funnel is where people come to our site. They're searching for jobs, maybe by keyword, maybe by zip code. A subset of that is an interest in a job, and they click on a posting. A subset of that is applying for the job via an application. A subset is interest in an employer, and so on. We had never been able to analyze this funnel.

The dataset is about 300 to 400 million rows, and 30 to 40 gigabytes. We wanted to make this data available, not just to our internal users, but all external users. Therefore, we set ourselves a goal of a five-second response time. No query on this dataset should run for more than five seconds -- and Vertica and Hadoop gave us a solution for this.

Gardner: How have you been able to increase your performance reach your key performance indicators (KPIs) and service-level agreements (SLAs)? How has this benefited you?

Fehrmann: Another application that we were able to implement is a recommendation engine. A recommendation engine is that use where our jobseekers who apply for a specific job may not know about all the other jobs that are very similar to this job or that other people have applied to.
Become a member of myVertica today
Register now
Gain access to the free HP Vertica Community Edition
We started analyzing the search results that we were getting and implemented a recommendation engine. Sometimes it’s very difficult to have real comparison between before and after. Here, we were able to see that we got an 11 percent increase in application flow. Application flow is how many applications a customer is getting from us. By implementing this recommendation engine, we saw an immediate 11 percent increase in application flow, one of our key metrics.

Gardner: So you took the success from your big-data implementation and analysis capabilities from this performance task to some other areas. Are there other business areas, search yield, for example, where you can apply this to get other benefits?

Brand-new applications

Fehrmann: When we started, we had the idea that we were looking for a solution for migrating our existing environment, to a better-performing new environment. But what we've seen is that most of the applications we've developed so far are brand-new applications that we hadn't been able to do before.

You mentioned search yield. Search yield is a very interesting aspect. It’s a massive dataset. It's about 2.5 billion rows and about 100 gigabytes of data as of right now and it's continuously increasing. So for all of the applications, as well as all of the search requests that we have collected since we have started this environment, we're able to analyze the search yield.
Most of the applications we've developed so far are brand-new applications that we hadn't been able to do before.

For example, that's how many applications we get for a specific search keyword in real time. By real time, I mean that somebody can run a query against this massive dataset and gets result in a couple of seconds. We can analyze specific jobs in specific areas, specific keywords that are searched in a specific time period or in a specific location of the country.

Gardner: And once again, now that you've been able to do something you couldn't do before, what have been the results? How has that impacted change your business? [Register for the upcoming HP Big Data Conference in Boston on Aug. 10-13.]

Fehrmann: It really allows our salespeople to provide great information during the prospecting phase. If we're prospecting with a new client, we can tell him very specifically that if they're in this industry, in this area, they can expect an application flow, depending on how big the company is, of let’s say in a hundred applications per day.

Gardner: How has this been a benefit to your end users, those people seeking jobs and those people seeking to fill jobs?

Fehrmann: There are certainly some jobs that people are more interested in than others. On the flip side, if a particular job gets a 100 or 500 applications, it's just a fact that only a small number going to get that particular job. Now if you apply for a job that isn't as interesting, you have much, much higher probability of getting the job.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Read a full transcript or download a copy. Sponsor: HP.

You may also be interested in:

Monday, July 27, 2015

Beyond look and feel--The new role that total user experience plays in business apps

The next BriefingsDirect business innovation thought leadership discussion focuses on the heightened role and impact of total user experience improvements for both online and mobile applications and services.

We'll explore how user expectations and rethinking of business productivity are having a profound impact on how business applications are used, designed, and leveraged to help buyers, sellers, and employees do their jobs better.

We’ll learn about the advantages of new advances in bringing instant collaboration, actionable analytics, and contextual support capabilities into the application interface to create a total user experience.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Read a full transcript or download a copy. See a demo.

To examine why applications must have more than a pretty face to improve the modern user experience, we're joined by Chris Haydon, Senior Vice President of Solutions Management for Procurement, Finance and Network at Ariba, an SAP company. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Chris, what sort of confluence of factors has come together to make this concept of user experience so important, so powerful? What has changed, and why must we think more about experience than interface?

Haydon: Dana, it’s a great question. There is the movement of hyper-collaboration, and things are moving faster and faster than ever before.

Haydon
We're seeing major shifts in trends on how users view themselves and how they want to interact with their application, their business applications in particular, and more and more, drawing parallels from their consumer and how they bring that simple consumer-based experience into their daily work. Those are some of the mega trends.

Then, as we step down a little bit, within that is obviously this collaboration aspect and how people prefer to collaborate online at work more than they did in traditional mechanisms, certainly via phone or fax.

Then, there's mobility. If someone doesn’t really have a smartphone in this day and age, certainly they're behind the eight ball.

Last but not least, there's the changing demographic of our workforce. In 2015, there are some stats out there that showed that millennials will become the single largest percentage of the workforce.

All of these macro trends and figures are going into how we need to think about our total user experience in our applications.

Something more?

Gardner: For those of us who have been using applications for years and years and have sort of bopped around -- whether we're on a mobile device or a PC -- from application to application, are we just integrating apps so that we don't have to change apps, or is it something more? Is this a whole greater than the existing sum of the parts?

Haydon: It’s certainly something more. It’s more the one plus one equals three concept here. The intersection of the connectivity powered by business networks, as well as the utility of mobile devices not tied to your desktop can fundamentally change the way people think about their interactions and think about the business processes and how they think about the work that needs to be done throughout the course of their work environment. That really is the difference.

This is not about just tweaking up as you open up a user interface. This is really about thinking about personal-based interactions in the context of mobility in a network-oriented or network-centric collaboration.

Gardner: When we think about collaboration, traditionally that’s been among people, but it seems to me that this heightened total user experience means we're actually collaborating increasingly with services. They could be services that recognize that we're at a particular juncture in a business process. They could be services that recognize that we might need help or support in a situation where we've run out of runway and don't know what to do, or even instances where intelligence and analytics are being brought to us as we need it, rather than our calling out to it.

Tell me about this expanding definition of collaboration. Am I right that we're collaborating with more than just other people here?
The best total user experiences are bringing that context to the end user managing that complexity, but contextualizing it to bring it to their attention as they work through it.

Haydon: That’s right. It’s putting information in the context of the business process right at the point of demand. Whether that’s predictive, intelligence, third party, the best user interfaces and the best total user experiences are bringing that context to the end user managing that complexity -- but contextualizing it to bring it to their attention as they work through it.

So whether that’s a budget check and whether there is some gaming on budget, it's saying, "You're under budget; that’s great." That’s an internal metric. Maybe in the future, you start thinking about  how others are performing in other segments of the business. If you want to take it even further, how are other potential suppliers doing on their response rate to their customers?

There is a whole new dimension on giving people contextualized information at the point where they need to make a decision, or even recommending the type of decisions they need to make. It could be from third-party sources that can come from a business network outside your firewall, or from smarter analysis and predictive analysis, from the transactions that are happening within the four walls of your firewall, or in your current application or your other business applications.

Gardner: It seems pretty clear that this is the way things are going. The logic behind why the user experience has expanded in its power and its utility makes perfect sense. I'm really enthused about this notion of contextual intelligence being brought to a business process, but it's more than just a vision here.

Pulling this off must be quite difficult. I know that many people have been thinking about doing this, but there just isn't that much of it actually going on yet. So we're at the vanguard.

What are the problems? What are the challenges that it takes to pull this off to make it happen? It seems to me there are a lot back-end services, and while we focus on the user experience and user interface, we're really talking about sophisticated technology in the data center providing these services.

Cloud as enabler

Haydon: There are a couple of enablers to this. I think the number one enabler here is cloud versus on-premise. When you can see the behavior in real time in a community aspect, you can actually build infrastructure services around that. In traditional on-premise models, when that’s locked in, all that burden is actually being pushed back to the corporate IT to be able to do that.

The second point is when you're in the cloud and you think about applications that are network-aware, you're able to bring in third-party, validated, trusted information to help make that difference. So there are those challenges.

I also think that it comes down to technology, but technology is moving to the focus of building applications actually for the end user. When you start thinking about the interactions with the end user and the focus on them, it really drives you to think about how you give that different contextualized information.

When you can have that level of granularity in saying, "I'm logging on as an invoicing processing assistant", or "I'm logging on as just a casual ad-hoc requisitioner." When the system knows you have done that, it’s actually able to be smart and pick up and contextualize that. That’s where we really see the future and the vision of how this is all coming together.
The reality is that in most companies for the foreseeable future there will be some degree of on-premise applications that continue to drive businesses, and then there will be side-by-side cloud businesses.

Gardner: When we spoke a while back about the traditional way that people got productivity was switching manually from application to application -- whether that’s an on-premise application or a software-as-a-service (SaaS) based application -- if they are losing the benefit of a common back-end intelligence capability or network services that are aware or access an identity management that’s coordinated, we still don't get that total user experience, even though the cloud is an important factor here and SaaS is a big part of it.

What brings together the best of cloud, but also the best of that coordinated, integrated total experience when you know the user and all of their policies and information can be brought to bear on this experience and productivity demand?

Haydon: There are a couple of ways of doing that. You could talk here about the concept of hybrid cloud. The reality is that in most companies for the foreseeable future there will be some degree of on-premise applications that continue to drive businesses, and then there will be side-by-side cloud businesses.

So it’s the job of leading practice technology providers, SaaS and on-premise providers, to enable that to happen. There definitely is this notion of having a very robust platform that underpins the cloud product and can be seamlessly integrated to the on-premise product.

Again, from a technology and a trend perspective, that’s where it’s going. So if the provider doesn’t have a solid platform approach to be able to link the disparate cloud services to disparate on-premise solutions, then you can’t give that full context to the end user.

One thing too is thinking about the user interface. The user interface manages that complexity for the end user. The end user really shouldn't need to know the mode of deployment, nor should they need to know really where they're at. That’s what the new leading user interfaces and what the total experience is about, to take you guided through your workflow or your work that needs to be done irrespective of the deployment location of that service.

Ariba roadmap

Gardner: Chris, we spoke last at Ariba Live, the user conference back in the springtime and you were describing the roadmap for Ariba and other applications coming through 2015  into 2016.

What’s been happening recently? This week, I think, you've gone general availability (July 2015) with some of these services. Maybe you could quickly describe that. Are we talking about the on-premise apps, the SaaS apps, the mobile apps, all the above? What’s happening?

Haydon: We're really excited about that. For our current releases that came out this week (see a demo), we launched our total user experience approach, where we have working anywhere, embracing the most modern user design interactions into our user interface, mobility, and also within that, and how we can enable our end users to learn the processes and context. All this has been launched in Ariba within the last 14 days.

Specifically, it’s about embracing modern user design principles. We have a great design principle here within SAP called Fiori. So we've taken that design principle and brought that into the world of procurement to put on top of our leading-practice capabilities today and we're bringing this new updated user experience design.
What we are doing differently here is embracing the power and the capability of mobile devices with cloud and the work that needs to be done.

But we haven’t stopped there. We're embracing, as you mentioned, this mobility aspect and how can we design new interactions between our common user interface on our mobile and a common user interface on our cloud deployment as one. That’s a given, but what we are doing differently here is embracing the power and the capability of mobile devices with cloud and the work that needs to be done.

One idea of that is how we have a process continuity feature, where you can look on your mobile application, have a look at some activities that you might want to track later on. You can click or pin that activity on your mobile device and when you come to your desktop to do some work, that pinning activity is visible for you to go on tracking and get your job done.

Similarly, if you're on the go to go and have a meeting, you're able to push some reports down to your mobile tablet or your smartphone to be able to look and review that work on the go.

We're really looking at that full, total user experience, whether you're on the desktop or whether you are on the go on your mobile device, all underpinned by a common user design imperative based upon Fiori.

Gardner: Just to be clear, we're talking about not only this capability across those network services for on-prem, cloud, and mobile, but we're taking this across more than a handful of apps. Tell us a bit about how these Ariba applications and the Ariba Network also involve other travel and expense capabilities. What other apps are involved in terms of line-of-business platform that SAP is providing?

Leading practice

Haydon: From a procurement perspective, obviously we have Ariba’s leading practice procurement. As context, we have another fantastic solution for contingent labor, statement-of-work labor and other services, and that’s called Fieldglass. We've been working closely with the Fieldglass team to ensure that our user interface that we are rolling out on our Ariba procurement applications is consistent with Fieldglass, and it’s based again on the Fiori style of design construct.

We're moving toward where an end user, whether they want to interact to do detailed time sheets or service entry, or they want to do requisitioning for powerful materials and inventory, on the Ariba side find a seamless experience.

We're progressively moving forward to that same style of construct for the Concur applications for our travel and expense, and even the larger SAP, cloud and S4/HANA approaches as well.

Gardner: You mentioned SAP HANA. Tell us how we're not only dealing with this user experience across devices, work modes, and across application types, but now we have a core platform approach that allows for those analytics to leverage and exploit the data that's available, depending on the type of applications any specific organization is using.
What we are really seeing is that the customer interactions change because they're actually able to do different and faster types of iterations.

It strikes me that we have a possibility of a virtuous adoption cycle; that is to say, the more data used in conjunction with more apps begets more data, begets more insights, begets more productivity. How is HANA and analytics coming to bear on this?

Haydon: We've had HANA running on analytics on the Ariba side for more than 12 months now. The most important thing that we see with HANA is that it's not about HANA in itself. It's a wonderful technology, but what we are really seeing is that the customer interactions change because they're actually able to do different and faster types of iterations.

To us, that's the real power of what HANA gives us from a technology and platform aspect to build on. When you can have real time analytics across massive amounts of information put into the context of what an end user does, that to us is where the true business and customer and end-user benefit will come from leveraging the HANA technology.

So we have it running in our analytics stack, progressively moving that through the rest of our analytics on the Ariba platform. Quite honestly, the sky's the limit as it relates to what that technology can enable us to do. The main focus though is how we give different business interactions, and HANA is just a great engine that enables us to do that.

Gardner: It's a fascinating time if you're a developer, because previously, you had to go through a requirements process with the users, but using these analytics you can measure and see what those users are actually doing, or progressing and modernizing their processes, and then take that analytics capability back into the next iteration of the application.

So it's interesting that we're talking about total user experience. We could be talking about total developer experience, or even total IT operator experience when it comes to delivering security and compliance. Expand a little bit about how what you are doing on that user side actually benefits the entire life cycle of these applications.

Thinking company

Haydon: It's really exciting. There are other great companies that do this, and SAP is really investing in this as well as Ariba, making sure we're really a data-driven, real-time, thinking company.

And you're right. In the simplest way, we're rolling out our total user experience in the simplest model. We're providing a toggle, meaning we're enabling our end users to road test the user experience and then switch back. We don't think anyone will want to switch back, but it's great.

That's the same type of experience that you experience in your personal life. When someone is trialing a new feature on an e-marketplace or in a consumer store, you're able to try this experience and come back. What's great about that is we're getting real-time insight. We know which customers are doing this. We know which personas are doing this. We know how long they are doing this for their session time.

We're able to bring that back to our developers, to our product managers, to our user design center experts, and just as importantly, back to our customers and also back to our partners to be able to say, "There is some info, doing these types of things, they are not on this page. They have been looking for this type of information when they do a query or request."
First and foremost, and it’s important, our customers entrust their business processes to us, and so it's about zero business disruption, and no downtime is our number one goal.

These types of information we're feeding into our roadmap, but we are also feeding back into our customers so they understand how their employees are working with our applications. As we step forward, we're exposing this in the right way to our partners to help them potentially build applications on top of what we already have on the Ariba platform.

Gardner: So obviously you can look this in the face at the general level of productivity, but now we can get specific with partners into verticals, geographies, all the details that come along with business applications, company to company, region to region.

Let’s think about how this comes to market. You've announced the general availability in July 2015 on Ariba, and because this is SaaS, there are no forklifts, there are no downloads, no install, and no worries about configuration data. Tell us how this rolls out and how people can experience it if they've become intrigued about this concept of total user experience. How easy is it for them to then now start taking part in it?

Haydon: First and foremost, and it’s important, our customers entrust their business processes to us, and so it's about zero business disruption, and no downtime is our number one goal.

When we rolled out our global network release to a 1.8 million suppliers two weeks ago (see a demo), we had zero downtime on the world’s largest business network. Similarly, as we rolled out our total user experience, zero downtime as well. So that’s the first thing. The number one thing is about business continuity.

The second thing really is a concept that we think about. It’s called agile adoption. This is again how we let end users and companies of end users adopt our solutions.

Educating customers

We have done an awful lot of work, before go live, on educating our customers, providing frequently asked questions, where required, training materials and updates, all those types of support aspects. But we really believe our work starts day plus one, not day minus one.

How are we working with our customers after this is turned on by monitoring, to know exactly what they are doing, giving them proactive support and communications, when we need to, when we see them either switching back or we have a distribution of a specific customer group or end user group within their company? We'll be actively monitoring them and pushing that forward.

That’s what we really think it’s about. We're taking this end user customer-centric view to roll out our applications, but letting our own customers find their own pathways.

Organic path

Gardner: If users want to go more mobile in how they do their business processes, want to get those reports and analytics delivered to them in the context of their activity, is there an organic path for them or do they have to wait for their IT department?

What do you recommend for people that maybe don’t even have Ariba in their organization? What are some steps they can take to either learn more or from a grassroots perspective encourage adoption of this business revolution really around total user experience emphasis?

Haydon: We have plenty of material from an Ariba perspective, not just about our solutions, but exactly what you're mentioning, Dana, about what is going on there. My first recommendation to everyone would be to educate yourselves and have a look at your business -- how many millennials are in your business, what are the new working paradigms that need to happen from a mobile approach -- and go and embrace it.
The second lesson is that if businesses think that this is not already happening outside of the control of their IT departments, they're probably mistaken.

The second lesson is that if businesses think that this is not already happening outside of the control of their IT departments, they're probably mistaken. These things are already going on. So I think those are the kind of macro things to go and have a look at.

But, of course, we have a lot more information about Ariba’s total user experience thinking on thought leadership and then how we go about and implement that in our solutions for our customers, and I would just encourage anyone to go and have a look at ariba.com. You'll be able to see more about our total user experience, and like I said, some of the leading practice thoughts that we have about implementations (see a demo).

Gardner: I'd also encourage people to listen or read the conversation you and I had just a month or two ago about the roadmap. There's an awful lot that you're working on that people will be able to exploit further for total user experience exploits. 

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Read a full transcript or download a copy. See a demo. Sponsor: Ariba, an SAP company.

You may also be interested in:

Wednesday, July 22, 2015

Zynga builds big data innovation culture by making analytics open to all developers

The next BriefingsDirect analytics innovation case study interview explores how Zynga in San Francisco exploits big-data analytics to improve its business via a culture of pervasive, rapid analytics and experimentation.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Read a full transcript or download a copy.

To learn more about how big data impacts Zynga in the fast-changing and highly competitive mobile gaming industry, BriefingsDirect sat down with Joanne Ho, Senior Engineering Manager at Zynga, and Yuko Yamazaki, Head of Analytics at Zynga. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: How important is big data analytics to you as an organization?

Ho
Ho: To Zynga, big data is very important. It's a main piece of the company and as a part of the analytics department, big data is serving the entire company as a source of understanding our users' behavior, our players, what they like, and what they don’t like about games. We are using this data to analyze the user’s behavior and we also will personalize a lot of different game models that fit the user’s player pattern.

Gardner: What’s interesting to me about games is the people will not only download them but that they're upgradable, changeable. People can easily move. So the feedback loop between the inferences, information, and analysis you gain by your users' actions is rather compressed, compared to many other industries.

What is it that you're able to do in this rapid-fire development-and-release process? How is that responsiveness important to you?

Real-time analysis

Ho: Real-time analysis, of course, is critical, and we have our streaming system that can do it. We have our monitoring and alerting system that can alert us whenever we see any drops in user’s install rating, or any daily active users (DAU). The game studio will be alerted and they will take appropriate action on that.

Gardner: Yuko, what sort of datasets we are talking about? If we're going to the social realm, we can get some very large datasets. What's the volume and scale we're talking about here?
Become a member of myVertica today
Register now
Gain access to the free HP Vertica Community Edition
Yamazaki: We get data of everything that happens in our games. Almost every single play gets tracked into our system. We're talking about 40 billion to 60 billion rows a day, and that's the data that our game product managers and development engineers decide what they want to analyze later. So it’s already being structured and compressed as it comes into our database.
Gardner: That’s very impressive scale. It’s one thing to have a lot of data, but it’s another to be able to make that actionable. What do you do once that data is assembled?

Yamazaki: The biggest success story that I will normally tell about Zynga is that we make data available to all employees. From day one, as soon as you join Zynga, you get to see all the data through our visualization to whatever we have. Even if you're FarmVille product manager, you get to see what Poker is doing, making it more transparent. There is an account report that you can just click and see how many people have done this particular game action, for example. That’s how we were able to create this data-driven culture for Zynga.

Yamazaki
Gardner: And Zynga is not all that old. Is this data capability something that you’ve had right from the start, or did you come into it over time? 

Yamazaki: Since we began Poker and Words With Friends, our cluster scaled 70 times.

Ho: It started off with three nodes, and we've grown to 230 node clusters.

Gardner: So you're performing the gathering of the data and analysis in your own data centers?

Yamazaki: Yes.

Gardner: When you realized the scale and the nature of your task, what were some of the top requirements you had for your cluster, your database, and your analytics engine? How did you make some technology choices?

Biggest points

Yamazaki: When Zynga was growing, our main focus was to build something that was going to be able to scale and provide the data as fast as possible. Those were the two biggest points that we had in mind when we decided to create our analytics infrastructure.

Gardner: And any other more detailed requirements in terms of the type of database or the type of analytics engine?
Become a member of myVertica today
Register now
Gain access to the free HP Vertica Community Edition
Yamazaki: Those are two big ones. As I mentioned, we wanted to have everyone be able to access the data. So SQL would have been a great technology to have. It’s easy to train PMs instead of engineering sites, for example, MapReduce for Hadoop. Those were the three key points as we selected our database.

Gardner: What are the future directions and requirements that you have? Are there things that you’d like to see from HP, for example, in order to continue to be able do what you do at increasing scale?

Ho: We're interested in real-time analytics. There's a function aggregated projection that we're interested in trying. Also Flex Tables [in HP Vertica] sounds like a very interesting feature that we also will attempt to try. And cloud analytics is the third one that we're also interested in. We hope HP will get it matured, so that we can also test it out in the future.
We we have 2,000 employees, and  at least 1,000 are using our visualization tool on a daily basis.

Gardner: While your analytics has been with you right from the start, you were early in using Vertica?

Ho: Yes.

Gardner: So now we've determined how important it is, do you have any metrics of what this is able to do for you? Other organizations might be saying they we don't have as much of a data-driven culture as Zynga, but would like to and they realize that the technology can now ramp-up to such incredible volume and velocity, What do you get back? How do you measure the success when you do big-data analytics correctly?

Yamazaki: Internally, we look at adoption of systems. We we have 2,000 employees, and  at least 1,000 are using our visualization tool on a daily basis. This is the way to measure adoption of our systems internally.

Externally, the biggest metric is retention. Are players coming back and, if so, was that through the data that we collect? Were we able to do personalization so that they're coming back because of the experience they've had?

Gardner: These are very important to your business, obviously, and it’s curious about that buy-in. As the saying goes, you can lead a horse to water, but you can't make him drink. You can provide data analysis and visualization to the employees, but if they don’t find it useful and impactful, they won’t use it. So that’s interesting with that as a key performance indicator for you.

Any words of advice for other organizations who are trying to become more data-driven, to use analytics more strategically? Is this about people, process, culture, technology, all the above? What advice might you have for those seeking to better avail themselves of big data analytics?

Visualization

Yamazaki: A couple of things. One is to provide end-to-end. So not just data storage, but also visualization. We also have an experimentation system, where I think we have about 400-600 experiments running as we speak. We have a report that shows you run this experiment, all these metrics have been moved because of your experiment, and A is better than B.

We run this other experiment, and there's a visualization you can use to see that data. So providing that end-to-end data and analytics to all employees is one of the biggest pieces of advice I would provide to any companies.

One more thing is try to get one good win. If you focus too much on technology or scalability, you might be building a battleship, when you actually don’t need it yet. It's incremental. Improvement is probably going to take you to a place that you need to get to. Just try to get a good big win of increasing installs or active users in one particular game or product and see where it goes.

Gardner: And just to revisit the idea that you've got so many employees and so many innovations going on, how do you encourage your employees to interact with the data? Do you give them total flexibility in terms of experiments? How do they start the process of some of those proof-of-concept type of activities?
Become a member of myVertica today
Register now
Gain access to the free HP Vertica Community Edition
Yamazaki: It's all freestyle. They can log whatever they want. They can see whatever they want, except revenue type of data, and they can create any experiments they want. Her team owns this part, but we also make the data available. Some of the games can hit real time. We can do that real-time personalization using that data that you logged. It’s almost 360-degree of the data availability to our product teams.
If you focus too much on technology or scalability, you might be building a battleship, when you actually don’t need it yet.

Gardner: It’s really impressive that there's so much of this data mentality ingrained in the company, from the start and also across all the employees, so that’s very interesting. How do you see that in terms of your competitive edge? Do you think the other gaming companies are doing the same thing? Do you have an advantage that you've created a data culture?

Yamazaki: Definitely, in online gaming you have to have big data to succeed. A lot of companies, though, are just getting whatever they can, then structure it, and make it analyzable. One of the things that we've done that do well was to make a structure to start with. So the data is already structured.

Product managers are already thinking about what they want to analyze before hand. It's not like they just get everything in and then see what happens. They think right away about, "Is this analyzable? is this something we want to store?" We're a lot smarter about what we want to store. Cost-wise, it's a lot more optimized.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS or Android. Read a full transcript or download a copy. Sponsor: HP.

You may also be interested in: