Thursday, December 17, 2015

DevOps by design--A practical guide to effectively ushering DevOps into any organization

The next BriefingsDirect DevOps innovation case study highlights how Cognizant Infrastructure Services has worked with a large telecommunications and Internet services company to make DevOps benefits a practical reality.

We'll learn important ways to successfully usher DevOps into any large, complex enterprise IT environment, and we'll hear best practices on making DevOps a multi-generational accelerant to broader business goals -- such as adapting to the Internet of Things (IoT) requirements, advancing mobile development, and allowing for successful cloud computing adoption.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To provide a practical guide to effectively ushering DevOps into any organization, we're joined by Sachin Ohal, Manager Consulting at Cognizant Infrastructure Services in Philadelphia, and Todd DeCapua, Chief Technology Evangelist at HPE Software. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: When we talk about DevOps in a large environment, what are the barriers that we're facing these days? It's a complex undertaking, but what are the things we need to be thinking about in terms of making DevOps a beneficial reality?

Ohal: Fundamentally, industries come in many different models, which is often a sending and receiving mode rather than a communicating mode.

Ohal
So either one team is sending to the other team or one organization is sending to the other team. When we come up with a model like DevOps, the IT team starts DevOps without selecting an area where DevOps needs to start, or where a team needs to take a lead to start DevOps in the organization.

Companies are trying to enhance their IT infrastructure. They want to enforce DevOps. On the other hand, when they all start communicating, they're getting lost. This has become a fundamental problem in implementing DevOps.

Gardner: You've been working with a number of companies in bringing DevOps best practices into play. What are some of the bedrock foundation steps companies should take? Is there a common theme, or does it vary from company to company?

Ohal: DevOps is a kind of domain that varies inside a company. We can't compare company to company. It varies company to company, domain to domain, organization to organization, because here we're talking about developing a culture. When we talk about developing a culture, a thought process, understanding those thought processes plays a key role.
DevOps Solutions Unify Development,
Accelerate Innovation and Meet Market Demands

Find out More from Hewlett Packard Enterprise
And if we fundamentally talk about an application development organization, testing organization, or the IT ops organization, they have their own key performance indicators (KPIs), their own thought process, and their own goals defined.

Many times, we observe that within the IT organization, development, testing, and operations have different goals, objectives, and KPI’s. They never cross-functionally define business needs. They mostly define technology as organization-specific. As an example, a functional tester doesn’t know how developers are communicating with each other, or the security team for security-related issues. An operations engineer has KPI up-time, but he really doesn’t know the various application modules he's supporting.  

Suddenly, by enforcing DevOps, we're telling all the organization to begin communicating, start intersecting, start having cross-communication. So this has become a key problem in the 21st century infrastructure, application, testing, or overall DevOps framework implementation. Communication and understanding have become key challenges for organizations.

Gardner: Before we get into the specific use case scenario and case study, what is the relationship between Cognizant and HPE? You're a services provider; they're a technology provider. How does it work?

Strong partner

Ohal: We're a strong partner with HPE. Cognizant is a consulting company, a talent company. On the other hand, HPE is an enterprise-scale product delivery company. There is a very nice synergy between Cognizant and HPE.

When we go to market, we assess the situation, we request HPE to come on-premises, to work with us, have a handshake, form a high-performance team, and deliver into an enterprise solution to Cognizant's and HPE's customers.

Gardner: Todd, given the challenges of bringing DevOps to bear in many organizations, the fact that it varies from company to company really sounds like a team sport, not something one can do completely alone. It's an ecosystem play. Is that right?

DeCapua: It absolutely is. When I think about this ecosystem, there are three players. You have your customer first, but then you have an organization like HPE that provides enterprise products and capabilities, and then other partners like Cognizant that can bring in the talent to be able to put it all together.

DeCapua
As we think about kind of this transition and think about what these challenges are that our number one player, our customers, have, there are these foundational pieces that you think about -- things like time-to-market as being a challenge, brand value being a challenge, and, of course, revenue is another challenge.

As we were talking early on, what are those fundamental challenges that our customer, again as a team sport, are being challenged with? We see that this is different for every one of our customers, and starting with some of these fundamentals, what are those challenges?

Understanding that helps with, "We need to make a change. We need to influence the culture. We need to do all these pieces." Before we jump right into that technical solution, let’s sit down as the teams together, with a customer, with someone like HPE, with someone like Cognizant, and really understand what our challenges are.

Gardner: Let's drill down a bit into a specific scenario. Sachin, a large telecommunications, media and Internet services company, tell us about what their goals were and why they were pursuing DevOps and practical improvement in how they have a test/deploy synergy.

Ohal: When we talk about telco, pharma or retail customers, they fundamentally come up with many upstream/downstream revenue-oriented, customer service, workbench platforms -- and it's very hard to establish a synergy between all the platforms, and to make them understand what their end goal is.

Obviously the end goal is customer service, but to achieve that goal you have to go through so many processes, so many handshakes on a business level, on a technology level, on a customer-service level, and even internal customer service level.

Always supporting

In today's world, we are IT for IT. None of the organizations inside a company works as an independent IT group. They work IT for IT. They are always supporting either business or internal IT group.

Having this synergy established, having this core value established, we come across many people who don't understand the communication. The right tools are not in place. Once we overcome the tools and the communication process, the major question is how I'll put that process in end-to-end in the IT organization

That, again, becomes a key challenge to that process, because it's not easy to have it adopted with something new. As Todd said, we're talking about Agile development and mobile. Your IT organization becomes your business. You're asking to inject something new with no result. It's like injecting some test assay with some new drug. That's exactly the feeling any IT executive has: "Why am I supposed to be injecting this thing?"

Do I have a value out of it or don't I, because there is no benchmark available in the industry that people succeed in a certain domain or a certain area. There are always bits and pieces. This is a key challenge that we observe across the industry  -- a lack of adaptiveness to a new technology or a new process. We're still seeing that.
There is no benchmark available in the industry that people succeed in a certain domain or a certain area. There are always bits and pieces.

I have a couple of customers who say, "Oh, I run Windows 2000 server. I run Windows 98. I have no idea how many Java libraries my application is using." They are also unable to explain why they still have so many.

It's similar on the testing side. Somebody says, "I use a Load Testing Solution 9," where even HPE themselves got rid of it three or four years back.

Then, if you come to the operations organization, people say, "I use a very old server." What does it mean? It means that business is just getting IT services. They have to understand that this service needs to be enhanced so that the business will be enhanced.

Technology enhancement doesn’t mean that my data center is flooded with some new technology. Technology enhancement means that my entire end-to-end landscape is upgraded with a new technology that will support for next gen, but I'm still struggling with legacy. These are the key challenges we observe in the market.

Gardner: But specifically with this use case, how did you overcome them? Did you enter into the test environment and explain to them how they should do things differently, leverage their data in different ways, or did you go to the developers first? Is there a pattern to how you begin the process of providing a better DevOps outcome?

End-to-end process

Ohal: First of all, we had to define an end-to-end delivery process and then we had to identify end-to-end business value out of that delivery process.

Once we identified the business value, we drew a line between various organizations so they could understand that they were not cutting across each other, but going parallel. But this is a thin line, which is going to work, and which will definitely vary domain-to-domain.

In a multi-generational business plan, when we talk about drawing this thin line, we don’t have any scope that tells exactly how we draw it in IT organization, a business organization, or inside IT. We draw it in a testing organization or a development organization.

DevOps can be started in any landscape. We may start with a testing organization and then we decide to pull it into the development and IT organization.
DevOps Solutions Unify Development,
Accelerate Innovation and Meet Market Demands

Find out More from Hewlett Packard Enterprise
In some cases we may start with a development organization, and then testing and operational organizations come into place. Some businesses start DevOps, and they say that they want to do things the way they want.

If you really ask me about a specific case study, rather than giving a very centric answer, I want to tell you that the answer is a wide area. I don’t just want to take our audience in a wrong direction. Somebody else started in testing. So we'll just start in testing. Somebody else started in development. Let’s start in development.

You can start anywhere, but before starting, just stay back, decide where you want to start, why you want to start, how you want to start, and get the right folks and the right tools in the picture.

Gardner: Given that there is a general pattern, but there are also deep specifics, could you walk us through the general methodology that you have been talking about and that you are describing?

Ohal: At one point in time, most users or most listeners on this podcast, were startup companies. They started up their company as a product or as service and they were struggling with a market.

Then, they shifted themselves as a product company. When I say product it doesn’t mean a physical product; it might be service as a product. Then, they started merger, acquisition, and enhancing their portfolio in the market. They've done a couple of exercises that fundamentally industry does.

Service companies

Now, more big companies are transforming themselves to the service companies. They want to make sure that their existing customers and their new customers are getting the same values, because challenges remain, while adding new customers. Are my existing customers still with me? Are they happy and satisfied, and are they willing to continue business with my company?

Are they getting equivalent service to what we have committed to them? Are they getting my new technology and business value out of those services?

This creates a lot of pressure on IT and business executives. In mobile computing and cloud computing, suddenly some companies are trying to transform themselves into cloud companies from service companies. There is a lot of pressure on their IT organization to go toward cloud.

They're starting with using cloud web services, cloud authentication at an IT level. We're not talking a larger landscape, but they're trying that. Basically this transformation from startup to product, product to services, and then services to cloud. That is your multi-generational vision with your multi-generational business plan, because your people change, your IT changes, your technology changes, your business models keep changing, your customers change, your revenue changes, and the mode of revenue changes.
That's where your IT plays a key role. Information technology becomes a key strategic business unit in your organization that is driving this whole task force.

Consider the example of eBay and Google. At some point in time, they never existed. We never even thought that these companies would be leading on Wall Street, giving us so much employment, or have such a large consumer base.

Being a consulting company like Cognizant, we observe those trends in the market very quickly. We see those changes in the market, we assist them, and we come with our own internal teams that understanding this all -- yet the customer multi-generational vision remains the same.

To run this vision I have a strategic business objective, a strategic business unit. How will this unit communicate with the strategic business objective? That's where your IT plays a key role. Information technology becomes a key strategic business unit in your organization that is driving this whole task force.

While driving this task force, if you didn’t define your DevOps in a multi-generational business plan, what will happen is that your focus is IT-centric. The moment technology changes, you're in trouble. The moment the process changes -- and the moment you think about cross domain in your company -- you're in trouble.

As an example, a telco is doing a cross-domain with the retailer. Then, pharma is doing cross-domain with the telco. Do you want to spend double for your IT or your business, or do you want to shut down the existing project and fund a new project?

There are so many questions that come into the picture when we talk about an IT-centric DevOps organization, but when we have business-centric DevOps initiation, we accommodate all the views, and accordingly, IT takes control of your business and they help you to run your business.

Gardner: So business agility is really the payoff, Todd?

Looking at disruptions

DeCapua: Yes. Dana and Sachin, as we look at this challenge and wrapping this around the use case that Cognizant has -- not only the one customer that we are talking about, but really all of them -- and thinking through this multi-generational business plan using DevOps, there are some real fundamentals to think about. But there are disruptions in the world today, and maybe starting there helps to illustrate a little bit better why this concept of a multi-generational business plan is so important.

Consider Uber, Yelp, or Netflix. Each one of them is in a different stage of a multi-generational business plan, but as to this foundational element that Sachin had been explaining -- where some organizations today are stuck in a legacy technology or IT organization -- it’s really starting at that fundamental level of understanding, What are our strategic business objectives?

Then look at this from whether there's a strategic business unit and where that's focused. Then, build up from there to where you have technology that lives on the top of that.

What’s fun for me is when I look at Uber, Yelp, or Netflix, knowing they are all different, but some of them do have a product and some of them don’t. Some of them are an IT organization that has a services layer that connects all of these pieces together.
Look at this from whether there's a strategic business unit and where that's focused. Then, build up from there to say you have technology that lives on the top of that.

So whether it's a large telecom or an Internet provider, there are products, but there has really been a focus on services.

What can help is that this organizational, multi-generational vision is going to live through the iterations that every organization goes through. I hate to keep pounding on these three examples, but I think they're great in ways that help illustrate this.

We all remember when things like Uber came in as a startup and was not really well-understood. Then, you look down, and it has become productized. It’s probably safe to assume that we've reached a certain level where it's available in most cities that I travel to.

Then, you move into something more like a product, looking at Yelp. That is definitely a product that’s mainstream. It definitely has a lot of users today. Then you move down into the service area, and as something would mature into a service it has now become definitely adopted in the majority of their target users.
DevOps Solutions Unify Development,
Accelerate Innovation and Meet Market Demands

Find out More from Hewlett Packard Enterprise
The fourth I would like to call on is cloud. As you move to something like cloud, that's where Netflix becomes a perfect example. It’s all cloud-based. I'm a subscriber. I know that I can have streaming video any device, anywhere in the world, at any time, on Netflix delivered from the cloud.

So these four generational business plan items that we are talking about -- startup, products, service, and cloud -- again, carrying that underlying vision, all supported by information technology and a defined strategic business objective, focusing on a strategic business unit.

It’s really important to help understand that as I look at somebody like Cognizant as a partner and the approach that they have used with several of their customers.

Gardner: For organizations reading this or listening in that are interested in getting to that multi-generational benefit -- where their investments in IT pay off dividends for quite some time, particularly in their ability to adapt to change rapidly -- any good starting points? Are there proof of concept (POC) places where you start? I know it’s boiling the ocean in some ways, but there must be some good principles to get going with.

Sensing inside

Ohal: Definitely there are. In this 21st Century IT business goal, first you have to sense everything inside of your business, rather than sensing the outside market. Sense all your business thoroughly, in real time. What is it doing?

You have to analyze your business model. Does my business model fit in these four fundamental parts? Where am I right now? Am I into the startup side, product side, service side, or cloud and where do I want to go? We have to define that, and then based on that, you have to adopt DevOps. You have to make sure where you are adopting your DevOps.

I was on product and I'm going to services, so I need a DevOps fitting here. Or I'm right now in a well-matured product and I want to go on a cloud. Where I am going? Or, I'm right now on a cloud and I want to have more and more refined services for my customers.

Find out that scale and define that scale, rather than getting many IT groups together and just doing a brainstorming session. Where am I supposed to stand? No. What is your business vision? What is your customer value? Those values really derive your business, and to derive that business use DevOps.
You have to make sure where you are adopting your DevOps.

It's not for just getting the continuous delivery in-place or continuous integration in-place. Two IT executives are talking, "You're in my organization doing a great handshake," and the business says, "I don’t want that handshake. I want that up-time."

There are so many various aspects, various views. Todd mentioned that he has all these examples, but if you check other example as well, they're very focused on their multi-generational business plan, and if you want to succeed, you have to be focused on those aspects as well.

Gardner: Anything else to add, Todd?

DeCapua: As far as getting started and what works and where you go, there are a number of different ways that we've worked with our customers to get started.

One of the ones that I have seen proven is something that has been neglected. For example, there's a maintenance backlog. Here are items that over six months, a year, or sometimes even two years, have just been neglected. If you really want to try to find some quick value, maybe it’s pulling that maintenance backlog off, prioritizing that with your customer, understanding what's important still, what’s not important any longer, and shortening it down to a target list.
The second piece that comes in is this analysis capability. How are you tracking the results?

Then being able to identify that if we're going to focus a few resources on a few of these high-priority items that are going to continue to be neglected, then starting to adopt some of these practices and capabilities to then immediately show value to that business owner because we have applied a few resources with a little bit of time and gone after the highest priority items that otherwise would have been neglected.

The second piece that comes in is this analysis capability. How are you tracking the results? What are those metrics that you're using to show back to the business that they have their multi-generational plan and strategy laid out, but how is it that they are incrementally showing this value as they're delivering over and over again?

But start small. Maybe go after that neglected maintenance backlog being a really easy target, and then showing the incremental value over time, again, through the sensing that Sachin has mentioned. Also be able to analyze and predict those results and then be able to adapt over time with speed and accuracy.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Tuesday, December 8, 2015

Need for fast analytics in healthcare spurs Sogeti converged BI solutions partnership model

The next BriefingsDirect big-data solution discussion explores how a triumvirate of big-data players is delivering a rapid and efficient analysis capability across disparate data types for the healthcare industry.

We'll learn how the drive for better patient outcomes amid economic efficiency imperatives has created a demand for a new type of big-data implementation model. This solutions approach -- with the support from Hewlett Packard Enterprise, Microsoft, and Sogeti -- leverages a nimble big-data platform, converged solutions, hybrid cloud, and deep vertical industry expertise.

The result is innovative and game-changing insights across healthcare ecosystems of providers, patients, and payers. The ramp-up to these novel and accessible insights is rapid, and the cost-per-analysis value is very impressive.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript o download a copy.

Here to share the story on how the Data-Driven Decisions for Healthcare initiative arose and why it portends more similar vertical industry focused solutions, we're joined by Bob LeRoy, Vice President in the Global Microsoft Practice and Manager of the HPE Alliance at Sogeti USA. He's based in Cincinnati. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Why the drive for a new model for big data analytics in healthcare? What are some of the drivers, some of the trends, that have made this necessary now?

LeRoy: Everybody is probably very familiar with the Affordable Care Act (ACA), also known as ObamaCare. They've put a lot of changes in place for the healthcare industry, and primarily it's around cost containment. Beyond that, the industry itself understands that they need to improve the quality of care that they're delivering to patients. That's around outcomes, how can we affect the care and the wellness of individuals.

LeRoy
So it’s around cost and the quality of the care, but it’s also about how the industry itself is changing, both from how providers are now doing more with payments and how classic payers are doing more to actually provide care themselves. There is this blur between the lines of payer and provider.

Some of these people are actually becoming what we call accountable care organizations (ACOs). We see a new one of these ACOs come up each week, where they are both payer and provider.

Gardner: Not only do we have a dynamic economic landscape, but the ability to identify what works and what doesn't work can really be important, especially when dealing with multiple players and multiple data types. This is really not just knowing your own data; this is knowing data across organizational boundaries.

LeRoy:  Exactly. And there are a lot of different data models that exists. When you look at things like big data and the volume of data that exist out in the field, you can put that data to use to understand who are your critical patients, and how that can affect your operations?

Gardner:  Why do we look to a triangulated solution between players like Hewlett Packard Enterprise, Microsoft, and Sogeti? What is it about the problem that you're trying to solve that has led it to a partnership type of solution?

Long-term partner

LeRoy: Sogeti, a wholly-owned subsidiary of the Capgemini Group, has been a long-term partner with Microsoft. The tools that Microsoft provides are one of the strengths of Sogeti. We've been working with HPE now for almost two years, and it's a great triangulation between the three companies. Microsoft provides the software, HPE provides the hardware, and Sogeti provides the services to deliver innovative solutions to customers and do it in a rapid way. What you're getting is best in class in all three of those categories -- the software, the hardware, and the services.

Gardner: There's another angle to this, too, and it’s about the cloud delivery model. How does that factor into this? When we talked about hardware, it sounds like there's an on-premises aspect to it, but how does the cloud play a role?

LeRoy: Everybody wants to hear about the cloud, and certainly it’s important in this space, too, because of the type of data that we're collecting. You could consider social data or data from third party software-as-a-service (SaaS) applications, and that data can exist everywhere.
Converged Systems Help Transform
Healthcare and Financial Services
Learn More from Sogetilabs
You have your on-premise data and you have your off-premise data. The tool that we're using, in this case from HPE and Microsoft, really lend themselves well to developing a converged environment to deliver best in class across those different environments. They're secured, delivered quickly, and they provide the information and the insights that their hospitals and insurance companies really need.

Gardner: So we have a converged solution set from HPE. We have various clouds that we can leverage. We have great software from Microsoft. Tell us a little about Sogeti and what you're bringing to the table. What is it that you've been doing in healthcare that helps solidify this solution and the rapid analysis requirements?
Sogeti’s strength is that we're really focused on the technology and the implementations of technology.

LeRoy: This is one of the things that Sogeti brings into the table. Sogeti is part of the Capgemini Group, a global organization with 150,000 employees, and Sogeti is one of the five strategic business units of the group. Sogeti’s strength is that we're really focused on the technology and the implementations of technology and we are focused on several different verticals, healthcare being one of them.

We have experts on the technology stacks, but we also have experts in healthcare itself. We have people who we've pulled from the healthcare industry. We taught them what we do in the IT world, so they can help us focus best practices and technologies to solve real healthcare organizational problems, so that we can get toward the quality of care and the cost reduction that the ACA is really looking for. That’s a real strength that's going to add significant values to healthcare organizations.

Gardner: It’s very important to see that one size does not fit all when it comes to the systems. Having industry verticalization is required, and you're embarking on a retail equivalent to this model, and manufacturing in other sectors might come along as well.

Let's look at why this approach to this problem is so innovative. What have been some of the problems that have held back the ability of large and even mid-sized organizations in the healthcare vertical industry from getting these insights? What are some of the hurdles that they've had to overcome and that perhaps beg for a new and different model and a new approach?

Complexity of data

LeRoy: There are a couple of factors. For sure, it’s the complexity of the data itself. The data is distributed over a wide variety of systems. So it’s hard to get a full picture of a patient or a certain care program, because the systems are spread out all over the place. When the systems in so many different ways end up with you, you get part of the data. You don’t get the full picture. We call that poor data quality, and that y makes it hard for somebody who's doing analysis to really understand and gain insight from their data.

Of course, there's also the existing structure that’s in place within organizations. They've been around for a long time. People are sometimes resistant to change. Take all of those things together and you end up with a slow response time to delivering the data that they're looking for.

Access to the data becomes very complex or difficult for an end-user or a business analyst. The cost of changing those structures can be pretty expensive. If you look at all those things together, it really slows down an organization’s ability to understand the data that they've got to gain insights about their business.

Gardner: Just a few years ago, when we used to refer to data warehouses, it was a fairly large undertaking. It would take months to put these into place, required a data center or some sort of a leasing arrangement, and of course a significant amount of upfront costs. How has this new model approached those costs and length of time or ramp-up time issues?
HPE is providing a box that’s going to allow me to put both into a single environment. So that’s going to reduce my cost a lot.

LeRoy: Microsoft’s model that they have put in place to support their Analytics Platform System (APS) allows them to license their tools at a lower price. The other thing that's really made a difference is the way HPE has put together their ConvergedSystem that allows us to tie these hybrid environments together to aggregate the data in a very simple solution that provides a lot of performance.

If I have to look at unstructured data and structured data, I often need two different systems. HPE is providing a box that’s going to allow me to put both into a single environment. So that’s going to reduce my cost a lot.

They have also delivered it as an appliance, so I don't need to spend a lot of time buying, provisioning, or configuring servers, setting up software, and all those things, I can just order this ConvergedSystem from HPE, put it in my data center, and I am almost ready to go. That’s the second thing that really helps save a lot of time.
Converged Systems Help Transform
Healthcare and Financial Services
Learn More from Sogetilabs
The third one is that at Sogeti Services, we have some intellectual property (IP) to help the data integration from these different systems and the aggregation of the data. We've put together some software and some accelerators to help make that integration go faster.

The last piece of that is a data model that structures all this data into a single view that makes it easier for the business people to analyze and understand what they have. Usually, it would take you literally years to come up with these data models. Sogeti has put all the time into it, created these models, and made it something that we can deliver to a customer much faster, because we've already done it. All we have to do is install it in your environment.

It's those three things together -- the software pricing from Microsoft, the appliance model from HP, and the IP and the accelerators that Sogeti has.

Consumer's view

Gardner: Bob, let’s look at this now through the lens of that consumer, the user. It wasn’t that long ago where most of the people doing analytics were perhaps wearing white lab coats, very accomplished in their particular query languages and technologies. But part of the thinking now for big data is to get it into the hands of more people.

What is it that your model, this triumvirate of organizations coming together for a solution approach, does in terms of making this data more available? What are the outputs, who can query it, and how has that had an impact in the marketplace?

LeRoy: We've been trying to get this to the end users for 30 years. I've been trying to get reports in the hands of users and let them do their own analysis, and every time I get to a point where I think this is the answer, the users are going to be able to do their own reports, that frees up guys in the IT world like me to go off and do other things, it doesn’t always work.

This time, though, it's really interesting. I think we have got it. We allow the users access directly to the data, using the tools that they already know. So I'm not going to create and introduce a new tool to them. We're using tools that are very similar to Excel, that point to a data source that’s well organized for them already and it’s the data that they are already familiar with.
This is something that we couldn't do before, and it’s very exciting to see that we're able to gain such insights and be able to take action against those insights.

So if they're using Microsoft Excel-like tools, they can do Power Pivots and pivot tables that they've already being doing, but just in an offline manner. Now, I can give them direct access to real-time data.

Instead of waiting until noon to get reports out, they can go and look online and get the data much sooner, so we can accelerate their access time to it, but deliver it in a format that they're comfortable with. That makes it easier for them to do the analysis and gain their insights without the IT people having to hold their hands.

Gardner: Perhaps we have some examples that we can look to that would illustrate some of this. You mentioned social media, the cloud-based content or data. How has that come to bear on some of these ways that your users are delivering value in terms of better healthcare analytics?

LeRoy: The best example I have is the ability to bring in data that’s not in a structured format. We often think of external data, but sometimes it’s internal data, too -- maybe x-rays or people doing queries on the Internet. I can take all of that structured data and correlate it to my internal electronic medical records or my health information systems that I have on-premise.

If I'm looking at Google searches, and people are looking for keywords such as "stress," "heart attacks," "cardiac care," or something like that, those keywords, I can map the times that people are looking at those kinds of queries by certain regions. I can tie that back to my systems and ask what the behavior or the traffic patterns look like within my facility at those same times. You can target certain areas to maybe change my staffing model, if there is a big jump in searches, do a campaign to ask people to come in and do a screening, or encourage people to get to their primary-care physicians.

There are a lot of things we can do with the data by looking just at the patterns. It will help us narrow down the areas of our coverage that we need to work with, what geographic areas I need to work on, and how I manage the operations of the organization, just by looking at the different types of data that we have and tying them together. This is something that we couldn't do before, and it’s very exciting to see that we're able to gain such insights and be able to take action against those insights.

Applying data science

Gardner: I can see now why you're calling it the Data Driven Decisions for Healthcare, because you're really applying data science to areas that would probably never have been considered for it before. People might use intuition or anecdote or deliver evidence that was perhaps not all that accurate. Maybe you could just illustrate a little bit more ways in which you're using data science and very powerful systems to gather insights into areas that we just never thought to apply such high-powered tools to before.
Converged Systems +
Analytics = Transformation
Learn More from Sogetilabs
LeRoy: Let’s go back to the beginning when we talked about how we change the quality of care that we are providing. Today, doctors collect diagnosis codes for just about every procedure that we have done. We don’t really look and see how many times those same procedures are repeated or which doctors are performing which procedures. Let’s look at the patients, too, and which patients are getting those procedures. So we can tie those diagnosis codes in a lot of different ways.

The one that I think I probably would like the best is that I want to know which doctors perform those procedures only once per patient and have the best results come from the treatments that that doctor performs. Now, if I'm from a hospital, I know which doctors perform which procedures the best and I can direct the patients that need those procedures to those doctors that provide the best care.
My quality of care goes up, the patient has a better experience, and we're going to do it a lower cost because we're only doing it once. 

And the reverse of that might be that if the doctor doesn’t perform that procedure well, let’s avoid sending him those kinds of patients. Now, my quality of care goes up, the patient has a better experience, and we're going to do it a lower cost because we're only doing it once. 

Gardner: Let’s dive into this solution a bit, because I'm intrigued by the fact that this model of bringing converged-infrastructure provider, a software provider and expertise in the field that crosses the chasm between a technology capability and a vertical industry knowledge-base works. So let’s dig in a little bit. The Microsoft APS, tell us a little bit about that -- what it includes and why it’s powerful and applicable in this situation?

LeRoy: The APS is a solution that combines unstructured data and structured data into a single environment and it allows the IT guys to run classic SQL queries against both.

On one side, we have what used to be called parallel data warehouse. It’s a really fast version of SQL Server. It's massively parallel processing and it can run queries super fast. That’s the important part. I have structured data that I can get to very quickly.

The other half of it is HDInsight, which is Microsoft's open source implementation of Hadoop. Hadoop is all unstructured data. In between these two things there is PolyBase. So I can query the two together and I can join structured and unstructured data together.

Then, since Microsoft created this APS specification, HPE then implemented that in a box that they call a ConvergedSystem 300. Sogeti has used that to build our IP against. We can consume data from all these different areas, put it into the APS, and deliver that data to an end user through a simple interface like Excel or Power BI or some other visualization tool.

Significant scale

Gardner: Just to be clear for our audience, sometimes people hear appliance and they don't think necessarily big scale, but the HPE ConvergedSystem 300 for the Microsoft APS is quite significant with server storage, networking technologies, and large amounts of data, up to 6 petabytes. So we're talking about some fairly significant amounts of data here, not small fry.

LeRoy: And they put everything into that one rack. We think of appliance as something like a toaster that we plug in. That’s pretty close to where they are, not exactly, but you drop this big rack into your data center, give it an IP address, give it some power, and now you can start to take existing data and put it in there. It runs extremely well because they've incorporated the networking and the computing platforms and the storage all within a single environment, which is really effective.

Gardner: Of course, one of the big initiatives at Microsoft has been cloud with Azure. Is there a way in which the HPE Converged Infrastructure in a data center can be used in conjunction with a cloud service like Azure or other cloud, public cloud, infrastructure-as-a-service (IaaS) cloud or even data warehousing cloud services that accelerates the ability to deliver this fast and/or makes it more inclusive or more types of data for more places? How does the public cloud fit into this?
One of the great things about the solution that Microsoft and HPE put together is it’s very much a converged system that allows us to bridge on-prem and the cloud together.

LeRoy: You can distribute the solution across that space. In fact, we take advantage of the cloud delivery as a model. We use a tool called Power BI from Microsoft that allows you to do visualizations.

The system from HPE is a hybrid solution. So we can distribute it. Some of it can be in the cloud and some of it can be on-prem. It really depends on what your needs are and how your different systems are already configured. It’s entirely flexible. We can put all of it on-prem, in a single rack or a single appliance or we can distribute it out to the cloud.

One of the great things about the solution that Microsoft and HPE put together is it’s very much a converged system that allows us to bridge on-prem and the cloud together.

Gardner: And of course, Bob, those end users that are doing those queries, that are getting insights, they probably don’t care where it's coming from as long as they can access it, it works quickly, and the costs are manageable.

LeRoy: Exactly.

Gardner: Tell me a little bit about where we take this model next -- clearly healthcare, big demand, huge opportunity to improve productivity through insights, improve outcomes, while also cutting costs.

You also have a retail solution approach in that market, in that vertical. How does that work? Is that already available? Tell us a little bit about why the retail was the next one you went to and where it might go next in terms of industries?

Four major verticals

LeRoy: Sogeti is focused on four major verticals: healthcare, retail, manufacturing, and life sciences. So we are kind of going across where we have expertise.
Converged Systems Help Transform
Healthcare and Financial Services
Learn More from Sogetilabs
The healthcare one has been out now for nine months or so. We see retailers in another place. There are point solutions where people have solved part of this equation, but they haven’t really dug deep in understanding how to get it from end to end, which is something that Sogeti has done now. From the point a person walks into a store, we would be alerted through all of these analytics that we have. We would be alerted that the person arrived and take action against that.

We do what we can to increase our traffic and our sales with individuals and then aggregate all of that data. You're looking at things like customers, inventory, or sales across an organization. That end-to-end piece is something that I think is very unique within the retail space.

After that, we're going to go to manufacturing. Everybody likes to talk about the Internet of Things (IoT) today. We're looking at some very specific use cases on how we can impact manufacturing so IoT can help us predict failures right on a manufacturing line. Or if we have maybe heavy equipment out on a job site, in a mine, or something like that, we could better predict when equipment needs to be serviced, so we can maximize the manufacturing process time.
We're looking at some very specific use cases on how we can impact manufacturing so IoT can help us predict failures right on a manufacturing line.

Gardner: Any last thoughts in terms of how people who are interested in this can acquire it? Is this something that is being sold jointly through these three organizations, through Sogeti directly? How is this going to market in terms of how healthcare organizations can either both learn more and/or even experiment with it?

LeRoy: The best way to do it is search for us online. It's mostly being driven by Sogeti and HPE. Most of the healthcare providers that are also heavy HPE users could be aware of it already, and talking to an HPE rep or to a Sogeti rep is certainly the easiest path to move forward on.

We have a number of videos that are out on YouTube. If you search for Sogeti Labs and Data Driven Decisions, you will certainly find my name and a short video that shows it. And of course sales reps and customers are welcome to contact me or anybody from Sogeti or HP.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript o download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

HPE's composable infrastructure sets stage for hybrid market brokering role

Making a global splash at its first major event since becoming its own company, Hewlett Packard Enterprise (HPE) last week positioned itself as a new kind of market maker in enterprise infrastructure, cloud, and business transformation technology.

By emphasizing choice and adaptation in hybrid and composable IT infrastructure, HPE is betting that global businesses will be seeking, over the long term, a balanced and trusted partner -- rather than a single destination or fleeting proscribed cloud model.

HPE is also betting that a competitive and still-undefined smorgasbord of cloud, mobile, data, and API service providers will vie to gain the attention of enterprises across both vertical industries and global regions. HPE can exploit these dynamic markets -- rather than be restrained by them -- by becoming a powerful advocate for enterprises sorting out the complexity of transformation across hybrid, mobile, security, and data analysis shifts.

"The most powerful weapons of competition are now software, data, and algorithms," said Peter Ryan, HPE Senior Vice President and Managing Director for EMEA. "Time to value is your biggest enemy and your biggest opportunity."

HPE led off its announcements at HPE Discover in London with a new product designed to run both traditional and cloud-native applications for organizations seeking the benefits of running a "composable" hybrid infrastructure. [Disclosure: HPE is a sponsor of BriefingsDirect podcasts.]
Time to value is your biggest enemy and your biggest opportunity.

Based on new architecture, HPE Synergy leverages fluid resource pools, software-defined intelligence, and a unified API to provide the foundation for organizations to continually optimize the right mix of traditional IT and private cloud resources. HPE also announced new partnerships with Microsoft around cloud computing and Zerto for disaster recovery.

HPE Synergy leverages a new architectural approach called Composable Infrastructure, hailed as HPE's biggest debut in a decade. In addition to nourishing dynamic IT service markets and fostering choice, HPE is emphasizing the need to move beyond manual processes for making disparate hybrid services operating well together.

The next step for businesses is to "automate and orchestrate across all of enterprise IT," said Antonio Neri, HPE Executive Vice President and General Manager of the company's Enterprise Group, to the 17,000 attendees.

"Market data clearly shows that a hybrid combination of traditional IT and private clouds will dominate the market over the next five years," said Neri. "With HPE Synergy, IT can deliver infrastructure as code and give businesses a cloud experience in their data center."

Composable choice for all apps

Composable Infrastructure via unified APIs allows IT to converge and virtualize assets while leveraging hybrid models, he said. Both developers and IT operators need to access all their resources rapidly and quickly automate their use.

HPE is striving to strike the right balance between the ability to use hybrid models and access legacy resources, while recognizing that the market will continue to rapidly advance and differ widely from region to region. It's a wise brokering role to assume, given the level of confusion and concern among IT leaders.

"What's the right formula for services at the right price with the right SLAs? It's still a work in progress," I told Trevor Jones at SearchCloudComputing at TechTarget just after the conference.
Cloud brokers can pick and choose the right requirements at the right price for their customers, so there will be a market for those services.

Indeed, HPE will offer a cloud brokerage service in early 2016 for hybrid IT management. HPE Helion Managed Cloud Broker leverages existing HP orchestration, automation, and operations software, and builds a self-service portal, monitoring dashboards and reports to better support on-premises offerings from VMware and public clouds and #PaaS from Microsoft, Amazon, and others. The service will be available sometime in 2016.

"Cloud brokers can pick and choose the right requirements at the right price for their customers, so there will be a market for those services," I told TechTarget. "I look at it like the systems integrator of cloud computing."

And brokers factor into cloud choice and hybrid choice decisions such variables as jurisdiction, industry verticals, types of workloads and mobile devices. Rather than dictate to enterprise architects what "parts" or services to use, HPE is focusing on the management and repeatability of the services that specific application sets require -- even as that changes over time.

For example, as the interest in software containers grows, HPE will automate their use. New HPE ContainerOS solves two major problems with containers -- security and manageability, said HPE CTO Martin Fink. "Ops can now fall in love with containers just as much as developers," he told the conference audience, adding that virtual machines alone are "highly inefficient."

IoT gets a new edge

In yet another IT area that enterprises need to quickly adjust to, the Internet of Things (IoT), HPE has developed a flexible solution approach. HPE Edgeline servers, part of an Intel partnership, sit at the edge of networks.

"What will make IoT work for business is not devices. It's infrastructure you build to support it," said Robert Youngjohns, Executive Vice President and General Manager, HPE Enterprise Group.


Microsoft partnership

HPE and Microsoft announced new innovation in hybrid cloud computing through Microsoft Azure, HPE infrastructure and services, and new program offerings. The extended partnership appoints Microsoft Azure as a preferred public cloud partner for HPE customers while HPE will serve as a preferred partner in providing infrastructure and services for Microsoft's hybrid-cloud offerings.

The partnering companies will collaborate across engineering and services to integrate innovative compute platforms that help customers optimize their IT environment, leverage new consumption models, and accelerate their business.
As part of the expanded partnership, HPE will enable Azure consumption and services on every HPE server, which allows customers to rapidly realize the benefits of hybrid cloud.

To simplify the delivery of infrastructure to developers, HPE Synergy, for example, has a powerful unified API and a growing ecosystem of partners like Arista, Capgemini, Chef, Docker, Microsoft, NVIDIA, and VMware. The unified API provides a single interface to discover, search, provision, update, and diagnose the Composable Infrastructure required to test, develop, and run code. With a single line of code, HPE's innovative Composable API can fully describe and provision the infrastructure that is required for applications, eliminating weeks of time-consuming scripting.

HPE and Microsoft are also introducing the first hyper-converged system with true hybrid-cloud capabilities, the HPE Hyper-Converged 250 for Microsoft Cloud Platform System Standard. Bringing together industry leading HPE ProLiant technology and Microsoft Azure innovation, the jointly engineered solution brings Azure services to customers' data centers, empowering users to choose where and how they want to leverage the cloud. An Azure management portal enables business users to self-deploy Windows and Linux workloads, while ensuring IT has central oversight.

Building on the success of HPE Quality Center and HPE LoadRunner on the Azure Marketplace, HPE and Microsoft will work together to make select HPE industry-leading application lifecycle management, big-data, and security software products available on the Azure Public Cloud.

HPE also plans to certify an additional 5,000 Azure Cloud Architects through its Global Services Practice. This will extend its Enterprise Services offerings to bring customers an open, agile, more secure hybrid cloud that integrates with Azure.

Disaster recovery with Zerto

Zerto, disaster recovery provider in virtualized and cloud environments, has achieved the gold partnership status with HPE.

The first deliverable out of the partnership is the Zerto Automated Failover Testing Pack. This is the first of several packs which will simplify BC/DR automation using HPE Operations Orchestration (HPE OO) as the master orchestrator. The new automation failover testing capabilities for HPE OO increases IT data center time savings, while improving overall disaster recovery testing compliance.
Failover tests can now run nightly versus annually, providing compliance coverage for customers operating in highly regulated industries such as financial services and healthcare.

While Zerto Automated Failover Testing Pack automatically runs failover tests in full virtual-machine environments, other automated processes eliminate the need to cross check multi-department failover success thereby increasing efficiency and productivity for IT teams.

With Zerto Automated Failover Testing Pack, users now simply schedule the failover test in HPE OO. The test runs autonomously and sends a report showing it was a successful test. Failover tests can now run nightly versus annually, providing compliance coverage for customers operating in highly regulated industries such as financial services and healthcare.

With HPE recognizing that global businesses are seeking a long-term, balanced and trusted partner -- rather than a single destination or fleeting proscribed cloud model -- the 75-year-old company has elevated itself above the cloud fray.

"Real transformation is hard, but it can have amazing benefits," HPE CEO Meg Whitman told the conference.

You may also be interested in: 

Tuesday, December 1, 2015

Nottingham Trent University elevates big data’s role to improving student retention

The next BriefingsDirect big-data case-study interview examines how Nottingham Trent University in England has devised and implemented an information-driven way to encourage higher student retention.

By gathering diverse data and information and making rapid analysis, Nottingham Trent is able to quickly identify those students having difficulties. They can thereby provide significant reductions in dropout rates while learning more about what works best to usher students into successful academic careers.

What’s more, the analysis of student metrics is also setting up the ability to measure more aspects of university life and quality of teaching, and to make valuable evidence-based correlations that may well describe what the next decades of successful higher education will look like.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To learn more about taking a new course in the use of data science in education, we're pleased to welcome Mike Day, Director of Information Systems at Nottingham Trent University in Nottingham, UK. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Tell us about Nottingham Trent University. It’s a unique institute, a very large student body and many of them attending university for the first time in their families.

Day: That’s right. We've had around 28,000 students over the last few years, and that’s probably going to increase this year to around 30,000 students. We have, as you say, many, many students who come from poor backgrounds -- what we call "widening participation" students. Many of them are first generation in their family to go to university.

Sometimes, those students are a little bit under-confident about going to university. We’ve come to call them "doubter students," and those doubters are the kinds of people that when they struggle, they believe it’s their fault, and so they typically don't ask for help.

Gardner: So it's incumbent upon you to help them know better where to look for help and not internalize that. What do you use to measure the means by which you can identify students that are struggling?

Low dropout rate

Day: We’ve always done very well in Nottingham Trent. We had a relatively low dropout rate, about seven percent or so, which is better than sector average. Nevertheless, it was really hard for us to keep students on track throughout their studies, especially those who were struggling early in their university career. We tended to find that we have to put a lot of effort into supporting students when they had failed exams, which for us, was too late.

Day
We needed to try to find a way where we could support our students as early as possible. To do that, we had to identify those students who were finding it a bit harder than the average student and were finding it quite difficult to put their hand up and say so.

So we started to look at the data footprint that a student left across the university, whether that was a smart card swipe to get them in and out of buildings or to use printers, or their use of the library, in particular taking library books out, or accessing learning materials through our learning management system. We wanted to see whether those things would give us some indication as to how well students were engaged in their studies and therefore, whether they're struggling or not.

Gardner: So this is not really structured information, not something you would go to a relational database for, part of a structured packaged application, for example. It's information that we might think of as breadcrumbs around the organization that you need to gather. So what was the challenge for dealing with such a wide diversity of information types?
Solutions That Unify Development and Operations
To Accelerate Business Innovation
Get More Information
Day: We had a very wide variety of information types. Some of it was structured, and we put a lot effort into getting some good quality data over the years, but some of it was unstructured. Trying to bring those different and disparate datasets together was proving very difficult to do in very traditional business intelligence (BI) ways.

We needed to know, in about 600 terabytes of data, what really mattered, what were the factors that in combination told us something about how successful students behave, and therefore something about comparing those that were not having such an easy time at the university how to compare that to those who were succeeding in it.
We needed ultimately to get to a position where we could create great relationships between people, particularly between tutors or academic counselors and individual students.

Gardner: It sounds as if the challenges were not only in the gathering of good information but in how to then use that effectively in drawing correlations that would point out where students rapidly were struggling. Tell us about both the technology side and then also the methodologies that you then use to actually create those correlations?

Day: You're absolutely right. It was very difficult to find out what matters and to get the right data for that. We needed ultimately to get to a position where we could create great relationships between people, particularly between tutors or academic counselors and individual students.

On the technology side, we engaged with a partner, that was a company called DTP Solutionpath, who brought with them the HPE IDOL engine. That allowed us to submit about five years worth of back data into the IDOL engine to try to create a model of engagement, in other words, to pick out what factors within that data in combination gave as a high confidence around student engagement.

Our partners did that. They worked very closely with us in a very collaborative way, with our academic staff, with our students, importantly -- because we have to be really clear and transparent about what we are doing in all of this, from an ethical point of view -- and with my IT technical team. And that collaboration really helped us to boil down what sorts of things really mattered.

Anonymizing details

Gardner: When you look at this ethically you have to anonymize a great deal of this data in order to adhere to privacy and other compliance issues. Is that the case?

Day: Actually, we needed to be able to identify individual students in all of this, and so there were very real privacy issues in all of this. We had to check quite carefully our legal position to make sure that we did comply with UK Data Protection Act, but that’s only a part of it.

What’s acceptable to the organization and ultimately to individual students is perhaps even more important than the strict legal position in all of this. We worked very hard to explain to students and staff what we were trying to do and to get them on board early, at the beginning of this project, before we had gone too far down the track, to understand what would be acceptable and what wouldn’t.

Gardner: I suppose it’s important to come off as a big brother and not the Big Brother in this?

Day: Absolutely. Friendly big brother is exactly what we needed to be. In fact, we found that how we engage with our student body was really important in all of this. If we try to explain this in a technical way. then it was very much Big Brother. But when we started to say, "We're trying to give you the very best possible support, such that you are most likely to succeed in your time in higher education and reap the rewards of your investment in higher education," then it became a very different story.
We worked very hard to explain to students and staff what we were trying to do and to get them on board early.

Particularly, when we were able to demonstrate the kind of visualizations of engagement to students, that shifted completely, and we've had very little, if any, problems with ethical concerns among students.

Gardner: It also seems to me that the stakes here are rather high. It's hard to put a number on it, but for a student who might struggle and drop out in their first months at university, it means perhaps a diminished potential for them over their lifetime of career, monetization of income, and contribution to society, and so forth.

So for thousands of students, this could impact them over the course of a generation. This could be a substantial return on investment (ROI), to be a bit crass and commercial about it.

Day: If you take all of this from the student’s perspective, clearly students are investing significant amounts of money in their education.

In the UK, that’s £9,000 (USD $13,760) a year at the moment, plus the accommodation costs, and the cost of not getting a job early, and all of those sorts of things that those students put into to invest in their early university career. To lose that means that they come out of the university experience being less positive than it could have been, with much, much lower earning potential over their lifetime.
Solutions That Unify Development and Operations
To Accelerate Business Innovation
Get More Information
That also has an impact on UK PLC, in that it isn’t perhaps generating as many skilled individuals as it might. That has implications for tax returns and also from a university point of view. Clearly if our students dropout, they aren’t paying their fees, and those slots are now empty. In terms of university efficiency, there was also a problem. So everybody wins if we can keep students on course.

On the journey

Gardner: Certainly a worthy goal. Tell us a little bit about where you are now? I think we have the vision. I think we understand the stakes and we understand some of the technologies we’ve employed. Where are you on this journey? Then, we can talk about so far what some of the results have been.

Day: It was very quick to get to a point where the technology was giving us the right kinds of answers. In about two to three months, we got into a position where the technology was pretty much done, but that was only a really part of the story. We really needed to look at how that impacted our practice in the university.

So we started to run a series of pilots into the series of courses. We did that over the course of a year about 18 months ago and we looked at every aspect of academic support for students and how this might change all of this. If we see that a student is disengaging from their studies, and we can see that now about a month or two before it otherwise would have been able to do that, we can have a very early conversation about what the problem might be.

In more than 90 percent of the cases that we have seen so far, those early conversations result in an immediate upturn in student engagement. We’ve seen some very real tangible results and we saw those very early on.
We've started to see students competing with each other to be the best engaged in their course. That’s got to be a good thing.

We expected that it would take as a considerable amount of time to demonstrate the system would give us a value at an institutional level, but actually it didn't. It took about six months or so into that pilot period that would set a year aside for to get to a position where we were convinced, as an institution ,that we roll out across the whole university. We did that at the beginning of this academic year and we rolled out about six months earlier than we thought. So we might even start thinking about that.

We now have had another year thinking about what good practice is, seeing that academic tutors are starting to share good practice among themselves. So there is a good conversation going on there. There is a much, much more positive relationship between those academic tutors and the students being reported from both the students and the tutors, we see that being very positive.

Importantly, there is also a dialogue going on between students themselves. We've started to see students competing with each other to be the best engaged in their course. That’s got to be a good thing.

Gardner: And how would they measure that? Is there some sort of a dashboard or visualization that you can provide to the students, as well as perhaps other vested interests in the ecosystem, so that they can better know where they are, where they stand?

Day: There absolutely is. The system provides a dashboard that gives a very simple visualization. It’s two lines on a chart. One of those lines is the average engagements of the cohort on a course by course basis. The other line is the individual student’s engagement compared to that average engagement in the course; in other words, comparing them with some of their peers on that.

We worked very hard to make that visualization simple, because we wanted that to be consistent. It needed to be something that prompted a conversation between tutors and students, and tutors sharing best practice with other tutors. It's a very simple visualization.

Sharing the vision

Gardner: Mike, it strikes me that other institutions of higher learning might want to take a page from what you've done. Is there some way of you sharing this or packaging it in some way, maybe even putting your stamp and name and brand on it? Have you been in discussions with other universities or higher education organizations that might want to replicate what you’ve done?

Day: Yes, we have. We're working with our supplier SolutionPath who have created now a model that is used to replicate in other universities. It starts with a readiness exercise because this is not about technology mostly. It's about how ready you are, as an organization, to address things like privacy and ethics in all of this. We've worked very closely with that.

We’ve spoken to two dozen universities already about how they might adopt something similar not necessarily exactly the same solution. We've done some work across the sector in the UK with a thing called the Joint Information Systems Committee, which looks at technology across all 150 universities in UK.
Solutions That Unify Development and Operations
To Accelerate Business Innovation
Get More Information
Gardner: Before we close out, I'm curious.When you’ve got the apparatus and the culture in the organization to look more discretely at data and draw correlations about things like student attainment and activities, it seems to me that we're only in the opening stages of what could be a much more data-driven approach to higher education. Where might this go next?
Research is another area where we might be able to think about how data helps us, what kind of research might we best be able to engage in.

Day: There’s no doubt at all that this solution has worked in its own right, but what it actually formed is a kind of bridgehead, which will allow us to take the principles and the approach that we have taken around the specific solution and apply to other aspects of the universities business.

For example, we might be able to start to look at which students might succeed on different courses across the university, perhaps regardless of traditional ways of recruiting students through their secondary school education qualification. It's looking at what other information might be a good indicator of success in a course.

We could start looking at the other end of the spectrum. How do students make their way into the world of work? What kinds of jobs do they get? And is this something about linking right at the beginning of a student’s university career, perhaps even at application stage, to the kinds of careers they might succeed in, and to try and advise early on those sorts of things that student might want to get involved with and engaged with. It’s a whole raft of things that we can start to think about.

Research is another area where we might be able to think about how data helps us, what kind of research might we best be able to engage in, and so on and so forth.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.
 

You may also be interested in: