Tuesday, November 1, 2011

Cloud procurement services give OSU Medical Center wins on buyer savings and process productivity

Listen to the podcast. Find it on iTunes/iPod. Read a full transcript or download a copy. Sponsor: Ariba.

T
he latest BriefingsDirect case study interview explores how large businesses are using cloud commerce and automated sourcing and procurement to dramatically improve how they modernize and manage their buying processes.

In doing so, we'll see how they improve cash management and gain an analytical edge to constantly improving these vital processes. The Ohio State University Medical Center (OSUMC), for example, has moved its procurement activities to the cloud and adopted strategic sourcing, dramatically increasing efficiency across the purchases managed.

Through our discussion with Karen Sherrill, Senior Commodity Manager at The Ohio State University Medical Center in Columbus, learn more about how a large medical enterprise is conducting its business better through collaborative cloud commerce. The moderator is Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: Ariba is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: What did you need to change in how you sourced and procured that led you to adopt more automation and more of these cloud-based services?

Sherrill: The Medical Center is a government agency. So as you can imagine, it’s tied down with a lot of bureaucracy and paper. But as we moved into the year 2010, we were receiving a lot of pressure to do things faster and more efficiently. The only way to do that was some type of technology to allow our current staffing levels to support the needed growth to be able to support our customers faster.

We were processing about 422 bids a year, and that equated to about the same number of contracts. We had only 26 buyers who were able to support the business that was projected to grow significantly with our medical center expansion and the growth of our ambulatory site.

Some of the limitations that we were running up against was that our resources were spending a lot of time providing technical support to the old legacy system. In addition, the legacy system was supporting static documents. So we were posting PDFs, and suppliers were mailing in bids, which was not an efficient way to analyze them on the back-end.

We were manually tracking supplier contract terms, conditions, and modifications, and that was taking a significant amount of time in order to execute the final agreement.

We had no repository for contracts. So when people were seeking agreements, we were looking on shared drives and peoples’ desks and were having to go to external storage. Contracts were expiring. We were not aware of them and not renewing them in a timely fashion.

No framework

In addition, we had counterparts with the university and we weren't able to collaborate with them and have visibility into what bids they were doing and what bids we were doing. So there was no framework to support that type of collaboration that was essential for us to be successful going forward.

Those were some of the limitations and concerns that we were trying to address and we therefore implemented technology in order to help us meet or relieve some of these concerns.

Gardner: What were the requirements that you had in mind when you were seeking to improve on this situation?

Sherrill: One of the requirements was that it had to be an automated technology and it had to be easy to use. The provider of the technology would need to provide the technical support, primarily not for the buyers, but for the suppliers, who would be interacting with the system and would need training and guidance to navigate the system in order to submit an effective bid result.

Gardner: Tell me a bit about The Ohio State University Medical Center.

Those were some of the limitations and concerns that we were trying to address and we therefore implemented technology in order to help us meet or relieve some of these concerns.



Sherrill: The Ohio State University Medical Center is located in Columbus, Ohio. We consist of five hospitals. The main hospitals are the James Cancer Hospital, the Ohio State University Hospital, the main campus, and then we have Ohio State University Medical Center Hospital East.

We have about 58,000 inpatient admissions and about one million outpatient admissions, and we do about 15,000 inpatient surgeries, about 19,000 outpatient surgeries, and about a 120,137 visits to our emergency department.

We consists of about 450 beds, and that does not include our one million square foot expansion construction project that we currently have underway, and is expected to be completed in fiscal year 2014.

Divided into specialization

We're divided into two sections. I head up the indirect products and services, which means that anything that's non-clinical or not related to a patient's care. So that would be marketing, advertising, linen, landscaping, anything that's not directly related with patient care.

I have a counterpart who handles all the clinical types of purchases, which would be mammography equipment, needles, drug-eluting stents, all the medical type related supplies and services.

Gardner: How did the journey begin, and where have you come in just a fairly short amount of time?

Sherrill: Initially they did an RFP for the e-sourcing technology. Ariba was selected. That was done prior to my coming on board, but when I came on board they said, "This is the technology we have selected and you need to implement this."

The first thing for me was to build relationships internally, and I did that by just taking the time to stop, listen to the way they currently did business, and why they did it that way.



As a result, being a new person in the organization, I didn't really know how the organization operated, because I came from the private sector. This was a public sector entity. I was a new person, and no one knew who I was.

The first thing for me was to build relationships internally, and I did that by just taking the time to stop, listen to the way they currently did business, and why they did it that way. Sometimes they may be doing things for a good reason or maybe there was a legal reason. There were other things they were doing because they did it for 50 years, and maybe we didn't need do it that way any longer. So that was the first step.

The second step was that the leadership heads agreed to lay out a significant amount of investment in this particular technology, and my only assignment was to implement it. So we created this analogy that the locomotive is out of the gate. The financial investment had been made. Then you have Karen Sherrill, who is trying to prove something, and my only assignment is to get this implemented.

The rule was that you have three choices: you can get on the locomotive and enjoy the ride, you can step aside and let the train pass you by, or you can try to get in front of the train and stop the progress.

That was the analogy or the vision that we put in place, and they could decide where they wanted to fall. We had individuals who chose three different areas.

Enjoy the ride

Most of them decided to get on the train and enjoy the ride, because they were really itching for change and becoming more efficient. Others were nonbelievers, saying that this is never going to stick, so I am going to let the train pass me by. And there were a few who were afraid of technology changing, afraid of the processes changing, afraid of the shift in power as a result of the processes changing, and they feared visibility. So they tried to stop the train.

Because we had leadership support, whenever we ran across those individuals, we could run it up the chain. We had an endearing term that they would then get the smack down or be smacked back in line so the train can continue on.

We had to do that several times, but in the end we knew what the destination was and the locomotive was going to get there. We were not going to be one of those organizations that bought the technology, and when the subscription expired, we had not had one bid go through the tool.

One of my personal objectives was that a year was not going to go by without one bid going through the tool. One year sounds like a long time, but they didn’t have any processes documented. So we had to step back a few months to document the processes in order to effectively communicate how we wanted this technology to be configured.

But it was not going to take more than a year to get one bid through the tool. Then, when we had the one success, the goal was to build upon that success and continue to get more and more momentum. That’s how we were able to drive this locomotive to conclusion very quickly.

We consciously made the decision not to integrate. There was really no need to do it at this point.



One of the key reasons we selected the cloud, on-demand version versus something hosted behind our firewall -- or even deciding to integrate it with our systems behind our firewall -- is that you have to get IT support internally, get their approval, and that would have delayed the implementation of this significantly.

We consciously made the decision not to integrate. There was really no need to do it at this point. And they wanted to see if we would use the technology. Why invest in integrating something that’s never going to be adopted?

So one of the benefits of using the on-demand solution in the cloud is that you don’t have to build that interface behind a firewall and you do not have to use internal resources to implement or execute the system. So that was one benefit to using the cloud solution.

Assisting suppliers

One of the problems we were having was that on the old system buyers were providing suppliers’ technical support, such as suppliers can't remember their email, they got the wrong category, they didn’t get the bid, and the buyers were constantly having to interface with suppliers.

When you go to a technology approach that’s more advanced, they're going to need even more assistance on how to navigate the sites or get their passwords reset. With the on-demand technology we are able to utilize the Ariba help desk. Most of our tickets are supplier based, and within 12 months of implementing there were over 700 tickets issued to the help desk, which were transparent to me and all the buyers.

That's a benefit that is definitely required when you go to a more advanced technology for processing bids. The suppliers are going to be needing more support, but you can't have your resources that are supposed to be focusing on strategic sourcing spending all their time trying to help suppliers to submit a response to a bid.

Gardner: What are the higher-level benefits around strategic sourcing?

When we transitioned to Discovery, there were about 350,000 suppliers on Discovery. It's grown to over a half-million at this point.



Sherrill: There was an additional benefit from using Ariba Discovery in our legacy system. There were only about 5,000 suppliers. As a public entity we're required to do public notifications of our bid opportunities. With only 5,000 suppliers within our legacy base, the competition could be somewhat limited, because it's only those suppliers who knew about our site and had come there to register to receive our bid notification.

When we transitioned to Discovery, there were about 350,000 suppliers on Discovery. It's grown to over a half-million at this point. So we've substantially increased the number of suppliers that are aware of our bid opportunities.

When you increase the number of suppliers aware of your bid opportunities, the number of suppliers that participate grows or increases. When you have more suppliers participating, you have increased competition, which then lowers pricing. And we found that to be the case on two high profile projects.

One project was related to linen. There was a supplier that we were aware of but we couldn’t find their contact information. We put the public bid notification on Discovery, and the supplier popped up. They participated in our bid.

They didn’t win, but they were strong competition, and the incumbent felt the need to reduce their pricing by about 16 percent, which resulted in a half-million dollar savings in year one and year two of the agreement. So if we didn’t have that strong competition, that saving probably would not have been generated.

Driving down prices

The dynamics that were working there was that you had the incumbent, who if they lost the business, would have a lot of explaining to do, working against the dynamic of a new company who was being told, "Do whatever you have to do to get the business." That dynamic drove the pricing down. So that was one benefit.

On several of our categories, we're just getting a lot more suppliers participating, finding suppliers that can source things that the buyers may not be a subject matter expert in. Because we're joined with the university, they're utilizing the system too, and sometimes they're being asked to source things like hay or dental strips. They're not a subject matter expert in that. Where are they going to find suppliers?

By just putting out a bid notification on Discovery, it makes the buyers who are more of a generalist have the tools necessary to find suppliers that can compete on categories that they're not subject matter experts in, in order to generate the competition to reduce price and get additional value from the supplier base.

Gardner: You mentioned earlier that one of the requirements you had was to automate and create more of a solid data repository. How has being able to satisfy that requirement benefited you as a business?

Sherrill: It's allowed us to standardize on the Seven-Step Strategic Sourcing Process, and within that process there are certain documents that have to be completed. We get audited on that documentation being completed. It's basically around, did we follow the public bidding guidelines? Did we do a public bid? Was the lowest and most responsive bidder selected? And what was the documentation, the score sheet, that support that contract award?

When an auditor comes in and they select bids, we can run a report of all the bids that were run during a particular period of time.



Now, when an auditor comes in and they select bids, we can run a report of all the bids that were run during a particular period of time. They can just select the bids that they want, and the buyers can just do a search and pull the documents that were requested.

Before, they had to go to some file cabinet and find the bid number. What if it was misfiled? It allows us to obtain audited documents much more quickly and it also standardized on the process to make sure the documents are completed in order to close the project out. So from a visibility standpoint, that has benefited us.

From the reporting aspect, we also can see how many bids are being run by what people and how many contracts are being executed, so we can get visibility into workload. Someone is doing x number of bids, but their work is not very contract related, versus someone is not doing very many bids, but they are doing a lot of contracts, versus someone who is not doing any bids or any contracts -- and that’s kind of an issue. So we can run a report and keep track of productivity, where we may have to shift projects or resources in order to better support our customers. And we can do that by just running a quick report.

Savings is another big initiative. Everybody wants to know, how much savings you generated over what period of time. We used to have an Excel spreadsheet where people were loading in their projects. A high-level executive would say, "I want to know what the savings were," and there would be this big initiative about going to the spreadsheet to update your savings. Of course only one person could go into the particular spreadsheet at a time, and the accuracy of it was always questionable.

People feel more comfortable

Now we have the technology where people load up their projects and load up their savings. We can run a monthly report that is scheduled. People are required to review their savings numbers on a monthly basis and make any tweaks or adjustments. When we get those requests for savings number, the accuracy of it is a lot higher and people feel more comfortable with it and people can update their savings at the same time.

So if we have to do it on a quick term basis, like in two or three hours, you can say, "Go and update your projects." The source analysts can just pull down that report, and we've got our numbers. That is a beautiful thing as far as visibility into savings, tracking the savings, and building more confidence in the accuracy of the numbers we report.

... Before we implemented Ariba, the construction category had a lot of bids, and that particular buyer was getting a lot of complaints that he wasn't turning the bids around faster, or as fast as the end user would like, and the analysis was taking too long. They wanted to see the bid results. They wanted to see it analyzed quickly, and they were escalating these complaints.

We've got them to the point that when the bids come in, they can go in and pull them down themselves and do the analysis and decide who they want to shortlist.



We implemented Ariba. So now he's able to execute his bids faster. Once he loaded one bid up, if another one was similar, he could copy and just tweak, which increased the efficiency on his end.

And because the suppliers were inputting their responses online, it was easier for him to export and do the analysis much quicker, but that particular user base also embraced the technology. We've got them to the point that when the bids come in, they can go in and pull them down themselves and do the analysis and decide who they want to shortlist.

That particular group is much happier. We're not getting any complaints, and the projects are moving off of his plate much faster. In fact, we used to have a lot of projects that were past due, past due, past due. The last report that we did there was something that was just due out in the future. So we're starting to get ahead of our strategic sourcing pipeline.

Gardner: What would you have in terms of suggestions or recommendations for folks who are examining moving to a cloud-based procurement and strategic sourcing service or activity? Do you have any 20/20 hindsight, something that you could offer in terms of what you learned along the way?

Sherrill: The first thing I would recommend, and most companies may have it, is that you need to have your processes documented. When you buy one of these subscriptions, the clock starts ticking. If you have to stop and document your processes, then you're not using the tool, but you are paying for it.

Document your processes

Have your processes documented, so as soon as your subscription starts, you can have the tool configured and you don't have to use eight months, like we had to do, to figure out how we wanted the tool to be configured.

They also need to keep in mind that the technology is just a tool and it requires people, processes, and the technology to be successful. I like to say that you have to have brains behind the tool or it's not going to work for you.

Some of the people here had the skill set to embrace the tool and the utilization of the tool, but we have other people who don't have the correct skill set in order to effectively utilize the tool. That’s kind of a constraint if you don't have the authority to transform your organization and right size it with the people that have the correct skill sets.

Another important implementation is that you've got to have support from the top, who will back you up when someone is trying to put up a wall or be an obstacle to the implementation.

They also need to keep in mind that the technology is just a tool and it requires people, processes, and the technology to be successful.



The way you get the leadership support is having an individual who has built credibility as far as the transformation actually working. The person leading the process needs to have the right skill set, and from what we found, that was a person who had excellent project management skills.

Also key is that they need to be able to do strategic sourcing. You have more credibility, if you're actually using the tool and actually using the processes that you are implementing.

Number three, the person has to have excellent communication and interpersonal skills to deal with those people who don't want to go along with the process, as well as team building skills. If that particular leader has all those skills, it allows the opportunity for them to do sourcing events and lead by example.

One of the things that made Ohio State University’s implementation successful is that my projects were the ones that went through the tool first. I was able to identify any problems and reconcile them immediately, so that the buyers that were putting in their bids behind mine would never run across those problems.

If there were problems that I encountered that I wasn't able to fix, and a buyer identified it, I could say, "I'm aware of that problem, and here is your workaround," so that I didn't get any "aha" or "got you" moment that they were trying to come up with to stop the implementation.

The second component to that is that you need to have early adopters. These were people who wanted to change, who wanted to have their sourcing projects to be the first ones to go to the tool. That was valuable in that, while I as a project leader will say that this is great, I'm going to say it's great, because I am responsible for rolling it out. But I had early adopters who also agree, who are fans, who are vocal about it, and also had successes. That was very instrumental to rolling the project out successfully.

Adding credibility

Then once you've got several of the people who have the credibility of utilizing the tool, who are actually sourcing professionals that are using the tool, it adds credibility, and then everyone else has no choice but to follow along.

The other important thing is that you have to shut down the old way of processing bids. We had a go-live date and we had those individuals that decided to use the old way all the way up until the last date, but when we came to the go-live date, there were no more bids going out the old way. That led to us having 100 percent compliance since we went live May of 2010.
Listen to the podcast. Find it on iTunes/iPod. Read a full transcript or download a copy. Sponsor: Ariba.

You may also be interested in:

Monday, October 31, 2011

Virtualized desktops spur use of 'bring your own device' in schools, allowing always-on access to educational resources

Listen to the podcast. Find it on iTunes/iPod. Read a full transcript or download a copy. Sponsor: VMware.

Educators are using of desktop virtualization in innovative new ways to enable "bring your own device" (BYOD) benefits for faculty and students. This latest BriefingsDirect interview explores how one IT organization has made the leap to allowing young users to choose their own client devices to gain access to all the work or learning applications and data they need -- safely, securely, and with high performance.

The nice thing about BYOD is that you can essentially extend what do you do on-premises or on a local area network (LAN) -- like a school campus -- to anywhere; to your home; to your travels, 24×7.

The Avon Community School Corp. in Avon, Indiana has been experimenting with BYOD and desktop virtualization, and has recently embarked in a wider deployment for both this school year.

To get their story, Dana Gardner, Principal Analyst at Interarbor Solutions, interviewed Jason Brames, Assistant Director of Technology, and Jason Lantz, Network Services Team Leader, both at Avon Community School. [Disclosure: VMware is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: You've been successful with server virtualization, but what made it important for you now to extend virtualization to the desktop?

Brames: One of the things that is important to our district we noticed when doing an assessment of our infrastructure: We have aging endpoints. We had a need to extend the refresh rate of our desktop computers from what was typical -- for a lot of school districts typical is about a 5-year refresh rate -- to getting anywhere from 7 to 10, maybe even 12 years, out of a desktop computer.

By going to a thin client model and connecting those machines to a virtual desktop, we're able to achieve high quality results for our end users, while still giving them computing power that they need and allowing us to have the cost savings by negating the need to purchase new equipment every five years.

By going with virtual environment, the problem that we were looking to solve was really just that -- how do we provide extended refresh rate for all of our devices?

Supporting 5,500 computers

We're located about 12 miles west of Indianapolis, Indiana, and we have 13 instructional buildings. We're a pre-K-to-12 institution and we have approximately 8,700 students, nearing 10,000 end-users in total. We’re currently supporting about 5,500 computers in our district.

... Currently have 400 View desktop licenses. We’re seeing utilization of that license pool of 20-25 percent right now, and the primary reason that we’re seeing that utilization is because we’re really just beginning that phase, with this being our first year for our virtual desktop roll out. We’re really in the second year, but the first year of more widespread use.

We’re training teachers on how to adequately and effectively use this technology in their classroom with kids It's been very highly received and is being adopted very well in our classrooms, because people are seeing that we were able to improve the computing experience for them.

Lantz: With that many devices, getting out there and installing software, even if it’s a push, locally, or what have you, there's a big management overhead there. By using VMware View and having that in our data center, where we can control that, the ability to have your golden image that you can then push out to a number of devices has made it a lot easier to transition to this type of model.

We’re finding that we can get applications out quicker with more quality control, as far as knowing exactly what’s going to happen inside of the virtual machine (VM) when you run that application. So that’s been a big help.

A lot of our applications are Web-based, Education City. It’s a lot of graphics and video. And we found that we're still able to run those in our View environment and not have issues.

Gardner: What are you running in terms of servers? What is your desktop virtualization platform, and what is it that allows you to move on this so far?

Lantz: On the server side, we're running VMware vSphere 4.1. On the desktop side, we're running View 4.6. Currently in our server production, as we call it, we have three servers. And we're adding a fourth shortly. On the View side of things, we currently have two servers and we’re getting two more in the next month or so. So we’ll have a total of four.

Access from anywhere

Gardner: Now one of the nice things about the desktop virtualization and this BYOD is it allows people to access these activities more freely anywhere. How do you manage to take what was once confined to the school network and allow the students and other folks in your community to do what they need to do, regardless of where they are, regardless of the device?

Brames: We’re a fairly affluent community. We have kids who were requesting to bring in their own devices. We felt as though encouraging that model in our district was something that would help students continue to use computers that were familiar to them and help us realize some cost savings long term.

So by connecting to virtual desktops in our environment, they get a familiar resource while they're within our walls in the school district, have access to all of their shared drives, network drives, network applications, all of the typical resources that are an expectation of sitting down in front of a school-owned piece of equipment. And they're seeing the availability of all of those things on their own device.

... A typical classroom for us contains four student computing stations, as well as, depending upon the building size, three to five labs available. We’re not focusing our desktop virtualization on those labs. We’re focusing on the classroom computing stations right now. Potentially, we'll also be in labs, as we go into the future.

Then, in addition to those student computing stations, we’re seeing those applications where our administrative team or principals and our district-level administrators are able to begin using virtual desktops to access while they’re outside of the district and growing familiar with that, so that whenever we enter into that phase where we’re allowing our students to access from outside of our network, we have that support structure in place.

... We’re also seeing an influx of more mobile-type devices such as tablets and even smartphones and things like that. The percentage of our users that are using tablets and smartphones right now for powerful computing or their primary devices is fairly low. However, we anticipate over time that the variety of devices we’ll have connecting to our network because of virtual desktops is going to increase.

We anticipate over time that the variety of devices we’ll have connecting to our network because of virtual desktops is going to increase.



Gardner: How is that hand-off happening? Are you able to provide a unified experience yet?

Lantz: That’s part of phase two of our approach that we’re implementing right now. We’ve gotten it out into the classrooms to get the students familiar with it, so that they understand how to use it. The next step in that process is to allow them to use this at home.

We currently have administrators that are using it in this fashion. They have tablets and are using the View client they connect in and get the same experience if they're in school or out of school.

So we’re to that point. Now that our administrators understand the benefits, now that our teachers have seen it in the classrooms, it’s a matter of getting it out there to the community.

One of the other ways that we’re making it available is that at our public library, we have a set of machines that students can access as well, because as you know, not every student has access to high-speed Internet, but they are able to go to library, check out these machines, and be able to get into the network that way. Those are some of the ways that we’re trying to bridge that gap.

Huge win-win

Technology Integration Group has resources that allow us to see what other school districts are doing and what are some of the things that they’ve run into. Then, they bring back here and we can discuss how we want to roll it out in our environment. They’ve been very good at giving us ideas of what has worked with other organizations and what hasn’t. That’s where they've come in. They’ve really helped us understand how we can best use this in our environment.

Gardner: I often hear from organizations, when they move to desktop virtualization, that there are some impacts on things like network or storage that they didn’t fully anticipate. How has that worked for you? How has this roll out movement towards increased desktop virtualization impacted you in terms of what you needed to do with your overall infrastructure?

Lantz: Luckily for us we’ve had a lot of growth in the last two to three years, which has allowed us to get some newer equipment. So our network infrastructure is very sound. We didn’t run into a lot of the issues that commonly you would with network bandwidth and things like that.

On the storage side, we did increase our storage. We went with an EqualLogic box for that, but with View, it doesn’t take up a ton of storage space with link clones and things like that. So having seen a huge impact there, now as we get further into this, storage requirements will get greater, but currently that hasn’t been a big issue for us.

Gardner: On the flip-side of that, a lot of organizations I talk to, who moved to desktop virtualization, gained some benefits on things like backup, disaster recovery, security, and control over data and assets, and even into compliance and regulatory issues. Has there been an upside that you could point to in terms of being a more centralized control of the desktop content and assets?

Difficult to monitor

Lantz: When you start talking about students bringing in their own devices, it's difficult to monitor what's on that personally owned device.

We found that by giving them a View desktop, we know what's in our environment and we know what that virtual machine has. That allows us to have more secure access for those students without compromising what's on that student’s machine, or what you may not know about what's on that student’s machine. That’s been a big benefit for us allowing students to bring in their own devices.

Gardner: Do we have any metrics of success either in business or, in this case, learning terms and/or IT cost savings? What has this done for you? I know it's a little early, but what's the early results?

Brames: You did mention that it is a little bit early, but we believe that as we begin using virtual desktops more so in our environment, one of the major cost savings that we’re going to see as a result is licensing cost for unique learning applications.

By creating these pools of machines that have specialty software on them we’re able to significantly reduce the number of titles we need to license.



Typically in our district we would have purchased x number of licenses for each one of our instructional buildings because they needed to utilize that with students in the classroom. They may have a certain number of students that need access to this application, for example, but they're not all accessing it during the same time of the day or it's on a machine that’s on a fat client, a physical machine somewhere in the building, and it's difficult for students to have access to it.

By creating these pools of machines that have specialty software on them we’re able to significantly reduce the number of titles we need to license for certain learning applications or certain applications that improve efficiencies for teachers and for students.

So that’s one area in which we know we’re going to see significant return on our investment. We already talked about extending the endpoints, and with energy savings, I think we can prove some results there as well. Anything to add, Jason?

Lantz: One of the ones that’s hard to calculate is, as you mentioned, maintenance or management of this piece and technology, as we all know you’re doing more with less. This really gives you the ability to do that. How you measure that is sometimes difficult, but there are definitely cost savings there as well.

Administrators are able to begin using virtual desktops to access while they’re outside of the district.



Gardner: I know budgets are really important in just about any school environment. Do you have any sense of the delta there between what it would be if you stuck to traditional cost structures, traditional licensing, fat client, to get to that one to one ratio, compared to what you’re going to be able to do over time with this virtualized approach?

Brames: Our Advanced Learning Center is the school building that has primarily senior students and advanced placement students. There are about 600 students that attend there.

Last year, 75 percent of those students were using school-owned equipment and 25 percent of them were bringing their own laptops to school. This year, what we have seen is that 43 percent of our students are beginning to bring their own devices to connect to our network and have access to network resources.

If that trend continues, which we think it will, we’ll be looking at certainly over 50 percent next year, hopefully approaching 60-65 percent of our students bringing their own devices. When you consider that that is approximately 400 devices that the school district did not need to invest in, that’s a significant saving for us.

This year, what we have seen is that 43 percent of our students are beginning to bring their own devices to connect to our network and have access to network resources.



Gardner: If you could do this over again, a little bit of 20/20 hindsight, what might you want to tell others in terms of being prepared?

Lantz: One thing that’s important is that when you explain this to users, the words "virtual desktop" can be a little confusing to teachers and your end-users. What I've done is taken the approach of it’s no different than having a regular machine and you can set it up to where it looks exactly the same.

No real difference

When you start talking with end users about virtual, it gets into, okay, "So it’s running back here, but what problems am I going to encounter?" and those sort of things. Trying to get that end user to realize that there really isn’t a difference between a virtual desktop and a real desktop has been important for us for getting them on board and making them understand that it’s not going to be a huge change for them.
Listen to the podcast. Find it on iTunes/iPod. Read a full transcript or download a copy. Sponsor: VMware.

You may also be interested in: