Wednesday, April 10, 2013

Data complexity forces need for agnostic tool chain approach for information management, says Dell Software executive

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: Dell Software.

A data dichotomy has changed the face of information management, bringing with it huge new data challenges for businesses to solve.

The dichotomy means that organizations, both large and small, not only need to manage all of their internal data to provide intelligence about their businesses, they need to manage the growing reams of increasingly external big data that enables them to discover new customers and drive new revenue.

The latest BriefingsDirect software how-to discussion then focuses on bringing far higher levels of automation and precision to the task of solving such varied data complexity. By embracing an agnostic, end-to-end tool chain approach to overall data and information management, businesses are both solving complexity and managing data better as a lifecycle.

To gain more insights on where the information management market has been and where it's going, we are joined by Matt Wolken, Executive Director and General Manager for Information Management at Dell Software. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: Dell Software is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: What are the biggest challenges that businesses need to solve now when it comes to data and information management?

Wolken: About 10 or 15 years ago, the problem was that data was sitting in individual databases around the company, either in a database on the backside of an application, the customer relationship management (CRM) application, the enterprise resource planning (ERP) application, or in data marts around the company. The challenge was how to bring all this together to create a single cohesive view of the company?

Wolken
That was yesterday's problem, and the answer was technology. The technology was a single, large data warehouse. All of the data was moved to it, and you then queried that larger data warehouse where all of the data was for a complete answer about your company.

What we're seeing now is that there are many complexities that have been added to that situation over time. We have different vendor silos with different technologies in them. We have different data types, as the technology industry overall has learned to capture new and different types of data -- textual data, semi-structured data, and unstructured data -- all in addition to the already existing relational data. Now, you have this proliferation of other data types and therefore other databases.

The other thing that we notice is that a lot of data isn't on premise any more. It's not even owned by the company. It's at your software-as-a-service (SaaS) provider for CRM, your SaaS provider for ERP, or your travel or human resources (HR) provider. So data again becomes siloed, not only by vendor and data type, but also by location. This is the complexity of today, as we notice it.

Cohesive view

All of this data is spread about, and the challenge becomes how do you understand and otherwise consume that data or create a cohesive view of your company? Then there is still the additional social data in the form of Twitter or Facebook information that you wouldn't have had in prior years. And it's that environment, and the complexity that comes with it, that we really would like to help customers solve.

Gardner: When it comes to this so-called data dichotomy, is it oversimplified to say it's internal and external, or is there perhaps a better way to categorize these larger sets that organizations need to deal with?

Wolken: There's been a critical change in the way companies go about using data. There are some people who want to use data for an outcome-based result. This is generally what I would call the line-of-business concern, where the challenge with data is how do I derive more revenue out of the data source that I am looking at.

What's the business benefit for me examining this data? Is there a new segment I can codify and therefore market to? Is there a campaign that's currently running that is not getting a good response rate, and if so, do I want to switch to another campaign or otherwise improve it midstream to drive more real value in terms of revenue to the company?

That’s the more modern aspect of it. All of the prior activities inside business intelligence (BI) -- let’s flip those words around and say intelligence about the business -- was really internally focused. How do I get sanctioned data off of approved systems to understand the official company point of view in terms of operations?
How do I go out and use data to derive a better outcome for my business?

That second goal is not a bad goal. That's still a goal that's needed, and IT is still required to create that sanctioned data, that master data, and the approved, official sources of data. But there is this other piece of data, this other outcome that's being warranted by the line of business, which is, how do I go out and use data to derive a better outcome for my business? That's more operationally revenue-oriented, whereas the internal operations are around cost orientation and operations.

So where you get executive dashboards for internal consumption off of BI or intelligence for the business, the business units themselves are about visualization, exploration, and understanding and driving new insights.

It's a change in both focus and direction. It sometimes ends up in a conflict between the groups, but it doesn't really have to be that way. At least, we don't think it does. That's something that we try to help people through: How do you get the sanctioned data you need, but also bring in this third-party data and unstructured data and add nuance to what you are seeing about your company.

Gardner: Do traditional technology offerings allow this dichotomy to be joined, or do we need a different way to create these insights across both internal and external information?

Wolken: There are certainly ways to get to anything. But if you're still amending program after program or technology after technology, you end up with something less than the best path, and there might be new and better ways of doing things.

Agnostic tool chain

There are lots of ways to take a data warehouse forward in today's environment, manipulate other forms of data so it can enter a data warehouse or relational data warehouse, and/or go the other way and put everything into an unstructured environment, but there's also another way to approach things, and that’s with an agnostic tool chain.

Tools have existed in the traditional sense for a long time. Generally, a tool is utilized to hide complexity and all of the issues underneath the tool itself. The tool has intelligence to comprehend all of the challenges below it, but it really abstracts that from the user.

We think that instead of buying three or four database types, a structured database, something that can handle text, a solution that handles semi-structured or structured, or even a high performance analytical engine for that matter, what if the tool chain abstracts much of that complexity? This means the tools that you use every day can comprehend any database type, data structure type, or any vendor changes or nuances between platforms.

That's the strategy we’re pursuing at Dell. We’re defining a set of tools -- not the underlying technologies or proliferation of technologies -- but the tools themselves, so that the day-to-day operations are hidden from the complexity of those underlying sources of vendor, data type, and location.
We’re looking to enable customers to leverage those technologies for a smoother, more efficient, and more effective operation.

That's how we really came at it -- from a tool-chain perspective, as opposed to deploying additional technologies. We’re looking to enable customers to leverage those technologies for a smoother, more efficient, and more effective operation.

Let's just take data integration as a point. I can sometimes go after certain siloed data integration products. I can go after a data product that goes after cloud resources. I can get a data product that only goes after relational. I can get another data product to extract or load into Hive or Hadoop. But what if I had one that could do all of that? Rather than buying separate ones for the separate use cases, what if you just had one?
Gardner: What are the stakes here? What do you get if you do this right?

Institutional knowledge

Wolken: There are a couple of ways we think about it, one of which is institutional knowledge. Previously, if you brought in a new tool into your environment to examine a new database type, you would probably hire a person from the outside, because you needed to find that skill set already in the market in order to make you productive on day one.

Instead of applying somebody who knows the organization, the data, the functions of the business, you would probably hire the new person from the outside. That's generally retooling your organization.

Or, if you switch vendors, that causes a shift as well. One primary vendor stack is probably a knowledge and domain of one of your employees, and if you switch to another vendor stack or require another vendor stack in your environment, you're probably going to have to retool yet again and find new resources. So that's one aspect of human knowledge and intelligence about the business.

There is a value to sharing. It's a lot harder to share across vendor environments and data environments if the tools can't bridge them. In that case, you have to have third-party ways to bridge those gaps between the tools. If you have sharing that occurs natively in the tool, then you don't have to cross that bridge, you don't have the delay, and you don't have the complexity to get there.

So there is a methodology within the way you run the environment and the way employees collaborate that is also accelerated. We also think that training is something that can benefit from this agnostic approach.
You're reaching across domains and you're not as effective as you would be if you could do that all with one tool chain.

But also, generically, if you're using the same tools, then things like master data management (MDM) challenges become more comprehensive, if the tool chain understands where that MDM is coming from, and so on.

You also codify how and where resources are shared. So if you have a person who has to provision data for an analyst, and they are using one tool to reach to relational data, another to reach into another type of data, or a third-party tool to reach into properties and SaaS environments, then you have an ineffective process.

You're reaching across domains and you're not as effective as you would be if you could do that all with one tool chain.

So those are some of the high-level ideas. That's why we think there's value there. If you go back to what would have existed maybe 10 or 15 years ago, you had one set of staff who used one set of tools to go back against all relational data. It was a construct that worked well then. We just think it needs to be updated to account for the variance within the nuances that have come to the fore as the technology has progressed and brought about new types of technology and databases.

Gardner: What are typically some of the business paybacks, and do they outweigh the cost?

Investment cycles

Wolken: It all depends on how you go about it. There are lots of stories about people who go on these long investment cycles into some massive information management strategy change without feeling like they got anything out of it, or at least were productive or paid back the fee.

There's a different strategy that we think can be more effective for organizations, which is to pursue smaller, bite-size chunks of objective action that you know will deliver some concrete benefit to the company. So rather than doing large schemes, start with smaller projects and pursue them one at a time incrementally -- projects that last a week and then you have 52 projects that you know derive a certain value in a given time period.

Other things we encourage organizations to do deal directly with how you can use data to increase competitiveness. For starters, can you see nuances in the data? Is there a tool that gives you the capability to see something you couldn't see before? So that's more of an analytical or discovery capability.

There's also a capability to just manage a given data type. If I can see the data, I can take advantage of it. If I can operate that way, I can take advantage of it.

Another thing to think about is what I would call a feedback mechanism, or the time or duration of observation to action. In this case, I'll talk about social sentiment for a moment. If you can create systems that can listen to how your brand is being talked about, how your product is being talked about in the environment of social commentary, then the feedback that you're getting can occur in real time, as the comments are being posted.
There's a feedback mechanism increase that also can then benefit from handling data in a modern way or using more modern resources to get that feedback.

Now, you might think you'll get that anyway. I would have gotten a letter from a customer two weeks from now in the postal system that provided me that same feedback. That’s true, but sometimes that two weeks can be a real benefit.

Imagine a marketing campaign that's currently running in the East, with a companion program in the West that's slightly different. Let's say it's a two-week program. It would be nice if, during the first week, you could be listening to social media and find out that the campaign in the West is not performing as well as the one in the East, and then change your investment thesis around the program -- cancel the one that's not performing well and double down on the one that's performing well.

There's a feedback mechanism increase that also can then benefit from handling data in a modern way or using more modern resources to get that feedback. When I say modern resources, generally that's pointing towards unstructured data types or textual data types. Again, if you can comprehend and understand those within your overall information management status, you now also have a feedback mechanism that should increase your responsiveness and therefore make your business more competitive as well.

Gardner: Given that these payoffs could be so substantial, what's between companies and the feedback benefits?

It's the complexity

Wolken: I think it's complexity of the environment. If you only had relational systems inside your company previously, now you have to go out and understand all of the various systems you can buy, qualify those systems, get pure feedback, have some proofs of concept (POCs) in development, come in and set all these systems up, and that just takes a little bit of time. So the more complexity you invite into your environment, the more challenges you have to deal with.

After that, you have to operate and run it every day. That's the part where we think the tool chain can help. But as far as understanding the environment, having someone who can help you walk through the choices and solutions and come up with one that is best suited to your needs, that’s where we think we can come in as a vendor and add lots of value.

When we go in as a vendor, we look at the customer environment as it was, compare that to what it is today, and work to figure out where the best areas of collaboration can be, where tools can add the most value, and then figure out how and where can we add the most benefit to the user.

What systems are effective? What systems collaborate well? That's something that we have tried to emulate, at least in the tool space. How do you get to an answer? How do you drive there? Those are the questions we’re focused on helping customers answers.

For example, if you've never had a data warehouse before, and you are in that stage, then creating your first one is kind of daunting, both from a price perspective, as well as complexity perspective or know-how. The same thing can occur on really any aspect -- textual data, unstructured data, or social sentiment.
Those are some of the major challenges -- complexity, cost, knowledge, and know-how.

Each one of those can appear daunting if you don't have a skill set, or don't have somebody walking you through that process who has done it before. Otherwise, it's trying to put your hands on every bit of data and consume what you can and learning through that process.

Those are some of the things that are really challenging, especially if you're a smaller firm that has a limited number of staff and there's this new demand from the line of business, because they want to go off in a different direction and have more understanding that they couldn't get out of existing systems.

How do you go out and attain that knowledge without duplicating the team, finding new vendor tools, and adding complexity to your environment, maybe even adding additional data sources, and therefore more data-storage requirements. Those are some of the major challenges -- complexity, cost, knowledge, and know-how.

Gardner: Why are mid-market organizations now more able to avail themselves of some of these values and benefits than in the past?

Mid-market skills

Wolken: As the products are well-known, there is more trained staff that understands the more common technologies. There are more codified ways of doing things that a business can take advantage of, because there's a large skill set, and most of the employees may already have that skill set as you bring them into the company.

There are also some advantages just in the way technologies have advanced over the years. Storage used to be very expensive, and then it got a little cheaper. Then solid-state drives (SSD) came along and then that got cheaper as well. There are some price point advantages in the coming years, as well.

Dell overall has maintained the status that we started with when Michael Dell started recreating PCs in his dorm room from standard product components to bring the price down. That model of making technology attainable to larger numbers of people has continued throughout Dell’s history, and we’re continuing it now with our information management software business.

We’re constantly thinking about how we can reduce cost and complexity for our customers. One example would be what we call Quickstart Data Warehouse. It was designed to democratize data to a lower price point, to bring the price and complexity down to a much lower space, so that more people can afford and have their first data warehouse.

We worked with our partner Microsoft, as well as Dell’s own engineering team, and then we qualified the box, the hardware, and the systems to work to the highest peak performance. Then, we scripted an upfront install mechanism that allows the process to be up and running in 45 minutes with little more than directing a couple of IP addresses. You plug the box in, and it comes up in 45 minutes, without you having to have knowledge about how to stand up, integrate, and qualify hardware and software together for an outcome we call a data warehouse.
We're trying to hit all of the steps, and the associated costs -- time and/or personnel costs – and remove them as much as we can.

Another thing we did was include Boomi, which is a connector to automatically go out and connect to the data sources that you have. It's the mechanism by which you bring data into it. And lastly, we included services, in case there were any other questions or problems you had to set it up.

If you have a limited staff, and if you have to go out and qualify new resources and things you don't understand, and then set them up and then actually run them, that’s a major challenge. We're trying to hit all of the steps, and the associated costs -- time and/or personnel costs – and remove them as much as we can.

It's one way vendors like Dell are moving to democratize business intelligence a little further, bring it to a lower price point than customers are accustomed too and making it more available to firms that either didn’t have that luxury of that expertise link sitting around the office, or who found that the price point was a little too high.

Gardner: You mentioned this concept of the tool chain several times -- being agnostic to the data type, holistic management, complete view, and then of course integrate it. What is it about the tool chain that accomplishes both a comprehensive value, but also allows it to be adopted on a fairly manageable path, rather than all at once?

Wolken: One of the things we find advantageous about entering the market at this point in time is that we're able to look at history, observe how other people have done things over time, and then invest in the market with the realization that maybe something has changed here and maybe a new approach is needed.

Different point of view

Whereas the industry has typically gone down the path of each new technology or advancement of technology requires a new tool, a new product, or a new technology solution, we’ve been able to stand back and see the need for a different approach. We just have a different point of view, which is that an agnostic tool chain can enable organizations to do more.

So when we look at database tools, as an example, we would want a tool that works against all database types, as opposed to one that works against only a single vendor or type of data.

The other thing that we look at is if you walk into an average company today, there are already a lot of things laying around the business. A lot of investment has already been made.

We wanted to be able to snap in and work with all of the existing tools. So, each of the tools that we’ve acquired, or have created inside the company, were made to step into an existing environment, recognize that there were other products already in the environment, and recognize that they probably came from a different vendor or work on a different data type.

That’s core to our strategy. We recognize that people were already facing complexity before we even came into the picture, so we’re focused on figuring out how we snap into what they already have in place, as opposed to a rip-and-replace strategy or a platform strategy that requires all of the components to be replaced or removed in order for the new platform to take its place.
We’ve also assembled a tool chain in which the entirety of the chain delivers value as a whole.

What that means is tools should be agnostic, and they should be able to snap into an environment and work with other tools. Each one of the products in the tool chain we’ve assembled was designed from that point of view.

But beyond that, we’ve also assembled a tool chain in which the entirety of the chain delivers value as a whole. We think that every point where you have agnosticism or every point where you have a tool that can abstract that lower amount of complexity, you have savings.

You have a benefit, whether it’s cost savings, employee productivity, or efficiency, or the ability to keep sanctioned data and a set of tools and systems that comprehend it. The idea being that the entirety of the tool chain provides you with advantages above and beyond what the individual components bring.

Now, we're perfectly happy to help a customer at any point where they have difficultly and any point where our tools can help them, whether it's at the hardware layer, from the traditional Dell way, at the application layer, considering a data warehouse or otherwise, or at the tool layer. But we feel that as more and more of the portfolio – the tool chain – is consumed, more and more efficiency is enabled.

Gardner: It also sounds as if this sets you up for a data and information lifecycle benefits, not just on the business and BI benefits, but also on the IT benefits.

Wolken: One of the problems that you uncover is that there's a lot of data being replicated in a lot of places. One of the advantages that we've put together in the tool chain was to use virtualization as a capability, because you know where data came from and you know that it was sanctioned data. There's no reason to replicate that to disk in another location in the company, if you can just reach into that data source and pull that forward for a data analyst to utilize.

You can virtually represent that data to the user, without creating a new repository for that person. So you're saving on storage and replication costs. So if you’re looking for where is there efficiency in the lifecycle of data and how can you can cut some of those costs, that’s something that jumps right out.

Doing that, you also solve the problem of how to make sure that the data that was provisioned was sanctioned. By doing all of these things, by creating a virtual view, then providing that view back to the analyst, you're really solving multiple pieces of the puzzle at the same time. It really enables you to look at it from an information-management point of view.

One of the advantages

Gardner: How should enterprises and mid-market firms get started?

Wolken: Most companies aren’t just out there asking how they can get a new tool chain. That's not really the strategy most people are thinking about. What they are asking is how do I get to the next stage of being an intelligent company? How do I improve my maturity in business intelligence? How would I get from Excel spreadsheets without a data warehouse to a data warehouse and centralized intelligence or sanctioned data?

Each one of these challenges come from a point of view of, how do I improve my environment based upon the goals and needs that I am facing? How do I grow up as a company and get to be more of a data-based company?

Somebody else might be faced with more specific challenges, such a line of business is now asking me for Twitter data, and we have no systems or comprehension to understand that. That's really the point where you ask, what's going to be my strategy as I grow and otherwise improve my business intelligence environment, which is morphing every year for most customers.
It's about incremental improvement as well as tangible improvement for each and every step of the information management process.

That's the way that most people would start, with an existing problem and an objective or a goal inside the company. Generically, over time, the approach to answering it has been you buy a new technology from a new vendor who has a new silo, and you create a new data mart or data warehouse. But this is perpetuating the idea that technology will solve the problem. You end up with more technologies, more vendor tools, more staff, and more replicated data. We think this approach has become dated and inefficient.

But if, as an organization, you can comprehend that maybe there is some complexity that can be removed, while you're making an investment, then you free yourself to start thinking about how you can build a new architecture along the way. It's about incremental improvement as well as tangible improvement for each and every step of the information management process.

So rather than asking somebody to re-architect and rip and replace their tool chain or the way they manage the information lifecycle, I would say you sort of lean into it in a way.

If you're really after a performance metric and you feel like there is a performance issue in an environment, at Dell we have a number of resources that actually benchmark and understand the performance and where bottlenecks are in systems.

Sometimes there’s an issue occurring inside the database environment. Sometimes it's at the integration layer, because integration isn’t happening as well as you think. Sometimes it's at the data warehouse layer, because of the way the data model was set up. Whatever the case, we think there is value in understanding the earlier parts of the chain, because if they’re not performing well, the latter parts of the chain can’t perform either.

And so at each step, we've looked at how you ensure the performance of the data. How do you ensure the performance of the integration environment? How do you ensure the performance of the data warehouse as well? We think if each component of the tool chain in working as well as it should be, then that’s when you enable the entirety of your solution implementation to truly deliver value.
Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: Dell Software.

You may also be interested in:

Monday, April 8, 2013

nCrypted Cloud adds security and privacy to cloud-based storage services for consumers and enterprises

Boston-based startup nCrypted Cloud recently launched software of the same name designed to address the security and privacy concerns that have emerged with the use of popular cloud-based storage services.

Available in consumer basic, consumer pro, and enterprise editions, nCrypted Cloud encrypts information stored on popular cloud services such as Dropbox, Google Drive and Microsoft’s SkyDrive. The software is as simple to use as the services it works with, says Nick Stamos, the CEO and Co-Founder of nCrypted Cloud, while offering the robustness and controls that enterprise IT departments need.

Stamos says nCrypted Cloud’s security privacy protections fill a glaring gap in cloud storage services today.
The promise of the cloud is 'put everything in the cloud and it will be available' – but that’s the problem as well as the promise.

“The promise of the cloud is 'put everything in the cloud and it will be available' – but that’s the problem as well as the promise,” says Stamos, who is also principal and founder of The Stamos Group.

Popular cloud-based storage services lack the security and privacy that enterprises require, yet employees are using them anyway -- with the rise of BYOD and mobility, users want access to files anytime, from anywhere. This leaves enterprise IT departments searching for a way to protect corporate information stored in the cloud.

In a related develoment, last month we reported on OwnCloud, Inc. and its release of the latest version of the ownCloud Community Edition with a number of usability, performance, and integration enhancements. The ownCloud file sync and share software, deployed on-premise, not only offers users greater control, but allows organizations to integrate existing security, storage, monitoring and reporting tools.

Mobile data management solutions have proven too restrictive and inflexible, said Stamos, while trying to implement corporate policies that prohibit employees from storing and accessing personal and corporate data from a mobile device is unreasonable. Enterprise IT needs a solution that users won’t attempt to work around, but will embrace, he says.

“We allow users to apply privacy controls to personal data, as well as corporate data, so that if an employee parts ways with a company he can revoke access to that personal data from a corporate device, and vice versa,” explains Stamos. “That makes it a value proposition that users feel comfortable with.” Meanwhile, enterprise security policies can be used to govern work files and allow for revocation of access if needed.

Enhance, not replace

One key distinction about nCrypted Cloud is that it works with existing cloud-storage services, instead of replacing them.

“We provide the same sort of native user experience … so it’s not disruptive end users. The last thing the world needed was a new storage provider,” says Stamos. “What people need is to be able to use the Dropbox they love…in the context of it being more secure by just being able to make folders private or share them securely. They can continue to have their data where it is and how it’s organized without being disruptive in any way, shape or form.”

nCrypted Cloud’s persistent client-side encryption ensures that data isn’t exposed and the software offers comprehensive key-management features to facilitate administration. When a user accesses corporate files from any device, her predefined access policies and sharing status is verified and keys for her user ID are sent to the device.
Users can easily access and share files in different cloud-based storage services and have a single-pane view of cloud and corporate file repositories.

That client caches keys for offline access to files, and keys can be removed if the access policies change. Users can easily access and share files in different cloud-based storage services and have a single-pane view of cloud and corporate file repositories.

The consumer basic version of nCrypted Cloud is available for free. The consumer pro version costs $5 per month and includes managed secure sharing, some file auditing, and the capability to manage files stored in different cloud services. The enterprise edition – which enters beta testing next week – well be priced at $10 per month. It includes all of the capabilities of the consumer pro version as well as  enhancements such as multiple identities, centralized provisioning and policy control and a full audit trail of 30-day archives. 

Downloads and more information are available at www.ncryptedcloud.com.

(BriefingsDirect contributor Cara Garretson provided editorial assistance and research on this post. She can be reached on LinkedIn.)

You may also be interested in:

Wednesday, April 3, 2013

On the road to Sydney: The Open Group gets to bottom of the latest in business architecture and enterprise transformation

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: The Open Group.

Later this month, The Open Group’s first conference in Australia will focus on "How Does Enterprise Architecture Transform an Enterprise?"

With special attention devoted to enterprise transformation, speakers and a variety of sessions will place the transformation in the context of such vertical industries as finance, defense, exploration, mining, and minerals.

As a prelude to the event, BriefingsDirect recently interviewed two of the main speakers at the conference -- Hugh Evans, the Chief Executive Officer of Enterprise Architects, a specialist enterprise architecture (EA) firm based in Melbourne, Australia, and Craig Martin, Chief Operations Officer and Chief Architect at Enterprise Architects.

As some background, Hugh is both the founder and CEO at Enterprise Architects. His professional experience blends design and business, having started out in traditional architecture, computer games design, and digital media, before moving into enterprise IT and business transformation.

In 1999, Hugh founded the IT Strategy Architecture Forum, which included chief architects from most of the top 20 companies in Australia. He has also helped found the Australian Architecture Body of Knowledge and the London Architecture Leadership Forum in the UK.

Since starting Enterprise Architects in 2002, Hugh has grown the team to more than 100 people, with offices in Australia, the UK, and the U.S.

With a career spanning more than 20 years, Craig has held executive positions in the communications, high tech, media, entertainment, and government markets and has operated as an Enterprise Architect and Chief Consulting Architect for a while.

In 2012, Craig became COO of Enterprise Architects to improve the global scalability of the organization, but he is also a key thought leader for strategy and architecture practices for all their clients and also across the EA field.

Craig has been a strong advocate of finding differentiation in businesses through identifying new mixes of business capabilities in those organizations. He advises that companies that do not optimize how they reassemble their capabilities will struggle, and he also believes that business decision making should be driven by economic lifecycles.

The interview was conducted by Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: What are some of the big problems that businesses are facing that architecture-level solutions can benefit?

Evans: I'll start with the trend in the industry around fast-paced change and disruptive innovation. You'll find that many organizations, many industries, at the moment in the U.S., Australia, and around the world are struggling with the challenges of how to reinvent themselves with an increasing number of interesting and innovative business models coming through.

Evans
For many organizations, this means that they need to wrap their arms around an understanding of their current business activities and what options they've got to leverage their strategic advantages.

We're seeing business architecture as a tool for business model innovation, and on the other side, we're also seeing business architecture as a tool that's being used to better manage risk, compliance, security, and new technology trends around things like cloud, big data, and so on.

Martin: Yes, there is a strong drive within the industry to try and reduce complexity. As organizations are growing, the business stakeholders are confronted with a large amount of information, especially within the architecture space. We're seeing that they're struggling with this complexity and have to make accurate and efficient business decisions on all this information.

Martin
What we are seeing, and based upon what Hugh has already discussed, is that some of those industry drivers are around disruptive business models. For example, we're seeing it with the likes of higher education, the utility space, and financial services space, which are the dominant three.

There is a lot of change occurring in those spaces, and businesses are looking for ways to make them more agile to adapt to that change, and looking towards disciplined architecture and the business-architecture discipline to try and help them in that process.

Gardner: Is there anything about the past 10 or 15 years in business practices that have led now to this need for a greater emphasis on that strategic architectural level of thinking?

Martin: A lot has to do with basically building blocks. We've seen a journey that’s traveled within the architecture disciplines specifically. We call it the commodification of the business, and we've seen that maturity in the IT space. A lot of processes that used to be innovative in our business are now becoming fairly utility and core to the business.

In any Tier 1 organization, a lot of the processes that used to differentiate them are now freely available in a number of vendor platforms, and any of their competitors can acquire those.

Looking for differentiation

So they are looking for that differentiation, the ability to be able to differentiate themselves from their competitors, and away from that sort of utility space. That’s a shift that’s beginning to occur. Because a lot of those IT aspects have become industrialized, that’s also moving up into the business space.

In other words, how can we now take complex mysteries in the business space and codify them? In other words, how can we create building blocks for them, so that organizations now can actually effectively work with those building blocks and string them together in different ways to solve more complex business problems.

Evans: EA is now around 30 years old, but the rise in EA has really come from the need for IT systems to interoperate and to create common standards and common understanding within an organization for how an IT estate is going to come together and deliver the right type of business value.

Through the '90s we saw the proliferation of technologies as a result of the extension of distributed computing models and the emergence of the Internet. We've seen now the ubiquity of the Internet and technology across business. The same sort of concepts that ring true in technology architecture extend out into the business, around how the business interoperates with its components.
This type of thinking enables organizations to change more rapidly.

The need to change very fast for business, which is occurring now in the current economy, with the entrepreneurship and the innovation going on, is seeing this type of thinking come to the fore. This type of thinking enables organizations to change more rapidly. The architecture itself won't make the organization change rapidly, but it will provide the appropriate references and enable people to have the right conversations to make that happen.

Business architecture, as well as strategic architecture, is still quite a nascent capability for organizations, and many organizations are really still trying to get a grip on this. The general rule is that organizations don’t manage this so well at the moment, but organizations are looking to improving in this area, because of the obvious, even heuristic, payoffs that you get from being better organized.
You end up spending less money, because you're a more efficient organization, and you end up delivering better value to customers, because you're a more effective organization. This efficiency and effectiveness need within organizations is worth the price of investment in this area.

The actual tangible benefits that we're seeing across our customers includes reduced cost of their IT estate.

Meeting profiles

You have improved security and improved compliance, because organizations can see where their capabilities are meeting the various risk and compliance profiles, and you are also seeing organizations bring products to market quicker.

The ability to move through the product management process, bring products to market more rapidly, and respond to customer need more rapidly puts organizations in front and makes them more competitive.

The sorts of industries we're seeing acting in this area would include the postal industry, where they are moving from a traditional mail- to parcels, which is a result of a move towards online retailing. You're also seeing it in the telco sector and you're seeing it in the banking and finance sector.

In the banking and finance sector, we've also seen a lot of this investment driven by the merger and acquisition (M&A) activity that’s come out of the financial crisis in various countries where we operate. These organizations are getting real value from understanding where the enterprise boundaries are, how they bring the business together, how they better integrate the organizations and acquisitions, and how they better divest.
We're seeing, especially at the strategic level, that the architecture discipline is able to give business decision makers a view into different strategic scenarios.

Martin: We're seeing, especially at the strategic level, that the architecture discipline is able to give business decision makers a view into different strategic scenarios.

For example, where a number of environmental factors and market pressures would have been inputs into a discussion around how to change a business, we're also seeing business decision makers getting a lot of value from running those scenarios through an actual hypothesis of the business model.

For example, they could be considering four or five different strategic scenarios, and what we are seeing is that, using the architecture discipline, it's showing them effectively what those scenarios look like as they cascade through the business. It's showing the impact on capabilities, on people and the approaches and technologies, and the impact on capital expenditures (CAPEX) and operational expenditures (OPEX).

Those views of each of those strategic scenarios allows them to basically pull the trigger on the better strategic scenario to pursue, before they've invested all of their efforts and all that analysis to possibly get to the point where it wasn’t the right decision in the first place. So that might be referred to as sort of the strategic enablement piece.

We're also seeing a lot of value for organizations within the portfolio space. We traditionally get questions like, "I have 180 projects out there. Am I doing the right things? Are those the right 180 projects, and are they going to help me achieve the types of CAPEX and OPEX reductions that I am looking for?"

With the architecture discipline, you don’t take a portfolio lens into what’s occurring within the business. You take an architectural lens, and you're able to give executives an overview of exactly where the spend is occurring. You give them an overview of where the duplication is occurring, and where the loss of cohesion is occurring.

Common problems

A common problem we find, when we go into do these types of gigs, is the amount of duplication occurring across a number of projects. In a worst-case scenario, 75 percent of the projects are all trying to do the same thing, on the same capability, with the same processes.

So there’s a reduction of complexity and the production of efforts that’s occurring across the organizations to try and bring it and get it into more synergistic sessions.

We're also seeing a lot of value occurring up at the customer experience space. That is really taking a strong look at this customer experience view, which is less around all of the underlying building blocks and capabilities of an organization and looking more at what sort of experiences we want to give our customer? What type of product offerings must we assemble, and what underlying building blocks of the organization must be assembled to enable those offerings and those value propositions?

That sort of traceability through the cycle gives you a view of what levers you must pull to optimize your customer experience. Organizations are seeing a lot of value there and that’s basically increasing their effectiveness in the market and having a direct impact on their market share.
What type of product offerings must we assemble, and what underlying building blocks of the organization must be assembled to enable those offerings and those value propositions?

And that’s something that we see time and time again, regardless of what the driver was behind the investment in the architecture project, seeing the team interact and build a coalition for action and for change. That’s the most impressive thing that we get to see.

Gardner: Let’s drill down a little bit into some of what you'll be discussing at the conference in Sydney in April. One of the things that’s puzzling to me, when I go to these Open Group Conferences, is to better understand the relationship between business architecture and IT architecture and where they converge and where they differ. Perhaps you could offer some insights and maybe tease out what some discussion points for that would be at the conference.

Martin: That’s actually quite a hot topic. In general, the architecture discipline has grown from the IT space, and that’s a good progression for it to take, because we're seeing the fruits of that discipline in how they industrialize IT components.

We're seeing the fruits of that in complex enterprise resource planning (ERP) systems, the modularization of those ERP systems, their ability to be customized, and adapt to businesses. It’s a fairly mature space, and the natural progression of that is to apply those same thinking patterns back up into the business space.

In order for this to work effectively well, when somebody asks a question like that, we normally respond with a "depends" statement. We have in this organization a thing called the mandate curve, and it relates to what the mandate is within the business. What is the organization looking to solve?

Are they looking to build an HR management system? Are they looking to gain efficiencies from an enterprise-wide ERP solution? Are they looking to reduce the value chain losses that they're having on a monthly basis? Are they looking to improve customer experience across a group of companies? Or are they looking to improve shareholder value across the organization for an M&A, or maybe reduce cost-to-income.

Problem spaces

Those are some of the problem spaces, and we often get into that mind space to ask, "Those are the problems that you are solving, but what mandate is given to architecture to solve them?" We often find that the mandate for the IT architecture space is sitting beneath the CIO, and the CIO tends to use business architecture as a communication tool with business. In other words, to understand business better, to begin to apply architecture rigor to the business process.

Evans: It’s interesting, Dana. I spent a lot of time last year in the UK, working with the team across a number of business-architecture requirements. We were building business-architecture teams. We were also delivering some projects, where the initial investigation was a business-architecture piece, and we also ran some executive roundtables in the UK.

One thing that struck me in that investigation was the separation that existed in the business-architecture community from the traditional enterprise and technology architecture or IT architecture communities in those organizations that we were dealing with.

One insurance company, in particular, that was building a business-architecture team was looking for people that didn’t necessarily have an architecture background, but possibly could apply that insight. They were looking for deep business domain knowledge inside the various aspects of the insurance organization that they were looking to cover.

So to your question about the relationship between business architecture and IT architecture, where they converge and how they differ, it’s our view that business architecture is a subset of the broader EA picture and that these are actually integrated and unified disciplines.
We're going to see more convergence between these two groups, and that’s certainly something that we are looking to foster in EA.

However, in practice you'll find that there is often quite a separation between these two groups. I think that the major reason for that is that the drivers that are actually creating the investment for business architecture are actually now from coming outside of IT, and to some extent, IT is replicating that investment to build the engagement capability to engage with business so that they can have a more strategic discussion, rather than just take orders from the business.

I think that over this year, we're going to see more convergence between these two groups, and that’s certainly something that we are looking to foster in EA.


Gardner: I just came back from The Open Group Conference in California a few weeks ago, where the topic was focused largely on big data, but analysis was certainly a big part of that. Now, business analysis and business analysts, I suppose, are also part of this ecosystem. Are they subsets of the business architect? How do you see the role of business analysts now fitting into this, given the importance of data and the ability for organizations to manage data with new efficiency and scale?

Martin: Once again, that's also a hot topic. There is a convergence occurring, and we see that across the landscape, when it comes to the number of frameworks and standards that people certify on. Ultimately, it comes to this knife-edge point, in which we need to interact with the business stakeholder and we need to elicit requirements from that stakeholder and be able to model them successfully.

The business-analysis community is slightly more mature in this particular space. They have, for example, the Business Analysis Body of Knowledge (BABOK). Within that space, they leverage a competency model, which in effect goes through a cycle, from an entry level BA, right up to what they refer to as the generalist BA, which is where they see the start of the business-architecture role.

Career path

There's a career path from a traditional business analyst role, which is around requirements solicitation and requirements management, which seems to be quite project focused. In other words, dropping down onto project environments, understanding stakeholder needs and requirements, and modeling those and documenting them, helping the IT teams model the data flows, the data structures but with a specific link into the business space.

As you move up that curve, you get into the business-architecture space, which is a broader structural view around how all the building blocks fit together. In other words, it’s a far broader view than what the business analyst traditional part would take, and looks at a number of different domains. The business architect tends to focus a lot on, as you mentioned, the information space, and we see a difference between the information and the data space.

So the business architect is looking at performance, market-related aspects, and  customer, information, as well as the business processes and functional aspects of an organization.

You can see that the business analysts could almost be seen as the soldiers of these types of functions. In other words, they're the guys that are in the trenches seeing what's working on a day-to-day basis. They've got a number of tools that they're equipped with, which for example the BABOK has given them.

And there are all different ways and techniques that they are using to elicit those requirements from various business stakeholders, until they move out that curve up into the business architecture and strategic architecture space.
There is certainly a pattern emerging, and there are great opportunities for business analysts to come across into the architecture sphere.

Gardner: Craig, in your presentation at The Open Group Conference in Sydney, what do you hope to accomplish?

Martin: How do I build cohesion in an organization? How do I look at different types of scenarios that I can execute against? What are the better ways to assemble all the efforts in my organization to achieve those outcomes? That’s taking us through a variety of examples that will be quite visual. 

We'll also be addressing the specific role of where we see the career path and the complementary nature of the business analyst and business architect, as they travel through the cycle of trying to operate at a strategic level and as a strategic enabler within the organization.

Gardner: How do you often find that enterprises get beyond the inertia and into this discussion about architecture and about the strategic benefits of it?

Martin: We often have two market segments, where a Tier 1 type company would want to build the capability themselves. So there's a journey that we need to take them on around how to have a business-architecture capability while delivering the actual outcomes?

Tier 2 and Tier 3 clients often don’t necessarily want to build that type of capability, so we would focus directly on the outcomes. And those outcomes start with two views. Traditionally, we're seeing the view driven almost on a bottom-up view, as the sponsors of these types of exercises try to get credibility within the organization.
It's not just a bunch of building blocks, but the actual outcome of each of those building blocks and how does it match something like a business-motivation model.

That relates to helping the clients build what we refer to as the utility of the business-architecture space. Our teams go in and, in effect, build a bunch of what we refer to as anchor models to try and get a consistent representation of the business and a consistent language occurring across the entire enterprise, not just within a specific project.

And that gives them a common language they can talk about, for example, common capabilities and common outcomes that they're looking to achieve. In other words, it's not just a bunch of building blocks, but the actual outcome of each of those building blocks and how does it match something like a business-motivation model.

They also look within each of those building blocks to see what the resources are that creates each of those building blocks -- things like people, process and tools. How do we mix those resources in the right way to achieve those types of outcomes that the business is looking for? Normally, the first path that we go through is to try to get that sort of consistent language occurring within an organization.

As an organization matures, that artifact starts to lose its value, and we then find that, because it has created a consistent language in the organization, you can now overlay a variety of different types of views to give business people insights. Ultimately, they don’t necessarily want all these models, but they actually want insight into their organizations to enable them to make decisions.

We can overlay objectives, current project spend, CAPEX, and OPEX. We can overlay where duplication is occurring, where overspend is occurring, where there's conflict occurring at a global scale around duplication of efforts, and with the impact of costs and reduction and efficiencies, all of those types of questions can be answered by merely overlaying a variety of views across this common language.

Elevating the value

That starts to elevate the value of these types of artifacts, and we start to see our business sponsors walking into meetings with all of these overlays on them, and having conversations between them and their colleagues, specifically around the insights that are drawn from these artifacts. We want the architecture to tell the story, not necessarily lengthy PowerPoint presentations, but as people are looking at these types of artifacts, they are actually seeing all the insights that come specifically from it.

The third and final part is often around the business getting to a level of maturity, in that they're starting to use these types of artifacts and then are looking for different ways that they can now mix and assemble. That’s normally a sign of a mature organization and the business-architecture practice.

They have the building blocks. They've seen the value or the types of insights that they can provide. Are there different ways that I can string together my capabilities to achieve different outcomes? Maybe I have got different critical success factors that I am looking to achieve. Maybe there are new shift or new pressures coming in from the environment.

How can I assemble the underlying structures of my organization to better cope with it? That’s the third phase that we take customers through, once they get to that level of maturity.

Evans: I agree with Craig on the point that, if you show the business what can actually be delivered such as views on a page that elicit the right types of discussions and that demonstrate the issues, when they see what they're going to get delivered, typically the eyes light up and they say, "I want one of those things."
It's not just enough to know the answer. You have to know how to engage somebody with the material.

The thing with architecture that I have noticed over the years is that architecture is done by a lot of very intelligent people, who have great insights and great understanding, but it's not just enough to know the answer. You have to know how to engage somebody with the material. So when the architecture content that’s coming through is engaging, clear, understandable, and can be consumed by a variety of stakeholders, they go, "That’s what I want. I want one of those."

So my advice to somebody who is going down this path is that if they want to get support and sponsorship for this sort of thing, make sure they get some good examples of what gets delivered when it's done well, as that’s a great way to actually get people behind it.
Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: The Open Group.

You may also be interested in:


Monday, March 25, 2013

Indiana health care provider goes fully virtualized, gains head start on BYOD and DR benefits

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: VMware.

This BriefingsDirect IT leadership interview focuses on how Associated Surgeons and Physicians, LLC in Indiana went from zero to 100 percent virtualized infrastructure, and how many compliance and efficiency goals have been met and exceeded as a result.

In part one of a two-part sponsored interview series, we discuss how a mid-market health services provider rapidly adopted server and client virtualization, and how that quickly lead to the ability to move to mobile, bring your own device (BYOD), and ultimately advanced disaster recovery (DR) benefits.

Associated Surgeons and Physicians found the right prescription for allowing users to designate and benefit from their own device choices, while also gaining an ability to better manage sensitive data and to create a data-protection lifecycle approach.

Here to share his story on how they did it, we welcome, Ray Todich, Systems Administrator at Associated Surgeons and Physicians. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. [Disclosure: VMware is a sponsor of BriefingsDirect podcasts.]

Here are some excerpts:
Gardner: When I go to the physician’s office, I see how they've gotten so efficient at moving patients in and out, the scheduling is amazing. Every minute is accounted for. Downtime is just very detrimental and backs up everything. This critical notion of time management is so paramount.
The ability that virtualization gives us is the core or heart of the entire infrastructure of the business.

Todich: Oh, it’s absolutely massive. If we have a snag somewhere, or even if our systems are running slow, then everything else runs slow. The ability that virtualization gives us is the core or heart of the entire infrastructure of the business. Without an efficient heart, blood doesn’t move, and we have a bigger problem on our hands.

Gardner: So over the past 10 or 15 years, as you pointed out, technology has just become so much more important to how a health provider operates, how they communicate to the rest of the world in terms of supplies, as well as insurance companies and payers, and so forth. Tell me a little bit about Associated Surgeons and Physicians. How big is the organization, what do you do, how have they been growing?

Todich: Pretty rapidly. Associated Surgeons and Physicians is a group of multi-specialty physicians and practices in Northeast Indiana and Northwest Ohio.

It began at the practice level, and then it really expanded. We're up to, I think, 14 additional locations and/or practices that have joined. We're also using an electronic medical record (EMR) application, given to us by Greenway, and that’s a big one that comes in.

We're growing exponentially. It went from one or two satellite practices that needed to piggyback Greenway, to probably 13 or 14 of them, and this is only the beginning. With that type of growth rate, you have to concern yourself with the amount of money it costs to serve everybody. If you have one physical server that goes out, you affect hundreds of users and thousands of patients, doctors, and whatnot. It’s a big problem, and that’s where virtualization came in strong.

Gardner: How about this in terms of the size of the organization? How many seats are you accommodating in terms of client, and then what is it about an IT approach to an organization such as yours that also makes virtualization a good fit?

Todich: Right now, we have somewhere around 300 employees. As far as how many clients this overall organization has, it’s thousands. We have lots of people who utilize the organization. The reality is that the IT staff here is used in a minimalist approach, which is one thing that I saw as well when I was coming into this.

One or even two persons to manage that many servers can be a nightmare, and on top of that, you try to do your best to help all the users. If you have 300-plus people and their desktops, printers, and so forth, so the overall infrastructure can be pretty intimidating, when you don’t have a lot of people managing it.

Going virtual was a lifesaver. Everything is virtualized. You have a handful of physical ESX hosts that are managing anything, and everything is stored on centralized storage. It makes it considerably efficient as an IT administrator to utilize virtualization.

The right answer

That’s actually how we went into the adoption of VMware View, because of 300-plus users, and 300-plus desktops. At that point, it can be very hairy. At times, you have to try and divine what the right answer is. You have this important scenario going on, and you have this one and another one, and how do you manage them all. It becomes easier, when you virtualize everything, because you can get to everything very easily and cover everyone’s desktops.

Gardner: What attracted you, at the beginning, to go to much higher total levels of server -- and then client -- virtualization.

Todich: When I first started here, the company was entirely physical. And as background, I came from a couple of companies that utilized virtualization at very high levels. So I'm very aware of the benefits, as far as administration, and the benefits of overall redundancy and activities -- the software and hardware used to allow high performance, high availability, access to people’s data, and still allow security be put in place.

Todich
When I came in, it looked like something you might have seen maybe 15 years ago. There were a lot of older technologies in place. The company had a lot of external drives hanging off the servers for backups, and so on.

My first thing to implement was server virtualization, which at the time, was the vSphere 4.1 package. I explained to them what it meant to have centralized storage, what it meant to have ESX host, and how creating virtual machines (VMs) would benefit them considerably over having physical servers in the infrastructure.

I gave them an idea on how nice it is to have alternate redundancy configured correctly, which is very important. When hardware drops out, RAID configuration goes south, or the entire server goes out, you've just lost an entire application -- or applications -- which in turn gives downtime.

I helped them to see the benefits of going virtualized, and at that time, it was solely for the servers.

Gardner: How long did it take you to go from being 100 percent physical to where you are now, basically 100 percent virtual?
VMware, in itself, has the ability to reach out as far and wide as you want it to. It’s really up to the people who are building it.

Todich: We've been going at it for about about a year-and-a-half. We had to build the infrastructure itself, but we had to migrate all our applications from physical to virtual (P2V). VMware does a wonderful job with its options for using P2V. It’s a time saver as well. For anybody who has to deal with the one that’s building the house itself, it can really be a help.

VMware, in itself, has the ability to reach out as far and wide as you want it to. It’s really up to the people who are building it. It was very rapid, and it’s so much quicker to build servers or desktops, once you get your infrastructure in place.

In the previous process of buying a server, in which you have to get it quoted out and make sure everything is good, do all the front-end sales stuff, and then you have to wait for the hardware to get here. Once it’s here, you have to make sure it’s all here, and then you have to put it altogether and configure everything, so forth. Any administrator out there who's done this understands exactly what that’s all about.

Then you have to configure and get it going, versus, "Oh, you need another server, here, right click, deploy from template," and within 10 minutes you have a new server. That, all by itself, is priceless.

Technology more important

Gardner: And you have a double whammy here, because you're a mid-market size company and don’t have a large, diversified IT staff to draw on. At the same time, you have branch offices and satellites, so you're distributed. To have people physically go to these places is just not practical. What is it about the distributed nature of your company that also makes virtualization and View 5.1 a good approach for a lean IT organization?

Todich: It helped us quite a bit, first and foremost, with the ability to give somebody a desktop, even if they were not physically connected to our network. That takes place a lot here.We have a lot of physicians who may be working inside of another hospital at the time.

Instead of them creating a VPN connection back into our organization, VMware View gave them the ability to have a client on their desktop, whether it be a PC, a MacBook, an iPod, an iPad, or whatever they have, even a phone, if they really want to go that route. They can connect anywhere, at anytime, as long as they have an Internet connection and they have the View client. So that was huge, absolutely huge.
It helped us quite a bit, first and foremost, with the ability to give somebody a desktop, even if they were not physically connected to our network.

They also have the ability to use PC-over-IP, versus RDP, That’s very big for us as well. It keeps the efficiency and the speed of the machines moving. If you're in somebody else’s hospital, you're bound to whatever network you are attached to there, so it really helps and it doesn’t bother their stuff as much. All you're doing is borrowing their Internet and not anything else.

Gardner: Tell me a bit more about your footprint. We've spoken about vSphere 4.1 and adopting along the path of 5.1. You even mentioned View. What else are you running there to support this impressive capabilities set?

Todich: We moved from vSphere 4.1 to 5.1, and going to VMware View. We use 5.1 there as well. We decided to utilize the networking and security vCloud Networking package, which at the time was a package called vShield. When we bought it, everything changed, nomenclature wise, and some of the products were dispersed, which actually was more to our benefit. We're very excited about that.

As far as our VDI deployment, that gave us the ability to use vShield Endpoint, which takes your anti-virus and offloads it somewhere else on the network, so that your hosts are not burdened with virus scans and updates. That’s a huge.

The word huge doesn’t even represent how everybody feels about that going away. It's not going away physically, just going away to another workhorse on the network so that the physicians, medical assistants (MAs), and everybody else isn’t burdened with, "Oh, look, it's updating," or "Look, it's scanning something." It's very efficient.

Network and security

Gardner: You mentioned the networking part of this, which is crucial when you're going across boundaries and looking for those efficiencies. Tell me a bit more about how the vCloud networking and security issues have been impacted.

Todich: That was another big one for us. Along with that the networking and security package comes a portion of the package called the vShield Edge, which will ultimately give us the ability to create our own DMZ the way that we want to create it, something that we don’t have at this time. This is very important to us.

Utilizing the vShield Edge package was fantastic, and yet another layer of security as well. Not only do we have our physical hardware, our guardians at the gate, but we also have another layer, and the way that it works, wrapping itself around each individual ESX host, is absolutely beautiful. You manage it just like you manage firewalls. So it’s very, very important.

Plus, some of the tools that we were going to utilize we felt most comfortable in, as far as security servers for the VDI package, that you want them sitting in a DMZ. So, all around, it really gave us quite a bit to work with, which we're very thankful for.
I'm a firm believer that centralized storage, and even more the virtualized centralized storage, is the answer to many, many, many issues.

Gardner: One of the things, of course, that’s key in your field is compliance and there's a lot going on with things like HIPAA, documents, and making sure the electronic capabilities are there for payers and provided. Tell me a bit about compliance and what you've been able to achieve with these advancements in IT?

Todich: With compliance, we've really been able to up our security, which channels straight into HIPAA. Obviously, HIPAA is very concerned with people’s data and keeping it private. So it’s a lot easier to manage all our security in one location.

With VDI, it's been able to do the same. If we need to make any adjustments security wise, it’s simply changing a golden image for our virtual desktop and then resetting everybody's desktops. It’s absolutely beautiful, and the physicians are very excited about it. They seem to really get ahold of what we have done with the ability that we have now, versus the ability we had two years ago. It does wonders.

Upgrading to a virtual infrastructure has helped us considerably in maintaining and increasing meaningful use expectations, with the ability to be virtual and have the redundancy that gives, along with the fact that VMs seem to run a lot more efficiently virtually. We have better ways to collect data, a lot more uptime, and a lot more efficiency, so we can collect more data from our customers.

Exceeding expectations

The more people come through, the more data is collected, the more uptime is there, the more there are no problems, which in turn has considerably helped meeting and exceeding the expectations of what's expected with meaningful use, which was a big deal.

Gardner: I've heard that term "meaningful use" elsewhere. What does that really mean? Is that just the designation that some regulatory organization has, or is that more of a stock-in-trade description?

Todich: My understanding of it, as an IT administrator, is basically the proper collection of people's data and keeping it safe. I know that it has a lot in with our EMR application, and what is collected when our customers interact with us.

Gardner: Are there any milestones or achievements you've been able to make in terms of this adoption, such as behaviors and then the protection of the documents and privacy data that has perhaps moved you into a different category and allows you to move forward on some of these regulatory designations?

Todich: It's given us the ability to centralize all our data. You have one location, when it comes to backing up and restoring, versus a bunch of individual physical servers. So data retention and protection has really increased quite a bit as far as that goes.

Gardner: How about DR?

Disaster recovery

Todich: With DR, I think there are a lot of businesses out there that hear that and don’t necessarily take it that seriously, until disaster hits. It’s probably the same thing with people and tornadoes. When they're not really around, you don’t really care. When all of a sudden, a tornado is on top of your house, I bet you care then.

VMware gives you the ability to do DR on a variety of different levels, whether it’s snapshotting, or using Site Recovery Manager, if you have a second data center location. It’s just endless.

One of the most important topics that can be covered in an IT solution is about our data. What happens if it stops or what happens if we lose it? What can we do to get it back, and how fast, because once data stops flowing, money stops flowing as well, and nobody wants that.

It’s important, especially if you're recording people’s private health information. If you lose certain data that’s very important, it’s very damaging across the board. So to be able to retain our data safely is of the highest concern, and VMware allows us to do that.

Also, it’s nice to have the ability to do snapshotting as well. Speaking of servers and whatnot, I'll have to lay it on that one, because in IT, everybody knows that software upgrades come. Sometimes, software upgrades don’t go the way that they're supposed to, whether it’s an EMR application, a time-saving application, or ultrasounds.
If it doesn’t work out in your favor, you have the ability to delete that snapshot and you're back to where you started from before the migration.

If you take a snapshot before the upgrade and run your upgrade on that snapshot, if everything goes great and everybody is satisfied. You can just merge the snapshot with the primary image and you are good to go.

If it doesn’t work out in your favor, you have the ability to delete that snapshot and you're back to where you started from before the migration, which was hopefully a functioning state.

Gardner: Let’s look to the future a bit. It sounds as if with these capabilities and the way that you've been describing DR benefits, you can start to pick and choose data center locations, maybe even thinking about software-defined networking and data center. That then allows you to pick and choose a cloud provider or a hosting model. So are you thinking about being able to pick up and virtually move your entire infrastructure, based on what makes sense to your company over the next say 5 or 10 years.

Todich: That’s exactly right, and the way this is growing, something that's been surfacing a lot in our neck of the woods is the ability to do hosting and provide cloud-based solutions, and VMware is our primary site on that as well.

But, if need be, if we had to migrate our data center from one state to another, we'll have the option to do that, which is very important, and it helps with uptime as well. Stuff happens. I mean, you can be at a data center physically and something happens to a generator that has all the power. All of a sudden, everybody is feeling the pain.

So with the ability to have the Site Recovery, it’s priceless, because it just goes to location B and everybody is still up. You may see a blip or you may not, and nothing is lost. That leaves everybody to deal with the data-center issue and everything is still up and going, which is very nice.

Creating redundancy

Gardner: I imagine too, Ray, that it works both ways. On one hand, you have a burgeoning ecosystem of cloud and hosting, of providers and options, that you can pursue, do your cost benefit analysis, think about the right path, and create redundancy.

At the same time, you probably have physicians or individual, smaller physician practices, that might look to you and say, "Those guys are doing their IT really well. Why don’t we just subscribe to their services or piggyback on their infrastructure?" Do you have any thoughts about becoming, in a sense, an IT services provider within the healthcare field? It expands your role and even increases your efficiency and revenues.

Todich: Yes, our sights are there. As a matter of fact, our heads are being turned in that direction without even trying to, because a lot of people are doing that. It’s a lot easier for smaller practices, instead of buying all the infrastructure and putting it all in place to get everything up, and then maintaining it, we will house it for you. We'll do that.
Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: VMware.

You may also be interested in:

Friday, March 22, 2013

ownCloud debuts cloud tool to give organizations more control over file sync and software

OwnCloud, Inc. recently released the latest version of the ownCloud Community Edition with a number of usability, performance, and integration enhancements.

Based on an open-source project of the same name, the ownCloud file sync and share software, deployed on-premise, not only offers users greater control, but allows organizations to integrate existing security, storage, monitoring and reporting tools, while still taking advantage of the software’s simplicity and flexibility.

File sync and share services like Dropbox, Google Docs, and Box Inc. have revolutionized the way users share information. These cloud-based services make it easy to share files with clean interfaces and seemingly endless amounts of storage. However, not everyone wants to turn over their information to a service provider – for those who prefer to control how and where their data is stored there’s ownCloud. 
We’ve completely revamped the design with a much simplified interface so you can differentiate the navigation elements and focus on what you want to work with.

OwnCloud comes in a free, community edition, and the company will launch a commercially supported enterprise edition of the software in the second quarter. That version will targeting enterprise IT departments in need of on-premise file sync and share for sensitive corporate data. The company estimates it has more than 750,000 users worldwide today.

In the latest offering, the user interface has been streamlined, so that the main web navigation panel is now clearly differentiated from in-app navigations, says Markus Rex, CEO of ownCloud. And the way in which the software’s settings are laid out have been revamped, making it easier to distinguish personal settings from app-specific settings, he says.

“We’ve completely revamped the design with a much simplified interface so you can differentiate the navigation elements and focus on what you want to work with, instead of distracting from that,” says Rex.

New features

This version of ownCloud also features a Deleted Files app that lets users restore accidentally deleted files and folders, and improved app management, so that third-party apps can be easily installed from the central apps repository and automatically removed from the server, if disabled. Also included is a new search engine that lets users find files stored by both name and by content, thanks to the Lucene-based full text search engine app, and a new antivirus feature courtesy of Clam AV scans uploaded files for malware. This release also includes improved contacts, calendar and bookmarks, says Rex.

Performance benefits in this release come from improved file cache and faster syncing of the desktop client, according to company officials. Externally mounted file systems such as Google Drive, Dropbox, FTP and others can be scanned on-demand and in the background to increase performance. And hybrid clouds can be created by mixing and matching storage, thanks to file system abstraction that offers more flexibility and greater performance.
You can get to the data in all of your data silos from one spot on a mobile client or desktop client, so you can get to files you might not be able to access otherwise from those devices.

“You can get to the data in all of your data silos from one spot on a mobile client or desktop client, so you can get to files you might not be able to access otherwise from those devices,” says Rex.

This release features improved integration with LDAP and Active Directory and an enhanced external storage app to boost performance of integrated secondary storage including Dropbox, Swift, FTP, Google Docs, Amazon S3, WebDAV and external ownCloud servers.

(BriefingsDirect contributor Cara Garretson provided editorial assistance and research on this post. She can be reached on LinkedIn.)

You may also be interested in: