Monday, October 5, 2015

How fast analytics changes the game and expands the market for big data value

The next BriefingsDirect big-data thought leadership discussion highlights how fast analytics -- or getting to a big data analysis value in far less time than before -- expands the market for advanced data infrastructure to gain business insights.

We'll learn how bringing analytics to a cloud services model also allows smaller and less data-architecture-experienced firms to benefit from the latest in big-data capabilities. And we'll explore how Dasher Technologies is helping to usher in this democratization of big data value to more players in less time.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or  download a copy.

To share how a fast ramp-up for big data as a service has evolved, we're joined by Justin Harrigan, Data Architecture Strategist at Dasher Technologies, as well as Chris Saso, Senior Vice President of Technology at Dasher Technologies in Campbell, California. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Justin, how have big-data practices changed over the past five years to set the stage for rapid leveraging of big-data capabilities?

Harrigan: Back in 2010, we saw big data become mainstream. Hadoop became a household name in the IT industry, doing scale-out architectures. Linux databases were becoming common practice. Moving away from traditional legacy, smaller, slower databases allowed this whole new world of analytics to open up to previously untapped resources within companies. So data that people had just been sitting on could now be used for actionable insights.

Harrigan
Fast forward to 2015, and we've seen big data become more approachable. Five years ago, only the largest organizations or companies that were specifically designed to leverage big-data architectures could do so. The smaller guys had maybe a couple of hundred or even tens of terabytes, and it required too much expertise or too much time and investment to get a big-data infrastructure up and running.

Today, we have approachable analytics, analytics as a service, hardened architectures that are almost turnkey with back-end hardware, database support, and applications -- all integrating seamlessly. As a result, the user on the front end, who is actually interacting with the data and making insights, is able to do so with very little overhead, very little upkeep, and is able to turn that data into business-impact data, where they can make decisions for the company.

Gardner: Justin, how big of an impact has this had? How many more types of companies or verticals have been enabled to start exploring advanced, cutting-edge, big-data capabilities? Is this a 20 percent increase? Perhaps almost any organization that wants to can start doing this.

Tipping point

Harrigan: The tipping point is when you outgrow your current solutions for data analytics. Data analytics is nothing new. We've been doing it for more than 50 years with databases. It’s just a matter of how big you can get, how much data you can put in one spot, and then run some sort of query against it and get a timely report that doesn’t take a week to come back or that doesn't time out on a traditional database.

Saso
Almost every company nowadays is growing so rapidly with the type of data they have. It doesn’t matter if you're an architecture firm, a marketing company, or a large enterprise getting information from all your smaller remote sites, everyone is compiling data to create better business decisions or create a system that makes their products run faster.

For people dipping their toes in the water for their first larger dataset analytics, there's a whole host of avenues available to them. They can go to some online providers, scale up a database in a couple of minutes, and be running.

They can download free trials. HP Vertica has a community edition, for example, and they can load it on a single server, up to terabytes, and start running there. And it’s significantly faster than traditional SQL.

It’s much more approachable. There are many different flavors and formats to start with, and people are realizing that. I wouldn’t even use the term big data anymore; big data is almost the norm.

Gardner: I suppose maybe the better term is any data, anytime.

Harrigan: Any data, anytime, anywhere, for anybody.

Gardner: I suppose another change over the past several years has been an emphasis away from batch processing, where you might do things at an infrequent or occasional basis, to this concept that’s more applicable to a cloud or an as-a-service model, where it’s streaming, continuous, and then you start reducing the latency down to getting close to real time.

Are we starting to see more and more companies being able to compress their feedback, and start to use data more rapidly as a result of this shift over the past five years or so?

Harrigan: It’s important to address the term big data. It’s almost like an umbrella, almost like the way people use cloud. With big data, you think large datasets, but you mentioned speed and agility. The ability to have real-time analytics is something that's becoming more prevalent and the ability to not just run a batch process for 18 hours on petabytes of data, but having a chart or a graph or some sort of report in real time. Interacting with it and making decisions on the spot is becoming mainstream.

We did a blog post on this not long ago, talking about how instead of big data, we should talk about the data pipe. That’s data ingest or fast data, typically OLTP data, that needs to run in memory or on hardware that's extremely fast to create a data stream that can ingest all the different points, sensors, or machine data that’s coming in.

Smarter analysis

Then we've talked about smarter analytic data that required some sort of number-crunching dataset on data that was relevant, not data that was real-time, but still fairly new, call it seven days or older and up to a year. And then, there's the data lake, which essentially is your data repository for historical data crunching.

Those are three areas you need to address when you talk about big data. The ability to consume that data as a service is now being made available by a whole host of companies in very different niches.

It doesn’t matter if it’s log data or sensor data, there's probably a service you can enable to start having data come in, ingest it, and make real-time decisions without having to stand up your own infrastructure.

Gardner: Of course, when organizations try to do more of these advanced things that can be so beneficial to their business, they have to take into consideration the technology, their skills, their culture -- people, process and technology, right?

Chris, tell us a bit about Dasher Technologies and how you're helping organizations do more with big-data capabilities, how you address this holistically, and this whole approach of people, process and technology.
Dasher has built up our team to be able to have a set of solutions that can help people solve these kinds of problems.

Saso: Dasher was founded in 1999 by Laurie Dasher. To give you an idea of who we are, we're a little over 65 employees now, and the size of our business is somewhere around $100 million.

We started by specializing in solving major data-center infrastructure challenges that folks had by actually applying the people, process and technology mantra. We started in the data center, addressing people’s scale out, server, storage, and networking types of problems. Over the past five or six years, we've been spending our energy, strategy, and time on the big areas around mobility, security, and of course, big data.

As a matter of fact, Justin and I were recently working on a project with a client around combining both mobility information and big data. It’s a retail client. They want to be able to send information to a customer that might be walking through a store, maybe send a coupon or things like that. So, as Justin was just talking about, you need fast information and making actionable things happen with that data quickly. You're combining something around mobility with big data.

Dasher has built up our team to be able to have a set of solutions that can help people solve these kinds of problems.

Gardner: Justin, let’s flesh that out a little bit around mobility. When people are using a mobile device, they're creating data that, through apps, can be shared back to a carrier, as well as application hosts and the application writers. So we have streams of data now about user experience and activities.

We also can deliver data and insights out to people in the other direction in that real-time of fashion, a closed loop, regardless of where they are. They don’t have to be at their desk, they don’t have to be looking at a specific business-intelligence (BI) application for example. So how has mobility changed the game in the past five years?

Capturing data

Harrigan: Dana, it’s funny you brought up the two different ways to capture data. Devices can be both used as a sensor point or as a way to interact with data. I remember seeing a podcast you did with HP Vertica and GUESS regarding how they interacted with their database on iPads.

In regards to interacting with data, it has become not only useful to data analysts or data scientists, but we can push that down into a format so lower-level folks who aren't so technical. With a fancy application in front of them, they can use the data as well to make decisions for companies and actually benefit the company.

You give that data to someone in a store, at GUESS for example, who can benefit by understanding where in the store to put jeans to impact sales. That’s huge. Rather than giving them a quarterly report and stuff that's outdated for the season, they can do it that same day and see what other sites are doing.

On the flip side, mobile devices are now sensors. A mobile device is constantly pinging access points over wi-fi. We can capture that data and, through a MAC address as an unique identifier, follow someone as they move through a store or throughout a city. Then, when they return, that person’s data is captured into a database and it becomes historical. They can track them through their device.
Read more on tackling big data analytics
Learn how the future is all about fast data
Find out how big data trends affect your business
It allows a whole new world of opportunities in terms of the way retailers interact with where they place merchandise, the way they interact with how they staff stores to make sure they have the proper amount of people for the certain time, what weather impact has on the store.

Lastly, as Chris mentioned, how do we interact with people on devices by pushing them data that's relevant as they move throughout their day?

The next generation of big data is not just capturing data and using it in reports, but taking that data in real time and possibly pushing it back out to the person who needs it most. In the retail scenario, that's the end users, possibly giving them a coupon as they're standing in front of something on a shelf that is relevant and something they will use.

Gardner: So we're not just talking about democratization of analytics in terms of the types of organizations, but now we're even talking about the types of individuals within those organizations.

Do you have any examples of some Dasher’s clients that have been able to exploit these advances and occurrences with mobile and cloud working in tandem, and how that's produced some sort of a business benefit?

Business impact

Harrigan: A good example of a client who leveraged a large dataset is One Kings Lane. They were having difficulty updating the website their users were interacting with because it’s a flash shopping website, where the information changes daily, and you have to be able to update it very quickly. Traditional technologies were causing a business impact and slowing things down.

They were able to leverage a really fast columnar database to make these changes and actually grow the inventory, grow the site, and have updates happen in almost real time, so that there was no impact or downtime when they needed to make these changes. That's a real-world example of when big data had the direct impact on the business line.

Gardner: Chris, tell us a little bit about how Dasher works with Hewlett Packard Enterprise technologies, and perhaps even some other HP partners like GoodData, when it comes to providing analytics as a service?
Once Vertica . . . has done the analysis, you have to report on that and make it in a nice human-readable form or human-consumable form.

Saso: HP has been a longtime partner from the very beginning, actually when we started the company. We were a partner of Vertica before HP purchased them back in 2011.

We started working with Vertica around big data, and Justin was one of our leads in that area at the time. We've grown that business and in other business units within HP to combine solutions, Vertica, big data, and hardware, as Justin was just talking about. You brought up the applications that are analyzing this big data. So we're partners in the ecosystem that help people analyze the data.

Once HP Vertica, or what have you, has done the analysis, you have to report on that and make it in a nice human-readable form or human-consumable form. We’ve built out our ecosystem at Dasher to have not only the analytics piece, but also the reporting piece.

Gardner: And on the as a service side, do you work with GoodData at all or are you familiar with them?

Saso: Justin, maybe you can talk a little bit about that. You've worked with them more I think on their projects.

Optimizing the environment

Harrigan: GoodData is a large consumer of Vertica and they actually leverage it for their back-end analytics platform for the service that they offer. Dasher has been working with GoodData over the past year to optimize the environment that they run on.

Vertica has different deployment scenarios, and you can actually deploy it in a virtual-machine (VM) environment or on bare-metal. And we did an analysis to see if there was a return on investment (ROI) on moving from a virtualized environment running on OpenStack to a bare-metal environment. Through a six-month proof of concept (POC), we leveraged HP Labs in Houston. We had a four-node system setup with multiple terabytes of data.

We saw 4:1 increase in performance in moving from a VM with the same resources to a bare-metal machine. That’s going to have a significant impact on the way they move data in their environment in the future and how they adjust to customers with larger datasets.

Gardner: When we think about optimizing the architecture and environment for big data, are there any other surprises or perhaps counter-intuitive things that have come up, maybe even converged infrastructure for smaller organizations that want to get in fast and don’t want to be too concerned with the architecture underlying the analytics applications?
That’s going to have a significant impact on the way they move data in their environment in the future and how they adjust to customers with larger datasets.

Harrigan: There's a tendency now with so many free solutions out there to pick a free solution, something that gets the job done now, something that grows the business rapidly, but to forget about what businesses will need three years down the road, if it's going to grow, if it’s going to survive.

There are a lot of startups out there that are able to build a big data infrastructure, scale it to 5,000 nodes, and then they reach a limit. There are network limits on how fast the switch can move data between nodes, constantly pushing the limits of 10 Gbyte, 40 Gyte and soon 100 Gbyte networks to keep those infrastructures up.

Depending on what architecture you choose, you may be limited in the number of nodes you can go to. So there are solutions out there that can process a million transactions per second with 100 nodes, and then there are solutions that can process a million transactions per second with 20 nodes, but may cost slightly more.

If you think long-term, if you start in the cloud, you want to be able to move out of the cloud. If you start with an open ecosystem, you want to make sure that your hardware refresh is not going to cost so much that the company can’t afford it three years down the road. One of the areas we help consult with, when picking different architectures, is thinking long-term. Don't think six weeks down the road, how are we going to get our service up and running? Think, okay, we have a significant client install base, how we are going to grow the business from three to five years and five to 10 years?

Gardner: Given that you have quite a few different types of clients, and the idea of optimizing architecture for the long-term seems to be important, I know with smaller companies there’s that temptation to just run with whatever you get going quickly.

What other lessons can we learn from that long-term view when it comes to skills, security, something more than the speeds and feeds aspects of thinking long term about big data?

Numerous regulations

Harrigan: Think about where your data is going to reside and the requirements and regulations that you may run into. There are a million different regulations we have to do now with HIPAA, ITAR, and money transaction processes in a company. So if you ever perceive that need, make sure you're in an ecosystem that supports it. The temptation for smaller companies is just to go cloud, but who owns that data if you go under, or who owns that data when you get audited?

Another problem is encryption. If you're going to start gaining larger customers once you have a proven technology or a proven service, they're going to want to make sure that you're compliant for all their regulations, not just your regulations that your company is enforcing.

There's logging that they're required to have, and there is going to be encryption and protocols and the ability to do audits on anyone who is accessing the data.

Gardner: On this topic of optimizing, when you do it right, when you think about the long term, how do you know you have that right? Are there some metrics of success? Are there some key performance indicators (KPIs) or ROIs that one should look to so they know that they're not erring on the side of going too commercial or too open source or thinking short term only? Maybe some examples of what one should be looking for and how to measure that.
If you implement a system and it costs you $10 million to run and your ROI is $5 million, you've made a bad decision.

Harrigan: That’s going to be largely subjective to each business. Obviously if you're just going to use a rule of thumb, it shouldn't cost you more money than it makes you. If you implement a system and it costs you $10 million to run and your ROI is $5 million, you've made a bad decision.

The two factors are the value to the business. If you're a large enterprise and you implement big data, and it gives you the ability to make decisions and quantify those decisions, then you can put a number to that and see how much value that big-data system is creating. For example, a new marketing campaign or something you're doing with your remote sites or your retail branches and it’s quantifiable and it’s having an impact on the business.

The other way to judge it is impact on business. So, for ad serving companies, the way they make money is ad impressions, and the more ad impressions they can view, for the least cost in their environment, the higher return they're going to make. The delta is between the infrastructure costs and the top line that they get to report to all their investors.

If they can do 56 billion ad impressions in a day, and you can double that by switching architectures, that’s probably a good investment. But if you can only improve it by 10 percent by switching architectures, it’s probably too much work for what it’s worth.
Read more on tackling big data analytics
Learn how the future is all about fast data
Find out how big data trends affect your business
Gardner: One last area on this optimization idea. We've seen, of course, organizations subjectively make decisions about whether to do this on-premises, maybe either virtualized or on bare metal. They will do their cost-benefit analysis. Others are looking at cloud and as a service model.

Over time, we expect to have a hybrid capability, and as you mentioned, if you think ahead that if you start in the cloud and move private, or if you start private you want to be able to move to the cloud, we're seeing the likelihood of more of that being able to move back and forth.

Thinking about that, do you expect that companies will be able to do that? Where does that make the most sense when it comes to data? Is there a type of analysis that you might want to do in a cloud environment primarily, but other types of things you might do private? How do we start to think about breaking out where on the spectrum of hybrid cloud set of options one should be considering for different types of big-data activity?

Either-or decision

Harrigan: In the large data analytics world, it’s almost an either-or decision at this time. I don’t know what it will look like in the future.

Workloads that lend themselves extremely well to the cloud are inconsistent, maybe seasonal, where 90 percent of your business happens in December. Seasonal workloads like that lend themselves extremely well to the cloud.

Or, if your business is just starting out, and you don't know if you're going to need a full 400-node cluster to run whatever platform or analytics platform you choose, and the hardware sits idle for 50 percent of the time, or you don’t get full utilization. Those companies need a cloud architecture, because they can scale up and scale down based on needs.

Companies that benefit from on-premise are ones that can see significant savings by not using cloud and paying someone else to run their environment. Those companies typically pin the CPU usage meter at 100 percent, as much as they can, and then add nodes to add more capacity.

The best advice I could give is, if you start in the cloud or you start on bare metal, make sure you have agility and you're able to move workloads around. If you choose one sort of architecture that only works in the cloud and you are scaling up and you have to do a rip and replace scenario just to get out of the cloud and move to on-premise, that’s going to be significant business impact.

One of the reasons I like HP Vertica is that it has a cloud instance that can run on a public cloud. That same instance, that same architecture runs just as well on bare metal, only faster.

Gardner: Chris, last word to you. For those organizations out there struggling with big data, trying to figure out the best path, trying to think long term, and from an architectural and strategic point of view, what should they consider when coming to an organization like Dasher? Where is your sweet spot in terms of working with these organizations? How should they best consider how to take advantage of what you have to offer?

Saso: Every organization is different, and this is one area where that's true. When people are just looking for servers, they're pretty much all the same. But when you're actually trying to figure out your strategy for how you are going to use big-data analytics, every company, big or small, probably does have a slightly different thing they are trying to solve.

That's where we would sit down with that client and really listen and understand, are they trying to solve a speed issue with their data, are they trying to solve massive amounts of data and trying to find the needle in a haystack, the golden egg, golden nugget in there? Each of those approaches certainly has a different answer to it.
Read more on tackling big data analytics
Learn how the future is all about fast data
Find out how big data trends affect your business
So coming with your business problem and also what you would like to see as a result -- we would like to see x-number of increase in our customer satisfaction number or x-number of increase in revenue or something like that -- helps us define the metric that we can then help design toward.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or  download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Tuesday, September 15, 2015

How content in context within apps and process strengthens marketing muscle

The next BriefingsDirect discussion explores the changing role and impact of content marketing, using the IT industry as a prime example. Just as companies now communicate with their consumers and prospects in much different ways -- with higher emphasis on social interactions, user feedback, big data analysis, and even more content to drive conversations -- so too the IT industry has abruptly changed.

There's more movement to cloud models, to mobile applications, to leveraging data at every chance -- and they are also facing lower-margin subscription business models. The margin for error is shrinking in the IT industry. If any industry is the poster child for how to deal with rapid change on all fronts, including marketing, it's surely the global information technology market.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS. Read a full transcript or download a copy.

To examine how the IT industry is adjusting to the dynamic nature of marketing, we're joined by Lora Kratchounova, the Founder and Principal at Scratch Marketing and Media in Cambridge, Mass. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Lora, you and I have been talking about marketing for years now. We're in an interesting field, and it’s been such a dynamic time. I have some interesting ideas about where technology is going and where marketing is intercepting, and how they are both changing.

So, let’s start at a high level. Content marketing has proven to be very successful, and you and I have had a hand in this. Creating compelling stories, narratives about what’s going on, and how people can learn from peers as they go through problems and solve them, has become a mainstay in marketing. From your perspective, why is content marketing so important? Why has it been so successful?

Kratchounova: There are couple of reasons for that. The pace of change is tremendous now. People are trying to get their bearings on what’s going on in their markets, and a lot of times, they need to get educated. What has changed with social media now, information is a lot more immediate and transparent, and you can get it from many more sources than the just online presence of a company, for example.

Kratchounova
The top-down modeling in the marketing is changing. We used to rely on companies to tell us how to think about the world, and now we can form our own opinions. As we realize that the customer is in the driver’s seat, they educate themselves, and they make the right decisions about how to go about change, companies are realizing that they need to feed into that flow and be part of that discussion. So content marketing has been so successful, because you become an educator, not just selling to people, and especially in IT.

Gardner: And I think people have become much more accustomed to conversations, rather than just a one-direction information flow. "We're the seller and we're going to tell you what it is." Now, people want to relate. They want to hear what others have to think. It’s much more of an actual conversation.

Ongoing conversation

Kratchounova: Exactly. Look at any IT domain. It’s interesting when we look at who is influencing and who the main voices in it are, who the voices that people consider experts are. You pretty much consistently see reporters, journalists, and the analysts folks like you, but then we see that there are a lot of C-level executives from IT companies who are becoming that kind of a voice as well.

That just points to the need for that ongoing conversation, the need for sharing at all levels of the buyer funnel. Once people have bought into a selection, they need to make sure of adoption, and they are maximizing the investment.

So the conversation is very important, and the immediacy of having access to folks and having the ability to exchange a few thoughts on Twitter or LinkedIn has changed the dynamic completely. So it’s absolutely about conversations and storytelling, but it's still mapped to the buyer’s funnel.
Find out what Scratch Marketing and Media
Can Do for You
Click Here to Access A Free Webinar on Market Reputation
People are still educating and still looking at options for a change or for replacement, one or the other, until they select the people they want to work with. And it’s usually people in brands. It's not just that they want to work with this company, but the people behind it. We're moving more to a people economy.
Gardner: As you point out, you can get to the real source of the knowledge nowadays. Publishing is available to anybody whether they're tweeting, blogging, posting on Facebook, or putting something up on their company website. Anybody who has something to say can say it. It can get indexed and it can be made available to anybody who wants to hear about that particular topic.
The ability to publish is great, and it democratizes the means of how we communicate with each other and educate each other, but yet you still have to earn it.

Most people now don’t just sit back and wait for information to reach them. They're proactive. They go out, they start to search, they do hashtag searches on Twitter, and they can do Google or Bing on web.

It’s much more of, "I know something; I'm putting it out there." And there's another case of someone saying, "I need to know something; I am seeking it." They come together on their own. The content makes that possible. The better the content, the better the likelihood that those in a need to know and those in a need to tell come together.

Kratchounova: Exactly, but I think you hit on something very important. Everybody can publish, and a lot of people are publishing. Yet, we're interested in a love for your people, falling in love for your people, and what they have to say.

The ability to publish is great, and it democratizes the means of how we communicate with each other and educate each other, but yet you still have to earn it. This is very important. People who really are influential are usually domain experts and they're there to help other people. That’s the other aspect of it that both companies and their marketing teams and their executives need to think about. You have to actively participate and show your expertise, it doesn’t come for granted.

Important of curation

Gardner: And there's another aspect to greasing the skids between the knowledge and the acquirer of the knowledge, and that is content curation. There are people who point at things, give it credence, and say that it's a good thing, you should read it; or that’s a bad thing, don’t waste your time -- and that helps refine this.

Kratchounova: It’s pretty exciting.

Gardner: There are machines doing the same thing. There are algorithms, there's indexing, there's both human and machine aspects of winnowing down the good stuff and providing it to people in a need to know, and that’s when we are going to get more powerful.

Kratchounova: Great. I'm sure you know about Narrative Science. I've had a professional crush on this company for few years now. They take data, turn it into storytelling, and they think this is phenomenal. Obviously, that’s not going to replace some of the human storytelling that needs to happen, but some of the data storytelling will come from technology. This is one particular application where marketing and technology come together to bring something completely new into life.

Gardner: So we can get knowledge through expertise or we can get knowledge through experience, someone who has gone through it already and is willing to share that with you. If you're acquiring IT, it’s super important to avail yourself of everything, because it changes so rapidly and the costs are high.
IT depends on the IT buyer, because we can’t necessarily lump them together and ask how the IT buyer goes about it. There are people with different needs, and it depends on their role.

If you make a big mistake in how you're designing a data center, you're out millions of dollars, your products don’t work, and your front office are going to come screaming down on you. You have to make the big decisions and you have to make them correctly in IT. It’s not just a service to the business; it is the business.

So, let’s think about the IT industry in particular, and then think about how content marketing as we’ve discussed is powerful. How do IT people acquire content marketing? Do they get it through websites, emails, or tweets? Is it delivered to them at a webinar that they opt into? How does content marketing reach somebody who's an IT buyer?

Kratchounova: IT depends on the IT buyer, because we can’t necessarily lump them together and ask how the IT buyer goes about it. There are people with different needs, and it depends on their role. If you're CIO or CTO, there is a different mix of channels and sources you use. If you're on the dev or on the ops side and looking for specific solutions, you're going into completely different channels.

For example, if you're a DevOps professional, you're maybe on Stack Overflow and you might be seeking advice from other folks. You might be on GitHub and sharing open-source code and getting feedback on that.

If you're a CIO or CTO, what we have found working with number of different companies, be that global companies or maybe companies that are growing, is that they do seek their peers to validate what the peers are going through. One of the best things that companies can do, when they try to talk to the C-level, is expose some of those connections that they already have from their customers. Make sure that the customers are part of the discussion, and they can chime in.

Another important source of information for the C level in IT would be folks like you, analysts, and strategic system integrators like Accenture and Deloitte, because these folks are exposed to the kinds of challenges that a CIO or CTO would go through. So they have a lot to bring to the table in terms of risk mitigation, optimal deployment, and maximization of the investment in IT. Making those connections and sharing those experiences we have seen work really, really well.

Let me just throw this in as well. The other thing we have seen is that the C level is still going on Google. They're still doing the searches. We have compelling data, across the board, that in any B2B complex enterprise environment folks are self-educating as well. So it’s not a question of either/or; it’s what’s the right mix for each company depending on channels, depending on where people sit.

Spectrum of content

Gardner: So there is a spectrum of content, some highly technical and defined, on places like GitHub that are germane to a technologist. Then, there is that spectrum up from there to a higher level toward peer review of products and peer review of solutions. Then, there are more business topics about what is strategic, what’s the forward direction, how do I understand at an architectural-level decision processes, and where can I go for more information to find out what’s coming down the pike and then put it in place.

Kratchounova: Think about Spiceworks. They're probably at five million IT professionals at this point, and the community is there for a reason. So again, with each particular, there isn’t one size fits all. One thing that we always recommend to folks is that if you’re looking to develop an influential strategy and approach IT, it really depends on what domains you span.

You find that even if you're doing mobile application development, the folks who were really influential and set the standards of that stage are somewhat different from the folks who are concerned with security in mobile app development. So there isn’t necessarily one pool of influencers that you need to go then to develop a relationship and understand what’s in their mind. It really depends on your domain.

Gardner: So if you're a marketer and you recognize that quality content is super important, you need to have a spectrum of content. It needs to be some content that would be germane to a technologist that’s highly detailed, a how-to type. You need to have peer review and stories, case studies, testimonial type content where the customer is telling what they’ve done, why it benefited them, and what you can learn from that.

You also need to have higher-level discussions with experts to help people chart the next course, the strategic level. So content needs to come across a spectrum, and we recognize that the way in which people get that content might be through search. It might be through web, e-mail, webinars, webcasts, reading certain online sites, listening to certain Twitter feeds or groups, or having a select group of people that you follow. All of that happens.

But what’s interesting to me, Lora, is that all has to do with the web. But what we're seeing in IT is a rapid movement toward mobile apps, rather than just the web. And in many cases, they're starting to overtake the web as to where people spend their time. I'm sure you're using a smartphone and you have mobile apps. You're not going on the web to find a cab; you’re going to the Uber app to find a cab.

If you're looking for a restaurant review, you’re not necessarily going on the web and doing a search. You’re going into a specific app on Yelp, OpenTable, or somewhere else to find out where your restaurants are and you’re going into Google Maps to find out how to get there.
Find out what Scratch Marketing and Media
Can Do for You
Click Here to Access A Free Webinar on Market Reputation
So more-and-more, we're seeing, on the consumer side, people using mobile apps for more of their processes, for their inquiry, for their actual productivity. Then, on the enterprise side, the business-to-employee (B2E) side, we're seeing people using cloud services.

We're moving more toward mobile applications, cloud services, an API-driven world that leverages big data and analytics in order to put context into process. It's all about user experiences, and mobile delivers the best. How then does content continue to reach people? Do we lose the ability to deliver content when they are in apps?

Different perspective

Kratchounova: I have a different perspective on what you're describing. I don’t know that we are moving to a mobile app experience necessarily. When we think about the apps and the examples you gave -- Yelp or Uber -- yes, they're best-of-breed applications that we use because these are the most frequently used applications.

But what you're seeing is actually a digital transformation. Digital no longer means the web, as we know it, going online through your computer. You're actually navigating on a mobile device. So it’s this digital transformation that’s happening, and the trend that we're seeing is aggregation.

It’s not about one individual app, but it’s more about what is the Flipboard within the enterprise. You're seeing that sort of aggregation bubbling up to the top because information overload is a huge problem. People can’t prioritize anymore. They can’t toggle among those different applications and companies.

For example, one of our clients, not to necessarily add a plug for them, actually is very germane to the discussion. Harmon.ie does exactly that.
Once you understand, then you understand what a partner is trying to do. Why are they are here, what’s the context, what’s the most logical next step or the optimal next step?

In those kinds of environments, what we're finding and where I totally agree with you, is the ability to read and understand context, so that you can support the user, be that an employee with internal work experience, or external customers, to support them to get the job done.


The role of content is actually merging with big data, because big data is helping us to understand context and say, "What do we serve this person here?" On the marketing side, and the lingo side it’s more about ongoing customer journeys. Think about the same thing on the employee side, ongoing employee journeys or partner journeys.

Once you understand, then you understand what a partner is trying to do. Why are they are here, what’s the context, what’s the most logical next step or the optimal next step? Now, content becomes both an ability for people to find something, but also for marketers or product development folks. I think those functions are emerging as well to deliver the right content in the right format so that the user can get the job done. That’s my perspective on that.

Gardner: There's no disagreement from me on this issue of context to process, context to location, context to need for knowledge all being much more granular and powerful going forward. What I am concerned about is that, when I talk to developers, the vast majority of them are much more interested in a mobile-first, cloud-first world.

They're not much interested in building what we used to think of as big honking applications in the enterprise. They're much more interested in how to bring services -- and microservices -- together in context to provide a better productive outcome and how to leverage low-cost services in APIs and from any cloud.

Discovering inference

So, to me, it becomes, on one hand, all the more important to have the ability to deliver content contextually into these processes, but at the same time these processes are becoming fragmented. They're going across hybrid-cloud environments, they include both what we call cloud and SaaS, and I'm not sure where the marketer now can get enough inference to support the injection of content appropriately.

The ways that it’s been done now is usually through the web where we have links, and we have code, and we can do cookies. It’s sort of like, it’s Web 1.0 mechanisms by which marketers are injecting content, but we are moving not only pass Web 2.0, we're into Web 3.0  cloud platform. To me this is a big question mark.

Kratchounova: It is a question mark. I don’t know that there is going to be one mode of delivering what we're talking about or one approach or one framework. I'll give you one example. Look at how web content management has changed. It used to be about managing pages and updating content. Now, web content management is becoming the Marketing Command Center, if you look at a web content management system like Sitefinity, for example.

Now, marketers can deal with the customer through his own mobile and on the web, so they can inject the content that needs to happen there. The reason they can do this now is because there is this ability, the analytics that come from all of these customer interactions of you, actually creating cohorts of people as they're going through your web experience or online experience. You know why they're there and what’s the optimal path for them to get where they need to be.
You're seeing this ability to distribute content to post content to people, but in a much more contextual way. So, there is going to be a pull and push, but the push is getting a lot smarter and very contextual.

So, you're seeing this ability to distribute content to post content to people, but in a much more contextual way. So, there is going to be a pull and push, but the push is getting a lot smarter and very contextual.

Gardner: So it’s incumbent upon us who are examining this marketing evolution in the context of the IT industry to create that spectrum of content to make it valuable, to make it appropriate and not too commercial or crass, but useful. And at the same time now, think about how to get this in front of right people at the right time.

It seems to me that if I'm an IT company, and more and more of my services, whether it’s a B2B, B2C, B2E, or all of the above, I need to be thinking about ways that I'm going to communicate with my existing universe or market and move them toward new products and services as they need them in context of their process.

Think about this in a B2C environment in retail, where I am walking through Wal-Mart. I have my smartphone and, as I turn the corner, they know that now I am interested in home goods, and they are going to start to incentivize me to buy something. That’s kind of an understood mechanism by which my location and the fact that I turned a corner and made a decision provides an inference that then they can react to with content or information.

But take that now to the B2B environment where I'm in a business setting. I'm in procurement, I'm in product development, or I'm looking for a supply chain efficiency. I want to move into a new geographic location and I need to find the means to do that. All of those things are also like turning a corner in a Wal-Mart, except you're in a business application using cloud services, using a mobile device and apps.

If I'm an IT vendor, I'm going to want to have content or information that I can bring to that situation, perhaps even through an example of what other people have done when they face that same process crossroads. So the content can be more important and more germane. These are multi-million-dollar decisions in some cases.

Don’t you think that big companies should be starting to make content with the idea that it’s going to become part of their application services, part of their cloud delivery services, and that they need to use big data and analytics to know when to inject it?

Understanding context

Kratchounova: I absolutely agree. I think that difference between the example you just gave for Wal-Mart and a B2B environment is that, in Wal-Mart, you don’t need to understand so much about who the person is, what their role is, whether they work at an accounting firm or whether they are a physician, for example.

In a B2B environment you do need to understand context, and context is the location or the point where they are in their journey, whatever that journey maybe, and their role as well, because different people do have different decisions to make.

It’s a little bit more complex to bring context in a B2B environment, but it’s absolutely essential. You used the word inference. We always get enamored by the concept of the big data and guess what, once the machines are there, they're going to analyze everything and it's going to be this perfect world of marketing where everyone is aligned. 

Just look at the history of marketing. We don’t know ourselves as people. We individually don’t know ourselves as well, let alone someone else getting to know us that well. Inference is very important, but it’s going to be a balance between inferring what the person needs and allowing the person to customize this experience as well. So it’s going to come both ways.
Some people still believe that it’s a relationship-based world and, therefore, there's no need for a digital experience for their customers or for their potential buyers, which is actually never the case.

Some people going to one extreme or the other. Some people still believe that it’s a relationship-based world and, therefore, there's no need for a digital experience for their customers or for their potential buyers, which is actually never the case. Other people believe that it’s all digital; therefore they don’t need to touch them in any other way, which is rarely the case, especially in IT. 

Gardner: I also suggest to you that the data is more readily available, because I, as an employer, as a corporation, control what’s going on. I know what that employee is doing. I know what apps they're using. I know what data they're seeking. 

They're going to provide a feed of data back to you about what’s going on, on those apps from your very own employees.

What I'm suggesting then, as we begin to think about closing out this fascinating conversation, is that you need to have content, stories, and customers lined up, so that you can uncover their path to truth, their path to value, and have that content context-ready. Not only you are going to be using it in webinars, webcasts, podcasts, blogs, but pretty soon, if my hypothesis is correct, you're going to be using that content in the context of process and inside of applications in cloud services and on mobile devices.

Way of the future

Kratchounova: Maybe this is an opportunity, because it is the way of the future, and some people are more mature and others are less mature, but maybe we can bring other people into the discussion and see what other folks in the field think about where the content is going, how to contextualize and how to deliver it. One of the biggest question is how do we scale this. You can still do a meaningful experience or create a meaningful experience one-on-one, but it’s hard to recreate that even if your customers are 200, 500, or even 5,000 within the IT space. 

Gardner: You also have to remember that people's connections to apps, cloud services and context-aware processes are only going to increase. The Internet of Things and new classes of devices like the Apple Watch are expanding the end points and ways to connect to them. One of the things that’s important with the Apple Watch functionally is that it’s very good at alerts and notifications. It can also detect a lot of context of what you're doing physically and your location, and it can relate, because it integrates to your phone, with what you're doing with applications and cloud services.

Wouldn’t it be interesting if you're wearing an Apple Watch or equivalent, you're in a business setting, and you come up against a problem that you might not even know yet, but all of these services working together are going to say, "That person is going to be facing a problem; they are going to need to make a decision. Let’s put some information, content, and use cases together for them that will help them as they face that situation to make a better decision." That’s the kind of role I think we're heading toward. 

Before we sign off, Lora, tell me more about Scratch Marketing and Media, what you do and why that’s related to this discussion we have had today.
Find out what Scratch Marketing and Media
Can Do for You
Click Here to Access A Free Webinar on Market Reputation
Kratchounova: Scratch Marketing and Media is an integrated marketing agency. We help B2B technology companies with market growth. Sometimes that means helping the sales folks within IT companies and sometimes it means working with the marketing folks on things like content marketing programs, PR, and all its relations, and influence their relations in social media.

Gardner: And how could they find out more information about Scratch Marketing Media?

Kratchounova: You can go online at www.scratchmm.com.

Listen to the podcast. Find it on iTunes. Get the mobile app for iOS. Read a full transcript or download a copy.

You may also be interested in:

Wednesday, September 9, 2015

How HTC centralizes storage management to gain visibility and IT disaster avoidance

The next BriefingsDirect storage management innovation case study discussion highlights how communications cooperative HTC centralizes storage management to gain powerful visibility, reduce costs, and implement IT disaster avoidance capabilities.

We’ll learn more about how HTC lowers total storage utilization cost while bringing in a common management view to improve problem resolution, automate resources allocation, and more fully gain compliance -- as well as set the stage for broader virtualization and business continuity benefits.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy.

To explore how HTC attains total storage management, we sat down with Philip Sellers, Senior System Administrator at HTC in Myrtle Beach, South Carolina. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Tell us about HTC.

https://www.linkedin.com/in/psellers
Sellers
Sellers: HTC is the largest telephone cooperative in the nation. We serve the Myrtle Beach and surrounding South Carolina area. We started out as a telephone company, but at this point, we're a full-line telecommunications company, doing cable TV, internet security, home automation, and through our partnership with AT and T, we also do wireless service. 

Gardner: Now, you are not HTC, the handset maker from Asia; you are an entirely different company.

Sellers: A completely different company, although we do sell a few of those handsets with our wireless division.

Gardner: You told me when we talked earlier that you are a reluctant storage administrator. You started out as a VMware in virtualization admin. How did you get from one to the other, and why is it important for your organization?

Common story

Sellers: It’s probably a common story in a lot of shops. As VMware became more prolific in our environment, the line started to blur between networking and VMware, and storage and VMware. So I was pulled more into those directions as the primary VMware admin for our company. That gave me the opportunity to dig in and start to learn an area of IT that was new to me.

Gardner: Philip, tell us a little bit about the scale: how many virtual machines (VMs), how many employees, what sort of a size organization are you?

Sellers: We have 700 or so employees at this point, and almost that number of VMs that we're managing. We have a couple of different storage platforms today with the HP EVA and HP 3PAR StoreServ in-house.
Storage Operations Manager
Reduce Total Costs -- Increase Productivity

Try It Now
We also use lots of other things. We have HP StoreOnce for backup and HP StoreVirtual for some of our smaller needs, such as remote offices. 

Gardner: What kind of storage workloads are we dealing with here? Is this all of the apps across the company? What set of IT workloads are you addressing? 
One of the great benefits we've realized with VMware is the ability to have a good test and development platform to mirror what we have in production.

Sellers: The group that I'm a part of is actually the internal IT group. So we're running line-of-business applications, not the things that our customers are delivered service across, but the things that run our business to take orders, support financial operations, and those sorts of things.

And we're running a mixture of test and dev and production. One of the great benefits we've realized with VMware is the ability to have a good test and development platform to mirror what we have in production. So it runs the gamut for internal IT.

Gardner: When you start to think about progressing to a better utilization and the rationalization of storage, rather than have overlapping or disjointed storage capabilities, what sort of philosophy do you have about storage? How do you think that you can make the whole greater than sum of the parts and get those utilization benefits over time?

Deeper insight

Sellers: It’s something that I learned back in my virtualization days. For me, it’s huge to have visibility into what’s going to in your storage. One of the benefits of our transition to HP 3PAR storage is that we've been able to realize much deeper levels of insight into what’s going on inside of the arrays.

You know, as we were making that switch, we evaluated other third parties, ultimately deciding on the mid-range 7000 3PAR series for our environment and for our needs. That visibility has been key for us.

But it’s also come with a set of challenges, because we now have multiple storage consoles that we need to manage from. We have different places that we need to check. One of the keys for us is having somewhere where we can see it all, or get a better idea of the entire environment from an end-to-end perspective.

One of the other huge benefits that we've realized is some level of disaster avoidance.
That’s one of the things we learned from our VMware days. We were flying blind early on, and that caused us problems and potential problems, because we didn’t know something was going on. One of our main goals is establishing good visibility into our storage environment.

Gardner: So, it’s not just enough to modernize your storage and improve your storage capabilities, but at the same time you really need to address the management issues and consolidate management. In doing so, what have been some of the payoffs that you can recall? How has this helped your organization better provide IT services internally?

Sellers: From a performance standpoint, our former primary storage platform was not great at telling us how close we were to the edge of our performance capabilities. We never knew exactly what was going to cause a problem or the unpredictability of virtual workloads in particular. We never knew where we were going to have issues.

Being able to see into that has allowed us to prevent help desk cost for slow services, for problems that maybe we didn’t even know were going on initially. One of the other huge benefits that we've realized is new levels of disaster avoidance.

Gardner: And what do you mean by that, rather than disaster recovery (DR), which is taking care of business after we have had some terrible thing happen? How do you head that off?

Disaster avoidance

Sellers: I know that’s not an industry term, but that’s what I like to call it, because in our environment, we have two data centers that are fairly close together. What we've implemented is the HP 3PAR StoreServ metro storage clustering feature, which they call peer persistence, but it's VMware’s metro storage clustering. We've also done that with Windows clustering as well.

We have two sets of 3PARs in different data centers, and they act as one. So, they replicate synchronously between the two locations and they fail-over "automagically." I don’t know how else to say it. It just seamlessly fails-over between the two sites.

For our environment, we were at a particularly vulnerable state if we lost a data array, because so many things were pointing at it. Now if we lose a single data array it’s not a big deal. It fails-over and it continues running.

Gardner: And when you say vulnerable, I think you're talking about hurricanes?

Sellers: A lot of times we plan for those large natural disasters, but sometimes it’s the small ones that get us like UPS maintenance or something as simple as a power outage. Maybe your generator doesn’t kick in in time. Sometimes, that can be a disaster of almost the same scale as a hurricane to your business operations -- just from something simple.
Storage Operations Manager
Reduce Total Costs -- Increase Productivity

Try It Now
Gardner: So the storage management capability has provided "automagically," as you say, this disaster avoidance. That’s a pretty important metric. Do you have any idea of the value of that to your business, and maybe start to put that in dollar terms? It seems a pretty profound difference.

Sellers: I can’t necessarily put it into dollar terms. That’s not the world that I work in, but I know that anytime there is downtime to our customer relationship advisers, and the people in the field, that’s bad for business.

So we're avoiding those kinds of situations as best we can. We could lose an entire data center site and, with technology built into the VMware layer and into the HP 3PAR layer, it will come back up. It may be reboot of a server, but we try to do everything we can to avoid disaster situations today, rather than just plan for needing to fail a data center over to "site B," and go through all of that testing.

Gardner: Let’s get down to some more brass tacks on actual storage utilization benefits. Any thoughts or recollections about what this means in terms of utilization, so  no more worries about running out of storage base or capacity?

Seeing benefits

Sellers: Yeah, the HP 3PAR platform has been really great inside of our environment because we realize the marketing term of the "two-to-one thin provisioning." We're seeing that benefit.

When I looked at the console before I came here, we were seeing around a 2.3 to 1 compaction, and that’s without deduplication and some of the other newer technologies that are capable in the 3PAR platform. We may be able to realize better than that in the future.

Gardner: We've talked about disaster avoidance. We've recognized some significant savings in the provisioning and utilization. Let’s go back to management. What sort of benefits are you getting now with a more holistic approach and how does that help, perhaps on a data lifecycle basis?

Sellers: One of the ways that we're approaching that set of problems is with storage resource management software. We've traditionally used a piece of software called Storage Essentials, which HP makes. It’s heterogeneous storage-management software, so it can look at all of our different arrays and looks at our backup arrays and our primary storage arrays, as well as our back-up environment, and pulls all that information together.
We've been able to leverage that from a reporting standpoint to be able to view and pinpoint growth to see how see things are running from a dashboard view.

We've been able to leverage that from a reporting standpoint to be able to view and pinpoint growth to see how see things are running from a dashboard view. Over the last six months or so, I've been working in an early-release program for a product called HP Storage Operations Management.

This software is the next iteration of Storage Essentials. It’s got a much more approachable and modern user interface, which brings up and aggregates our total environment so that we can get a full picture of what’s going on there. Then, we can drill down and see at specific levels how things are performing, what our utilization trend is, or how much time we have until a device or a storage pool is full.

Those are things that keep us out of the really dangerous situations in getting down to a time where you're in a mission critical season, maybe the holidays or something where it’s heavy sales, and you run out of disk space and you can’t get your procurement cycle to get storage quickly enough.

Those things are just as dangerous as the hurricane that we were talking about earlier from a business operations perspective. Tools like this help us to manage and see what’s going on in the environment and help us plan and act proactively.

Gardner: I could really see why your philosophy is visibility and management oversight. It comes back again and again as a huge force multiplier benefit. 

Room to grow

Sellers: Absolutely. There's a saying that ignorance is bliss. When you're flying blind, that’s true, until it catches up with you, and it eventually overtakes you. We have lots and lots of room to grow and capabilities where we're at today. This new version of management storage resource management product has lots of great potential, too.

It’s an initial release. So, it’s got somewhat limited support for different storage families and that kind of thing, but they're working to bring in additional support and make it all that the previous product was, and much more -- and that’s visible from the initial release.

So we're excited about seeing where that can help us, particularly because one of the switches in this new product is that it’s not just a collect, an analytics reporting system. It’s a dashboard system where it takes that analytics and brings it back to a dashboard to let you drill down in to it and see it real clearly in near-real-time. I won’t say in real-time, but within whatever amount of time you configure.

Gardner: How about your future business activities? How well you can support them? I know that media is a fast-changing business. Do you feel confident now that when your superiors in your organization come to you and say, "We need this," that you're in a better position to hop-to quickly? Is there a sense of confidence that you can take on market change better?
We feel confident that we have room to grow and that we can do so in shorter terms.

Sellers: I certainly believe so. We've been able to adapt and change more quickly because of changes that we've made with VMware, with HP 3PAR. We feel confident that we have room to grow and that we can do so in shorter terms. We've been able to try and look at new things like VDI deployments to help us with compliance-type issues, where we're under regulations and have to patch and have to ensure that our systems are secure.

And so we are looking at things like that now that we were afraid to put on to primary storage in the past. It's something where we think we have a good mix today for the future.

Gardner: What advice might you might provide others who would be approaching a disparate storage environment? And maybe share your philosophy about visibility and anticipation being better than reaction. Maybe they are also seeking disaster avoidance, rather than disaster recovery. For those folks that are not quite as far along in this journey as you are, what might you suggest for them to be thinking about -- or that you wish you knew about earlier?

Sellers: There is definitely some low hanging fruit, and that’s what visibility will bring to you -- the ability to handle some of that low-hanging fruit. If you have a situation where your storage team is siloed away from your server team, bringing something in that can see both of those sides and map together that whole environment is a real easy way to identify inefficiency.
Storage Operations Manager
Reduce Total Costs -- Increase Productivity

Try It Now
Those are LUNs that maybe are provisioned -- but not in use. There is no I/O on them. That’s a dollar amount immediately reclaimed. Finding VMs and things with visibility. These tools can look in to the VMware environment where you can see that you have lots and lots of VMs that are shut down.

There are easy things that you can do to start that process, no matter what your storage platform is. I think that’s a universal thing. If you have something that can gain you visibility in to the environment there are some easy things and easy wins that you can bring back.

Further improvements

Gardner: And those of course provide grist for the mill of further improvements and further budget to accomplish even more.

Sellers: Absolutely. If you want to make a storage platform switch or if you want to do other improvements and gain more efficiency, this gives you a little bit of extra room, some wiggle room, to make those things reality. We spent an awful lot of our budget just in keeping the lights on, keeping things up and running. Anytime you can gain some wiggle room from that budget, it certainly allows you the ability to look at innovation.

Listen to the podcast. Find it on iTunes. Get the mobile app. Read a full transcript or download a copy. Sponsor: HP.

You may also be interested in: