Wednesday, February 20, 2019

How the data science profession is growing in value and impact across the business world


The next BriefingsDirect business trends panel discussion explores how the role of the data scientist in the enterprise is expanding in both importance and influence.

Data scientists are now among the most highly sought-after professionals, and they are being called on to work more closely than ever with enterprise strategists to predict emerging trends, optimize outcomes, and create entirely new kinds of business value.

 
Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy.

To learn more about modern data scientists, how they operate, and why a new level of business analysis professional certification has been created by The Open Group, we are joined by Martin Fleming, Vice President, Chief Analytics Officer, and Chief Economist at IBM; Maureen Norton, IBM Global Data Scientist Professional Lead, Distinguished Market Intelligence Professional, and author of Analytics Across the Enterprise, and George Stark, Distinguished Engineer for IT Operations Analytics at IBM. The panel is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: We are now characterizing the data scientist as a profession. Why have we elevated the role to this level, Martin? 
 
Fleming
Fleming: The benefits we have from the technology that’s now available allow us to bring together the more traditional skills in the space of mathematics and statistics with computer science and data engineering. The technology wasn't as useful just 18 months ago. It’s all about the very rapid pace of change in technology.

Gardner: Data scientists used to be behind-the-scenes people; sneakers, beards, white lab coats, if you will. What's changed to now make them more prominent?

Norton: Today’s data scientists are consulting with the major leaders in each corporation and enterprise. They are consultants to them. So they are not in the back room, mulling around in the data anymore. They're taking the insights they're able to glean and support with facts and using them to provide recommendations and to provide insights into the business.

Gardner: Most companies now recognize that being data-driven is an imperative. They can’t succeed in today's world without being data-driven. But many have a hard time getting there. It's easier said than done. How can the data scientist as a professional close that gap?

Stark
Stark: The biggest drawback in integration of data sources is having disparate data systems. The financial system is always separate from the operational system, which is separate from the human resources (HR) system. And you need to combine those and make sure they're all in the same units, in the same timeframe, and all combined in a way that can answer two questions. You have to answer, “So what?” And you have to answer, “What if?” And that’s really the challenge of data science.

Gardner: An awful lot still has to go on behind the scenes before you get to the point where the “a-ha” moments and the strategic inputs take place.

Martin, how will the nature of work change now that the data scientist as a profession is arriving – and probably just at the right time?

Fleming: The insights that data scientists provide allow organizations to understand where the opportunities are to improve productivity, of how they can help to make workers more effective, productive, and to create more value. This enhances the role of the individual employees. And it’s that value creation, the integration of the data that George talked about, and the use of analytic tools that's driving fundamental changes across many organizations.

Captain of the data team

Gardner: Is there any standardization as to how the data scientist is being organized within companies? Do they typically report to a certain C-suite executive or another? Has that settled out yet? Or are we still in a period of churn as to where the data scientist, as a professional, fits in?

Norton
Norton: We're still seeing a fair amount of churn. Different organizing approaches have been tried. For example, the centralized center of excellence that supports other business units across a company has a lot of believers and followers.

The economies of scale in that approach help. It’s difficult to find one person with all of the skills you might need. I’m describing the role of consultant to the presidents of companies. Sometimes you can’t find all of that in one individual -- but you can build teams that have complimentary skills. We like to say that data science is a team sport.

Gardner: George, are we focusing the new data scientist certification on the group or the individual? Have we progressed from the individual to the group yet?

Stark: I don’t believe we are there yet. We’re still certifying at the individual level. But as Maureen said, and as Martin alluded to, the group approach has a large effect on how you get certified and what kinds of solutions you come up with.

Gardner: Does the certification lead to defining the managerial side of this group, with the data scientist certified in organizing in a methodological, proven way that group or office?
Learn How to Become
Certified as a
Data Scientist
Fleming: The certification we are announcing focuses not only on the technical skills of a data scientist, but also on project management and project leadership. So as data scientists progress through their careers, the more senior folks are certainly in a position to take on significant leadership and management roles.

And we are seeing over time, as George referenced, a structure beginning to appear. First in the technology industry, and over time, we’ll see it in other industries. But the technology firms whose names we are all familiar with are the ones who have really taken the lead in putting the structure together.

Gardner: How has the “day in the life” of the typical data scientist changed in the last 10 years?

Stark: It’s scary to say, but I have been a data scientist for 30 years. I began writing my own Fortran 77 code to integrate datasets to do eigenvalues and eigenvectors and build models that would discriminate among key objects and allow us to predict what something was.

The difference today is that I can do that in an afternoon. We have the tools, datasets, and all the capabilities with visualization tools, SPSS, IBM Watson, and Tableau. Things that used to take me months now take a day and a half. It’s incredible, the change.

Gardner: Do you as a modern data scientist find yourself interpreting what the data science can do for the business people? Or are you interpreting what the business people need, and bringing that back to the data scientists? Or perhaps both?

Collaboration is key

Stark: It’s absolutely both. I was recently with a client, and we told them, “Here are some things we can do today.” And they said, “Well, what I really need is something that does this.” And I said, “Oh, well, we can do that. Here’s how we would do it.” And we showed them the roadmap. So it’s both. I will take that information back to my team and say, “Hey, we now need to build this.”

Gardner: Is there still a language, culture, or organizational divide? It seems to me that you’re talking apples and oranges when it comes to business requirements and what the data and technology can produce. How can we create a Rosetta Stone effect here?

Norton: In the certification, we are focused on supporting that data scientists have to understand the business problems. Everything begins from that.

In the certification, we are focused on supporting that data scientists have to understand the business problems. Everything begins from that. Knowing how to ask the right questions, to scope the problem, and be able to then translate is essential.
Knowing how to ask the right questions, to scope the problem, and be able to then translate [is essential]. You have to look at the available data and infer some, to come up with insights and a solution. It's increasingly important that you begin with the problem. You don't begin with your solution and say, “I have this many things I can work with.” It's more like, “How we are going to solve this and draw on the innovation and creativity of the team?”

Gardner: I have been around long enough to remember when the notion of a chief information officer (CIO) was new and fresh. There are some similarities to what I remember from those conversations in what I’m hearing now. Should we think about the data scientist as a “chief” something, at the same level as a chief technology officer (CTO) or a CIO?

Chief Data Officer defined 

Fleming: There are certainly a number of organizations that have roles such as mine, where we've combined economics and analytics. Amazon has done it on a larger scale, given the nature of their business, with supply chains, pricing, and recommendation engines. But other firms in the technology industry have as well.

We have found that there are still three separate needs, if you will. There is an infrastructure need that CIO teams are focused on. There are significant data governance and management needs that typically chief data officers (CDOs) are focused on. And there are substantial analytics capabilities that typically chief analytics officers (CAOs) are focused on.

It's certainly possible in many organizations to combine those roles. But in an organization the size of IBM, and other large entities, it's very difficult because of the complexity and requirements across those three different functional areas to have that all embodied in a single individual.

Gardner: In that spectrum you just laid out – analytics, data, and systems -- where does The Open Group process for a certified data scientist fit in?


Fleming: It's really on the analytics side. A lot of what CDOs do is data engineering, creating data platforms. At IBM, we use the term Watson Data Platform because it's built on a certain technology that's in the public cloud. But that's an entirely separate challenge from being able to create the analytics tools and deliver the business insights and business value that Maureen and George referred to.

Gardner: I should think this is also going to be of pertinent interest to government agencies, to nonprofits, to quasi-public-private organizations, alliances, and so forth.

Given that this has societal-level impacts, what should we think about in improving the data scientists’ career path? Do we have the means of delivering the individuals needed from our current educational tracks? How do education and certification relate to each other?

Academic avenues to certification

Fleming: A number of universities have over the past three or four years launched programs for a master’s degree in data science. We are now seeing the first graduates of those programs, and we are recruiting and hiring.

I think this will be the first year that we bring in folks who have completed a master’s in data science program. As we all know, universities change very slowly. It's the early days, but demand will continue to grow. We have barely scratched the surface in terms of the kinds of positions and roles across different industries.

That growth in demand will cause many university programs to grow and expand to feed that career track. It takes 15 years to create a profession, so we are in the early days of this.


Norton: With the new certification, we are doing outreach to universities because several of them have master’s in data analytics programs. They do significant capstone-type projects, with real clients and real data, to solve real problems.

We want to provide a path for them into certification so that students can earn, for example, their first project profile, or experience profile, while they are still in school.

Gardner: George, on the organic side -- inside of companies where people find a variety of tracks to data scientist -- where do the prospects come from? How does organic development of a data scientist professional happen inside of companies?

Stark: At IBM, in our group, Global Services, in particular, we've developed a training program with a set of badges. They get rewarded for achievement in various levels of education. But you still need to have projects you've done with the techniques you've learned through education to get to certification.

Having education is not enough. You have to apply it to get certified.

Gardner: This is a great career path, and there is tremendous demand in the market. It also strikes me as a very fulfilling and rewarding career path. What sorts of impacts can these individuals have?
Learn How to Become
Certified as a
Data Scientist
Fleming: Businesses have traditionally been managed through a profit-and-loss statement, an income statement, for the most part. There are, of course, other data sources -- but they’re largely independent of each other. These include sales opportunity information in a CRM system, supply chain information in ERP systems, and financial information portrayed in an income statement. These get the most rigorous attention, shall we say.

We're now in a position to create much richer views of the activity businesses are engaged in. We can integrate across more datasets now, including human resources data. In addition, the nature of machine learning (ML) and artificial intelligence (AI) are predictive. We are in a position to be able to not only bring the data together, we can provide a richer view of what's transpiring at any point in time, and also generate a better view of where businesses are moving to.

It may be about defining a sought-after destination, or there may be a need to close gaps. But understanding where the business is headed in the next 3, 6, 9, and 12 months is a significant value-creation opportunity.

Gardner: Are we then thinking about a data scientist as someone who can help define what the new, best business initiatives should be? Rather than finding those through intuition, or gut instinct, or the highest paid person's opinion, can we use the systems to tell us where our next product should come from?

Pioneers of insight

Norton: That's certainly the direction we are headed. We will have systems that augment that kind of decision-making. I view data scientists as pioneers. They're able to go into big data, dark data, and a lot of different places and push the boundaries to come out with insights that can inform in ways that were not possible before.

It’s a very rewarding career path because there is so much value and promise that a data scientist can bring. They will solve problems that hadn't been addressed before.

It's a very exciting career path. We’re excited to be launching the certification program to help data scientists gain a clear path and to make sure they can demonstrate the right skills.

It's a very rewarding career path because there is so much value and promise that a data scientist can bring. They will solve problems that hadn't been addressed before.
Gardner: George, is this one of the better ways to change the world in the next 30 years?

Stark: I think so. If we can get more people to do data science and understand its value, I'd be really happy. It's been fun for 30 years for me. I have had a great time.

Gardner: What comes next on the technology side that will empower the date scientists of tomorrow? We hear about things like quantum computing, distributed ledger, and other new capabilities on the horizon?

Future forecast: clouds

Fleming: In the immediate future, new benefits are largely coming because we have both public cloud and private cloud in a hybrid structure, which brings the data, compute, and the APIs together in one place. And that allows for the kind of tools and capabilities that necessary to significantly improve the performance and productivity of organizations.

Blockchain is making enormous progress and very quickly. It's essentially a data management and storage improvement, but then that opens up the opportunity for further ML and AI applications to be built on top of it. That’s moving very quickly.

Quantum computing is further down the road. But it will change the nature of computing. It's going to take some time to get there but it nonetheless is very important and is part of that what we are looking at over the horizon.

Gardner: Maureen, what do you see on the technology side as most interesting in terms of where things could lead to the next few years for data science?

Norton: The continued evolution of AI is pushing boundaries. One of the really interesting areas is the emphasis on transparency and ethics, to make sure that the systems are not introducing or perpetuating a bias. There is some really exciting work going on in that area that will be fun to watch going forward.

Gardner: The data scientist needs to consider not just what can be done, but what should be done. Is that governance angle brought into the certification process now, or something that it will come later?

Stark: It's brought into the certification now when we ask about how were things validated and how did the modules get implemented in the environment? That’s one of the things that data scientists need to answer as part of the certification. We also believe that in the future we are going to need some sort of code of ethics, some sort of methods for bias-detection and analysis, the measurement of those things that don't exist today and that will have to.

Gardner: Do you have any examples of data scientists doing work that's new, novel, and exciting?

Rock star potential

Fleming: We have a team led by a very intelligent and aggressive young woman who has put together a significant product recommendation tool for IBM. Folks familiar with IBM know it has a large number of products and offerings. In any given client situation the seller wants to be able to recommend to the client the offering that's most useful to the client’s situation.

And our recommendation engines can now make those recommendations to the sellers.  It really hasn't existed in the past and is now creating enormous value -- not only for the clients but for IBM as well.

Gardner: Maureen any examples jump to mind that illustrate the potential for the data scientist?

Norton: We wrote a book, Analytics Across the Enterprise, to explain examples across nine different business units. There have been some great examples in terms of finance, sales, marketing, and supply chain.
Learn How to Become
Certified as a
Data Scientist
Gardner: Any use-case scenario come to mind where the certification may have been useful?

Norton: Certification would have been useful to an individual in the past because it helps map out how to become the best practitioner you can be. We have three different levels of certification going up to the thought leader. It's designed to help that professional grow within it.

Stark: A young man who works for me in Brazil built a model for one of our manufacturing clients that identifies problematic infrastructure components and recommends actions to take on those components. And when the client implemented the model, they saw a 60 percent reduction in certain incidents and a 40,000-hour-a-month increase in availability for their supply chain. And we didn't have a certification for him then -- but we will have now.

Gardner: So really big improvement. It shows that being a data scientist means you're impactful and it puts you in the limelight.

IBM has built an internal process that matches with The Open Group. Other companies are getting accredited for running a version of the certification themselves, too.
Stark: And it was pretty spectacular because the CIO for that company stood up in front of his whole company -- and in front of a group of analysts -- and called him out as the data scientist that solved this problem for their company. So, yeah, he was a rock star for a couple days.

Gardner: For those folks who might be more intrigued with a career path toward certification as a data scientist, where might they go for more information? What are the next steps when it comes to the process through The Open Group, with IBM, and the industry at large?

Where to begin

Norton: The Open Group officially launched this in January, so anyone can go to The Open Group website and check under certifications. They will be able to read the information about how to apply. Some companies are accredited, and others can get accredited for running a version of the certification themselves.

IBM recently went through the certification process. We have built an internal process that matches with The Open Group. People can apply either directly to The Open Group or, if they happen to be within IBM or one of the other companies who will certify, they can apply that way and get the equivalent of it being from The Open Group. 

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: The Open Group.

You may also be interested in:

Wednesday, February 13, 2019

Why enterprises should approach procurement of hybrid IT in entirely new ways



The next BriefingsDirect hybrid IT management strategies interview explores new ways that businesses should procure and consume IT-as-a-service. We’ll now hear from an IT industry analyst on why changes in cloud deployment models are forcing a rethinking of IT economics -- and maybe even the very nature of acquiring and cost-optimizing digital business services.

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy.

Here to help us explore the everything-as-a-service business model is Rhett Dillingham, Vice President and Senior Analyst at Moor Insights and Strategy. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.
Here are some excerpts:

Gardner: What is driving change in the procurement of hybrid- and multi-cloud services?

Dillingham: What began as organic adoption -- from the developers and business units seeking agility and speed -- is now coming back around to the IT-focused topics of governance, orchestration across platforms, and modernization of private infrastructure.

There is also interest in hybrid cloud, as well as multi-cloud management and governance. Those amount to complexities that the public clouds are not set up for and are not able to address because they are focused on their own platforms.
Learn How to
Better Manage
Gardner: So the way you acquire IT these days isn’t apples or oranges, public or private, it’s more like … fruit salad. There are so many different ways to acquire IT services that it’s hard to measure and to optimize.

Dillingham
Dillingham: And there are trade-offs. Some organizations are focused on and adopt a single public cloud vendor. But others see that as a long-term risk in management, resourcing, and maintaining flexibility as a business. So they’re adopting multiple cloud vendors, which is becoming the more popular strategic orientation.

Gardner: For those organizations that don’t want mismanaged “fruit salad” -- that are trying to homogenize their acquisition of IT services even as they use hybrid cloud approaches -- does this require a reevaluation of how IT in total is financed?

Champion the cloud

Dillingham: Absolutely, and that’s something you can address, regardless of whether you’re adopting a single cloud or multiple clouds. The more you use multiple resources, the more you are going to consider tools that address multiple infrastructures -- and not base your capabilities on a single vendor’s toolset. You are going to go with a cloud management vendor that produces tools that comprehensively address security, compliance, cost management, and monitoring, et cetera.

Gardner: Does the function of IT acquisitions now move outside of IT? Should companies be thinking about a chief procurement officer (CPO) or chief financial officer (CFO) becoming a part of the IT purchasing equation?

Dillingham: By virtue of the way cloud has been adopted -- more by the business units – they got ahead of IT in many cases. This has been pushed back toward gaining the fuller financial view. That move doesn’t make the IT decision-maker into a CFO as much as turn them into a champion of IT. And IT goes back to being the governance arm, where traditionally they been managing cost, security, and compliance.

It’s natural for the business units and developers to now look to IT for the right tools and capabilities, not necessarily to shed accountability but because that is the traditional role of IT, to enable those capabilities. IT is therefore set up for procurement.

IT is best set up to look at the big picture across vendors and across infrastructures rather than the individual team-by-team or business unit-by-business unit decisions that have been made so far. They need to aggregate the cloud strategy at the highest organizational level.

Gardner: A central tenet of good procurement is to look for volume discounts and to buy in bulk. Perhaps having that holistic and strategic approach to acquiring cloud services lends itself to a better bargaining position?
Learn How to
Make Hybrid IT
Simple
Dillingham: That’s absolutely the pitch of a cloud-by-cloud vendor approach, and there are trade-offs. You can certainly aggregate more spend on a single cloud vendor and potentially achieve more discounts in use by that aggregation.

The rebuttal is that on a long-term basis, your negotiating leverage in that relationship is constrained versus if you have adopted multiple cloud infrastructures and can dialogue across vendors on pricing and discounting.

Now, that may turn into more of an 80/20-, 90/10-split than a 50/50-split, but at least by having some cross-infrastructure capability -- by setting yourself up with orchestration, monitoring, and governance tools that run across multiple clouds -- you are at least in a strategic position from a competitive sourcing perspective.

The trade-off is the cost-aggregation and training necessary to understand how to use those different infrastructures -- because they do have different interfaces, APIs, and the automation is different.

Gardner: I think that’s why we’ve seen vendors like Hewlett Packard Enterprise (HPE) put an increased emphasis on multi-cloud economics, and not just the capability to compose cloud services. The issues we’re bringing up force IT to rethink the financial implications, too. Are the vendors on to something here when it comes to providing insight and experience in managing a multi-cloud market?

Follow the multi-cloud tour guide

Dillingham: Absolutely, and certainly from the perspective that when we talk multi-cloud, we are not just talking multiple public clouds. There is a reality of large existing investments in private infrastructure that continue for various purposes. That on-premises technology also needs cost optimization, security, compliance, auditability, and customization of infrastructure for certain workloads.

Consultative input is very valuable when you see how much pattern-matching there is across customers -- and not just within the same industry but cross industries.
That means the ultimate toolset to be considered needs to work across both public and private infrastructures. A vendor that’s looking beyond just public cloud, like HPE, and delivers a multi-cloud and hybrid cloud management orientation is set up to be a potential tour guide and strategic consultative adviser.

And that consultative input is very valuable when you see how much pattern-matching there is across customers – and not just within same industry but across industries. The best insights will come from knowing what it looks like to triage application portfolios, what migrations you want across cloud infrastructures, and the proper set up of comprehensive governance, control processes, and education structures.

Gardner: Right. I’m sure there are systems integrators, in addition to some vendors, that are going to help make the transition from traditional IT procurement to everything-as-a service. Their lessons learned will be very valuable.

That’s more intelligent than trying to do this on your own or go down a dark alley and make mistakes, because as we know, the cloud providers are probably not going to stand up and wave a flag if you’re spending too much money with them.
How to Solve Cost and Utilization
Challenges of
Hybrid Cloud
Dillingham: Yes, and the patterns of progression in cloud orientation are clear for those consultative partners, based on dozens of implementations and executions. From that experience they are far more thoroughly aware of the patterns and how to avoid falling into the traps and pitfalls along the way, more so than a single organization could expect, internally, to be savvy about.

Gardner: It’s a fast-moving target. The cloud providers are bringing out new services all the time. There are literally thousands of different cloud service SKUs for infrastructure-as-a-service, for storage-as-a-service, and for other APIs and third-party services. It becomes very complex, very dynamic.

Do you have any advice for how companies should be better managing cloud adoption? It seems to me there should be collaboration at a higher level, or a different type of management, when it comes to optimizing for multi-cloud and hybrid-cloud economics.

Cloud collaboration strategy

Dillingham: That really comes back to the requirement that the IT organization partner with the business units. The more business units there are in the organization, the more IT is critical in driving collaboration at the highest organizational level and in being responsible for the overall cloud strategy.
Remove Complexity
From Multi-Cloud
And Hybrid IT
The cloud strategy across the topics of platform selection, governance, process, and people skills -- that’s the type of collaboration needed. And it flows into these recommendations from the consultancies of how to avoid the traps and pitfalls. For example: Avoiding mismanagement of expectations and goals in order to drive clear outcomes on the execution of projects, making sure that security and compliance are considered and involved from a functional perspective all the way through, and on down the list.


The decision of what advice to bring in is really about the topic and the selection on the menu. Have you considered the uber strategy and approach? How well have you triaged your application portfolio? How can you best match capabilities to apps across infrastructures and platforms?

Do you have migration planning? How about migration execution? Those can be similar or separate items. You also have development methodologies, and the software platform choices to best support all of that along with security and compliance expertise. These are all aspects certain consultancies will have expertise on more than others, and not many are going to be strong across all of them.

Gardner: It certainly sounds like a lot of planning and perhaps reevaluating the ways of the past. 

Listen to the podcast. Find it on iTunes. Read a full transcript or download a copy. Sponsor: Hewlett Packard Enterprise.

You may also be interested in:

Friday, February 8, 2019

Manufacturer gains advantage by expanding IoT footprint from many machines to many insights

https://www.cubep.com/

The next BriefingsDirect manufacturing modernization and optimization case study centers on how a Canadian maker of containers leverages the Internet of Things (IoT) to create a positive cycle of insights and applied learning.

We will now hear how CuBE Packaging Solutions, Inc. in Aurora, Ontario has deployed edge intelligence to make 21 formerly isolated machines act as a single, coordinated system as it churns out millions of reusable package units per month.

Listen to the podcast. Find it on iTunes. Read a full transcript or  download a copy.

Stay with us as we explore how harnessing edge data with more centralized real-time analytics integration cooks up the winning formula for an ongoing journey of efficiency, quality control, and end-to-end factory improvement.


Here to describe the challenges and solutions for injecting IoT into a plastic container’s production advancement success journey is Len Chopin, President at CuBE Packaging Solutions. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: Len, what are the top trends and requirements driving the need for you to bring more insights into your production process?

Chopin
Chopin: The very competitive nature of injection molding requires us to stay ahead of the competition and utilize the intelligent edge to stay abreast of that competition. By tapping into and optimizing the equipment, we gain on downtime efficiencies, improved throughput, and all the things that help drive more to the bottom line.

Gardner: And this is a win-win because you’re not only improving quality but you're able to improve the volume of output. So it’s sort of the double-benefit of better and bigger.

Chopin: Correct. Driving volume is key. When we are running, we are making money, and we are profitable. By optimizing that production, we are even that much more profitable. And by using analytics and protocols for preventive and predictive maintenance, the IoT solutions drive an increase the uptime on the equipment.

Gardner: Why are sensors in of themselves not enough to attain intelligence at the edge?

Chopin: The sensors are reactive. They give you information. It’s good information. But leaving it up to the people to interpret [the sensor data] takes time. Utilizing analytics, by pooling the data, and looking for trends, means IoT is pushing to us what we need to know and when.

Otherwise we tend to look at a lot of information that’s not useful. Utilizing the intelligent edge means it's pushing to us the information we need, when we need it, so we can react appropriately with the right resources.

Gardner: In order to understand the benefits of when you do this well, let's talk about the state at CuBE Packaging when you didn't have sensors. You weren't processing and you weren't creating a cycle of improvement?
Learn How to Automate and
Drive Insights

Chopin: That was firefighting mode. You really have no idea of what's running, how it’s running, is it trending down, is it fast enough, and is it about to go down. It equates to flying blind, with blinders on. It’s really hard in a manufacturing environment to run a business that way. A lot of people do it, and it’s affordable -- but not very economical. It really doesn’t drive more value to the bottom line.

Gardner: What have been the biggest challenges in moving beyond that previous “firefighting” state to implementing a full IoT capability?

Chopin: The dynamic within our area in Canada is resources. There is lot of technology out there. We rise to the top by learning all about what we can do at the edge, how we can best apply it, and how we can implement that into a meaningful roadmap with the right resources and technical skills of our IT staff.

It’s a new venture for us, so it's definitely been a journey. It is challenging. Getting that roadmap and then sticking to the roadmap is challenging, but as we go through the journey we learn the more relevant things. It's been a dynamic roadmap, which it has to be as the technology evolves and we delve into the world of IoT, which is quite fascinating for us.

Gardner: What would you say has been the hardest nut to crack? Is it the people, the process, or the technology? Which has been the most challenging?

Trust the IoT process

Chopin: I think the process, the execution. But we found that once you deploy IoT, and you begin collecting data and embarking on analytics, then the creative juices become engaged with a lot of the people who previously were disinterested in the whole process.

But then they help steer the ship, and some will change the direction slightly or identify a need that we previously didn't know about – a more valuable way than the path we were on. So people are definitely part of the solution, not part of the problem. For us, it’s about executing to their new expectations and applying the information and technology to find solutions to their specific problems.

We have had really good buy-in with the people, and it’s just become about layering on the technical resources to help them execute their vision.

Gardner: You have referred to becoming, “the Google of manufacturing.” What do you mean by that, and how has Hewlett Packard Enterprise (HPE) supported you in gaining that capability and intelligence?

People are definitely part of the solution, not part of the problem. For us, it's about executing to their new expectations and applying information and technology to find solutions to specific problems.
Chopin: “The Google of manufacturing” was first coined by our owner, JR. It’s his vision so it’s my job to bring it to fruition. The concept is that there’s a lot of cool stuff out there, and we see that IoT is really fascinating.

My job is to take that technology and turn it into an investment with a return on investment (ROI) from execution. How is that all going to help the business? The “Google of manufacturing” is about growth for us -- by using any technology that we see fit and having the leadership to be open to those new ideas and concepts. Even without having a clear vision of the roadmap, it means focusing on the end results. It’s been a unique situation. So far it’s been very good for us.

Gardner: How has HPE helped in your ability to exploit technologies both at the edge and at the data center core?

Chopin: We just embarked on a large equipment expansion [with HPE], which is doubling our throughput. Our IT backbone, our core, was just like our previous equipment -- very old, antiquated, and not cutting edge at all. It was a burden as opposed to an asset.

Part of moving to IoT was putting in a solid platform, which HPE has provided. We work with our integrator and a project team that mapped out our core for the future. It’s not just built for today's needs -- it's built for expansion capabilities. It's built for year-two, year-three. Even if we’re not fully utilizing it today -- it has been built for the future.

https://www.cubep.com/
HPE has more things coming down the pipeline that are built on and integrated to this core, so that there are no real limitations to the system. No longer will we have to scrap an old system and put a new one in. It’s now scalable, which we think of as the platform for becoming the “Google of manufacturing,” and which is going to be critical for us.

Gardner: Future-proofing infrastructure is always one of my top requirements. All right, please tell us about CuBE Packaging, your organization’s size, what you're doing, and what end products are.

The CuBE takeaway

Chopin: We have a 170,000-square-foot facility, with about 120 employees producing injection-molded plastic containers for the food service industry, for home-meal replacement, and takeout markets, distributed to Canada as well as the US, which is obviously a huge and diverse market.

We also have a focus on sustainability. Our products are reusable and recyclable. They are a premier product that come with a premier price. They are also customizable and brandable, which has been a key to CuBE’s success. We partner with restaurants, with sophisticated customers, who see a value in the specific branding and of having a robust packaging solution.

Gardner: Len, you mentioned that you're in a competitive industry and that margin is therefore under pressure. For you to improve your bottom line, how do you account for more productivity? How are you turning what we have described in terms of an IoT and data capability into that economic improvement to your business outcome?

Chopin: I refer to this as having a plant within a plant. There is always lot more you can squeeze out of an operation by knowing what it’s up to, not day-by-day, but minute-by-minute. Our process is run quite quickly and so slippage in machine cycle times can occur rapidly. We must grasp the small slippages, or predict failures, or when something is out of technical specifications from the injection molding standpoint or we could be producing a poor-quality product.

Getting a handle on what the machines are doing, minute-by-minute-by-minute gives us the advantage to utilize the assets and the people and so to optimize the uptime, as well as improve our quality, to get more of the best product to the market. So it really does drive value right to the bottom line.
Learn How to Automate and
Drive Insights

Gardner: A big buzzword in the industry now is artificial intelligence (AI). We are seeing lots of companies dabble in that. But you need to put in place certain things before you can take advantage of those capabilities that not only react but have the intelligence to prescribe new processes for doing things even more efficiently.

Are you working in conjunction with your integrator and HPE to allow you to exploit AI when it becomes mature enough for organizations like your own?

AI adds uptime 

Chopin: We are already embarking on using sensors for things that were seemingly unrelated. For example, we are picking up data points off of peripheral equipment that feed into the molding process. This provides us a better handle on those inputs to the process, inputs to the actual equipment, rather than focusing on the throughput and of how many parts we get in a given day.

For us, the AI is about that equipment uptime and of preventing any of it going down. By utilizing the inputs to the machines, it can notify us in advance, when we need to be notified.

Rather than monitoring equipment performance manually with a clipboard and a pen, we can check on run conditions or temperatures of some equipment up on the roof that's critical to the operation. The AI will hopefully alert us to things that we don't know about or don't see because it could be at the far end of the operations. Yet there is a codependency on a lot of that pre-upstream equipment that feeds to the downstream equipment.

So for us to gain transparency into that end-to-end process and having intelligence built-in enough to say, “Hey, you have a problem -- not yet, but you're going to have a problem,” allows us to react before the problem occurs and causes a production upset.

Rather than monitoring equipment performance manually with a clipboard and a pen, we can check on run conditions or temperatures of some equipment up on the roof if that is critical to the operations.
Gardner: You can attain a total data picture across your entire product lifecycle, and your entire production facility. Having that allows you to really leverage AI.

Sounds like that means a lot of data over long period of time. Is there anything about what's happening at that data center core, around storage, that makes it more attractive to do this sooner than later?

Chopin: As I mentioned previously, there are a lot of data points coming off the machines. The bulk of it is useless, other than from an historical standpoint. So by utilizing that data -- not pushing forward what we don't need, but just taking the relevant points -- we piggyback on the programmable logic controllers to just gather the data that we need. Then we further streamline that data to give us what we're looking for within that process.


It's like pushing out only the needle from the haystack, as opposed to pushing the whole haystack forward. That’s the analogy we use.

Gardner: So being more intelligent about how you gather intelligence?

Chopin: Absolutely! Yes.