After more than 140 years of growth, Campbell Soup Company decided to implement an enterprise resource planning (ERP) system to replace a variety of legacy programs in use across the company’s operating regions and business units. As the new system was put into use, it didn’t take long for the finance division to realize that a new opportunity had emerged – that of leveraging the vast quantity of data now being collected.

This insight was pivotal: Access to the data collected by the ERP system allowed Campbell’s to transform its operations. Instead of planning at the aggregate level for its Pepperidge Farm business, for example, it could gain visibility into data down to the SKU level, company executives told SAPInsider. In stark contrast to the past, when Campbell’s capabilities were limited to simply analyzing a country’s performance, Campbell’s was now able to use the data collected by the ERP tool to understand which product lines were more profitable, determine which customers were driving profits and develop strategies based on that data.

As part of this process, Campbell’s realized something that is just becoming clear to organizations today: At its most basic level, an ERP system is a database. Granted, it’s a big, complex database, but a database nonetheless. All the front-ends and fancy reporting tools thrown on top of an ERP software simply serve to make the data readable for input and analysis by people.

This is where many supply chain analysts and managers go astray – they equate the ERP system with the interface used to interact with it. The ERP system isn’t the order entry screen, nor is it the set of reports generated. The ERP is the underlying set of fields and the data within those fields, and there’s an incredible amount of interconnected information that is being collected but left unused. According to the Harvard Business Review, companies that installed digital platforms like ERPs over the past 10 to 15 years haven’t yet cashed in on the information those platforms make available. InformationWeek estimates that companies fail to analyze 88 percent of the data they currently have. That data, used correctly, holds stories that could determine the future direction of your organization.

Change Your Mindset: Big Data is an Approach, Not a Thing

To more fully utilize the power of the data collected by your ERP requires a change in mindset. The key to maximizing the opportunity afforded by your ERP system lies in treating Big Data as an approach, not a thing.

Whether an organization is analyzing 10,000 lines of data or 10 million, the benefits of Big Data approaches and tools apply equally. Yet even at large companies with robust ERP systems, it’s surprisingly easy to find supply chain managers and analysts who don’t believe they have the right types of data sets to benefit from the use of Big Data tools — also known as business intelligence (BI) applications. It’s common to hear Big Data dismissed as an empty trend with a definition so nebulous that it essentially has no meaning.

Treating Big Data as an approach – one that affords or even demands nonstop analysis and experimentation — allows even the largest of companies to act small. In fact, the bigger the company, the more it needs to treat Big Data as a methodology to help it act small. When smaller competitors get a wild idea, they turn on a dime and try it out. For their experiments, they’re setting up a live firing range. Small as it may be, they’re assembling a very real supply chain with items, suppliers and customers.

At larger organizations, budgets, approvals, and compliance issues make instant pivots and trials less likely to be an option. Here’s where the large organization has a major benefit over its more nimble competitors, though. The larger organization has data, so much data, in fact, that it can set up virtual supply chains and use that data to predict outcomes. The experiment isn’t free, of course — there’s a time cost, as well as systems and personnel expenses associated with IT resources and business analysts skilled in handling the Big Data from your ERP system — but the total outlays pale in comparison to having to run a real-world trial. In addition, virtual experimentation preserves the pre-launch secrecy so critical for many product launches.

Virtual Experimentation in Action

Take, for example, the real-world case of a company with a global supply chain. The location of the company’s supply base was driven by where its customers were: About 60 percent of its supply was in China, serving its customers in that country, and about 25 percent was in Vietnam, serving its customers located there. Another 15 percent was spread among smaller locales. Purchasing patterns, however, were changing, with sales in Vietnam trending up at the expense of those in China.

A shift in the supply base from 60 percent (China) – 25 percent (Vietnam) – 15 percent (other) to something more along the lines of 45 percent - 40 percent - 15 percent would surely have a financial impact, and the organization needed clarity into the magnitude of that change. To determine that financial figure, the company used Tableau — one of Big Data’s software darlings — to dive into the data collected by its ERP system, discovering correlations between factors like manufacturing site, distance to customer location, freight terms, commodity pricing, unit-sell pricing and unit-buy pricing. As the experimentation phase, this constituted the heavy lifting of the project. A willingness to question, try, re-question, validate and either accept the results or abandon the thread is the cornerstone of the process. Some questions led nowhere; others led to breakthroughs. Once the correlations and trends in the ERP data were uncovered, using BI tools to act upon it was relatively straightforward.

The models then drove decision making in the supply base’s load-balancing process and even rippled into the company’s pricing strategy. The latter was unexpected but crucial, because the company needed to proactively move customers to new price points due to the upcoming cost changes associated with a higher percentage of the supply coming from Vietnam.

Where We’re Going

The mid-20th century was the era of the large corporation. The incredible technological progress of the late 20th century and early 21st gave rise to a powerful army of fast-moving small companies. But thanks to the impact of Big Data, we’re on the front edge of a boom of fleet-footed large companies. Treating Big Data as an approach, not a thing, will position your organization to be able to experiment and change as quickly as your smaller competitors.



ABOUT THE AUTHOR

Clay Cavanaugh is a Propeller alumnus that has a diverse background that includes several years in the packaging industry, with roles in international supply chain management, strategic planning and business process analysis. Other experience includes founding a publishing company, where he oversaw marketing, project delivery and planning initiatives (not to mention writing two books). Clay has an MBA from Duke University and a bachelor’s degree in computer engineering from the University of Idaho.