Big Data’s benefits touted for logistics sector

Alan M. Field | Jun 13, 2014 3:39PM EDT

Like other fashionable buzzwords in the past, the catchphrase “Big Data” has grabbed many headlines in recent months, even as it has left many potential users bewildered about its scope and potential benefits.

Although many supply chain companies seem to believe Big Data is too big for anyone but the Fortune 500 manufacturers, retailers and logistics service providers, the word “big” is a purely relative term in most cases, according to a recent report by Nathaniel Rowe, Aberdeen Group’s research analyst for enterprise data management. “ ‘Big’ is in the eye of the beholder, and simply refers to data demands that outstrip an organization’s ability to handle them in a reasonable, cost-effective manner,” Rowe said.

No sector of business activity is more ripe for using Big Data-based analytical tools than logistics and the supply chain, said Greg Kefer, vice president of corporate marketing at GT Nexus. Based in Oakland, California, GT Nexus operates a cloud-based supply chain platform that annually manages data covering more than $100 billion in goods from more than 25,000 active trading organizations across all major industries. 

“If there’s a place for Big Data, it’s the supply chain, because (more and more) companies have global networks comprised of thousands of different partners,” Kefer said. “Within each one of those commercial relationships, there could be hundreds of transactions per day or per hour, depending on the nature of the relationship. And every one of those transactions creates data. Probably Big Data was always around; it was just in spreadsheets and e-mails.”

The key question managers face is how Big Data can be leveraged to provide insights that add value, Kefer noted. “There is the data, which is ones and zeroes, and there is information,” he said. “Companies have embraced connecting with their partners, because obviously everyone needs visibility into what is going on there, across their value chain. So they hook everything up, and turn the pipes on, and you suddenly have a reservoir, but you really don’t know what to make of it. Now you get into the dynamics of ‘Which data do I need?’ ‘Which data is important?’ ”

Complicating this challenge is the constant chain of custody in the supply chain, he added. One purchase order, for example, commonly turns into multiple shipments, which involves using multiple partners that require multiple sets of documents, assets, information and events. “What was once one thing has turned into 17 different things, and if you then click to see what’s inside the container at the pallet, or carton level — which is what you ultimately want — it turns into hundreds of different things. And that’s just one order. What happens when you send out 27,000 orders a month?” Kefer asked.

In the retail sector, one of the biggest challenges to applying the insights of Big Data is that there isn’t just one owner of the data that flows through the supply chain, said Andrew Roszko, senior vice president of sales at Descartes Systems. “The shippers own the purchase orders, the suppliers own their advance shipping notices and the carriers own their status information, so bringing this data into one place so retailers can do analytics to become more nimble is a very hot topic,” he said.

Some retailers, for example, are learning how they can leverage that data to ship directly from a supplier in Asia to a customer in North America to gain greater efficiency, instead of going through their traditional distribution center, Roszko said.

Insight derived from Big Data analysis also is being used to manage inventory more efficiently. “A traditional retailer has a ladder plan with suppliers — and projected lead time on whether to carry a certain amount of safety stock in their DCs to make sure their stores are full,” Roszko said. “Really understanding the details of what’s going on in their supply chain — where the bottlenecks are and whether suppliers and carriers are doing what they are supposed to be doing — provides a real opportunity” to reduce costs.

Descartes worked with a Canadian retailer that took five days out of its average lead time in all of its transportation lanes, which equated to an inventory reduction worth $27 million, he said.

Previously, that retailer would bring goods in from Asia, but it was “completely blind” to what was really going on in its own supply chain. About half of the retailer’s purchase orders that would arrive from Asia were about 40 days ahead of the plan, so they were essentially being very conservative in their buying, and incurring a lot of costs by building up their safety stock, Roszko explained.

“The solution was to understand what was going on, to collect all the relevant information about their supply chain, to connect with all of their suppliers and their carriers — there was a huge amount of data in every trade lane — and then connect with government agencies to see when the goods were being released.”

A key technical challenge, Kefer noted, is that “technology systems speak different languages. Ones and zeroes may be common, but you might have different systems at corporate headquarters that are built on an SAP platform, and half of your logistics providers have proprietary home-grown systems, like the carriers — for managing their EDI — and then the suppliers have their own systems. There are a lot of languages floating around. What’s missing is insight and information.”

Meanwhile, partners around the world may be using a range of foreign languages, such as Spanish, German or Chinese, and viewing information through the prism of their own cultures and time zones.

Despite such challenges, the benefits of integrating supply chain data streams from multiple logistics providers — that is, leveraging Big Data across the supply chain — “could eliminate current market fragmentation, enabling powerful new collaboration and services,” a recent report by DHL Custom Solutions and Innovation states. “Many providers realize that Big Data is a game-changing trend for the logistics industry.”

The DHL report noted that “the first and most obvious” benefit of Big Data analytics is to improve “operational efficiency” — that is, to “optimize resource consumption, and to improve process quality and performance.” Although these goals have always been the key benefits of automated data processing, Big Data analytics provide “an enhanced set of capabilities.”

A second high-level benefit for providers of logistics services is to improve “customer experience,” the report says. Typically, that means increasing customer loyalty, performing more precise customer segmentation and optimizing customer service. “Including the vast data resources of the public Internet, Big Data propels CRM (Customer Relationship Management) techniques to the next evolutionary stage,” while enabling new business models to create additional revenue streams “from entirely new data products.”

More specifically, in the logistics industry “Big Data analytics can provide competitive advantage because of five distinct properties,” the report says:

  1. Optimization of service properties, such as delivery time, resource utilization and geographical coverage. “The more precise the information is, the better the optimization results will become.”
  2. The delivery of tangible goods. “Big Data concepts provide versatile analytic means in order to generate valuable insight on consumer sentiment and product quality.”
  3. Seamless integration of logistics solutions into production and distribution processes. As a result, “logistics providers feel the heartbeat of individual businesses, vertical markets or regions.”
  4. Using the transportation and delivery network as a high-resolution source of data, to “provide valuable insight on the global flow of goods.” This “moves the level of observation to a micro-economic viewpoint.”
  5. Leveraging data from local and decentralized operations. Processing information from a fleet of vehicles moving across the country “creates a valuable zoom display for demographic, environmental and traffic statistics.”

To address these complexities, GT Nexus connects all of the shippers onto its own network, which provides its own software to do the analytics that suit the specific needs of each customer. Many companies, however, still prefer to buy the technology and build their own networks, using technology provided by SAP, Oracle and others.

One problem with building your own network, Kefer argued, is that it takes years to do. Another issue is that some multinational companies “have 30 versions of SAP around the world, and they can’t even talk with each other,” he said. Connecting with the GT Nexus network automatically links users with thousands of other companies, including their own suppliers, carriers and third-party logistics companies.

Scott Sangster, a Descartes vice president, said that if a member of its Global Logistics Network wants to take advantage of Big Data analytics, it can import the information it needs from the GLN and merge it with information from other Internet services, and from various kinds of software for business analytics.

“Some companies are starting to do some analytics themselves, internally, and some are outsourcing” such analytics to outside specialists. “A lot of the sources of this data come from the GLN,” Sangster explained. “Big Data involves gathering not only your own information, but real-time data from multiple sources.” 

Companies need to prepare for the likelihood that their various internal departments will have different perspectives about how and when to use Big Data. IT managers often focus on using Big Data to “help move more information faster and help managers derive insights within the decision window,” said Peter Krensky, Aberdeen Group’s senior research associate for analytics and business intelligence.

But the operational managers who actually use the data are 40 percent more likely “to be focused on the ability to incorporate data sources like social media and customer feedback into their Big Data initiatives,” he said. This kind of “unstructured” information is typically text-heavy but not organized in a pre-defined manner that would make it easier to process with business intelligence software.

Nevertheless, this kind of data “can offer perspectives on an organization’s products that traditional data sources cannot,” Krensky said. “Combining structured and unstructured data offers users optimal visibility to understand the business and take intelligent action.”

Overall, making a determination about what kinds of Big Data qualify for analysis can be something “very subjective,” Sangster said. “What sources are relevant to you? And how do you weigh those different sources?” 

For example, a company whose supply chains pass through regions with a high probability of hurricanes or tornadoes — or other major weather events — will want to put greater weight on data about such “acts of God.” And a company that depends on feedback from its customers to build its brand and generate repeat sales may decide that social media can play a significant role in its mix of Big Data sources — even in the hard-nosed world of transportation and logistics.  

Contact Alan M. Field at alanmf0@gmail.com.