Caretakers’ Big Data Challenge

Caretakers’ Big Data Challenge

Managing big data is similar to managing the data “of old,” with one big difference: It’s exponentially faster and broader in scope.

In logistics, structured and unstructured data now can be managed simultaneously for developing predictive models for asset, inventory and labor management, and much more, said Sundar Swaminathan, Oracle’s senior director of transportation industry strategy and marketing.

Unstructured data is data from weblogs, e-mail, social media, sensors, photographs, videos and other sources.

Oracle last year rolled out Oracle Advanced Analytics, which includes deep data mining capabilities and an enhanced relational database management system.

For companies managing big data, there are three key areas: large-scale advanced analytics, scalable in-memory database management and processing.

Advanced analytics tools enable complex processes such as data mining, statistical analysis and numerical computations, as well as traditional online analytical processing. They can integrate analyses of unstructured and traditional enterprise data, and are highly scalable.

Little installation is required, and users can configure the tools, according to Oracle.  

Analysis of new and historical data enables new solutions to old problems. For example, data from a smart vending machine in concert with the events schedule for the business that hosts the machine can produce an optimal product mix.    

Hadoop, a recent technology breakthrough, has emerged as the primary system for organizing and analyzing large amounts of complex, unstructured and machine-generated data for elements of value or meaningful patterns.

Hadoop allows for data to be organized and processed in its original storage cluster, saving time and money by avoiding massive data transfers. 

Organizations that use Hadoop have the ability to scale “linearly,” so processing time grows commensurately with data volumes, making it possible to analyze large amounts of unstructured data rapidly. “Conventional analysis methods just do not work as the processing times are too high,” Swaminathan said.

In-memory data management is regarded as the future of big data management. It relies on clustered servers and high-speed networks for virtually unlimited scalability that match the high-performance characteristics of memory-based data management associated with a single server’s memory capacity.

Big data management is making its way gradually into the marketplace. In retailing, where margins are tight, service can be the differentiator. The challenge is to identify profit and cost-savings opportunities at every supply chain touch point, which requires data capture and analysis and collaborative platforms, but not necessarily the latest big data technologies, said David Landau, vice president of product management for global supply chain technology provider Manhattan Associates.

“It’s not about pushing boundaries. It’s about using big data for practical business functions,” he said. “We think about big data as having the ability to acquire and synchronize data in a standardized format. Customers want cost data captured and brought down to a granular level.” 

Contact David Biederman at inexdb@comcast.net.