top of page
Kelly J. Thomas

In-Memory Computing, SCM Software, and Moonwalking with Einstein

Updated: Dec 22, 2019


This is the first of a multi-part series on the importance of in-memory computing to supply chain management software. I’m starting with a provocative view on in-memory computing (IMC) vis-a-vis artificial intelligence, and then will backtrack in future articles to discuss different approaches to using computer memory to build SCM software solutions. I also touched on this topic in an article, titled “10 technologies that will reshape supply chain software,” that I published early this year in Supply Chain Quarterly.

One of Bill Gates’ favorite books of 2102 was “Moonwalking with Einstein,” which is a fascinating tale of how the human brain, while typically only able to juggle about 7-10 pieces of data in short-term memory, can be trained to remember extraordinary amounts of data. This training includes techniques that allow humans to remember Pi out to 73,000 digits and to be able to see and then recall the sequence of two decks of randomly-sequenced cards in less than two minutes. These extraordinary memory feats are on display at competitions across the world. These competitions typically involve a timed period in which the competitor has to absorb a set of information, and then a timed period in which the competitor has to verbalize the information absorbed, in the exact sequence in which it was displayed. Thus, it is both the volume of the data and the speed with which it is stored and retrieved that are important to winning.

Similar concepts are pertinent to the discussion of the application of in-memory computing to supply chain management problems. In-memory computing technologies are well-suited to help solve supply chain problems for a number of reasons. But not all in-memory computing approaches are the same; there’s in-process memory, caching, in-memory databases, in-memory data grids and other approaches that form the starting point for the discussion. Over and above this level of discussion is the degree to which the memory can learn and exhibit intelligence a la “Moonwalking with Einstein,” and the degree to which such learning has been optimized for supply chain problems. We broach these subjects here and then explore them deeper in future posts.

Why In-Memory for Supply Chain Problems?

Supply chains are complex, highly-interconnected organisms. A wide variety of resources, operations, people skills, and materials are connected across physical space to deliver billions of different products to intermediate customers and end consumers. And, significantly, these resources are connected across a time horizon necessitated by lead times and business planning. The time dimension makes supply chain management and all its supporting systems very complex. Software must connect resources, operations, materials, and people, through time-phased bills-of-material and -distribution, such that a change to a resource in the short-term must ripple to other resources not just at a point in time, but potentially over the entire horizon under consideration. This leads to an explosion of permutations that must be processed.

Consider the following simple example, with just input and output without details for transportation and warehousing. Let’s say a company makes 10,000 unique products and delivers them to 1000 unique customer locations through 200,000 orders per year. The products have inputs from 1000 suppliers, 10 different inbound transportation companies and there are 50 process steps from supplier delivery to customer delivery, along with 50 unique intermediate inventory holding positions for each of the 10,000 products, and 10 different outbound transportation providers. The company must plan their operations over a 24-month horizon, which consists of 18-month size buckets, 22 week-size buckets, and 30 daily sized buckets for a total of 70 time buckets. While exact permutations would depend on specific relationships between supply chain entities, such a supply chain could easily result in more than a trillion planning entities (simply multiplying the above numbers together yields a result of 70 sextillion, or 70 billion trillion).

A simple diagram showing a microcosm of this supply chain is shown below.

In a supply chain planning application, a plan must be constructed by traversing a complex network that includes all of the elements described in the example above. Attempting to do this using a physical hard drive will quickly result in bottlenecks. In a world in which corporations must rapidly respond to both demand and supply changes, disk-based applications became obsolete years ago. And, as will be discussed below, just taking old approaches and putting them into an in-memory database is no longer sufficient.

It’s little wonder that George Buckley, former CEO of 3M, once described the 3M supply chain as a “hairball.” In the case of 3M, their supply chain, like many others, was more complex that that depicted in the diagram above. Manufacturing facilities would ship intermediate product out, sometimes across geographies, for further processing, and then ship it either back in or somewhere else for finish processing.

Now consider for a moment the following physical facts: pure access speeds for in-memory data are 10,000 to 100,000 times faster than access speeds for physical databases (actions requiring processing may result in lower multiples). For the above supply chain, plan creation using physical disk-based approaches would in the past result in many hours of processing. With in-memory processing, this time can be brought down by orders of magnitude.

But just throwing a model into an in-memory database is only part of the answer, and a first-generation one at that (more about the evolution of in-memory techniques for supply chain problems in a later post). It’s the how part that becomes really important and takes in-memory computing for supply chain problems to another level of performance. For example, a relational database that is resident in-memory versus on a physical disk is much faster, but it is also constrained by the same limitations that are inherent in a relational design. How do I take an in-memory capability and optimize it for supply chain planning and decision making? This brings us back to “Moonwalking with Einstein” concepts and the need to marry human-like intelligence approaches to in-memory design.

One solution that mimics the human brain in its in-memory design is that provided by Kinaxis. In a case in which scarcity of resources can sometimes drive the most elegant designs, the Kinaxis in-memory compute engine was borne of a time when memory was scarce and expensive. As is often the case in innovation (see the HBR article “The Sliver Lining to Scarcity: It Drives Innovation”), scarcity can make necessity the mother of invention. A similar theme was communicated by Michael Eisner, former CEO of Disney, at the AMR supply chain conference a number of years ago. Eisner described “innovation in a box,” in which he would sometimes place a box around time and resources for creative talent; this would often lead to unique solutions, resulting in famous movie scenes that would have otherwise cost hundreds of times more.

Although it is not called artificial intelligence, the Kinaxis in-memory design is one in which the more the system is used, the smarter it gets. In this case smartness is measured in the speed and quality with which the system can answer questions. Not only are the same questions answered faster (as one would expect), but new questions that are even remotely associated with previous questions are answered faster. The system creates a web of connections, much in the same way humans carry on conversations, segueing from one topic to the next. Furthermore, with the flip of switch, an entirely new memory or brain can be created, providing the ability to go off and explore answers to new questions. These answers can then be mind-melded back into the original memory.

While the world of AI is currently focused largely on using ML, neural networks, deep learning, and other techniques to train systems by processing large amounts of data, most enterprise software solutions have bypassed the important step of intelligent memory management. This basic intelligence does not need to start with large amounts of data, it can become increasingly smart with each incremental piece of data it is fed. This is very powerful, yet it is being overlooked by most solution providers, as they focus on the technology instead of the problem that needs to be solved.

More to come on this topic in future posts.

301 views0 comments

Comments


bottom of page