October 23rd, 2013 9:00 am Category: White Papers, by: Editor
Today, smart manufacturers view the supply chain as a critical element for gaining competitive advantage. Leading companies have long since gloablized their manufacturing and distribution operations. They rely heavily on enterprise resource planning (ERP) platforms to track and record virtually every transaction that occurs in the supply chain – from raw materials sourcing to point-of-sale sell-through.Without doubt, the efficiencies that have accrued through ERP are significant. When one accounts for reduced inventory, carrying costs, labor costs, improvements to sales and customer service, and efficiencies in financial management, the tangible cost savings to enterprises have been estimated to range between 10 and 25% or more. 1 2 Global and multinational concerns have reorgnized themselves – through ERP standardization – to create a competitive advantage over regional manufacturers.
While this ERP standardization has created an advantage for larger concerns, leading supply chain managers are discovering new ways to improve beyond ERP’s limitations. In essence, these supply chain ‘disruptors’ are seeking new ways to separate themselves from the pack. The functional areas and tools used by these disruptors varies widely – from long-term global supply chain network design to near-term sales and operations planing (S&OP) and order fulfillment; and from realtively simple solver-based spreadsheets to powerful optimization software deeply integrated in to the ERP data warehouse.
At Profit Point, we believe that continued pursuit of supply chain improvement is great. We believe that it is good for business, for consumers and for the efficient use (and reuse) of resources around the globe. In this survey, we set out to explore the methods, tools and processes that supply chain professionals utilize to improve upon their historical gains and to gain competitive advantage in the future. You can request a copy of the report here.
We welcome your feedback. Please feel free to contact us or leave a comment below.
July 30th, 2012 12:56 pm Category: Enterprise Resource Planning, Global Supply Chain, Network Design, Operations Research, Optimization, Profit Network, Profit Vehicle Planner, Profit Vehicle Router, Risk Management, Supply Chain Improvement, by: Jim Piermarini
There is nothing like a bit of vacation to help with perspective.
Recently, I read about the San Diego Big Boom fireworks fiasco — when an elaborate Fourth of July fireworks display was spectacularly ruined after all 7,000 fireworks went off at the same time. If you haven’t seen the video, here is a link.
And I was reading an article in the local newspaper on the recent news on the Higgs: Getting from Cape Cod to Higgs boson read it here:
And I was thinking about how hard it is to know something, really know it. The data collected at CERN when they smash those particle streams together must look a lot like the first video. A ton of activity, all in a short time, and a bunch of noise in that Big Data. Imagine having to look at the fireworks video and then determine the list of all the individual type of fireworks that went up… I guess that is similar to what the folks at CERN have to do to find the single firecracker that is the Higgs boson.
Sometimes we are faced with seemingly overwhelming tasks of finding that needle in the haystack.
In our business, we help companies look among potentially many millions of choices to find the best way of operating their supply chains. Yeah, I know it is not the Higgs boson. But it could be a way to recover from a devastating earthquake and tsunami that disrupted operations literally overnight. It could be the way to restore profitability to an ailing business in a contracting economy. It could be a way to reduce the greenhouse footprint by eliminating unneeded transportation, or decrease water consumption in dry areas. It could be a way to expand in the best way to use assets and capital in the long term. It could be to reduce waste by stocking what the customers want.
These ways of running the business, of running the supply chain, that make a real difference, are made possible by the vast amounts of data being collected by ERP systems all over the world, every day. Big Data like the ‘point-of’sale’ info on each unit that is sold from a retailer. Big Data like actual transportation costs to move a unit from LA to Boston, or from Shanghai to LA. Big Data like the price elasticity of a product, or the number of products that can be in a certain warehouse. These data and many many other data points are being collected every day and can be utilized to improve the operation of the business in nearly real time. In our experience, much of the potential of this vast collection of data is going to waste. The vastness of the Big Data can itself appear to be overwhelming. Too many fireworks at once.
Having the data is only part of the solution. Businesses are adopting systems to organize that data and make it available to their business users in data warehouses and other data cubes. Business users are learning to devour that data with great visualization tools like Tableau and pivot tables. They are looking for the trends or anomalies that will allow them to learn something about their operations. And some businesses adopting more specialized tools to leverage that data into an automated way of looking deeper into the data. Optimization tools like our Profit Network, Profit Planner, or Profit Scheduler can process vast quantities of data to find the best way of configuring or operating the supply chain.
So, while it is not the Higgs boson that we help people find, businesses do rely on us to make sense of a big bang of data and hopefully see some fireworks along the way.
I was sitting on the plane the other day and chatting with the guy in the next seat when I asked him why he happened to be traveling. He was returning home from an SAP ERP software implementation training course. When I followed up and asked him how it was going, I got the predictable eye roll and sigh before he said, “It was going OK.” There are two things that were sad here. First, the implementation was only “going OK” and second, that I had heard this same type of response from so many different people implementing big ERP that I was expecting his response before he made it.
So, why is it so predictable that the implementations of big ERP systems struggle? I propose that one of the main reasons is that the implementation doesn’t focus enough on the operational decision-making that drives the company’s performance.
A high-level project history that I’ve heard from too many clients looks something like this:
- Blueprinting with wide participation from across the enterprise
- Implementation delays
- Data integrity is found to be an issue – more resources are focused here
- Transaction flow is found to be more complex than originally thought – more resources are focused here
- Project management notices the burn rate from both internal and external resources assigned to the project
- De-scoping of the project from the original blueprinting
- Reports are delayed
- Operational functionality is delayed
- Testing of transactional flows
- Go-live involves operational people at all levels frustrated because they can’t do their jobs
Unfortunately, the de-scoping phase seems to hit some of the key decision-makers in the supply chain, like plant schedulers, supply and demand planners, warehouse managers, dispatchers, buyers, etc. particularly hard, and it manifests in the chaos after go-live. These are the people that make the daily bread and butter decisions that drive the company’s performance, but they don’t have the information they need to make the decisions that they must make because of the de-scoping and the focus on transaction flow. (It’s ironic that the original sale of these big ERP systems are made at the executive level as a way to better monitor the enterprise’s performance and produce information that will enable better decision-making.)
What then, would be a better way to implement an ERP system? From my perspective, it’s all about decision-making. Thus, the entire implementation plan should be developed around the decisions that need to be made at each level in the enterprise. From blueprinting through the go-live testing plan, the question should be, “Does the user have the information in the form required and the tools (both from the new ERP system and external tools that will still work properly when the new ERP system goes live) to make the necessary decision in a timely manner?” Focusing on this question will drive user access, data accuracy, transaction flow, and all other elements of the configuration and implementation. Why? Because the ERP system is supposed to be an enabler and the only reasons to enter data into the system or to get data out is either to make a decision or as the result of a decision.
Perhaps with that sort of a focus there will be a time when I’ll hear an implementation team member rave about how much easier it will be for decision-makers throughout the enterprise once the new system goes live. I can only hope.
More than a decade has passed since businesses started using Enterprise Resource Planning (ERP) for managing data and transactions throughout the supply chain. Traditionally, ERP systems have provided transparency and insight into transaction-level data in the supply chain that support important business planning activities. Now, a new generation of applications is being developed to help fill the gaps between general business planning and business-specific, tactical and strategic decisions. These ERP-connected applications offer supply chain executives previously unavailable analysis and insights into the decisions that directly impact customer service, profitability and competitive advantage.
Supply Chains Differences
Supply chains are as different as the companies and people that run them. Some companies view their supply chain operations as a “utility” that is expected to function without any investment in intellectual capital. These organizations are content to rely on industry best practices in their supply chain operations and follow the leaders (or the features that are added by ERP software providers) in supply chain improvement. Other organizations see their supply chain operations as a strategic opportunity to develop a competitive advantage and increase market share. They know that with some small departures from the norm and a modest investment in intellectual capital, supply chains can provide enhanced performance to the business. These companies understand that there are opportunities for creative and unique ideas in the supply chain to improve company performance and achieve business strategy objectives.
Today, many C-level executives see their ERP systems as key enablers to company productivity, and for the most part, they are correct. Since ERP systems perform many valuable functions, there is a natural assumption that they can handle whatever business strategy the company adopts. However, new business ideas by definition run the risk of stressing the ERP system features beyond their ability to cope. Usually these failures are discovered only during the implementation of a new business strategy. So what happens when the ERP system fails to support the new business strategy in certain critical details? Those working in the trenches know this scenario all too well. But, what can be done to implement strategic supply chain initiatives when ERP is not equipped to handle business-specific initiatives?
Making the ERP Work
There are three possible approaches for implementing supply chain planning activities that offer a company a competitive advantage:
1. Figure out how to get the ERP system to do it. This approach works well if the company’s needs align well with current industry practices supported by ERP systems. Otherwise, companies may find themselves going down a path that consumes significant resources for a poor fit in the end. Companies that adhere to this path typically do so in part because there is a strong C-level edict in favor of simple, clean upgrades for the ERP system. Faced with this, the IT organization has enormous power to shape the nature of the supply chain operation to fit within the established ERP norms, and thus can act as a barrier for business innovation and supply chain improvement.
2.Modify the ERP system to provide new functionality. This is an approach often promoted by IT organizations committed to supporting the fewest number of tools. While this is an important cost management objective, it is important to understand the full cost to implement and support the system over the long term. What can be accomplished is often limited by the lack of flexibility in large ERP systems and IT organizations. Since ERP systems are mission-critical systems, the support and maintenance of the core functions are of paramount importance. This task, placed on a limited IT staff, leads to large backlogs of enhancement work and long queue times. And while IT departments are well-equipped to manage their primary assets, few if any IT departments have the requisite domain knowledge to cross over into supply chain optimization. Given long wait times, organizations will often choose the simplest approximation of the business change that can be ushered to the top of the queue. This approach can result in a quick-fix style of strategy implementation, rather than a priority-based feature development, and may leave the most important aspects of the initiative lingering in the queue.
3. Add an integrated solution to the ERP system that replaces one or more functions that are needed to achieve the business strategy. This could be from an out-of-the-box third-party provider, or for full competitive advantage, a targeted or custom supply chain application that integrates with the company’s ERP data. This approach has the benefit of including priority-based features that the current ERP system lacks, and the additional benefit of avoiding the ERP enhancement queue. The downside, however, is that it suffers from the stigma of being yet another application and not the ERP system itself. This usually presents a hurdle that requires a careful analysis to understand the total cost relative to the strategic benefit. While not all business changes will overcome this roadblock, there are good reasons to look at this approach. These include:
- Ensuring a tight fit between the business strategy and the tool execution
- Minimizing the cost, overhead, and extra setup and maintenance in un-needed functions from a shrink-wrapped general purpose tool
- Providing the marketplace with a specialized and unique operation of the supply chain for competitive advantage.
Example from the Field
A leading consumer electronics company with about $2bn in annual sales implemented an integrated solution to its ERP system to manage its order fulfillment process for competitive advantage. The company had recently modified its corporate strategy to increase retail sales through its “big box” customers (Walmart, Best Buy, Staples, etc.). However, key service level agreements were not being met for these customers due to lower than expected order fulfillment measures. A simple inventory analysis recommended large increases in the stock required at the warehouse, with some method of segregating inventory for each big-box customer so it could not be taken by orders from other customers.
In this case, one of the leading causes of low service for customers was that they ordered “just-in-time”. These JIT orders were not being given any priority over other customers’ orders with longer lead times. The company noted that these important customers may have provided accurate plan information, but that was not being used to assure them any better service. The analysis recommended that separate stocks of inventory be set up based on the big-box planning information, and that other customers not be allowed to take from those inventory locations. This would result in a large increase in overall stocks, but should achieve the desired increase in service levels.
One manager questioned this recommendation, wanting to know why the ERP system did not use the big-box planning information to appropriately manage the company’s service levels. She also questioned what could be done to avoid increasing her inventory risk and yet still achieve the business strategy. This is a question many managers face when their analysts say that to improve service you need to increase inventory levels. Often there are alternatives. This key manager’s insight set the path for her company to make a significant shift in their supply chain operations, with remarkable benefits. What follows will answer the question: Can I raise the service level of my key customers without increasing my inventory and capital risk? The short answer is, “yes”. Significant service benefits and risk reductions can be achieved, but only if you are willing to deviate from your ERP’s standard approach to implementing key supply chain initiatives.
The industry standard approach for assigning available inventory to open orders is to use a FIFO (first in, first out) approach. This approach prioritizes orders based on when the order was received and assigns on-hand inventory to those orders that were received and entered into the system first. While this approach has a degree of fairness to it, and is available in all ERP systems, it did not align well with the business objectives of this company. It actually penalized key customers who issued JIT purchase orders while giving ample planning information. These JIT orders would have to wait until all the older orders, from non-key customers, were allocated before they would be assigned any inventory.
The standard ERP process does not take into consideration the customer’s strategic importance or their planning information. Given this FIFO process, the internal recommendation makes sense: set up separate safety stocks for each big-box customer (based on their planning information), in separate inventory locations, and make a rule that directs big-box orders to their separate inventory.
But having separate safety stocks violates the principle that more customers need lower inventory together, than each does individually. Pooling the inventory helps to avoid unnecessary capital risk. The standard ERP FIFO inventory assignment process could be replaced with one that met customer needs more effectively.
The company embarked on a project to take into account several important factors when deciding how much inventory to assign to each order:
- The priority of the customer
- The amount of inventory actually in the sales channel of the customer, and
- The planning information that the customer shared with the company.
Customer priority is a key and strategic factor in deciding which customers receive product, when inventory availability is limited or delayed. This business need meant that strategic and high-volume customers should typically be serviced before others. However, this may not be the case if a strategic or high-volume customer happens to be sitting on a lot of inventory in their channel. In these cases, it may be preferable to share the wealth with smaller volume resellers to maximize the sell-through to retail customers. Moreover, these rules may apply differently for each SKU in a manufacturer’s product line.
The business rules to implement these sorts of complex trade-offs can get complicated. If one wants to retain a certain amount of flexibility in these rules, then the ERP system is a poor place to make these decisions. However, since most, if not all, of the data resides in the ERP system, these decisions must be tightly integrated with the data and transaction handling within the ERP system. So an application was constructed to manage the inventory assignment process in this way to more closely match the business strategy. The new application is run several times a day, extracting needed info from the ERP system, making the assignment of inventory to all open orders, and sending back the info to the ERP system.
Using this integrated solution, overall service levels for these key customers were sharply increased, prompting several supply chain awards from these big-box customers. As a result of the increase in service level, Walmart (a strategic customer) was so pleased they chose to increase their orders of all this company’s products by 100 percent. The overall inventory did not increase.
The new method demonstrated that pooled inventory was an effective approach to containing inventory levels. In subsequent versions of this application, the integration of point of sale data has allowed even more control over the inventory in the various channels to market. As a result, this company has declared this application a business-critical application. It overcame the hurdle, and the application can defend its spot on the chart of critical business applications alongside the ERP system.
Integrated Solution Success
Using an integrated solution to the ERP system was a win-win approach that allows the business the flexibility to manage order fulfillment for competitive advantage while maintaining the benefits of centralized data and the strong transactional handling capabilities delivered by ERP.
But order fulfillment is not the only area where there is opportunity to supplement the strengths of ERP with flexible and powerful business optimization processes and tools. Other areas where leading companies have decided to enhance their ERP capabilities include optimization-based infrastructure planning, sales and operation planning, distribution route and territory planning, transportation bid optimization, transportation fleet planning, and production scheduling.
These are just some examples of where complex and/or strategic business rules can provide competitive advantage through improved supply chain performance. While ERP systems remain the backbone of all successful large business operations today, they are not the only path available to companies who desire to apply innovative approaches to their business and supporting supply chain activities. Global enterprises that seek a competitive advantage now have the opportunity to leverage their ERP investments by integrating optimization-based solutions to key business strategies.
Here at Profit Point we regularly hear from clients with well established Enterprise Resource Planning (ERP) systems that they need something more. ERP systems are excellent for doing certain things including:
- Providing central repositories of data
- Enabling cross functional work processes within and across companies
- Costing of goods
- Planning resources and materials at a high level
However the more complicated your business work processes and manufacturing production processes the less sufficient a standard ERP system will be in providing the best decision support functionality. Some of the complications that require decision support systems (DSS) and which we have been helping clients deal with lately include:
- Work processes to handle make to order versus make to stock material assignments
- Allocation of inventory to customer orders when in an oversold position
- Sequence dependent setups / cleanings of manufacturing equipment
- Scheduling of production sequenced through a “product wheel”
DSS are necessary because of the complexity of first finding a feasible solution and then having some means of sorting through the huge number of feasible options to find a “good” or “optimal” solution. DSS help in these kinds of situations to:
- Reduce costs
- Reduce manufacturing lead times
- Improve customer service
- Increase revenue
ERP systems are a necessary part of being able to deliver a DSS by providing the data necessary for making the decisions in question but don’t have the following:
- Ability to be tailored to a specific work process or manufacturing environment
- Advanced analytical capability to sort through the complexity and volume of options to get to a “good” or “optimal” solution
- Graphical user interface tools to be able to allow a user to visualize the data in a way that gives them the insights needed to make decisions
At Profit Point we specialize in listening to our clients needs and then building DSS to unlock improvement opportunities which enable our clients to outdistance the competition.
Many of us have worked with companies that provide large ERP solutions. Some of these experiences have been successful and others somewhat less than ideal. If you work in the manufacturing, supply chain, logistics areas, then you realize the vast importance of having access to meaningful data, although implementing a large ERP system does not necessarily mean you can get to that data. It has been my experience that having the data available and having the ability to get to that data and using it to perform strategic or tactical analyses may be challenging.
I recall a situation that happened to me many years ago. I was working for a large corporation that maintained a large database on its customers. All of this information was on a mainframe. To get access to the data so that we could perform analyses and generate reports required communication with the MIS department. We would schedule a meeting with one of the analyst to discuss what data we needed access to and what reports we required. The analyst would routinely tell us to fill out a job request form number 777. Then, this form need to go through several levels of management approval. If the request made it through the approval process it was put on the development schedule. Typically from start to end the process would take several months. In today’s world that would not be acceptable. Read the rest of this entry »
Logitech is a world leader in personal peripherals, driving innovation in PC navigation, Internet communications, digital music, home-entertainment control, gaming and wireless devices. With a history of fast-growing distribution channels and a product line that is frequently being updated, Logitech’s key supply chain challenges are similar to those of many other consumer electronics heavyweights. Its product life cycles are relatively short and consumer demand can be fickle. But when Logitech gained global, mass market status with customers ranging from Walmart and Best Buy to direct online sales, its supply chain challenges were compounded.
With mounting distribution challenges, Logitech engaged Profit Point to bridge the gap between their ERP and their real world need to compete. Click the link below to access the case study:
Despite our egalitarian mindset in the U.S., when it comes to customers, let’s face it: They have never been ‘created equal.’ Certainly for decades, manufacturers and distributors have offered better pricing to some customers than others. We’re all familiar with quantity break pricing, column pricing with different discount levels for different categories of customers, and contract pricing. And who doesn’t visit the local supermarket today and notice the ‘buy 3 get 1 free’ offers to encourage us to increase our purchases?
Volume is valuable and warrants better pricing, we are in the habit of believing. And most often this is true. Not only does a high-volume customer drive our buying power with suppliers by helping us reach the next price break level on the purchasing side, but it can make each sale more profitable: The cost of servicing 10 orders that result in a sale of 100 units can be 10 times as great as the cost of servicing a single order for those 100 units.
This bias towards volume underlies traditional customer ranking methods. But many manufacturers today are taking a closer look at these policies and finding them lacking. Instead, they are engaging in a detailed cost analysis effort called ‘cost-to-serve.’ While cost-to-serve can be a very broad subject covering product costs, location costs, transportation costs and service costs, to name a few, this article will take a look primarily at customer costs.
It’s not that heretofore companies have ignored factors that shade the degree of profitability of a large client. Many firms, presented with the opportunity of doing business with, say, Wal-Mart or the federal government, may question whether it’s really worth doing. They’re thinking about the overhead of handling such a client and the cost of meeting client demands – with slim price margins.
What’s different today is that companies are trying to measure these costs precisely and to make informed, scientific decisions based upon them. Whether they engage consulting firms who have developed methods for tackling this measurement, purchase software to help them out, or devise their own internal approach, more and more manufacturers and wholesalers are gathering detailed costs and trying to apply them to decisions about their customers.
Consumer goods companies, for instance, are recording metrics such as the true cost of customer service. How much support time does this customer require of the customer service organization? How much sales time to we devote to him? Does the customer frequently return merchandise, and if so, what is the cost of processing that return? In the case of consumer goods manufacturers, we might also look at custom-branded merchandise: What is the true cost of providing private labeling for a retailer? Are we really capturing in the product cost all of the special handling required by the purchasing and distribution organizations? All of these costs are very important is assessing a customer’s true profitability.
On the other side of the equation, there may be some sales and marketing benefits that a customer brings, and these, too, should be weighed. Does the name ‘Wal-Mart’ on our client list provide positive benefit to the organization? Is another client who doesn’t seem to purchase very much an outstanding reference for us who sends other potential customers to us? If a business can establish a process and gain agreement across the organization on measuring true costs and benefits, it can define policies to more precisely control bottom-line revenue.
Certainly, one of the first decisions that can be made, once true costs are measured and accepted by an organization, is to eliminate customers who are really unprofitable. But cost-to-serve can also come into play in other ways. We may want to devise strategic programs that nurture our best clients to safeguard their business. We may hold special events for them or assign dedicated reps, for instance.
One of the situations where cost-to-serve becomes a critical tool is in inventory allocation, particularly in an inventory shortage situation. When there is insufficient inventory to meet demand, most manufacturers will want to serve the most valuable customers first.
This frequently comes into play in segments of the technology industry, such as computer peripherals, typically with the launch of a popular new consumer product. An extreme example of this might be the launch of a new Wii game player at the start of the holiday season. Armed with true cost-to-serve data, manufacturers could make allocation decisions scientifically to spread the available inventory across the order pool while maximizing profit.
You might ask whether this process can be automated today. The answer is ‘partially.’ Allocation can certainly be automated, but collecting cost-to-serve data on customers usually involves some manual steps, because most companies don’t have all the systems in place to collect this data automatically (and even with sophisticated systems, the data may not be collected in exactly the way you wish.) Some spreadsheet work may be required. Once the spreadsheet is in place, however, the process becomes straightforward.
Perhaps you want to rank customers sequentially from top to bottom, or group them into ‘profit’ segments. Once that is done, an algorithm can be designed to optimize the allocation of inventory according to the rules tied to those rankings or segments. The allocation algorithm might be designed to work directly from the spreadsheet, as well, automating even more of the process. In any case, executing the service decisions in accord with true costs ensures we are protecting our most valuable customers.
The application of cost-to-serve to inventory allocation takes on an even more interesting aspect for consumer goods manufacturers who ship to retailers. As those of us familiar with this industry are aware, most large retailers have very specific guidelines defining how suppliers must do business with them. The retailers specify how an order must arrive – shipped complete, packed by store, etc.; when it must arrive – ‘arrive by’ date; and a variety of paperwork details including design, content and placement of shipping labels and bills of lading. Associated with each of these requirements is a dollar penalty the supplier will incur, taken as a deduction from the supplier’s invoice, for violation of the guideline.
For a consumer goods manufacturer, these penalties or ‘chargebacks,’ can mean the difference between a profitable client and an unprofitable one. In this situation, the ability to allocate inventory defensively, to minimize chargebacks (or at least make an informed scientific decision to incur them) is critical. A powerful allocation engine, in an inventory shortage situation, can maximize profit by factoring potential chargeback costs for late or partial shipment into the equation. In this case, the allocation engine ensures that the cost to serve the retailer is as low as possible.
In addition to retailer penalties, another aspect of ‘allocation-according-to-true-cost’ involves inventory fulfillment location choices. If a company operates a single distribution center in Los Angeles and imports all its product from Asia, there may be only a single fulfillment option. But for the wide majority of consumer goods manufacturers who import from Asia, service clients nationwide, and operate either multiple distribution centers or a distribution center located in, for instance, the Midwest, there are several options and a variety of questions
If inventory is constrained at the facility that would normally handle a particular customer’s order, should the order be fulfilled from an alternate facility? To make this decision, we need to factor in not only the additional shipping cost but also to weigh that cost against the value of the customer. There may be low profit customers, viewed from the perspective of cost-to-serve, for whom we do not want to make this investment. In the case of a retailer where a potential penalty is involved, the decision might be made dynamically based on a comparison of the chargeback incurred against the additional cost of shipping. If the chargeback fee would be higher than the additional shipping cost, it may be worthwhile to use the alternate distribution center.
This type of on-the-fly fulfillment decision is often called ‘dynamic allocation.’ Another example of dynamic allocation involves intercepting shipments in transit to, say, our hypothetical Midwest distribution center. Least cost fulfillment might dictate fulfilling west coast orders by pulling off inventory required to fulfill them at a deconsolidation facility near the port – before a shipment heads out to the distribution center in the Midwest. Under what conditions is this the least-cost choice? An inventory allocation algorithm based on cost-to-serve can make this decision mathematically, using rules the manufacturer defines.
It’s important to emphasize that the decisions on exactly how to apply cost-to-serve data to inventory allocation will depend on the philosophy of the individual company. For this reason, such allocation solutions are often unique and are adjuncts to the standard capabilities of order management systems. Leading-edge firms who are structuring allocation based on true costs typically do so via point solutions that supplement their central transactional systems.
Profit Point, as the name suggest, provides these point solutions and integrates them into SAP, Oracle, and other order management systems to help clients make the best, most profitable allocation and customer decisions. Our expertise in this area can help clients drive maximal profit to the bottom line.
This article was written by Cindy Engers, a Senior Account Manager at Profit Point.
What is a Monte Carlo model and what good is it? We’re not talking a type of car produced by General Motors under the Chevy nameplate. “Monte Carlo” is the name of a type of mathematical computer model. A Monte Carlo is merely a tool for figuring out how risky some particular situation is. It is a method to answer a question like: “what are the odds that such-and-such event will happen”. Now a good statistician can calculate an answer to this kind of question when the circumstances are simple or if the system that you’re dealing with doesn’t have a lot of forces that work together to give the final result. But when you’re faced with a complicated situation that has several processes that interact with each other, and where luck or chance determines the outcome of each, then calculating the odds for how the whole system behaves can be a very difficult task.
Let’s just get some jargon out of the way. To be a little more technical, any process which has a range of possible outcomes and where luck is what ultimately determines the actual result is called “stochastic”, “random” or “probabilistic”. Flipping a coin or rolling dice are simple examples. And a “stochastic system” would be two or more of these probabilistic events that interact.
Imagine that the system you’re interested in is a chemical or pharmaceutical plant where to produce one batch of material requires a mixing and a drying step. Suppose there are 3 mixers and 5 dryers that function completely independent of one another; the department uses a ‘pool concept’ where any batch can use any available mixer and any available dryer. However, since there is not enough room in the area, if a batch completes mixing but there is no dryer available, then the material must sit in the mixer and wait. Thus the mixer can’t be used for any other production. Finally, there are 20 different materials that are produced in this department, and each of them can have a different average mixing and drying time.
Now assume that the graph of the process times for each of the 8 machines looks somewhat like what’s called a ‘bell-shaped curve’. This graph, with it’s highest point (at the average) right in the middle and the left and right sides are mirror images of each other, is known as a Normal Distribution. But because of the nature of the technology and the machines having different ages, the “bells” aren’t really centered; their average values are pulled to the left or right so the bell is actually a little skewed to one side or the other. (Therefore, these process times are really not Normally distributed.)
If you’re trying to analyze this department, the fact that the equipment is treated as a pooled resource means it’s not a straightforward calculation to determine the average length of time required to mix and dry one batch of a certain product. And complicating the effort would be the fact that the answer depends on how many other batches are then in the department and what products they are. If you’re trying to modify the configuration of the department, maybe make changes to the scheduling policies or procedures, or add/change the material handling equipment that moves supplies to and from this department, a Monte Carlo model would be the best approach to performing the analysis.
In a Monte Carlo simulation of this manufacturing operation, the model would have a clock and a ‘to-do’ list of the next events that would occur as batches are processed through the unit. The first events to go onto this list would be requests to start a batch, i.e. the paperwork that directs or initiates production. The order and timing for the appearance of these batches at the department’s front-door could either be random or might be a pre-defined production schedule that is an input to the model.
The model “knows” the rules of how material is processed from a command to produce through the various steps in manufacturing and it keeps track of the status (empty and available, busy mixing/drying, possibly blocked from emptying a finished batch, etc.) of all the equipment. And the program also follows the progress and location of each batch. The model has a simulated clock, which keeps moving ahead and as it does, batches move through the equipment according to the policies and logic that it’s been given. Each batch moves from the initial request stage to being mixed, dried and then out the back-door. At any given point in simulated time, if there is no equipment available for the next step, then the batch waits (and if it has just completed mixing it might prevent another batch from being started).
What sets a Monte Carlo model apart however is that when the program needs to make a decision or perform an action where the outcome is a matter of chance, it has the ability to essentially roll a pair of dice (or flip a coin, or “choose straws”) in order to determine the specific outcome. In fact, since rolling dice means that each number has an equal chance of “coming up”, a Monte Carlo model actually contains equations known as “probability distributions”, which will pick a result where certain outcomes have more or less likelihood of occurrence. It’s through the use of these distributions, that we can accurately reflect those skewed non-Normal process times of the equipment in the manufacturing department.
The really cool thing about these distributions is that if the Monte Carlo uses the same distribution repeatedly, it might get a different result each time simply due to the random nature of the process. Suppose that the graph below represents the range of values for the process time of material XYZ (one of the 20 products) in one of the mixers. Notice how the middle of the ‘bell’ is off-center to the right (it’s skewed to the right).
So if the model makes several repeated calls to the probability distribution equation for this graph, sometimes the result will be the 2.0-2.5 hrs, other times 3.5-4.0 hrs, and on some occasions >4hrs. But in the long run, over many repetitions of this distribution, the proportion of times for each of the time bands will be the values that are in the graph (5%, 10%, 15%, 20%, etc.) and were used to define the equation.
So to come back to the manufacturing simulation, as the model moves batches through production, when it needs to determine how much time will be required for a particular mixer or dryer, it runs the appropriate probability equation and gets back a certain process time. In the computer’s memory, the batch will continue to occupy the machine (and the machine’s status will be busy) until the simulation clock gets to the correct time when the process duration has completed. Then the model will check the next step required for the batch and it will move it to the proper equipment (if there is one available) or out of the department all together.
In this way then, the model would continue to process batches until it either ran out of batches in the production schedule that was an input, or until the simulation clock reached some pre-set stopping point. During the course of one run, the computer would have been monitoring the process and recording in memory whatever statistics were relevant to the goal of the analysis. For example, the model might have kept track of the amount of time that certain equipment was block
ed from emptying XYZ to the next step. Or if the aim of the project was to calculate the average length of time to produce a batch, the model would have been following the overall duration of each batch from start to finish in the simulated department.
The results from just one run of the Monte Carlo model however are not sufficient to be used as a basis for any decisions. The reason for this is the fact that this is a stochastic system where chance determines the outcome. We can’t really rely on just one set of results, because just through the “luck of the draw” the process times that were picked by those probability distribution equations might have been generally on the high or low side. So the model is run repeatedly some pre-set number of repetitions, say 100 or 500, and results of each of these is saved.
Once all of the Monte Carlo simulations have been accumulated, it’s possible to make certain conclusions. For example, it might turn out that the overall process time through the department was 10 hrs or more on 8% of the times. Or the average length of blocked time, when batches are prevented from moving to the next stage because there was no available equipment, was 12 hrs; or that the amount of blocked time was 15hrs or more on 15% of the simulations.
With information like this, a decision maker would be able to weigh the advantages of adding/changing specific items of equipment as well as modifications to the department’s policies, procedures, or even computer systems. In a larger more complicated system, a Monte Carlo model such as the one outlined here, could help to decrease the overall plant throughput time significantly. At some pharmaceutical plants for instance, where raw materials can be extremely high valued, decreasing the overall throughput time by 30% to 40% would represent a large and very real savings in the value of the work in process inventory.
Hopefully, this discussion has helped to clarify just what a Monte Carlo model is, and how it is built. This kind of model accounts for the fundamental variability that is present is almost all decision making. It does not eliminate risk or prevent a worst-case scenario from actually occurring. Nor does it guarantee a best-case outcome either. But it does give the business manager added insight into what can go wrong or right and the best ways to handle the inherent variability of a process.
Profit Point’s data integration and scheduling optimization services deliver reliable results with reduced operations costs.
North Brookfield, MA
Profit Point today announced that its Profit Data InterfaceTM software has been selected by Rohm and Haas Company (NYSE: ROH) to integrate its scheduling processes with the company’s ERP data warehouse. The company, which last reported nearly $9 billion in annual sales, produces innovative products for nine industries worldwide through a network of more than 100 manufacturing, technical research and customer service sites. Optimizing and supporting the production and distribution scheduling across this network is a complex and ever-changing process.
“Rohm and Haas has a history of improving our operations to enhance customer service levels and reduce cost,” said Dave Shaw, the company’s Business Process Manager for MFG and Supply Chain. “Production scheduling, which entails constant change to meet demand, is one of the toughest challenges in the supply chain. In the past, the lack of a reliable data interface has limited our ability to react quickly and with a high degree of confidence in our results. Profit Point’s Data Interface software has given us near real-time access to highly reliable data, so we can respond quickly and know that our plan is right.”
Profit Data Interface is a robust application that helps decision makers boost the effectiveness of their ERP data by extending its usefulness with optimization applications. By leveraging existing ERP systems, the software provides a robust and proven method that supply chain managers can rely upon to optimize their critical business processes and improve profitability.
“Rohm and Haas is a recognized leader in the chemicals industry with a reputation for supply chain excellence,” said Jim Piermarini, Profit Point’s CEO. “We have supported their scheduling processes for years. So, it was clear that the next evolution was to directly connect their optimization software to the date store using our Data Interface product.”
Profit Data Interface, which integrates with SAP® and Oracle® data stores, can be used to optimize the entire supply chain including network planning, production and inventory planning, distribution scheduling, sales planning and vehicle routing.
To learn more about Profit Point’s supply chain software and services, visit www.profitpt.com.
About Profit Point:
Profit Point Inc. was founded in 1995 and is now a global leader in supply chain optimization. The company’s team of supply chain consultants includes industry leaders in the fields infrastructure planning, green operations, supply chain planning, distribution, scheduling, transportation, warehouse improvement and business optimization. Profit Point’s has combined software and service solutions that have been successfully applied across a breadth of industries and by a diverse set of companies, including General Electric, Dole Foods, Logitech and Toyota.
About Rohm and Haas Company:
Leading the way since 1909, Rohm and Haas is a global pioneer in the creation and development of innovative technologies and solutions for the specialty materials industry. The company’s technologies are found in a wide range of industries including: Building and Construction, Electronics and Electronic Devices, Household Goods and Personal Care, Packaging and Paper, Transportation, Pharmaceutical and Medical, Water, Food and Food Related, and Industrial Process. Innovative Rohm and Haas technologies and solutions help to improve life every day, around the world. Visit www.rohmhaas.com for more information.
Leading supply chain consulting firm’s entire line of optimization software is now capable of quickly and easily leveraging SAP’s robust ERP data warehouse.
North Brookfield, MA (PRWEB) August 25, 2008 — Profit Point, a leading supply chain optimization consulting firm, today announced the introductions of Profit Connect, an interface that bridges its line of optimization software applications with SAP’s enterprise resource planning (ERP) applications. With more than 46,000 customers worldwide, SAP is the ERP software of choice for thousands of medium and large businesses. By combining SAP’s central data store with Profit Point’s supply chain optimization software, business managers are now able to gain increased visibility to improve the quality of their critical business decisions.
“Historically, data availability and integrity have been the biggest challenge facing business managers that seek to improve their business operations,” stated Alan Kosansky, Profit Point’s President. “Compatibility with the universe of SAP’s real-time data enables our clients to use our industry-leading business optimization tools with easy access to the universe of SAP data.”
Profit Point’s entire line of supply chain optimization software, which includes tools to improve network design, production and distribution planning, scheduling and vehicle routing, is designed to help manufacturing and distribution managers improve the decisions they make using advanced optimization algorithms and proven supply chain methodologies. By leveraging existing ERP systems, Profit Point’s software provides a robust and proven method that supply chain managers can rely upon to optimize their critical business processes and improve profitability.
“In recent years, we have seen our clients increase their use and reliance on SAP for data management,” said Jim Piermarini, Profit Point’s Chief Technology Officer. “We saw an opportunity to access this data store, so that our clients could easily and accurately aggregate data for their optimization projects and increase the frequency of these business improvement efforts.”
Profit Connect solves the data integration challenges by providing an easy, direct bridge to SAP’s data store. Using Profit Point’s SAP-compatible software, business managers can now avoid data duplication and distortion, improve efficiencies and customer service, cut operational costs and improve decision making through accurate analysis and proven optimization techniques.