Add Total Delivered Cost Variances to Manage Your Supply Chain
It is often said that you can only improve what you measure. To that end, there has been a lot of progress in performance tracking and activity-based costing over the past 10 years. With the advent of better activity-based costing, leading companies generate monthly manufacturing variance reports at a detailed and actionable level. However, this does not appear to be the case in the supply chains of many of those same companies. At the end of this post, I’ll recommend some specific supply chain metrics to guide your supply chain improvement.
We routinely find that many companies have a very limited understanding of their supply chain costs: what they are, where they come from or why they’re happening. In a typical engagement with a new client, one of the first things we do is develop a picture of their supply chain current state with respect to flows, cost and service. We work with the client to gather all of the available information, which is much too often a very formidable task, until we can assign the cost from each operation that touches a product or intermediate from the time it is a raw material until it is delivered as a final product to the customer.
When the project team first presents the results to management, we invariably hear, “We don’t do that,” or “Those costs must be wrong.” Unfortunately, we sometimes hear, “There is no way we’re losing that much money at that customer.”
Clearly, there are times when the team learns something new and we have to adjust the costs. However, in the majority of cases we walk through the elements of the costs with management and the realization sets in that the numbers are correct and the costs really are that high. Now that we have all seen the true picture of the supply chain we can align on the effort required to improve it.
Supply chain managers, like their manufacturing counterparts, should demand ongoing metrics at the operational level that are actionable if they want to drive improvement in their supply chains. Reports that provide only the total freight spend, total warehouse spend or total person-hours worked in the supply chain vs. the plan don’t contain enough actionable information to drive performance.
I propose the following metrics as a starting point for managing the total delivered cost to the customer base and welcome your feedback on any metrics that I might have missed or that might replace one I’ve suggested.
Total Delivered Cost Supply Chain Metrics, a Start:
- Actual vs. target for shipping containers
- Actual loaded vs. the maximum allowable capacity for the commodity and shipping container combination
- Actual vs planned cost to serve variance reports at the customer/product level of detail with specific variances called out for
- Cost of Goods Sold (COGS)
- Mode exception (shipped by a premium mode of transport vs. the planned mode)
- Sourcing exception (shipped from a different location than the planned source)
- Fill exception (the difference in cost if the shipping container were filled to the maximum allowable capacity)
- Volume variance (total volume shipped vs. the planned volume to allocate fixed costs)
- Mix variance (change in the mix of products shipped vs. the plan and its impact on cost)
- Price variance (change in the price charged by carriers and other logistics service providers vs. the planned price)
With this set of metrics a supply chain manager should be able to quickly understand the reason for any changes in the total delivered cost to each customer, and thus the gross margin. Now that we can measure it, we can manage it.
The Future of Supply Chain Network Design
Most leading companies perform supply chain network design (SCND) analysis to define the structure of their supply chain as well as the key operations that will be performed at each location in the resulting network. The analysis includes suppliers, manufacturing, warehousing, and distribution. In fact, a number of Fortune 100 companies require such analysis before approving capital to add manufacturing or warehousing infrastructure. Those on the cutting edge are also using SCND analysis on a continual basis to understand the true delivered cost to supply each customer as well as the price required from the customer to achieve profitability goals. Advances in network design modelling and optimization have also opened the door to a broader analysis that focuses on the collective supply chains of all competitors in a marketplace and how best to supply the market at the lowest cost and maximum profit.
Twenty-five years ago, the optimization tools to analyze and strategically set one’s supply chain infrastructure were new and untested. Those companies on the cutting edge had the vision to employ this technology to gain competitive advantage. They analyzed the trade-offs among raw material, production, transportation, distribution and inventory costs to understand the most cost effective way to meet their customer demand at higher customer service levels. In today’s world, what was once the domain of a few has become a “best practice” in supply chain management. Most supply chain managers are on the band wagon and perform some sort of optimization based supply chain analysis when considering major capital investment, key strategic initiatives or when their network has grown too large and unwieldy through acquisition and growth. That does not mean that the world has caught up to the thought leaders….rather the thought leaders continue to push the envelope and are using SCND to do more for their organization than they did in the past.
In particular, there are two areas where the best supply chain managers are focusing their attention with regard to SCND: First, they are making SCND an evergreen business process that is fully integrated into all strategic and tactical decisions related to network infrastructure and product sourcing. Second, they are expanding the scope of their supply chain analysis to not only include their own supply chain network, but also to analyze the supply chain dynamics of their major competitors and how the entire market is being served.
Sustained Supply Chain Network Design
In too many cases, SCND analysis is a one-and-done process. The reality is that it’s often difficult to assemble the data required to perform the analysis, but this prevents companies from assessing ongoing risks and opportunities. Through a carefully designed data management process integrated to the right set of tools, leading businesses are putting into place sustainable business processes to continually revisit their supply chain network structure and operations.
Good data is the driver of good supply chain analysis. Many companies struggle with understanding the true activity costs associated with one or more of the following: raw material supply, production, packaging, warehousing, distribution and inventory. When running a significant supply chain analysis and design project it is often the case that the bulk of time and effort is spent on gathering, organizing and validating the input data that drives the analysis. Unfortunately, too often that effort is wasted, as that data is used once and then forgotten. However this need not be the case.
Those implementing best practices have extended data warehouses to include key activity based operations and cost data that is used in their strategic and tactical network optimization. The data is kept evergreen through continual data management processes and is always available for the next what-if scenario. These what-if scenarios might include:
• Short term: How best to tactically respond to unexpected events like strikes, weather disruptions, major capacity losses, etc…?
• Mid-term: How do I evaluate new sales opportunities for large additional volumes and how will these impact my ability to deliver across the supply chain?
• Long Term: How do I evaluate new merger and acquisition opportunities? How do I plan for capacity expansions?
Companies who maintain the proper data and do not have to start from scratch on each new what-if analysiscan use a tried and true process to respond more quickly and more accurately to the opportunities that constantly and continually present themselves.
Extending Supply Chain Network Design to Competitive Markets
If you have used optimization SCND to analyze a portion of your business in the past couple of years, then you are running with the herd. If you have implemented sustainable business process to refresh and maintain the critical data and are able to run supply chain network what-if analysis at a moment’s notice, then you are running at the front of the herd. Those running way out in front are also using SCND analysis to understand the true delivered cost of supplying product to each customer and managing their business profitability accordingly.
Advances in network design modelling, optimization, and game theory have recently opened the door to a broader analysis that focuses on the collective supply chains of all competitors in a marketplace. These tools can be used to which customer/product combinations should be targeted and at what price to maximize profit. There are three key steps to being able to accomplish this.
1. Understand your own Total Delivered Cost to each customer.
Understanding your true total deliver cost to each of your customers enables you to analyze and determine the profit you are earning from each customer. It also partially informs your pricing decisions, especially in competitive situations or when the demand is greater than your ability to supply. Not only does this analysis determine the profitability by customer, it also determines the impact of adding or dropping a customer, thus answering the question, “Even though it’s low margin business, can we afford to lose the volume?”
2. Estimate competitor costs for supplying to a shared set of customers
While pricing information is largely influenced by your own internal costs for producing, delivering and selling to your customers, it is also heavily influenced by the market conditions and at what price your competitors are able and willing to sell their competitive products to the same customers. In order to understand the market dynamics, you need to be able to reasonably estimate your competitor’s costs. For example, if you are in an industry where transportation delivery costs are significant, then regionally located manufacturing will have an impact on price and profitability. Understanding which customers are more profitable for you and which are more costly for your competitors to serve enables you to develop a winning strategy.
3. Use cutting edge optimization technology to model the competitive market
While most businesses are good at determining pricing and identifying profitable customers intuitively and on and ad hoc basis, few have put into place the rigorous business processes and analytics to be able to do it accurately on a routine basis. This requires a deep understanding of your total delivered cost, your supply chain sourcing options, and the flexibility you have on both the cost and revenue side of the equation. It also requires a better understanding of your competitors supply chain, and what they may or may not be able to do, based on their own costs.
Supply chain network design optimization tools have become well integrated into modern business decision making processes at leading edge companies. They are used to rigorously analyze and make the best decision in response to both short-term events such as weather disruptions, spot sales opportunities, capacity outages) and long-term strategy, such as capacity expansion or mergers and acquisitions. These analytical approaches and technologies have recently been extended to enable businesses to analyze not just their own operations, but the sum of multiple supply chains in the competitive market place. It is exciting work and adding additional millions of dollars to bottom line profit each year.
December 1st, 2015 5:11 pm Category: Distribution, Global Supply Chain, Green Network, Inventory Management, Network Design, Optimization, Optimization Software, Scheduling, Solver Optimization, Supply Chain Improvement, Supply Chain Optimization, Supply Chain Planning, Transportation, Vehicle Routing, Warehouse Optimization, by: Gene Ramsay
Profit Point has been helping companies apply mathematical techniques to improve their business decisions for 20 years now, and it is interesting to review some of the advances in technology that have occurred over this time that have most enabled us to help our clients, including:
• The ability for companies to capture, store and access increasingly larger amounts of transaction and anecdotal data that quantify the behavior and motivation of customers, manufacturers, suppliers and other entities
• The improvement in analytical capabilities that help make optimized choices, in such areas as the solving mixed integer optimization problems, and
• The improvement of computing technology, allowing us to perform calculations in a fraction of the time required just a few years ago
A recent post on the Data Science Central website highlights the use of advanced techniques based on these advances by on-line marketplace Amazon, which is generally acknowledged as one of the most tech-savvy companies on the planet. 21 techniques are listed that Amazon uses to improve both their day-to-day operations and planning processes, including supply chain network design, delivery scheduling, sales and inventory forecasting, advertising optimization, revenue / price optimization, fraud detection and many others. For a complete list see the link below:
Like our customer Amazon, Profit Point is committed to using these techniques for the benefit of our clients – we have been concentrating on implementing business improvement for our clients, including optimization in various forms, since our very beginning. Are you, like Amazon, using the best methods to seize the opportunities that are available today?
Over the past week I’ve had two experiences that made me think about what’s required for a successful Organizational Change. The first was our CSCMP Roundtable tour of a family-owned food distribution company that had built a large, successful regional business by leveraging their founder’s focus on customer satisfaction and valuing his employees as cornerstone of the business. The company had recently been purchased by another family-owned company and was in the midst of a successful wholesale change in IT systems and work processes. Having seen many organizations struggle with such a large change, I asked our host about the secret of their organizational change. In a word, he said, “Culture.”
Immediately after the new owner had completed the purchase, they spent a lot of time with the employees reassuring them that the values of the company wouldn’t change even though the way that they did their jobs might change dramatically. In the end, the two companies’ cultures valued the same things: customer satisfaction and their employees. With that in mind the change management effort began as an inclusive effort with a clear set of goals for the new work processes. Not that there weren’t any bumps in the road, but the two once-separate organizations were able to push towards the new way of doing business as a common team.
So what does that have to do with a bike ride on a windy day? That’s where the second experience of the week comes in. Over the weekend, I completed a two-day 176 mile ride around Galveston Bay. Just like a good organizational change-management effort the first day was preceded by a lot of training and preparation and accompanied by excitement and adrenalin. We had some tough slogs, particularly one 21 mile stretch directly into a 15 mph headwind. It was grueling, but we knew it was coming and grunted our way through it. But then came the pay-off, the headwind became a tailwind as we sailed down the coast on our way to the finish line for Day 1. Again, like an organizational change, we had some tough slogs, but our preparation paid off and we were able to celebrate as we hit our first big milestone.
The second day of the ride promised to be a tough one. We had already ridden 97 miles on the first day, winds were blowing at almost 20 mph and were forecast to be mostly in our face all the way back to our starting point on Day 1. I knew it would be a challenging day, so I decided that speed was not very important; just finishing. In addition, I knew that I needed to find some like-minded riders so we could work together into the wind. Luckily fate smiled upon me and I found a couple of riders that were taking the same approach to the ride. We teamed up, taking turns pulling from the front so that the other two could draft and waiting for each other when we had flat tires. We also got to celebrate when we turned away from the wind and had it at our backs for short stretches before turning into it again. The parallels to a successful organizational change jumped out at me.
- We made a realistic assessment of the challenges ahead
- We set goals that were within our reach, given the challenges
- We found allies with the same mind-set and worked as a team towards a common goal
- We celebrated success when we had a downwind leg
- We finished as a team
I hope to see you out on the road, be it organizational change or riding your bike into the wind. Good luck, and let me know if you need someone to help pull on the next ride.
July 21st, 2015 2:58 pm Category: Global Supply Chain, Network Design, Operations Research, Optimization, Optimization Software, Profit Network, Publications, Supply Chain Agility, Supply Chain Optimization, Supply Chain Planning, Warehouse Optimization, by: Ted Schaefer
Profit Point’s recent article in Industry Today, “The Future of Supply Chain Network Design,” describes how to fully leverage the new advances in a traditional supply chain optimization process to include not only your internal supply chain, but the supply chains of your competitors, as well.
Supply chain network design optimization tools have become well integrated into modern business decision-making processes at leading edge companies. The tools are used to rigorously analyze and make the best decisions in response to both short-term events such as weather disruptions, spot sales opportunities, utility outages and to longer-term strategy issues, such as capacity expansion or mergers and acquisitions. These analytical approaches and technologies can be game changers. The newest versions of SCND tools have been expanded: businesses can now analyze not just their own operations, but also the sum of multiple supply chains in the competitive marketplace, creating a new way to integrate competitive risk into the design of your supply chain.
Please contact us if you’d like to learn more about new ways to leverage traditional ideas.
We all know the saying, “An apple a day keeps the doctor away.” However, for many of our neighbors, it’s easier said than done. According to some recent surveys, most of you reading this eat about 40% more fresh produce than the segment of the population that is served by our nation’s food banks. In Texas, according to the recently released Feeding America Map the Meal Gap data, 17.6% percent of the overall state population struggled to avoid hunger in 2013, including nearly two million children. Surprisingly, in many cases the problem is not the availability of food -, it is a supply chain problem.
Through a CSCMP friend at the Houston Food Bank, I recently started a project with Feeding Texas, a network of the 21 food banks serving the state, to increase the amount of fresh produce we can deliver per dollar spent. Just like in many private sector companies that have grown in size over time, each Feeding Texas member food bank operates independently. Across the food banks resources are tough to come by and tend to be used in the day-to-day operations to bring in food and get it out the door to clients. Thus, they have not yet adopted many of the current best practices in supply chain management.
Even though there is more fresh produce available than they can use at any given time, many of the key issues that Feeding Texas members face, like
- Lack of transportation capacity to move produce when it is offered
- The high cost of transportation that consumes limited budget funds and restricts the amount of produce that can be obtained
- Purchase of Out-of-State produce when produce is available in Texas
are typical of an organization that haven’t implemented modern network design, supply/demand planning and transportation planning processes.
We are currently collecting data to complete a more extensive review of the total Feeding Texas supply chain to identify opportunities to move more fresh produce at a lower cost. We are also engaging with key industry contacts and donors to help us understand whether we can adopt some of their best practices in the movement of fruits and vegetables. I have to say that I have been very gratified at the number of times I’ve called a supply chain colleague to ask for their help on this project. In almost all cases, the response has been an unhesitating, “What can I do to help?”
I’ll keep posting our updates as we hit new milestones in our project. In the meantime, I would ask you to reach out to the food bank in your neighborhood and ask if your supply chain expertise can be put to good use.
April 22nd, 2015 12:30 pm Category: Global Supply Chain, Network Design, Operations Research, Optimization, Profit Network, Supply Chain Improvement, Supply Chain Optimization, Supply Chain Planning, Supply Chain Software, by: Gene Ramsay
At Profit Point network design analysis, answering such questions as
• how many facilities a business needs,
• how large they should be and where they should be located, and
• how they should change over time, etc.
is one of our specialties. We have performed this type of analysis for a range of companies across multiple industry types and geographical regions, and we have developed our own network design-focused software package to help us do this type of study. (And we teach folks how to use the software as well, so they can answer their own network design questions, if they want to pursue that.)
Our modeling “toolbox”, our Profit Network software, is designed to be flexible and data-driven, so that the user can focus more attention on a key part of the supply chain where the questions must be answered, without having to define more detail than is really desired in other areas to the supply chain.
One of the key elements in many of the models we or our clients construct is the bill of materials. This data specifies the materials that are required to produce goods along the supply chain, be they intermediate materials or finished goods. For instance, if you are making a finished good such as a loaf of bread, the bill of materials would specify the quantities of flour, yeast, salt and other ingredients that would go into a batch.
To get trustworthy results from a model, it must require that the bill of materials (BOM) data be defined, and be used, in deriving the solution. (In some models we have encountered the BOM is just a suggestion, or products can be created from thin air if the BOM data in not defined.)
The BOM logic must also be able to capture the reality of a situation. The BOM may need to vary from one machine to another within the same facility. Or it might need to vary over time – as an example, when agricultural or dairy products are ingredients to a manufacturing process, the ingredients might have different characteristics over the seasons of the year, thus requiring different input quantities over time to produce a consistent, standardized output.
We work closely with our clients to ensure that our software is matched to their needs, and that it gives them the flexibility they need as their businesses change.
November 14th, 2014 9:45 am Category: Global Supply Chain, Green Network, Green Optimization, Network Design, Operations Research, Optimization, Optimization Software, Profit Network, Risk Management, Supply Chain Improvement, Supply Chain Optimization, Supply Chain Planning, Sustainability, by: Gene Ramsay
In developing a supply chain network design there are many criteria to consider – including such factors as the impact of the facility choices on
• Cost of running the system,
• current and future customer service,
• ability to respond to changes in the market, and
• risk of costly intangible events in the future
to name a few.
Frequently we use models to estimate revenues / costs for a given facility footprint, looking at costs of production, transportation, raw materials and other relevant components. We also sometimes constrain the models to ensure that other criteria are addressed – a constraint requiring that DCs be placed so that 80% of demand be within a day’s drive of a facility, for instance, might be a proxy for “good customer service”.
Some intangibles, such as political risk associated with establishing / maintaining a facility in a particular location, are difficult to measure and include in a trade off with model cost estimates. Another intangible of great interest for many companies, and that has been difficult to make tangible, is water risk. Will water be available in the required quantities in the future, and if so, will the cost allow the company to remain competitive? For many industry groups water is the most basic of raw materials involved in production, and it is important to trade off water risk against other concerns.
As I wrote in a previous blog published in this forum,
There are several risks that all companies face, to varying degrees, as global water consumption increases, including
• Physical supply risk: will fresh water always be available in the required quantities for your operations?
• Corporate image risk: your corporate image will likely take a hit if you are called out as a “polluter” or “water waster”
• Governmental interference risk: governmental bodies are becoming increasingly interested in water consumption, and can impose regulations that can be difficult to deal with
• Profit risk: all of the above risks can translate to a deterioration of your bottom line.
The challenge has been: how to quantify such risks so that they can be used to compare network design options.
Recently a post entitled “How Much is Water Worth” on LinkedIn highlighted a website developed by Ecolab that offers users an approach to monetization of water risks. This website allows the user to enter information about their current or potential supply chain footprint – such as locations of facilities and current or planned water consumption – and the website combines this information with internal information about projected GDP growth for the country of interest, the political climate and other factors to calculate a projected risk-adjusted cost of water over the time horizon of interest.
This capability, in conjunction with traditional supply chain modeling methods, gives the planner a tool that can be used to develop a more robust set of information that can be used in decision-making.
For more details visit the website waterriskmonetizer.com
In their recent article in Industry Week, titled “Is Forecasting the Weakest Link in your Supply Chain?”, Profit Point explains how Supply Chain Optimization is breaking new ground. By bringing state-of-the-art optimization techniques to customer order fulfillment and execution leading companies are making significant inventory reductions and no longer relying on old and expensive technique of building high levels of safety stock to ensure high customer satisfaction.
Visit Industry Week to read about these fresh supply chain ideas.
July 17th, 2014 5:04 pm Category: Global Supply Chain, Green Network, Network Design, Optimization Software, Supply Chain Agility, Supply Chain Improvement, Supply Chain Optimization, Sustainability, Transportation, by: Gene Ramsay
Many of our activities at Profit Point are focused on helping clients in identifying and implementing changes that improve the efficiency of existing supply chain networks, ranging from planning to operations and scheduling. In the short term we are usually trying to find ways to use existing capabilities more effectively, but as you look out over longer time horizons supply chains evolve to develop new links, and these must be considered as you plan.
One instance of this evolution was described by my colleague, John Hughes, who recently wrote about the rise of a “New Silk Road”– a rail network stretching through Western China, Kazakhstan, Russia and Belarus to Europe – used for transporting manufactured goods from Asia to meet demand in Europe.
But Asia has a complementary demand that must be met for their manufacturing systems to function, the demand for energy to power their factories and cities. The growing worldwide demand for energy, and for faster routes to market, is opening up another new link in the global trade routes – the Northern Sea Route, a maritime route connecting Pacific ports with Europe via the Arctic.
Lawson Brigham, professor of geography and Arctic policy at the University of Alaska Fairbanks, was recently quoted on the arcticgas.gov website as saying “What’s really driving the Northern Sea Route is global commodity prices and natural resource development, particularly in Russia.”
The northern reaches of the earth are currently hotbeds of energy development, and much of the activity is focused on adding Liquefied Natural Gas (LNG) production capacity. Projects are on-line or in progress stretching from the North Slope in Alaska to the Yamal Peninsula in Siberia to Hammerfest in Norway. The Northern Sea Route offers quicker shipments of many of these supplies to major Asian ports, shaving ten to twenty days off one-way transit times from Russia and Norway to ports in Korea and China, compared to routes through the Suez Canal.
Climate change has made these routes generally ice-free for several months of each year, and thus more cost effective, but ice-strengthened cargo ships, with icebreaker support, are still required to keep the route open in the colder months, thus driving up the costs.
Supply chain planning activities on a global scale will over time need to expand to consider the potential impact of these types of shipping options. Keep an eye out for this and other new links in the global chain as they become available – change is inevitable.
For a more information on this route see articles like these:
April 16th, 2014 10:21 am Category: Global Supply Chain, Network Design, Operations Research, Optimization, Optimization Software, Profit Network, Supply Chain Optimization, Supply Chain Planning, Supply Chain Software, by: Gene Ramsay
Recently I had the opportunity to speak to an operations management class for MBA students in the Goizueta Business School at Emory University. The class is intended to give the students an introduction to a variety of problems that they might encounter during their careers, and to management science techniques that might be applied to them, using Excel as a solution platform. The professor had asked me to address the course topic from the point of view of one who had used these methods in the real world, and I was glad to do so, recounting my work in supply chain network design, hydro power generation scheduling, routing of empty shipping containers, natural gas supply contract management and various other problems.
During Q&A one of the students asked how a company should determine the appropriate source of resources to use for solving these types of problems – should it be in-house expertise or an outside consultant?
As I told him, to me, this depends on a number of factors, and I gave an example, based on our experience: In our practice we perform supply chain network design studies, and we also license the network design software that we use to our clients, if they desire. A number of clients have engaged us to first do an analysis for them, and then they have licensed the software so that they can then perform future projects themselves, using our initial project as a base. Many of these clients have used the software very effectively.
Those that have been most successful at using the software in-house, and at performing management science projects in-house in general, have several common characteristics-
- They are committed to tactical and strategic planning as tools for meeting their business goals,
- They have enough work in this area, and related areas, to keep an analyst or group of analysts busy full time, due to such factors as
- The scale and scope of their operations
- The speed of innovation in their industry
- The level of complexity of their supply chain and variety of products made, and
- Their desire for a “continuous improvement” approach as opposed to a “one-time reorganization” approach
- They have a commitment to maintaining personnel who
- have the proper skills and training to address these problems, and
- are allowed the time to work on these problems, rather than being constantly pulled off for “firefighting” short term or operational problems.
Most companies can make good use of management science solution methods, but, as you think about how to do this, try to make a realistic determination of your internal priorities, so you can decide between insourcing and outsourcing, or a mixture of the two.
March 6th, 2014 9:32 am Category: Operations Research, Optimization, Optimization Software, Profit Network, Profit Vehicle Planner, Profit Vehicle Router, Supply Chain Improvement, Supply Chain Optimization, Supply Chain Planning, by: Jim Piermarini
In the recent weeks, I have been thinking about testing our applications, like our popular Profit Network, or Profit Vehicle Planner. When we test, we run data sets that are designed to stress the system in different ways, to ensure that all the important paths through the code are working properly. When we test, our applications get better and better. There are many good reasons to test, most importantly, is to know that an improvement in one part of the code does not break a feature in a different part of the code.
I have been thinking about how we could test our code a bit more, and the means by which we could do that. I have been reading about automated testing, and its benefits. They are many, but the upshot is that if the testing is automated, you will likely test more often, and that is a good thing. To automate application testing requires the ability to churn out runs with nobody watching. And to do that, the application needs to be able to be kicked off and run in a way that there are no buttons or dialog boxes that must be manually clicked to continue. There can be no settings that must be manually set, or information reviewed to decide what to do next. In addition, the application must then save the results somewhere, either in the instance of the application, or to a log file, or to a database of some sort. Then finally, to really be testing, the results must be compared to the expected results to determine the pass/fail state of the test. This requires having a set of expected results for every test data set.
In looking at this process above, I see numerous similarities to the process used to run a sensitivity analysis, in that many runs are typically run, (so automation is a natural help) and the results need to be recorded. Sensitivity Analysis is a typical process for user of our Profit Network tool, and out Profit Planner and Profit Scheduler tool. An additional step in sensitivity analysis however, is that you may desire to change the input data in a systematic way (say Demand + 5%, and Demand -5%), and to the extent that it is indeed systematic, this too could be folded into the automation. The results analysis is different too, in that here you would like to look across the final sets of results at the differences, while in testing you just compare one set of test results to its expected results. I can foresee difficulty in automating the data changes, since each type of data may need to be changed in a very specific way. Never-the-less, even if the data changes are manual, they could be prepared ahead of the run, and the runs themselves could be grouped in a batch run to generate the results needed for a sensitivity analysis.
Constructing a harness that lashes up to an application where you can define the number of runs to be made, the setting for that run, the different data sets to be used, and the output location for results to be analyzed, would be useful not only for testing, but for the type of sensitivity analysis we do a lot of here at Profit Point.
I am going to encourage our developers to investigate this type of a system harness to be able to talk to and control our applications to be able to run them automatically, and have their results automatically stored in a data store for either test or sensitivity analysis.
Jim Piermarini | CEO Profit Point Inc.
A couple of years ago, there was a book written by Thomas L. Friedman called “The World is Flat” which made the case that businesses and organizations are becoming more and more integrated across physical locations and national boundaries. His point was that in the future, the various different functions and processes of any enterprise will increasingly be done in different parts of the world by the most efficient and effective individuals regardless of where they happen to be or who their employer is. It will no longer be necessary to have all of the people working on a project to be co-located; they can be practically anywhere in the world and actually be members of multiple different departments or totally separate organizations.
Although Friedman did recognize the impact of companies’ supply chains becoming ever more efficient and expansive in this process (‘Supply Chaining” was one of his 10 basic forces), he focused mainly on the ramifications of the increased power and availability of the internet as the driving force behind this increasing integration. For sure, it’s true that over time the ubiquitous nature of the web will continue to profoundly shaped the way people in different parts of the world share information, and as a result increases each others’ creativity and productivity. But the impact of developments in supply chain management will are also have a huge impact on the way people in different parts of the world think of each other, interact and inter-relate.
This was reinforced to me by the recent news that workers at the Volkswagen plant in Chattanooga recently voted against allowing the UAW union to represent them. This is the latest evidence that relatively un-skilled production workers in the US are finding themselves in direct competition with workers in India, China, Korea, and Europe. The workers at Volkswagen realized that they had to send a message world-wide to prospective employers that Tennessee was a “business-friendly” location. If somehow they were viewed negatively, then companies because of their super-efficient supply chains will seek out lower wage locations for their production.
When Volkswagen opened their Chattanooga plant in 2011, the starting wage for assembly-line workers was $14.50 per hour. This is roughly half of what traditional, unionized personnel at GM and Ford make. With benefits, this salary comes to about $27 per hour. The ‘funny-sad’ thing is that Volkswagen personnel in Germany make roughly $67 per hour. So in effect Volkswagen exported those German jobs from their high-wage home country to the low-wage United States. It’s a similar story when General Electric started up a new line at their Louisville Ky. Factory in 2012. There, the starting rate for line workers was $13.50 which comes out to less than $30,000 per year.
Essentially, low or un-skilled manufacturing workers in the U.S. are in a race to the bottom (though what the bottom looks like is a function of the industry). The only way for these workers in improve their living standard is through increased training, education and flexibility. I’m not saying that everybody should go to college… not at all. After all, I’m college educated and nobody in their right mind would want as an industrial welder. It would be a scary thing to see me with a blow torch. And my wife has already prohibited me from going near the electrical system in our house. But learning specific skills and trades such as CNC Machine Programmer, skilled mechanic, CAD technician, electrician etc. will distinguish non-college educated workers from their competition. These are the skills that are valuable to employers and are in short supply, and will allow the U.S. blue-collar labor force to insure that they are not competing with other low-skilled workers in far off places in the world.
February 19th, 2014 3:51 pm Category: Supply Chain Optimization, by: Karen Bird
The economy has been slow to recover after the Great Recession of 2008, however, many economists believe that 2014 and 2015 will be strong years for the US and Global economies. How accurate will your forecasting model be in projecting the supply needed from your business? Since forecasting models typically use two to three years of history (actual sales) to predict the future and we are coming out of a down economy and heading towards positive growth, the standard forecasting models will not predict the future very well. This is where human intelligence and companies with a formal Sales and Operations Planning (S&OP) process have an advantage.
A formal S&OP process gives companies a monthly opportunity for their Sales and Operations teams to come together and review the data, latest intelligence from the field and make the best decisions possible for the company. In addition, a formal S&OP process gives the business a forum each month to challenge the current execution plan and either reconfirm or adjust the plan to meet the strategic goals of the company. A monthly review of key forecasting metrics can provide the Sales team with valuable feedback regarding the forecast.
Read the Profit Point S&OP Research Report
A study by the Aberdeen Group shows that greater than 60% of Best-In-Class companies view a formal S&OP process as a strategic priority for their organization and that Best-In-Class companies hold an 18 point advantage in forecast accuracy. According to an AMR Research study from 2008, companies that are best at demand forecasting average:
- 15% less inventory
- 17% higher perfect order fulfillment
- 35% shorter cash-to-cash cycle times
- 1/10 the stock-outs of their peers
How does a formal S&OP process help to deliver these benefits? It is a combination of getting the right people together to make the right decisions at the right time. A few years ago, Thomas Wallace, author of Sales and Operations Planning: How-To Handbook initiated a project to study the experiences of companies using Executive S&OP very well. The companies in the Best Practices Project cited similar hard benefits listed above but they also said “the soft benefits are equal in importance, or perhaps greater, than the hard benefits”. The soft benefits most often cited were:
- Enhanced Teamwork
- Embedded Communications
- Better Decisions
- Better Financial Plans
- More Focused Accountability
- Greater Control
- A Window Into The Future
A well run S&OP process will put a spotlight on problem areas or gaps in your business 18 – 24 months in the future. This allows the team to collectively see a potential problem or upside opportunity and produce scenarios to help the company to react to them in a timely and efficient manner.
November 25th, 2013 9:12 am Category: Supply Chain Optimization, by: Editor
We found a great article by Adrian Gonzalez entitled 5 Reasons Why Excel is Champ of Supply Chain Apps. Over the past 15 years, we’ve seen so many companies relying on Excel to make early stabs at supply chain optimization. And, since we recognize that you supply chain “disruptors” will do just about anything to change the status quo to move the needle on continuous improvement, we applaud you.
In fact, Excel – coupled with it’s relatively powerful optimization engine, Solver, from our partner Frontline – is a great starting place for identifying supply chain optimization opportunities. It’s extremely accessible, virtually free and easy to collaborate, since everyone has it. So long as your not looking for near-real time planning and decision support, the optimization features are quite useful.
Of course,when you’re ready to take it the next level, there are some very good reasons to move beyond Excel. First and foremost, connecting your supply chain optimization software to your ERP data warehouse allows you to get a much more granular level of detail. And, a direct connection means that you’re not wasting hours – or sometimes days – extracting and shaping the data into a consumable form. It’s just there and ready to go!
Below is Adrian’s top 5 list as well as some of the limitations that she identified. There’s also a link to the complete article, which is well worth reading.
“I believe there are five main reasons why Excel remains the reigning champ of supply chain applications:
- It’s is easy to learn and use.
- You can quickly and easily configure it to your specific needs and preferences.
- It’s highly portable: you can use it almost everywhere, and share it easily with others.
- It’s ubiquitous: Almost everybody has it and knows how to use it.
- It’s inexpensive.
Of course, Excel has some significant drawbacks that limit its usefulness and value as a supply chain application, such as…
- You’re working with static data
- A macro is not the same as an optimization engine
- It’s not integrated with execution tools
- You often end up with multiple versions of the truth”