In the wake of the recent election, there has been a lot of talk about the types of changes we’ll be facing over the next few years. The continuing analysis of the election and a recent plane ride have given me a good refresher course on some of the critical factors that enable a successful change in an organization or doom it to failure.
The day after the election, I was traveling home to Houston and I took the advice that I give to my two college-age sons: if you really want to know what’s going on with a particular issue, make sure you get at least two different points of view and the truth is likely to fall somewhere in the middle. So I bought a copy of The New York Times and The Wall Street Journal to read their analyses of the election results. Needless to say, the newspapers had some fairly different interpretations of the same sets of facts. Sometimes, it was a matter of drawing different conclusions from the same set of data and other times it was a matter of which facts were emphasized and in what order that would lead a reader in two different directions depending on which paper I was reading.
The following week, I was traveling home after our project team delivered the final presentation to our executive sponsors. Our team had recommended a number of changes to the client’s supply chain; some fairly straight forward and others that would require a significant change in culture. As it happens, I sat next to a gentleman who helps companies change cultures. We had a good conversation, helped by our third row-mate, who bought drinks for the row, about a number of different things. However, one thing that stuck with me was his premise that an organization’s results are determined by its culture. In this organizational model, actions drive results, but beliefs drive actions. Thus, to change the results in a company, one must change the beliefs held by the people who impact the results.
Once again, I was reminded that the key to a successful change is the people who run the process. If they are not engaged and if they don’t believe that the change will be a good one, you’re in for a very rough ride. Further, when trying to understand the current beliefs that drive the actions that drive the success of your change, it’s best to seek out more than one source of information.
For years we’ve been hearing about Big Data. Now the call is to make the data visible and actionable. Easier said than done. Remember when we wanted our music, phone and camera on one device instead of having to carry multiple devices? Data Visualization is that desirable right now and it is challenging to do well. Here’s why:
Challenge #1: Properly defining the question that you want the data to answer
In the world of supply chain the leaders typically want all of the data summed up into Good News or Bad News. For example, at the end of a monthly S&OP meeting one of the key questions that gets asked is Can Sales continue to promote product A? For Operations to give a Yes or No answer, a timeframe has to be defined. Once the timeframe is agreed, then Operations can answer the question by building a heat map for every product or family of product (if that makes the data more manageable). The heat map then can be given to Sales at the end of the monthly S&OP.
Challenge #2: Cleaning up dirty data
This is where most organizations get stuck. Cleaning up the data is tedious work but it has to be done or the metric is useless. Take heart, sometimes identifying and fixing the issues with the data is meaningful on its own. Also, think about what decisions that dirty data is influencing on a daily basis or the time that it takes to explain variances every month.
Challenge #3: Developing graphics that tell the story at a glance with the push of a button
You have to work with your audience to determine what graphics work for them. I find that it’s best to create something and then get feedback. This step can be a bit of trial and error but once you have the design locked in, then you need a skilled developer to automate the report out. The end users really appreciate if they can easily run the reports and generate the charts and graphs on-demand with the push of button. If it is complicated or requires many manual key strokes to generate the charts and graphs then that report out will not be sustainable.
Challenge #4: Making the data actionable
Congratulations on making it to this step. You have put so much effort into getting here and now all you have to do is summarize thousands or even millions of data points across multiple parameters in a way that helps the receivers of the results to take action. If you can monetize the results by showing costs or savings that will give direction to the receivers of the output to either keep doing what they’ve been doing or give them incentive to make a change. Or, if you can summarize the data into categories that is meaningful to the audience then they will know where to focus their time and energy to make improvements.
Here is an example of a chart that answers the question: How good is my schedule? This chart in addition to five other supporting charts can be generated on-demand in 30 seconds.
At Profit Point, we work with our clients to overcome the challenges with Data Visualization and develop Meaningful Supply Chain Metrics. Contact us at www.profitpt.com or at 610-645-5557 and we will be happy to assist you.
Here at Profit Point, we typically put in a fair amount of effort up front to scope out a project together with our client. This typically helps us and our client to set appropriate expectations and develop mutually agreeable deliverables. These are key to project success. But another key element to project success is getting good quality data that will allow our clients to make cost effective decisions from the analysis work we are doing or the software tool we are implementing.
Decision support models are notoriously data hogs. Whether we are working on a strategic supply chain network design analysis project or implementing a production scheduling tool or some optimization model, they all need lots and lots of data.
The first thing we do (which is usually part of our scoping effort) is identify each of the data types that will be required and what will be the source of this data. To do this we start with what decisions are trying to be made and what data is required to make them successfully. From there we identify if the data currently exists in some electronic form (such as an MRP system) or whether it will have to be collected and entered into some system (say a spreadsheet or database program) and then figure out how the data will get into the tool we are developing.
Second, we try to get sample data from each data source as early as possible. This allows us to see if the assumptions that were made as part of the scoping effort were valid. There is nothing like getting your hands on some real data to see if what you and your team were assuming is really true! Often there are some discoveries and revelations that are made by looking at real data that require design decisions to be made to be able to meet the project deliverables.
Third, to help with data validation we find it extremely helpful to be able to visualize the data in an appropriate way. This could take the form of graphs, maps, Gantt charts, etc. depending on the type of data and model we are working on. On a recent scheduling project, we had the schedulers review cycle times in a spreadsheet but it wasn’t until they saw the data in Gantt chart form that they noticed problems with the data that needed correcting.
Identifying data sources, getting data as early as possible and presenting the data in a visualized form are absolutely required to make a project successful. Omitting any of these steps will at least add to the project cost and / or duration or possibly doom the project to failure.
Several years ago I started collecting coins. I love the beauty of a nicely preserved coin; just looking at the year on the coin takes me back to that time and place in history.
In addition to the usual proof sets most numismatists collect, I also like to collect coins that reflect unique times in history such as US steel war pennies and Japanese occupation dollars.
A few years ago I started collecting hyperinflation currency – currency issued by a country during a time of hyperinflation. Hyperinflation is extremely rapid or out of control inflation – a situation where the price increases are so out of control that the concept of inflation is essentially meaningless. Hyperinflation often occurs when there is a large increase in the money supply not supported by gross domestic product growth, resulting in an imbalance in the supply and demand for the money. This causes prices to increase, as the currency loses its value. Soon even the largest denominations of the country’s currency has so little buying power that consumers need to bring a wheelbarrow of currency just to buy a loaf a bread. To respond to this, the government begins to issue larger and larger denomination bills. Finally the denominations reach such ludicrous levels and have so little value that the currency totally collapses. One of the most famous examples of hyperinflation occurred in Germany between January 1922 and November 1923. By some estimates, the average price level increased by a factor of 20 billion, doubling every 28 hours.
One of my favorite pieces of hyperinflation currency in my collection is my 100 Trillion Dollar bill from Zimbabwe when by 2008 their inflation rate had reached 231,150,888.87%. Use of the Zimbabwean dollar as an official currency was effectively abandoned on April 12, 2009.
Venezuela is currently experiencing hyperinflation. According to estimates released by the International Monetary Fund, inflation in Venezuela is projected to increase 481% this year and by 1,642% next year. To put that in perspective, in February of 2016 a McDonald’s Happy Meal in Caracas cost $146 dollars at the official exchange rate of 6.3 bolivars per dollar
So how does a country with more oil reserves than Saudi Arabia end up with armed guards on food delivery trucks, 3 mile long lines to buy basic food, people dying due to lack of basic medical supplies and more poverty than Brazil?
1) Price Controls
As part of his Bolivarian Socialist Revolution, Chavez implemented price controls on goods. The government set prices at which retailers can sell goods. If you violate a price control and sell your goods for a higher price, the government will jail the business owner and nationalize (seize) their business. As a result of these price controls, it cost farmers more to grow the product than they could sell it and it cost factories more to produce an item then they were allowed to sell it. The logical conclusion to this scenario occurred – the farmers stopped growing crops and the manufacturing facilities stopped producing goods. The government’s response – jail the business owners and seize their factories and farms. The Venezuelan government was totally unqualified to run these factories and farms; as a result, they have all been shuttered. This lead to a huge imbalance in trade and Venezuela started to import almost everything from basic foods to medical supplies. This works great as long as the government has the huge revenue income required to support those types of subsidies.
2) Oil Prices have Fallen
For years, the country has been deeply dependent on its vast oil reserves, which account for 96 percent of export earnings and nearly half its federal budget. That was manageable when oil was selling at more than $100 dollars a barrel. Venezuela now needs oil prices to reach $121 per barrel to balance its budget however oil is hovering around $50 per barrel. Add to that the fact that oil from Venezuela is very thick and difficult to refine making it not as desirable as light sweet such as Brent Crude. This has forced Venezuela to import oil to blend with their oil to make it saleable in the current market.
3) Crippling Foreign Debt
Since 2005, Venezuela has borrowed $50 billion from China as part of an oil-for-loans agreement. Venezuela exports more than 600,000 barrels a day to China, however nearly 50 percent of that goes to paying off its existing debt to China. The situation has gotten so bad that Venezuela is selling its gold reserves to pay foreign debt obligations.
4) Currency is in Freefall
Venezuela’s bolivar recently fell past 1,000 per U.S. dollar in the black market. That means that the country’s largest denomination note of 100 bolivars is now worth less than 10 U.S. cents. The currency has lost 81 percent of its value in the last 12 months. This makes a bad situation much worse for a country that imports almost every basic need. For comparison purposes to truly understand how bad hyperinflation is getting in Venezuela, a doctor working in the country’s national health care system makes $15 per month. As of this writing, a 1kg bag of apples in Caracas costs $18. A liter of whole milk was $5.14. A 40” flat screen TV was $5,889. (U.S. dollars, assuming an exchange rate of 0.15748 USD per Venezuelan Bolívar. Source is a crowd-sourced cost-of-living comparison site).
Sadly for the good people of Venezuela, it is almost inevitable that their currency, the Bolivar, is destined for my hyperinflation currency collection. But what is the lesson that we as Supply Chain professionals can take from this tragic situation? Perhaps that supply chains and markets must be free to find their own price and value; and that governments cannot run a government properly, much less a factory.
Add Total Delivered Cost Variances to Manage Your Supply Chain
It is often said that you can only improve what you measure. To that end, there has been a lot of progress in performance tracking and activity-based costing over the past 10 years. With the advent of better activity-based costing, leading companies generate monthly manufacturing variance reports at a detailed and actionable level. However, this does not appear to be the case in the supply chains of many of those same companies. At the end of this post, I’ll recommend some specific supply chain metrics to guide your supply chain improvement.
We routinely find that many companies have a very limited understanding of their supply chain costs: what they are, where they come from or why they’re happening. In a typical engagement with a new client, one of the first things we do is develop a picture of their supply chain current state with respect to flows, cost and service. We work with the client to gather all of the available information, which is much too often a very formidable task, until we can assign the cost from each operation that touches a product or intermediate from the time it is a raw material until it is delivered as a final product to the customer.
When the project team first presents the results to management, we invariably hear, “We don’t do that,” or “Those costs must be wrong.” Unfortunately, we sometimes hear, “There is no way we’re losing that much money at that customer.”
Clearly, there are times when the team learns something new and we have to adjust the costs. However, in the majority of cases we walk through the elements of the costs with management and the realization sets in that the numbers are correct and the costs really are that high. Now that we have all seen the true picture of the supply chain we can align on the effort required to improve it.
Supply chain managers, like their manufacturing counterparts, should demand ongoing metrics at the operational level that are actionable if they want to drive improvement in their supply chains. Reports that provide only the total freight spend, total warehouse spend or total person-hours worked in the supply chain vs. the plan don’t contain enough actionable information to drive performance.
I propose the following metrics as a starting point for managing the total delivered cost to the customer base and welcome your feedback on any metrics that I might have missed or that might replace one I’ve suggested.
Total Delivered Cost Supply Chain Metrics, a Start:
- Actual vs. target for shipping containers
- Actual loaded vs. the maximum allowable capacity for the commodity and shipping container combination
- Actual vs planned cost to serve variance reports at the customer/product level of detail with specific variances called out for
- Cost of Goods Sold (COGS)
- Mode exception (shipped by a premium mode of transport vs. the planned mode)
- Sourcing exception (shipped from a different location than the planned source)
- Fill exception (the difference in cost if the shipping container were filled to the maximum allowable capacity)
- Volume variance (total volume shipped vs. the planned volume to allocate fixed costs)
- Mix variance (change in the mix of products shipped vs. the plan and its impact on cost)
- Price variance (change in the price charged by carriers and other logistics service providers vs. the planned price)
With this set of metrics a supply chain manager should be able to quickly understand the reason for any changes in the total delivered cost to each customer, and thus the gross margin. Now that we can measure it, we can manage it.
The Future of Supply Chain Network Design
Most leading companies perform supply chain network design (SCND) analysis to define the structure of their supply chain as well as the key operations that will be performed at each location in the resulting network. The analysis includes suppliers, manufacturing, warehousing, and distribution. In fact, a number of Fortune 100 companies require such analysis before approving capital to add manufacturing or warehousing infrastructure. Those on the cutting edge are also using SCND analysis on a continual basis to understand the true delivered cost to supply each customer as well as the price required from the customer to achieve profitability goals. Advances in network design modelling and optimization have also opened the door to a broader analysis that focuses on the collective supply chains of all competitors in a marketplace and how best to supply the market at the lowest cost and maximum profit.
Twenty-five years ago, the optimization tools to analyze and strategically set one’s supply chain infrastructure were new and untested. Those companies on the cutting edge had the vision to employ this technology to gain competitive advantage. They analyzed the trade-offs among raw material, production, transportation, distribution and inventory costs to understand the most cost effective way to meet their customer demand at higher customer service levels. In today’s world, what was once the domain of a few has become a “best practice” in supply chain management. Most supply chain managers are on the band wagon and perform some sort of optimization based supply chain analysis when considering major capital investment, key strategic initiatives or when their network has grown too large and unwieldy through acquisition and growth. That does not mean that the world has caught up to the thought leaders….rather the thought leaders continue to push the envelope and are using SCND to do more for their organization than they did in the past.
In particular, there are two areas where the best supply chain managers are focusing their attention with regard to SCND: First, they are making SCND an evergreen business process that is fully integrated into all strategic and tactical decisions related to network infrastructure and product sourcing. Second, they are expanding the scope of their supply chain analysis to not only include their own supply chain network, but also to analyze the supply chain dynamics of their major competitors and how the entire market is being served.
Sustained Supply Chain Network Design
In too many cases, SCND analysis is a one-and-done process. The reality is that it’s often difficult to assemble the data required to perform the analysis, but this prevents companies from assessing ongoing risks and opportunities. Through a carefully designed data management process integrated to the right set of tools, leading businesses are putting into place sustainable business processes to continually revisit their supply chain network structure and operations.
Good data is the driver of good supply chain analysis. Many companies struggle with understanding the true activity costs associated with one or more of the following: raw material supply, production, packaging, warehousing, distribution and inventory. When running a significant supply chain analysis and design project it is often the case that the bulk of time and effort is spent on gathering, organizing and validating the input data that drives the analysis. Unfortunately, too often that effort is wasted, as that data is used once and then forgotten. However this need not be the case.
Those implementing best practices have extended data warehouses to include key activity based operations and cost data that is used in their strategic and tactical network optimization. The data is kept evergreen through continual data management processes and is always available for the next what-if scenario. These what-if scenarios might include:
• Short term: How best to tactically respond to unexpected events like strikes, weather disruptions, major capacity losses, etc…?
• Mid-term: How do I evaluate new sales opportunities for large additional volumes and how will these impact my ability to deliver across the supply chain?
• Long Term: How do I evaluate new merger and acquisition opportunities? How do I plan for capacity expansions?
Companies who maintain the proper data and do not have to start from scratch on each new what-if analysiscan use a tried and true process to respond more quickly and more accurately to the opportunities that constantly and continually present themselves.
Extending Supply Chain Network Design to Competitive Markets
If you have used optimization SCND to analyze a portion of your business in the past couple of years, then you are running with the herd. If you have implemented sustainable business process to refresh and maintain the critical data and are able to run supply chain network what-if analysis at a moment’s notice, then you are running at the front of the herd. Those running way out in front are also using SCND analysis to understand the true delivered cost of supplying product to each customer and managing their business profitability accordingly.
Advances in network design modelling, optimization, and game theory have recently opened the door to a broader analysis that focuses on the collective supply chains of all competitors in a marketplace. These tools can be used to which customer/product combinations should be targeted and at what price to maximize profit. There are three key steps to being able to accomplish this.
1. Understand your own Total Delivered Cost to each customer.
Understanding your true total deliver cost to each of your customers enables you to analyze and determine the profit you are earning from each customer. It also partially informs your pricing decisions, especially in competitive situations or when the demand is greater than your ability to supply. Not only does this analysis determine the profitability by customer, it also determines the impact of adding or dropping a customer, thus answering the question, “Even though it’s low margin business, can we afford to lose the volume?”
2. Estimate competitor costs for supplying to a shared set of customers
While pricing information is largely influenced by your own internal costs for producing, delivering and selling to your customers, it is also heavily influenced by the market conditions and at what price your competitors are able and willing to sell their competitive products to the same customers. In order to understand the market dynamics, you need to be able to reasonably estimate your competitor’s costs. For example, if you are in an industry where transportation delivery costs are significant, then regionally located manufacturing will have an impact on price and profitability. Understanding which customers are more profitable for you and which are more costly for your competitors to serve enables you to develop a winning strategy.
3. Use cutting edge optimization technology to model the competitive market
While most businesses are good at determining pricing and identifying profitable customers intuitively and on and ad hoc basis, few have put into place the rigorous business processes and analytics to be able to do it accurately on a routine basis. This requires a deep understanding of your total delivered cost, your supply chain sourcing options, and the flexibility you have on both the cost and revenue side of the equation. It also requires a better understanding of your competitors supply chain, and what they may or may not be able to do, based on their own costs.
Supply chain network design optimization tools have become well integrated into modern business decision making processes at leading edge companies. They are used to rigorously analyze and make the best decision in response to both short-term events such as weather disruptions, spot sales opportunities, capacity outages) and long-term strategy, such as capacity expansion or mergers and acquisitions. These analytical approaches and technologies have recently been extended to enable businesses to analyze not just their own operations, but the sum of multiple supply chains in the competitive market place. It is exciting work and adding additional millions of dollars to bottom line profit each year.
December 1st, 2015 5:11 pm Category: Distribution, Global Supply Chain, Green Network, Inventory Management, Network Design, Optimization, Optimization Software, Scheduling, Solver Optimization, Supply Chain Improvement, Supply Chain Optimization, Supply Chain Planning, Transportation, Vehicle Routing, Warehouse Optimization, by: Gene Ramsay
Profit Point has been helping companies apply mathematical techniques to improve their business decisions for 20 years now, and it is interesting to review some of the advances in technology that have occurred over this time that have most enabled us to help our clients, including:
• The ability for companies to capture, store and access increasingly larger amounts of transaction and anecdotal data that quantify the behavior and motivation of customers, manufacturers, suppliers and other entities
• The improvement in analytical capabilities that help make optimized choices, in such areas as the solving mixed integer optimization problems, and
• The improvement of computing technology, allowing us to perform calculations in a fraction of the time required just a few years ago
A recent post on the Data Science Central website highlights the use of advanced techniques based on these advances by on-line marketplace Amazon, which is generally acknowledged as one of the most tech-savvy companies on the planet. 21 techniques are listed that Amazon uses to improve both their day-to-day operations and planning processes, including supply chain network design, delivery scheduling, sales and inventory forecasting, advertising optimization, revenue / price optimization, fraud detection and many others. For a complete list see the link below:
Like our customer Amazon, Profit Point is committed to using these techniques for the benefit of our clients – we have been concentrating on implementing business improvement for our clients, including optimization in various forms, since our very beginning. Are you, like Amazon, using the best methods to seize the opportunities that are available today?
Over the past week I’ve had two experiences that made me think about what’s required for a successful Organizational Change. The first was our CSCMP Roundtable tour of a family-owned food distribution company that had built a large, successful regional business by leveraging their founder’s focus on customer satisfaction and valuing his employees as cornerstone of the business. The company had recently been purchased by another family-owned company and was in the midst of a successful wholesale change in IT systems and work processes. Having seen many organizations struggle with such a large change, I asked our host about the secret of their organizational change. In a word, he said, “Culture.”
Immediately after the new owner had completed the purchase, they spent a lot of time with the employees reassuring them that the values of the company wouldn’t change even though the way that they did their jobs might change dramatically. In the end, the two companies’ cultures valued the same things: customer satisfaction and their employees. With that in mind the change management effort began as an inclusive effort with a clear set of goals for the new work processes. Not that there weren’t any bumps in the road, but the two once-separate organizations were able to push towards the new way of doing business as a common team.
So what does that have to do with a bike ride on a windy day? That’s where the second experience of the week comes in. Over the weekend, I completed a two-day 176 mile ride around Galveston Bay. Just like a good organizational change-management effort the first day was preceded by a lot of training and preparation and accompanied by excitement and adrenalin. We had some tough slogs, particularly one 21 mile stretch directly into a 15 mph headwind. It was grueling, but we knew it was coming and grunted our way through it. But then came the pay-off, the headwind became a tailwind as we sailed down the coast on our way to the finish line for Day 1. Again, like an organizational change, we had some tough slogs, but our preparation paid off and we were able to celebrate as we hit our first big milestone.
The second day of the ride promised to be a tough one. We had already ridden 97 miles on the first day, winds were blowing at almost 20 mph and were forecast to be mostly in our face all the way back to our starting point on Day 1. I knew it would be a challenging day, so I decided that speed was not very important; just finishing. In addition, I knew that I needed to find some like-minded riders so we could work together into the wind. Luckily fate smiled upon me and I found a couple of riders that were taking the same approach to the ride. We teamed up, taking turns pulling from the front so that the other two could draft and waiting for each other when we had flat tires. We also got to celebrate when we turned away from the wind and had it at our backs for short stretches before turning into it again. The parallels to a successful organizational change jumped out at me.
- We made a realistic assessment of the challenges ahead
- We set goals that were within our reach, given the challenges
- We found allies with the same mind-set and worked as a team towards a common goal
- We celebrated success when we had a downwind leg
- We finished as a team
I hope to see you out on the road, be it organizational change or riding your bike into the wind. Good luck, and let me know if you need someone to help pull on the next ride.
Recent events during the summer of 2015 have exposed a major vulnerability in the supply chains of many U.S. manufacturers located in the industrial belt of the American Midwest. Iron ore as well as many other bulk commodities such as grain and coal, is shipped from Northern Minnesota and Michigan via vessels on Lake Superior through the Soo Locks at Sault Ste. Marie Michigan and then south to the lower Great Lakes region. And for 20 days this past August these vessels that normally transit the Soo choke point experienced long delays and backups because the 2 primary and largest locks were unusable or intermittently closed for maintenance.
The trouble started when the MacArthur lock had to be shutdown unexpectedly in early August because of a set of gates that did not close properly, thus diverting its normal traffic to the adjacent Poe lock. This closure eventually lasted almost 20 days. Then, according to the newspaper USA Today, the Poe had to be briefly shut as well. (There is a 3rd lock that is still available but it is functionally obsolete and rarely used these days.) With both locks out of commission or only sporadically open, 100 vessels were delayed at least 166 hours during the height of the summer shipping season: imagine the cost to shippers as well as the disruptions on the receiving end of those shipments.
The Soo locks are a critical link in the U.S. transportation network: according to the Detroit News, 3985 ships hauling 77.5 million tons of iron ore, coal, grain and other cargoes transited the locks in 2014. A large part of the production in the Great Lakes region of the Midwest is directly or secondarily tied to the manufacture of steel and other basic commodities which in turn rely on marine delivery of raw materials via the Great Lakes. The Soo Locks are so important that during World War II, troops were send to guard them against any sabotage.
Given the vulnerability of this critical asset, and the deteriorating state of the country’s infrastructure generally, you would think that Congress and the Army Corps of Engineers would be moving quickly to either build a new lock or modernize the existing ones. However no significant institutional movement or progress is underway right now. Just how critical the Soo is becomes clear when you realize that only the 47-year-old Poe is big enough to handle the 1000-foot vessels that today carry roughly 70% of the freight on the Upper Great Lakes. The impact of any prolonged outage of this asset would certainly have catastrophic consequences on many companies’ supply chains
By now, we’ve all probably heard about the fact that there is a worldwide glut of crude oil.
This is due to many factors of course, including the increased production of oil and natural gas in North America (especially as a result of fracking), as well as the rising proportion that renewable energy sources have come to represent in the overall energy marketplace. And members of the OPEC cartel have made no secret that they are increasing or at least maintaining relatively high production levels so as to drive competitors out of the market and thus maintain their market share. Therefore the supply of petroleum on world markets is high, thus driving down the price.
This oversupply of crude oil in relationship to demand has had a big impact on the supply chain for moving oil from supplier to customer. Over the past decade, there has been a huge increase in the oil supply chain infrastructure. Trading companies have built vast storage facilities in order to insulate themselves from the high prices they experienced in the past. But now with the current glut of petroleum most of this on-shore storage capacity is full, and this has led to an interesting phenomenon. Now, some trading companies are being forced to use their marine transportation assets, i.e. oil tankers and supertankers, as simply floating tank farms. As the spot price of oil has collapsed, it now makes economic sense to simply load the vessels without a definite destination or customer in mind and store the oil at sea.
Such a strategy, of using transportation assets as de facto storage locations is very typical of any commodity type market where the market power of the customer is much stronger relative to the producer. For example, this situation has long been typical in certain commodities that are delivered by rail, where customers simply leave product parked in cars out on a rail siding somewhere until it’s needed.
Of course, over time as normal market forces do their work, the relative bargaining positions of the buyer and seller can shift. In the case of oil, various economic and political forces can quickly move the markets such that leaving tons of oil floating out on the seas in very expensive storage tanks no longer makes economic sense. And when this happens, those vessels will soon be put back to the purpose for which they were truly built.
One of our main activities at Profit Point is to help companies and organizations to plan better, to make informed decisions that lead to improvements such as more efficient use of resources, lower cost, higher profit and reduced risk. Frequently we use computer models to compare the projected results for multiple alternative futures, so that an organization can better understand the impacts and tradeoffs of different decisions. Companies can usually effectively carry out these types of processes and make decisions, since the CEO or Board of the entity is empowered to make these types of decisions, and then direct their implementation.
Infrastructure and resource allocation decisions must be made on a national and international basis as well, and are usually more difficult to achieve than within a company. An example of this today is the on-going controversy in southeastern Asia regarding the use of water from the Mekong River in the countries through which it flows: China, Myanmar, Laos, Thailand, Cambodia and Vietnam.
For a map of the river and region, refer to the link below:
The Mekong River rises in the Himalayan Mountains and flows south into the South China Sea. For millennia the marine ecosystems downstream have developed based on an annual spring surge of water from snow melt upstream. The water flow volume during this annual surge period causes the Tonle Sap River, a Mekong tributary in Cambodia, to reverse flow and absorb some of the extra water, resulting in a large temporary lake. That lake is the spawning ground for much of the fish population in the entire Lower Mekong river basin, which is in turn the main protein source for much of the human population in those areas.
Now China has an ambitious dam construction program underway along the upper Mekong, and other countries (along with their development partners) are planning more dams downstream. Laos, for one, has proposed construction of eleven dams, with an eye towards becoming “The Battery of Asia”.
The challenge here is to find and implement a resource allocation tradeoff that meets multiple objectives, satisfying populations and companies that need clean water, countries that need electricity to promote economic development and fish that need their habitat and life cycle.
Multiple parties have developed measures and models that can help forecast the impact of different infrastructure choices and water release policies on the future Mekong basin. Let’s hope that the governments in Southeast Asia are able to agree on a reasonable path forward, and implement good choices for the future use of the river.
For more information here are a few examples of articles on the Mekong:
July 21st, 2015 2:58 pm Category: Global Supply Chain, Network Design, Operations Research, Optimization, Optimization Software, Profit Network, Publications, Supply Chain Agility, Supply Chain Optimization, Supply Chain Planning, Warehouse Optimization, by: Ted Schaefer
Profit Point’s recent article in Industry Today, “The Future of Supply Chain Network Design,” describes how to fully leverage the new advances in a traditional supply chain optimization process to include not only your internal supply chain, but the supply chains of your competitors, as well.
Supply chain network design optimization tools have become well integrated into modern business decision-making processes at leading edge companies. The tools are used to rigorously analyze and make the best decisions in response to both short-term events such as weather disruptions, spot sales opportunities, utility outages and to longer-term strategy issues, such as capacity expansion or mergers and acquisitions. These analytical approaches and technologies can be game changers. The newest versions of SCND tools have been expanded: businesses can now analyze not just their own operations, but also the sum of multiple supply chains in the competitive marketplace, creating a new way to integrate competitive risk into the design of your supply chain.
Please contact us if you’d like to learn more about new ways to leverage traditional ideas.
Are new products placing strain on your warehouse space and warehouse operations? Are increases in revenue from new products being offset by higher supply chain costs? Are you seeing increasing costs for the disposal of discontinued products? Are you experiencing significantly higher costs for specialty SKUs for specific channels or specific customers? If you answered yes to any of these questions, then it may be time to consider a process of optimizing your SKU portfolio.
SKU optimization is a critical annual process that companies need to develop and execute on an ongoing basis. SKU optimization is the combination of analysis and the realities of the competitive marketplace used to determine the merits of adding, retaining, or deleting items from a company’s product assortment.
It’s simply a systematic and consistent business process for analyzing, evaluating and deciding on how to manage your SKU portfolio in order to better align with your organization’s overall strategies, objectives and goals. An effective SKU optimization program lays the groundwork for important initiatives, such as capacity planning and price optimization. Benefits of this integrated, cross-functional business process include improved profitability, increased product availability (lower out of stocks) and increased labor and asset productivity.
Why is SKU Optimization critically important? Research shows customers only use approximately 340 unique items per year in their households (down from 390 last year) from a pool of more than a million items sold *2014 ARM Research, Gartner*. Many consumer product companies have seen an explosion in SKUs over the past decade but don’t have a process for evaluating the merits of individual SKUs. The vast majority of all SKUs become liabilities to an organization at some point in their individual lifecycles.
Smart, proactive companies establish a consistent and repeatable process to identify when that inflection point is reached and execute plans to capture as much profit as possible before discontinuing the item, subject to marketplace constraints and competitive factors. If you’re interested in learning more about SKU optimization,you may either download or view the following presentation on this topic.
April 22nd, 2015 12:30 pm Category: Global Supply Chain, Network Design, Operations Research, Optimization, Profit Network, Supply Chain Improvement, Supply Chain Optimization, Supply Chain Planning, Supply Chain Software, by: Gene Ramsay
At Profit Point network design analysis, answering such questions as
• how many facilities a business needs,
• how large they should be and where they should be located, and
• how they should change over time, etc.
is one of our specialties. We have performed this type of analysis for a range of companies across multiple industry types and geographical regions, and we have developed our own network design-focused software package to help us do this type of study. (And we teach folks how to use the software as well, so they can answer their own network design questions, if they want to pursue that.)
Our modeling “toolbox”, our Profit Network software, is designed to be flexible and data-driven, so that the user can focus more attention on a key part of the supply chain where the questions must be answered, without having to define more detail than is really desired in other areas to the supply chain.
One of the key elements in many of the models we or our clients construct is the bill of materials. This data specifies the materials that are required to produce goods along the supply chain, be they intermediate materials or finished goods. For instance, if you are making a finished good such as a loaf of bread, the bill of materials would specify the quantities of flour, yeast, salt and other ingredients that would go into a batch.
To get trustworthy results from a model, it must require that the bill of materials (BOM) data be defined, and be used, in deriving the solution. (In some models we have encountered the BOM is just a suggestion, or products can be created from thin air if the BOM data in not defined.)
The BOM logic must also be able to capture the reality of a situation. The BOM may need to vary from one machine to another within the same facility. Or it might need to vary over time – as an example, when agricultural or dairy products are ingredients to a manufacturing process, the ingredients might have different characteristics over the seasons of the year, thus requiring different input quantities over time to produce a consistent, standardized output.
We work closely with our clients to ensure that our software is matched to their needs, and that it gives them the flexibility they need as their businesses change.
We just finished the fall soccer season in my home. I was thinking about watching my children play soccer when they were younger after a conversation with one of our consultants. He had just come back from visiting a prospective client where he was doing an assessment of their supply chain work processes and systems. Speaking frankly, this prospective client really did not have well defined work processes and certainly didn’t have systems implemented to enable good work processes. Mostly they seemed to run from one fire to the next and tried to do their best in tamping out the flames enough to be able to move onto the next crisis. Our consultant came back feeling dizzy from observing how they operated.
When my kids were younger and playing soccer, their style of play could be characterized as “kick and run”. They really either didn’t understand the concept of trying to possess the ball or couldn’t execute this strategy. If you have the ball, you have the opportunity to score. If your opponent does not have the ball, they can’t score. It’s a simple as that. After watching my kids play on Saturday mornings with this “kick and run” style, I would really enjoy going to see a local college team play. They have won numerous national championships and play at a very high level. They understand and are able to execute this “possess the ball” style of play. It was always helpful to see how the game should be played and get my perspective straightened out.
Perhaps the “possessing the ball” analog in the operation of a supply chain is “possessing the key information.” In soccer, you have to get the ball to your attackers at the right time and in the right place in order to score. Likewise, in the supply chain, you have to get the right information to the right people at the right time to beat the competition. If you are feeling dizzy from fighting fire after fire (playing “kick and run”) in your supply chain operations and don’t seem to be making any progress on making things better and more stable, it would be our privilege to help assess where you are at and work together to move your organization toward operating in championship form.
November 14th, 2014 9:45 am Category: Global Supply Chain, Green Network, Green Optimization, Network Design, Operations Research, Optimization, Optimization Software, Profit Network, Risk Management, Supply Chain Improvement, Supply Chain Optimization, Supply Chain Planning, Sustainability, by: Gene Ramsay
In developing a supply chain network design there are many criteria to consider – including such factors as the impact of the facility choices on
• Cost of running the system,
• current and future customer service,
• ability to respond to changes in the market, and
• risk of costly intangible events in the future
to name a few.
Frequently we use models to estimate revenues / costs for a given facility footprint, looking at costs of production, transportation, raw materials and other relevant components. We also sometimes constrain the models to ensure that other criteria are addressed – a constraint requiring that DCs be placed so that 80% of demand be within a day’s drive of a facility, for instance, might be a proxy for “good customer service”.
Some intangibles, such as political risk associated with establishing / maintaining a facility in a particular location, are difficult to measure and include in a trade off with model cost estimates. Another intangible of great interest for many companies, and that has been difficult to make tangible, is water risk. Will water be available in the required quantities in the future, and if so, will the cost allow the company to remain competitive? For many industry groups water is the most basic of raw materials involved in production, and it is important to trade off water risk against other concerns.
As I wrote in a previous blog published in this forum,
There are several risks that all companies face, to varying degrees, as global water consumption increases, including
• Physical supply risk: will fresh water always be available in the required quantities for your operations?
• Corporate image risk: your corporate image will likely take a hit if you are called out as a “polluter” or “water waster”
• Governmental interference risk: governmental bodies are becoming increasingly interested in water consumption, and can impose regulations that can be difficult to deal with
• Profit risk: all of the above risks can translate to a deterioration of your bottom line.
The challenge has been: how to quantify such risks so that they can be used to compare network design options.
Recently a post entitled “How Much is Water Worth” on LinkedIn highlighted a website developed by Ecolab that offers users an approach to monetization of water risks. This website allows the user to enter information about their current or potential supply chain footprint – such as locations of facilities and current or planned water consumption – and the website combines this information with internal information about projected GDP growth for the country of interest, the political climate and other factors to calculate a projected risk-adjusted cost of water over the time horizon of interest.
This capability, in conjunction with traditional supply chain modeling methods, gives the planner a tool that can be used to develop a more robust set of information that can be used in decision-making.
For more details visit the website waterriskmonetizer.com
October 17th, 2014 1:30 pm Category: Global Supply Chain, by: Karen Bird
August of 2013 I took a course at Stanford Graduate School of Business focused on Supply Chain Strategies and Leadership. Here are my takeaways of the Very Important Principles and People (VIPs) of Supply Chain Strategies and Leadership. We learned about The Triple A Supply Chain (Agility, Adaptability and Alignment), Vertical Integration, Postponement, Big Data, Value Chain Ethics and Sustainable Supply Chains. Each of these topics deserve a blog of their own. The bottom line is there is no cookie cutter strategy that fits every business. It is a combination of people and ideas and a willingness to innovate based on what’s happening in your business at the time. However, understanding these concepts and studying companies that have successfully leveraged them as well as those that failed gives us an idea of what may or may not work for a business or industry.
My class was made up of 40 students from 17 different countries and the industries represented were fashion, technology, chemicals, pharmaceuticals, transportation, logging and wine to name a few. We were given 20 case studies to read prior to the class starting and questions to answer for each case study. On the first day of class, we were assigned study groups of five people that reflected the diversity of the class. I learned as much from this part of the class as I learned from the professors and we had fun in the process.
The professors were all outstanding because they have theoretical as well as practical experience. A few examples were Professors Hau Lee, Bill Barnett and Michael Marks. Professor Lee was the Director of the program and taught us about The Triple A Supply Chain as well as many of the other Supply Chain Strategies. See how Professor Lee kept us engaged and why we call him Professor Bullwhip: http://youtu.be/b0dExw3es40. Professor Barnett focused on Leadership and is a prolific blogger. Here is one of his blogs on Leading by Design: http://www.barnetttalks.com/2014/07/leading-by-design.html. Finally Michael Marks, a founding Partner in Riverwood Capital and the former CEO of Flextronics, took us through some Supply Chain Ethics case studies as well as case studies on companies that demonstrate Innovation.
It was one of the greatest educational experiences that I have had. I am still thinking about it more than a year later. In my blogs over the next few months I will share:
- The Top 3 Strategies that the Top Supply Chain Companies are Using
- A More In-depth Discussion of the Very Important Principles of Supply Chain Strategies
July 17th, 2014 5:04 pm Category: Global Supply Chain, Green Network, Network Design, Optimization Software, Supply Chain Agility, Supply Chain Improvement, Supply Chain Optimization, Sustainability, Transportation, by: Gene Ramsay
Many of our activities at Profit Point are focused on helping clients in identifying and implementing changes that improve the efficiency of existing supply chain networks, ranging from planning to operations and scheduling. In the short term we are usually trying to find ways to use existing capabilities more effectively, but as you look out over longer time horizons supply chains evolve to develop new links, and these must be considered as you plan.
One instance of this evolution was described by my colleague, John Hughes, who recently wrote about the rise of a “New Silk Road”– a rail network stretching through Western China, Kazakhstan, Russia and Belarus to Europe – used for transporting manufactured goods from Asia to meet demand in Europe.
But Asia has a complementary demand that must be met for their manufacturing systems to function, the demand for energy to power their factories and cities. The growing worldwide demand for energy, and for faster routes to market, is opening up another new link in the global trade routes – the Northern Sea Route, a maritime route connecting Pacific ports with Europe via the Arctic.
Lawson Brigham, professor of geography and Arctic policy at the University of Alaska Fairbanks, was recently quoted on the arcticgas.gov website as saying “What’s really driving the Northern Sea Route is global commodity prices and natural resource development, particularly in Russia.”
The northern reaches of the earth are currently hotbeds of energy development, and much of the activity is focused on adding Liquefied Natural Gas (LNG) production capacity. Projects are on-line or in progress stretching from the North Slope in Alaska to the Yamal Peninsula in Siberia to Hammerfest in Norway. The Northern Sea Route offers quicker shipments of many of these supplies to major Asian ports, shaving ten to twenty days off one-way transit times from Russia and Norway to ports in Korea and China, compared to routes through the Suez Canal.
Climate change has made these routes generally ice-free for several months of each year, and thus more cost effective, but ice-strengthened cargo ships, with icebreaker support, are still required to keep the route open in the colder months, thus driving up the costs.
Supply chain planning activities on a global scale will over time need to expand to consider the potential impact of these types of shipping options. Keep an eye out for this and other new links in the global chain as they become available – change is inevitable.
For a more information on this route see articles like these:
April 16th, 2014 10:21 am Category: Global Supply Chain, Network Design, Operations Research, Optimization, Optimization Software, Profit Network, Supply Chain Optimization, Supply Chain Planning, Supply Chain Software, by: Gene Ramsay
Recently I had the opportunity to speak to an operations management class for MBA students in the Goizueta Business School at Emory University. The class is intended to give the students an introduction to a variety of problems that they might encounter during their careers, and to management science techniques that might be applied to them, using Excel as a solution platform. The professor had asked me to address the course topic from the point of view of one who had used these methods in the real world, and I was glad to do so, recounting my work in supply chain network design, hydro power generation scheduling, routing of empty shipping containers, natural gas supply contract management and various other problems.
During Q&A one of the students asked how a company should determine the appropriate source of resources to use for solving these types of problems – should it be in-house expertise or an outside consultant?
As I told him, to me, this depends on a number of factors, and I gave an example, based on our experience: In our practice we perform supply chain network design studies, and we also license the network design software that we use to our clients, if they desire. A number of clients have engaged us to first do an analysis for them, and then they have licensed the software so that they can then perform future projects themselves, using our initial project as a base. Many of these clients have used the software very effectively.
Those that have been most successful at using the software in-house, and at performing management science projects in-house in general, have several common characteristics-
- They are committed to tactical and strategic planning as tools for meeting their business goals,
- They have enough work in this area, and related areas, to keep an analyst or group of analysts busy full time, due to such factors as
- The scale and scope of their operations
- The speed of innovation in their industry
- The level of complexity of their supply chain and variety of products made, and
- Their desire for a “continuous improvement” approach as opposed to a “one-time reorganization” approach
- They have a commitment to maintaining personnel who
- have the proper skills and training to address these problems, and
- are allowed the time to work on these problems, rather than being constantly pulled off for “firefighting” short term or operational problems.
Most companies can make good use of management science solution methods, but, as you think about how to do this, try to make a realistic determination of your internal priorities, so you can decide between insourcing and outsourcing, or a mixture of the two.
Additive manufacturing or 3D printing is a process of making three-dimensional solid objects from a digital model. It is achieved by laying down successive layers of material, as opposed to the traditional machining techniques of removing material by drilling and cutting. 3D printing is usually performed by a materials printer using digital technology.
Taking a digital image of a toy and printing out a near-perfect replica of it seems sci-fi and surreal, but rapid technological advances in 3D printing being developed make this and even more possible. Printing metal parts with increased strength makes machines even more viable and cost-effective in manufacturing. Additionally, an entire part can be 3D printed in a single machine, eliminating multiple touch points in traditional manufacturing and reducing failures. The newest futuristic trend in 3D printing is to go huge: using robotics to deposit building materials in an orchestrated and precise way to build large structures made up tons of interconnecting parts.
3D printing is a reality. A recent Forbes magazine article, “What Can 3D Printing Do? Here are 6 Creative Examples” lists several ways in which 3D printing have been used:
- In 2012, doctors from University of Michigan developed a tracheal splint made from a polymer and created directly from a CT scan of a baby’s trachea/bronchus using image-based computer model with laser-based 2D printing to product the splint.
- Both General Motors and Ford Motor Company have used 3D printing to make prototypes of vehicle parts used in testing and design.
- Nasa has used 3D printing recently to make a rocket engine injector and use it for major hot fire testing.
- Defense Distributed, a high tech gunsmith group, created the world’s first 3D printed gun called the “Liberator”.
- Prosthetics including a 3D printed bionic ear created by Princeton University scientists have been developed.
Although 3D printing has been around since the 1980’s, a differentiating trend has emerged this year that could make 2014 pivotal: 3D printing machines are now being used to manufacture a large variety of consumer products not just heavy machinery and structural components such as aircraft parts. The printers are expensive and the 3D pictures required to print are difficult for most – a mainstream breakthrough in 3D printing could be seen in the near future as printers become cheaper and easier to use.
What could the Supply Chain of tomorrow look like if and when 3D printing takes off? It has the potential to transform certain parts of manufacturing and supply chains over the long term. Traditional supply chains are often characterized by mass production of products driven by forecasts and pushed to customers through a warehouse distribution network, with long lead times, high transportation costs and large carbon footprints. A 3D supply chain would be distinguished by having customized production, be “pulled” by customer demand, locally printed and distributed, have short lead times, low transportation costs as well as low carbon footprint. It will create a demand for smaller factories that would take offshore manufacturing and bring it close to the consumer. Goods will be cheaper to reproduce domestically versus manufactured offshore and shipped from low-wage countries. Because new technologies currently being developed result in a significant proportion of manufacturing becoming automated large and costly work forces would be reduced. In addition to distribution cost reduction, storage would also be a reduced as products could be made quickly in response to demand as opposed to meeting service levels via inventory and safety stocks.
Although it is a huge leap to go from printing a single object on a 3D printer to replacing an entire manufacturing enterprise and thus allowing any business or individual to become its own homegrown factory, Gartner Group calls it the “beginning of the Digital Industrial Revolution which threatens to reshape how we create physical goods”. If that “threat” becomes reality, then it promises to reshape how we consider and optimize our current Supply Chain.