Here at Profit Point, we typically put in a fair amount of effort up front to scope out a project together with our client. This typically helps us and our client to set appropriate expectations and develop mutually agreeable deliverables. These are key to project success. But another key element to project success is getting good quality data that will allow our clients to make cost effective decisions from the analysis work we are doing or the software tool we are implementing.
Decision support models are notoriously data hogs. Whether we are working on a strategic supply chain network design analysis project or implementing a production scheduling tool or some optimization model, they all need lots and lots of data.
The first thing we do (which is usually part of our scoping effort) is identify each of the data types that will be required and what will be the source of this data. To do this we start with what decisions are trying to be made and what data is required to make them successfully. From there we identify if the data currently exists in some electronic form (such as an MRP system) or whether it will have to be collected and entered into some system (say a spreadsheet or database program) and then figure out how the data will get into the tool we are developing.
Second, we try to get sample data from each data source as early as possible. This allows us to see if the assumptions that were made as part of the scoping effort were valid. There is nothing like getting your hands on some real data to see if what you and your team were assuming is really true! Often there are some discoveries and revelations that are made by looking at real data that require design decisions to be made to be able to meet the project deliverables.
Third, to help with data validation we find it extremely helpful to be able to visualize the data in an appropriate way. This could take the form of graphs, maps, Gantt charts, etc. depending on the type of data and model we are working on. On a recent scheduling project, we had the schedulers review cycle times in a spreadsheet but it wasn’t until they saw the data in Gantt chart form that they noticed problems with the data that needed correcting.
Identifying data sources, getting data as early as possible and presenting the data in a visualized form are absolutely required to make a project successful. Omitting any of these steps will at least add to the project cost and / or duration or possibly doom the project to failure.
We just finished the fall soccer season in my home. I was thinking about watching my children play soccer when they were younger after a conversation with one of our consultants. He had just come back from visiting a prospective client where he was doing an assessment of their supply chain work processes and systems. Speaking frankly, this prospective client really did not have well defined work processes and certainly didn’t have systems implemented to enable good work processes. Mostly they seemed to run from one fire to the next and tried to do their best in tamping out the flames enough to be able to move onto the next crisis. Our consultant came back feeling dizzy from observing how they operated.
When my kids were younger and playing soccer, their style of play could be characterized as “kick and run”. They really either didn’t understand the concept of trying to possess the ball or couldn’t execute this strategy. If you have the ball, you have the opportunity to score. If your opponent does not have the ball, they can’t score. It’s a simple as that. After watching my kids play on Saturday mornings with this “kick and run” style, I would really enjoy going to see a local college team play. They have won numerous national championships and play at a very high level. They understand and are able to execute this “possess the ball” style of play. It was always helpful to see how the game should be played and get my perspective straightened out.
Perhaps the “possessing the ball” analog in the operation of a supply chain is “possessing the key information.” In soccer, you have to get the ball to your attackers at the right time and in the right place in order to score. Likewise, in the supply chain, you have to get the right information to the right people at the right time to beat the competition. If you are feeling dizzy from fighting fire after fire (playing “kick and run”) in your supply chain operations and don’t seem to be making any progress on making things better and more stable, it would be our privilege to help assess where you are at and work together to move your organization toward operating in championship form.
This month, Supply Chain Management Review is featuring a 3-part series by Dr. Alan Kosansky and Michael Taus of Profit Point entitled Managing for Catastrophes: Building a Resilient Supply Chain. In this article we discuss the five key elements to building a resilient supply chain and the steps you can take today to improve your preparedness for the next catastrophic disruption.
Once a futuristic ideal, the post-industrial, globally-interconnected economy has arrived. With it have come countless benefits, including unprecedentedly high international trade, lean supply chains that deliver low cost consumer goods and an improved standard of living in many developing countries. Along with these advances, this interdependent global economy has amplified collective exposure to catastrophic events. At the epicenter of the global economy is a series of interconnected supply chains whose core function is to continue to supply the world’s population with essential goods, whether or not a catastrophe strikes.
In the last several years, a number of man-made and natural events have lead to significant disruption within supply chains. Hurricane Sandy closed shipping lanes in the northeastern U.S., triggering the worst fuel shortages since the 1970s and incurring associated costs exceeding $70 billion. The 2011 earthquake and tsunami that struck the coast of Japan, home to the world’s 3rd largest economy representing almost nine percent of global GDP caused nearly $300 billion in damages. The catastrophic impact included significant impairment of country-wide infrastructure and had a ripple effect on global supply chains that were dependent on Japanese manufacturing and transportation lanes. Due to interconnected supply chains across a global economy, persistent disruption has become the new norm.
Are you ready to build a resilient supply chain?
Call us at (866) 347-1130 or contact us here.
March 6th, 2014 9:32 am Category: Operations Research, Optimization, Optimization Software, Profit Network, Profit Vehicle Planner, Profit Vehicle Router, Supply Chain Improvement, Supply Chain Optimization, Supply Chain Planning, by: Jim Piermarini
In the recent weeks, I have been thinking about testing our applications, like our popular Profit Network, or Profit Vehicle Planner. When we test, we run data sets that are designed to stress the system in different ways, to ensure that all the important paths through the code are working properly. When we test, our applications get better and better. There are many good reasons to test, most importantly, is to know that an improvement in one part of the code does not break a feature in a different part of the code.
I have been thinking about how we could test our code a bit more, and the means by which we could do that. I have been reading about automated testing, and its benefits. They are many, but the upshot is that if the testing is automated, you will likely test more often, and that is a good thing. To automate application testing requires the ability to churn out runs with nobody watching. And to do that, the application needs to be able to be kicked off and run in a way that there are no buttons or dialog boxes that must be manually clicked to continue. There can be no settings that must be manually set, or information reviewed to decide what to do next. In addition, the application must then save the results somewhere, either in the instance of the application, or to a log file, or to a database of some sort. Then finally, to really be testing, the results must be compared to the expected results to determine the pass/fail state of the test. This requires having a set of expected results for every test data set.
In looking at this process above, I see numerous similarities to the process used to run a sensitivity analysis, in that many runs are typically run, (so automation is a natural help) and the results need to be recorded. Sensitivity Analysis is a typical process for user of our Profit Network tool, and out Profit Planner and Profit Scheduler tool. An additional step in sensitivity analysis however, is that you may desire to change the input data in a systematic way (say Demand + 5%, and Demand -5%), and to the extent that it is indeed systematic, this too could be folded into the automation. The results analysis is different too, in that here you would like to look across the final sets of results at the differences, while in testing you just compare one set of test results to its expected results. I can foresee difficulty in automating the data changes, since each type of data may need to be changed in a very specific way. Never-the-less, even if the data changes are manual, they could be prepared ahead of the run, and the runs themselves could be grouped in a batch run to generate the results needed for a sensitivity analysis.
Constructing a harness that lashes up to an application where you can define the number of runs to be made, the setting for that run, the different data sets to be used, and the output location for results to be analyzed, would be useful not only for testing, but for the type of sensitivity analysis we do a lot of here at Profit Point.
I am going to encourage our developers to investigate this type of a system harness to be able to talk to and control our applications to be able to run them automatically, and have their results automatically stored in a data store for either test or sensitivity analysis.
Jim Piermarini | CEO Profit Point Inc.
February 19th, 2014 3:51 pm Category: Supply Chain Optimization, by: Karen Bird
The economy has been slow to recover after the Great Recession of 2008, however, many economists believe that 2014 and 2015 will be strong years for the US and Global economies. How accurate will your forecasting model be in projecting the supply needed from your business? Since forecasting models typically use two to three years of history (actual sales) to predict the future and we are coming out of a down economy and heading towards positive growth, the standard forecasting models will not predict the future very well. This is where human intelligence and companies with a formal Sales and Operations Planning (S&OP) process have an advantage.
A formal S&OP process gives companies a monthly opportunity for their Sales and Operations teams to come together and review the data, latest intelligence from the field and make the best decisions possible for the company. In addition, a formal S&OP process gives the business a forum each month to challenge the current execution plan and either reconfirm or adjust the plan to meet the strategic goals of the company. A monthly review of key forecasting metrics can provide the Sales team with valuable feedback regarding the forecast.
Read the Profit Point S&OP Research Report
A study by the Aberdeen Group shows that greater than 60% of Best-In-Class companies view a formal S&OP process as a strategic priority for their organization and that Best-In-Class companies hold an 18 point advantage in forecast accuracy. According to an AMR Research study from 2008, companies that are best at demand forecasting average:
- 15% less inventory
- 17% higher perfect order fulfillment
- 35% shorter cash-to-cash cycle times
- 1/10 the stock-outs of their peers
How does a formal S&OP process help to deliver these benefits? It is a combination of getting the right people together to make the right decisions at the right time. A few years ago, Thomas Wallace, author of Sales and Operations Planning: How-To Handbook initiated a project to study the experiences of companies using Executive S&OP very well. The companies in the Best Practices Project cited similar hard benefits listed above but they also said “the soft benefits are equal in importance, or perhaps greater, than the hard benefits”. The soft benefits most often cited were:
- Enhanced Teamwork
- Embedded Communications
- Better Decisions
- Better Financial Plans
- More Focused Accountability
- Greater Control
- A Window Into The Future
A well run S&OP process will put a spotlight on problem areas or gaps in your business 18 – 24 months in the future. This allows the team to collectively see a potential problem or upside opportunity and produce scenarios to help the company to react to them in a timely and efficient manner.
One of the pleasures of working at Profit Point is having the occasion to reflect on and write a blog about a subject that is of interest to me. Taking time to reflect on what is important, especially now around Thanksgiving and the holiday season has rewards of its own, but especially with regards to the business we are in, helping people make their supply chains better, brings rewards unique to our profession. I am pleased and thankful to be a part of it.
At the grand scale, improving supply chains improves the very heart of most businesses. It reduces waste, reduces cost, increases efficiency and profit, and reduces the detrimental impact of commerce on our world, including reducing our carbon footprint and land fill waste as well as promoting the most efficient use of our natural resources and raw materials. Profit Point routinely has an enormous impact on the business of our customers, and I personally am pleased and thankful to be a part of it.
On a more human level, the people at our customers who interact with Profit Point personnel on our projects, come away with a profound sense that they can make a difference in their company; the impact of our network studies or scheduling work or other projects demonstrates the ability of a better business process to be rolled out and actually change the business for the better. Once involved in a successful project, many of these people move up in their organizations to lead other successful business change projects. Nothing succeeds like success! I am pleased and thankful to be a part of it.
Reflecting on our part in the business world as small actors on a large stage, it is rewarding to be able to know that we have made some difference.
Wishing you the same peace of mind this holiday season, Jim Piermarini, CEO Profit Point Inc.
October 23rd, 2013 9:00 am Category: White Papers, by: Editor
Today, smart manufacturers view the supply chain as a critical element for gaining competitive advantage. Leading companies have long since gloablized their manufacturing and distribution operations. They rely heavily on enterprise resource planning (ERP) platforms to track and record virtually every transaction that occurs in the supply chain – from raw materials sourcing to point-of-sale sell-through.Without doubt, the efficiencies that have accrued through ERP are significant. When one accounts for reduced inventory, carrying costs, labor costs, improvements to sales and customer service, and efficiencies in financial management, the tangible cost savings to enterprises have been estimated to range between 10 and 25% or more. 1 2 Global and multinational concerns have reorgnized themselves – through ERP standardization – to create a competitive advantage over regional manufacturers.
While this ERP standardization has created an advantage for larger concerns, leading supply chain managers are discovering new ways to improve beyond ERP’s limitations. In essence, these supply chain ‘disruptors’ are seeking new ways to separate themselves from the pack. The functional areas and tools used by these disruptors varies widely – from long-term global supply chain network design to near-term sales and operations planing (S&OP) and order fulfillment; and from realtively simple solver-based spreadsheets to powerful optimization software deeply integrated in to the ERP data warehouse.
At Profit Point, we believe that continued pursuit of supply chain improvement is great. We believe that it is good for business, for consumers and for the efficient use (and reuse) of resources around the globe. In this survey, we set out to explore the methods, tools and processes that supply chain professionals utilize to improve upon their historical gains and to gain competitive advantage in the future. You can request a copy of the report here.
We welcome your feedback. Please feel free to contact us or leave a comment below.
This month’s IndustryWeek features an article by Alan Kosansky and Ted Schaefer entitled Margin-based Supply Chain Optimization.
“To effectively implement margin-based supply chain optimization, it is important to have three key components in place: data, optimization technology and alignment with strategic business objectives.
Margin-based supply chain optimization is a new business process based on two key business priorities: 1) the desire to deliver more high profit products to customers, and 2) the ability to stop serving customers and products with low profit yield. This supply chain decision support process quantitatively shows companies which customers to serve and what products to produce in order to maximize profit and margin. For companies with complex supply chain operations, this is often easier said than done. Recent advances in the availability of data and optimization modeling, however, enable a growing number of companies to implement more efficient and effective supply chain systems.
A company’s portfolio of customers and products typically changes more quickly than the assets used to meet the customer demand. These situations include changes in the macro-economic environment that precipitate significant increases or decreases in customer demand, shifts in a company’s product portfolio, development of new markets, or changes in the cost to produce and/or deliver products or services. In each scenario, margin-based supply chain optimization is a key tool to help companies manage supply to achieve maximum profitability.
To effectively implement margin-based supply chain optimization, it is important to have three key components in place. They are: data, optimization technology and most importantly, alignment with strategic business objectives.”
Supply Chain Survey 2013:
Gaining Competitive Advantage
If you’re reading our blog, you are probably someone who is deeply interested in supply chain improvement. So we’d like to invite you to participate in this brief survey. And in return, we will send you exclusive, early access to the results of the survey along with our analysis .
Your insights and experiences are very important to us. And we are hosting the survey on a trusted, 3rd-party site so your responses will remain completely confidential. The survey is relatively short and should take only 3-4 minutes to complete. Please take a few moments to complete the Supply Chain Competitive Advantage Survey.
Start the Supply Chain Survey:
Gone are the days that supply chain was merely an expense. These days, savvy decision makers are gaining advantages over the competition by leveraging the data and tools available to them. In this survey, we will be exploring the methods, tools and processes that supply chain professionals utilize to gain competitive advantage via their supply chain.
Building applications, especially custom ones, carries with it the burden of answering the question: Does this do what the customer wants?
With complicated systems with many interacting features and business rules, answering this question can be daunting. In fact, evaluating the answer can be daunting too, from the perspective of the customer. Having the sales guy check some boxes in a questionnaire, or watching a demo just doesn’t leave you with the assurance that the application with handle all the business requirements, from either perspective, the vendors or the customer. Everyone I have spoken to who has sold complex software, or who has participated in the purchasing process of software has expressed the same doubt. They are just not sure that the tool will be a good fit. As we all know, that doubt does not always prevent the purchase of the software, as each organization has its own level of risk tolerance, and trust in the vendor’s brand or reputation. Often these other considerations can outweigh the amorphous doubt that some folks might feel. How can one quantify that doubt? Frankly, it’s a quandary.
This thought got us at Profit Point thinking… Wouldn’t it be great if there was another way to evaluate the goodness of fit or an application, or the appropriateness of the parameter settings, to match the business needs of an organization. Would it be great if there was a way to eliminate (or greatly reduce) the doubt, and replace it with facts. Either a business rule is obeyed or it is not. Either a decision is made according to the requirements, or it is not. Let’s eliminate the doubt, we thought, and the world would be a better place. (well a little bit anyway).
There are many processes for testing an application as it is being developed, with writing test scripts, and evaluating the results. All these are based on testing little pieces of code, to ensure that each function or sub routine does what it should do in each case of input data. These processes work fine in our opinion, but only when the sub of function is able to be considered independently form the others. When the system has functions that interact heavily, then this approach doesn’t reduce the doubt that the functions may conflict or compete in a way that the whole system suffers. How then to evaluate the whole system? Could we treat the entire application as one black box, and evaluate the important business cases, and evaluate the results? This is exactly what we have done, with the effect of reducing the doubt to zero about the suitability of the application for a business.
With several of our clients we have worked out what seems to be a great process of testing a complex software solution for suitability to the business requirement. In this case, the detailed level function testing methods were not open to us, since the solution relied on a Linear Programming technique.
This process is really just an amplification of the standard testing process.
- Define the test case, with the expected results
- Construct the test data
- Build or configure the application
- Run the Test using the Test Data and Evaluate the results – Pass or Fail
This is the standard process for testing small functions, where the expected results are clear and easy to imagine. However, in some systems where there many interacting rules and conflicting priorities, it may not be simple to know what the expected results should be without the help of the tool’s structure to evaluate them. Such is the case with many of our application, with layer upon layer of business rules and competing priorities… The very reason for using an LP based approach makes testing more complex.
In the revised process, we have, for each new business requirement:
- Construct the test case with the test data
- Build or configure the application
- Set the expected results using the results of the first pass build
- Re-factor the code and test until all test are passing
In my next blog I will show you the simple excel based tools we use to facilitate the test evaluation.
In practice, the process works well, new versions of the application go into production without any surprises, and with full confidence of the application management team that all the business requirements are 100% met.
No doubt – no doubt a better process.
By Jim Piermarini
What kind of risks are you prepared for?
As a supply chain manager, you have profound control over the operations of your business. However, it is not without limits, and mother nature can quickly and capriciously halt even the smoothest operation. Or other man-made events can seemingly conspire to prevent goods from crossing borders, or navigating traffic, or being produced and delivered on time. How can you predict where and when your supply chain may fall prey to unforeseen black swan events?
Prediction is very difficult, especially about the future. (Niels Bohr, Danish physicist) But there are likely some future risks that your stockholders are thinking about that you might be expected to have prepare for. The post event second guessing phrase: “You should have known, or at least prepared for” has been heard in many corporate supply chain offices after recent supply chain breaking cataclysmic events: tsunami, hurricane, earthquake, you name it.
- What will happen to your supply chain if oil reaches $300 / barrel? What lanes will no longer be affordable, or even available?
- What will happen if sea level rises, causing ports to close, highways to flood, and rails lines to disappear?
- What will happen if the cost of a ton of CO2 is set to $50?
- What will happen if another conflict arises in the oil countries?
- What will happen if China’s economy shrinks substantially?
- What will happen if China’s economy really takes off?
- What will happen if China’s economy really slows down?
- What will happen if the US faces a serious drought in the mid-west?
What will happen if… you name it, it is lurking out there to have a potentially dramatic effect on your supply chain.
As a supply chain manager, your shareholders expect you to look at the effect on supply, transportation, manufacturing, and demand. The effect may be felt in scarcity, cost, availability, capacity, government controls, taxes, customer preference, and other factors.
Do you have a model of your supply chain that would allow you to run the what-if scenario to see how your supply chain and your business would fare in the face of these black swan events?
Driving toward a robust and fault tolerant supply chain should be the goal of every supply chain manager. And a way to achieve that is to design it with disruption in mind. Understanding the role (and the cost) of dual sourcing critical components, diversified manufacturing and warehousing, risk mitigating transportation contracting, on-shoring/off-shoring some manufacturing, environmental impacts, and customer preferences, just to begin the list, can be an overwhelming task. Yet, there are tools and processes that can help with this, and if you want to be able to face the difficulties of the future with confidence, do not ignore them. The tools are about supply chain planning and modelling. The processes are about risk management, and robust supply chain design. Profit Point helps companies all over the world address these and other issues to make some of the of the best running supply chains anywhere.
The future is coming, are you ready for it?
June 22nd, 2012 3:46 pm Category: Distribution, Enterprise Resource Planning, Global Supply Chain, Green Network, Green Optimization, Network Design, Optimization, Supply Chain Agility, Supply Chain Improvement, Supply Chain Planning, Transportation, Vehicle Routing, by: Editor
Supply Chain optimization is a topic of increasing interest today, whether the main intention is to maximize the efficiency of one’s global supply chain system or to pro-actively make it greener. There are many changes that can be made to improve the performance of a supply chain, ranging from where materials are purchased, the types of materials purchased, how those materials get to you, how your products are distributed, and many more. An additional question on the mind of some decision makers is: Can I minimize my environmental footprint and improve my profits at the same time?
Many changes you make to your supply chain could either intentionally – or unintentionally – make it greener, so effectively reducing the carbon footprint of the product or material at the point that it arrives at your receiving bay. Under the right circumstances, if the reduced carbon footprint results from a conscious decision you make and involves a change from ‘the way things were’, then there might be an opportunity to capture some financial value from that decision in the form of Greenhouse Gas (GHG) emission credits, even when these emission reductions occur at a facility other than yours (Scope 3 emissions under the Greenhouse Gas Protocol).
As an example, let’s consider the possible implications of changes in the transportation component of the footprint and decisions that might allow for the creation of additional value in the form of GHG emission credits. In simple terms, credits might be earned if overall fuel usage is reduced by making changes to the trucks or their operation, such as the type of lubricant, wheel width, idling elimination (where it is not mandated), minimizing empty trips, switching from trucks to rail or water transport, using only trucks with pre-defined retrofit packages, using only hybrid trucks for local transportation and insisting on ocean going vessels having certain fuel economy improvement strategies installed. These are just some of the ways fuel can be saved. If, as a result of your decisions or choices made, the total amount of fuel and emissions is reduced, then valuable emission credits could be earned. It is worth noting that capturing those credits is dependent on following mandated requirements and gaining approval for the project.)
If your corporate environmental strategy requires that you retain ownership of these reductions, then you keep the credits created and the value of those credits should be placed on the balance sheet as a Capital Asset. Alternatively, if you are able, the credits can be sold on the open market and the cash realized and placed on the balance sheet. Either way, shareholders will not only get the ‘feel good’ benefit of the environmental improvement, but also the financial benefit from improvement to the balance sheet. If preferred, the credits can be sold to directly offset the purchase price of the material involved, effectively reducing that price and so increasing the margin on the sales price of the end-product and again improving the bottom line. If capital investment is required as part of the supply chain optimization, the credit value can also be a way to shorten the payback period and improve the ROI, or to allow an optimization to occur
So, when you consider improving your environmental impact or optimizing your supply chain, consider the possibility that there might be additional value to unlock if you include both environmental and traditional business variables in your supply chain improvement efforts.
Written by: Peter Chant, President, The FReMCo Corporation Inc.
I was sitting on the plane the other day and chatting with the guy in the next seat when I asked him why he happened to be traveling. He was returning home from an SAP ERP software implementation training course. When I followed up and asked him how it was going, I got the predictable eye roll and sigh before he said, “It was going OK.” There are two things that were sad here. First, the implementation was only “going OK” and second, that I had heard this same type of response from so many different people implementing big ERP that I was expecting his response before he made it.
So, why is it so predictable that the implementations of big ERP systems struggle? I propose that one of the main reasons is that the implementation doesn’t focus enough on the operational decision-making that drives the company’s performance.
A high-level project history that I’ve heard from too many clients looks something like this:
- Blueprinting with wide participation from across the enterprise
- Implementation delays
- Data integrity is found to be an issue – more resources are focused here
- Transaction flow is found to be more complex than originally thought – more resources are focused here
- Project management notices the burn rate from both internal and external resources assigned to the project
- De-scoping of the project from the original blueprinting
- Reports are delayed
- Operational functionality is delayed
- Testing of transactional flows
- Go-live involves operational people at all levels frustrated because they can’t do their jobs
Unfortunately, the de-scoping phase seems to hit some of the key decision-makers in the supply chain, like plant schedulers, supply and demand planners, warehouse managers, dispatchers, buyers, etc. particularly hard, and it manifests in the chaos after go-live. These are the people that make the daily bread and butter decisions that drive the company’s performance, but they don’t have the information they need to make the decisions that they must make because of the de-scoping and the focus on transaction flow. (It’s ironic that the original sale of these big ERP systems are made at the executive level as a way to better monitor the enterprise’s performance and produce information that will enable better decision-making.)
What then, would be a better way to implement an ERP system? From my perspective, it’s all about decision-making. Thus, the entire implementation plan should be developed around the decisions that need to be made at each level in the enterprise. From blueprinting through the go-live testing plan, the question should be, “Does the user have the information in the form required and the tools (both from the new ERP system and external tools that will still work properly when the new ERP system goes live) to make the necessary decision in a timely manner?” Focusing on this question will drive user access, data accuracy, transaction flow, and all other elements of the configuration and implementation. Why? Because the ERP system is supposed to be an enabler and the only reasons to enter data into the system or to get data out is either to make a decision or as the result of a decision.
Perhaps with that sort of a focus there will be a time when I’ll hear an implementation team member rave about how much easier it will be for decision-makers throughout the enterprise once the new system goes live. I can only hope.
A husband, two kids and a golden retriever later… I am back to implementations in Supply Chain planning and scheduling. To my surprise, the same challenges I encountered 10 years ago remain in force today: data, defining business processes, data, implementing software, data, training people, data, supporting the change to a new system and data.
Data collection remains one of the cornerstones of success of a supply chain planning or scheduling implementation. Though scores of data may exist in a company’s business, harnessing it to feed into a planning or scheduling model can be extremely complex and time consuming. Interestingly, the data collection process often drives an elucidation of manufacturing practices and process flows, and clients learn what they do and don’t know about their business. This may seem backwards and risky in terms of getting things out of order. In a perfect world, a thorough understanding of manufacturing and business processes would pave the way towards building an Advanced Planning and/or Scheduling System. In reality, they often happen in tandem and are evolutionary in nature.
Deciding how data will be housed, derived and propagated early on in an implementation will pay off in the long run. Establishing a systematic, automated way to update and propagate data is equally important as the decision of what software system to use. It is worth the investment to take the time to put this automation in place as a greater and greater number of products are added to a system the data will remain manageable and scalable.
From PC to Cloud, emails to tweets, networking happy hours to LinkedIn, it is nice to know some things stay the same.
IDK (“I Don’t Know”)
After listening to a Freakonomics Radio podcast on NPR, the following question and blog comments emerged:
Why do people feel compelled to answer questions that they do not know the answer to?
What I’ve found in business is that we are all prone to hiding our ignorance when asked a question that we cannot answer. So even if someone absolutely has no idea what the answer is, if it’s within his or her realm of expertise, “faking” seems to be an essential part of the response.
My professor friend told me that she has learned the following from teaching MBA students: “One of the most important things you learn as an MBA student is how to pretend you know the answer to any question even though you have absolutely no idea what you’re talking about. It’s really one of the most destructive factors in business. Everyone masquerades like they know the answer and no one will ever admit they don’t know the answer, which makes it almost impossible to discover the correct answer”.
I ask: Does every question need to be answered?
Everyone expects answers to every question, especially if the question comes from someone higher up in an organization. However, not every unknown question is worth the time and resources to research. If it comes down to the choice of making-up an answer or being saddled with a research project, many people will prefer to make-up an answer. Perhaps in some situations, combined with the ego/self-image issues, every question will be answered, regardless of the person’s knowledge.
I ask: Should IDK be a legitimate response?
Perhaps, if the question has minimal economic impact on the business, and you know something related to the question, then maybe a guesstimate (an estimate made without using adequate or complete information) is fine.
But then, for significant economic impact questions …maybe it’s better to say “IDK the answer to that question, but we are studying it”, and then do the study!
As an example, management asks: Will our delivered cost per SKU increase or decrease if we add more distribution centers to meet expected growth rates and satisfy customer service levels?
The first reaction guesstimate might be “yes they will increase”, although, this might not be true.
The smart analyst will say: “Hmmm, IDK! Give me a few hours (days) to do a quick analysis, and see what the true impact will be.”
A small spreadsheet study looking at the increase in production and distribution levels combined with the increase fixed and variable costs associated with adding a few new distribution centers may be surprising. It may indicate that the increase volume and revenues and lower transportation costs will offset the increased DC costs.
This small study may also be the first in a stage gate approach to perform a forward looking comprehensive supply chain infrastructure study. A detailed strategic infrastructure study can capture the manufacturing and distribution details, including costs and constraints, generating results that will allow management to make a reliable strategic economic decision.
No field is exempt from their know-it-alls, even when the correct answer really is IDK.
I submit, if you are in an uncertain position, try the IDK approach and then offer the following response “I can check into that and find an answer for you”. You may be surprised to learn that your credibility with management will improve.
“Every act of conscious learning requires the willingness to suffer an injury to one’s self-esteem. That is why young children, before they are aware of their own self-importance, learn so easily.”
Sales and operations planning (S&OP) is an integrated business management process that enables a company to continually balance and manage the supply chain supply and demand to achieve its strategic and tactical business objectives. More and more business leaders are relying on S&OP to align and improve decision making across the disparate parts of their organization. And, many companies are still adopting and improving the techniques and tools that they use to improve S&OP.
So this year, we conducted an S&OP Survey of key decision makers to learn more about their challenges, concerns and expectation for 2012. Business leaders from a variety of companies and industries were polled. Here’s what we learned:
- Many companies lack the metrics needed to capture the benefits from S&OP
- Scenario and sensitivity analysis is the tool of choice for S&OP planners who understand that sales forecasts are imperfect
- More companies are beginning to collaborate with suppliers and customers to improve S&OP
- For many companies, point-of-sale (POS) data may be the key to effective sales and operations planning
To read the complete report, including our conclusions, click the link below:
November 21st, 2011 12:20 pm Category: Global Supply Chain, Network Design, Optimization, Risk Management, Scheduling, Supply Chain Agility, Supply Chain Improvement, Supply Chain Planning, Sustainability, by: Jim Piermarini
Change is hard.
In the businesses that I help, change comes for several reasons. It may be thrust upon the business from the outside, a change in the competitive landscape for instance, or a new regulation. It may come from some innovative source within the company, looking for cost savings to increase profitability of productivity, or a new process or product with increased productivity. Change can come from the top down, or from the bottom up. Change can come in a directed way, as part of a larger program, or organically as part of a larger cultural shift. Change can come that makes your work easier, or harder, and may even eliminate a portion (or all) of the job that you were doing. Change can come to increase the bottom line or the top line. But primarily change comes to continue the adaptation of the company to the business environment. Change is the response to the Darwinian selector for businesses. Adapt or decline. Change is necessary. It is clear to me from my experience that businesses need to change to stay relevant.
This may seem trite or trivial, but accepting that change is not only inevitable, but that it is good, is the shift in attitude that separates the best companies (and best employees) from the others.
So, you say, I see the need to change, it is not the change itself that is so difficult, but rather the way that it is inflicted upon us that makes it hard. So, why does it have to be so hard? Good question.
Effective managers know that change is necessary but hard. They are wary of making changes, and rightly so. Most change projects fail. People generally just don’t like it. Netflix is a great example. Recently, Netflix separated their streaming movie service from their DVD rental business. After what I am sure must have been careful planning, they announced the change, and formed Quikster, the DVD rental site, and the response from the customer base was awful. As you likely know, Netflix, faced with the terrible reception from their customer base and stockholders, reversed their decision to separate streaming from DVDs. What was likely planned as a very important change, failed dead. Dead, dead, dead. Change can be risky too.
If change is necessary, but hard and risky… how can you tame this unruly beast?
The secret of change is that it relies on three things: People, Process, and Technology. I name them in the order in which they are important.
People are the most important agents relative to change, since they are the one who decide on the success or failure of the change. People decided that the Netflix change was dead. People decide all the time about whether to adopt change. And people can be capricious and fickle. People are sensitive to the delivery of the change. They peer into the future to try to understand the affect it will have on them, and if they do not like what they see… It is the real people in the organization who have to live with the change, who have to make it work, and learn the new, and unlearn the old. It is likely the very same people who have proudly constructed the current situation that will have to let go of their ‘old’ way of doing things to adopt to the new. Barriers to change exist in many directions in the minds of people. I know this to be true… in making change happen, if you are not sensitive to the people who you are asking to change, and address their fears and concerns, the change will never be accepted. If you do not give them a clear sense of the future state and where they will be in it, and why it is a better place, they will resist the change and have a very high likely hood of stopping the change, either openly, or more likely passively and quietly, and you may never know why the fabulously planned for change project failed.
Process is the next aspect of a change project that matters. A better business process is what drives costs down. Avoiding duplication of efforts, and removing extra steps. Looking at alternatives in a ‘what-if’ manner, in order to make better decisions, these are what make businesses smarter, faster, better. A better business process is like getting a better recipe for the kitchen. Yet, no matter how good a recipe; it still relies on the chef to execute it and the ovens to perform properly. Every business is looking for better business processes, just as every Chef is looking for new recipes. But putting an expert soufflé recipe, where the soufflé riser higher, in the hands of an inexperienced Chef does not always yield a better soufflé. People really do matter more than the process.
Technology is the last aspect of the three that effect change. Better technology enables better processes. A better oven does not make a Chef better. The Chef gets better when they learn to use the new oven in better ways, when they change the way they make the soufflé, since the oven can do it. A better oven does not do it by itself. An oven is just an oven. In the same way, better technology is still just technology. It by itself changes nothing. New processes can be built that use it, and people can be encouraged to use it in the new process. Technology changes are the least difficult to implement, and it is likely due to this fact that they are often fixed upon as the simple answer to what are complex business problems requiring a comprehensive approach to changing the business via it people, process, and technology.
Change is necessary, but hard and risky. Without change businesses will miss opportunities to adapt to the unforgiving business world, and decline. However, change can be tamed if the attitude towards it is changed to be considered a good thing, and is addressed with a focus on people, process and technology, in that order. Done right, you can implement the change that will increase the bottom line and avoid a collapse of your soufflé.
“With every passing year, the amount and variety of information available to make business decisions continues its exponential growth. As a result, business leaders have an opportunity to exploit the possibilities inherent in this rich, but complex, stream of information. Alternatively, they can continue with the status quo, using only their good business sense and intuition and thereby risk being left in the dust by competitors. Top-tier companies have learned to harness the available data with powerful decision support tools to make fast, robust trade-offs across many competing priorities and business constraints.”
Read the complete article here: Face Complexity – Making Sound Business Decisions
At Profit Point, we often repeat the mantra “People, Process, Technology.” All three are important for the kinds of projects we work on. You have to have good systems (the technology part) that support good work processes and people that follow the process and use the systems. If your people are not committed to following the process and using the systems, you are going nowhere fast.
Recently we were discussing with a senior manager at one of our clients what makes for a good Sales and Operations Planning Process (S&OP Process). Being someone who is more of a process and technology guy I was thinking that he might say something like “You have to have a well thought out work process that is clearly communicated to everyone involved” or “You have to have a system that is easy to work with that supports the work process well.” WRONG!
The first thing he mentioned was that senior management needed to be openly committed to the process and systems. He illustrated this for us by recounting what another senior manager at this same client said during an S&OP meeting with a large group. The group was going back and forth discussing a “potential” order from a customer and this particular senior manager said “If it’s not in the system then it’s a rumor and we don’t plan and schedule for rumors.”
As you can imagine, this cut down on the chatter in the room quite quickly. This client had spent a lot of time and money developing processes and systems that worked well and those two things are necessary but not sufficient. You have to have leadership that says “We have a work process to follow and a system to use to support executing that process. Follow the process and use the system.”
Next you have to have people who do exactly that! If this is not happening then as I heard from another executive “Either the people will change or the people will change!”
You have to be able to trust the data in the system but really at its root this boils down to trusting the people who entered the data in the system. As I was reminded, this starts at the top!
October 20th, 2010 11:37 am Category: Optimization, by: Ted Schaefer
A good friend of mine, who works for a large employer in her city, recently told me that her department’s budget, along with every other department budget that was classified as “Administration” in the ubiquitous SAP system, had to be cut by a large and specific percentage.
It didn’t matter that the “Administration” label was not uniformly applied across her organization and that some departments that were so labeled performed functions very similar to other departments that were not stuck with that label. It didn’t matter what services each department provided, or how efficiently they provided them, they just had to cut the budget and they had to hit the number. Incredibly, it didn’t matter that her group was one of the few “Administration” groups that actually generated revenue; in her case three times their total annual budget spend.
Unfortunately, hers is not the first story like this that I have heard.
There is no doubt that many corporations, organizations, governments and households have been hit hard by the recent economic downturn. Each of these groups has been forced to make some difficult decisions. So what do I have against across-the-board (ATB) budget cuts? Basically, I think it has to be the worst way to reduce costs in an organization, and here’s why.
Let’s take a look at something that is important and familiar to all of us; the family budget. Sadly, many families have been forced to drastically reduce spending as a result of a lay-off or furlough over the past two years. In those cases, an ATB cost-cutting strategy just doesn’t work. Try telling the bank that you’ve had to cut your monthly mortgage payments by 15%. I doubt that they will be impressed when you tell them that you’ve had to do the same with your property taxes, insurance premiums, electricity and water payments, as well. You might get lucky and be able to renegotiate your mortgage and you might get lucky if your state provides utilities assistance for people who have recently lost their jobs, but most tax assessors and insurance companies will not be particularly sympathetic.
But my guess is that you’d probably take a very different type of approach to cost-cutting in your household. You’d probably take a hard look at all of the money that you’re spending over a month or a quarter. You might first examine your spending to see if you could conserve on the amount you consume or if there were ways to get the same goods and services in a cheaper manner. If that didn’t reduce your spending enough, you’d probably divide the remaining spending into different categories. There are many different ways to categorize your expenses, but they’ll probably come down to something like, 1) Essential; 2) Non-essential, but painful to cut; 3) Non-essential and easier to cut. If you’re lucky, you will be able to cut enough of your spending by eliminating or reducing your expenses in the non-essential categories. If not, you might be forced to re-examine what really is “Essential.” For example, your mortgage payment is essential, as long as you plan to stay in your house, but if the situation calls for it, you can reduce your costs by moving into a smaller home or apartment. Not a fun choice, but it could be the right thing to do in certain situations.
Looking back on the family budget example, what did we do? First, we looked for opportunities to conserve and less expensive ways to purchase the same goods and services. Next, we prioritized our spending so we could make good decisions. To find less expensive ways to purchase the same goods and services and to prioritize the spending means that we needed to 1) understand what we were getting for the money we were spending and 2) understand what would happen when we stopped spending that money. After prioritizing our spending we made trade-offs by deciding what we could live without. Some of the trade-offs may have been no-brainers, but some may have been very difficult.
I would argue that this is the same process that should occur in any organization that needs to reduce its spending. It amazes me how a manager can walk into a large organization and mandate a large cut in the budget for each and every department (as they are defined in the accounting system, but that’s a different blog) without understanding where, how and why the money is spent. It would be laughable if the results weren’t so sad.
ATB budget cuts penalize your best managers. These are the managers that run a lean operation, who have taken the initiative to drive out all of the waste and improve productivity. They are already doing the job you’ve asked them to do with the fewest resources possible, but they are being treated in the same manner as the manager who is either not as effective, or who has become jaded by past ATB cuts, so that he/she keeps some “rainy day” resources in the budget for just such “emergencies.” (… and people wonder why their best managers seem to leave after these types of budget cuts, even when their positions are not eliminated.)
Let’s not forget the knock-on effect of penalizing your best managers. The best managers often assemble the best teams to do the work. If one or more members of a lean, highly productive, well-functioning team is forced out in an ATB cut, the rest of the team is forced to pick up the additional work of the departing team members. This extra work, on top of an already full workload, either forces the quality of the work to suffer, or reduces the total output of the team; that is, if the rest of the team elects to stay in an organization that doesn’t value efficiency.
ATB budget cuts often fail to achieve their savings targets or result in so much “slash and burn” damage to the organization that “add-backs” must occur after the blood-letting so the organization can survive. It continues to amaze me that these managers have the time to perform an initial ATB cut, followed by another one or by an “add back” program; but don’t have the time to do it right the first time.
ATB cuts suggest that the value of the work performed under each of the budgets is equal to the value of the work performed in all other budgets. I have seen a lot of different organizations over my career and I don’t think I’ve ever observed this to be the case. Take my friend’s case: her group makes money, while others spend it. Is a cost cut that forces a reduction in revenue equal to a cost cut that has no impact on revenue? Probably not.
So, what’s the answer? Clearly, many organizations are forced to radically reduce costs just to survive. I think it goes back to our home budget example: 1) know what you’re spending; 2) understand what you get for it, 3) find ways to get the same or similar things for less money, and 4) make the hard choices about what you can do without.
In the end, my experience has been that managers who drive ATB cost reductions are incapable/unwilling to understand their business processes and organizations sufficiently; lack the imagination or skills to reengineer their business processes; or lack the courage to make the hard choices about what their organization will do and what it won’t do in the future.
To all those top level managers who have instituted ATB cuts, or for those who are planning to do so: Don’t do it! Think before you act, and save your company the added burden of bad management.