Frequently when we work with clients to implement decision support tools for supply chain scheduling and planning, they often have some unique constraint that is essential to model and may be unique to their environment. Some recent examples we have encountered include the following:
- When producing a batch in a make to order environment, the plant always produces some extra amount called the purge quantity which is stuck in the piping from the reactor to the packout line. After purging the line, this material is recycled into the next batch.
- A warehouse can have capacity constraints on both the
- Throughput based on the number and type of doors and
- Storage based on material characteristics such as hazardous material classifications.
- When working with a dairy industry client, the bill of materials changes throughout the year based on the component ratios of the milk produced by the cows which drives the product split.
These types of situations are a regular occurrence and require modeling tools that allow for the flexibility to deal with them. We will implement decision support tools either with a development suite such as Aspen Tech’s Supply Chain ManagementTM or develop an application that connects to an optimization engine such as FICO’s XpressTM. These tools provide a base starting point but then allow for adding modeling constraints that are required to include to get to a solution that the client can actually implement.
In addition, having this flexibility allows the work processes and tools that enable the work processes to evolve over time as the business needs change.
This flexibility though has to be balanced with some level of standardization. Therefore we will often build a new application by using a previous application as a starting point. For a production scheduling tool, there are many things that are common between different implementations including how to represent the schedule via an interactive Gantt chart, common basic reports, standard interfaces to external systems, etc. In a production planning tool, typically there are plants, warehouses and transshipment points to be modelled via a network representation; costs and capacities at each of these nodes in the network that need to be modelled; and an objective function that is either to minimize cost or maximize profit. All of these would be common elements between different planning model implementations.
- Flexibility allows for
- Modelling essential constraints that may be unique to a particular client’s environment but are required to get to a feasible solution that the client can actually implement.
- Changing the tool over time as the business needs change.
- Standardization allows for
- Faster / cheaper implementation.
- Faster / cheaper support.
- Ease of training when moving to a different role but using similar tools.
Having a hybrid of flexibility with standardization is the best of both worlds!
Supply Chain Survey 2013:
Gaining Competitive Advantage
If you’re reading our blog, you are probably someone who is deeply interested in supply chain improvement. So we’d like to invite you to participate in this brief survey. And in return, we will send you exclusive, early access to the results of the survey along with our analysis .
Your insights and experiences are very important to us. And we are hosting the survey on a trusted, 3rd-party site so your responses will remain completely confidential. The survey is relatively short and should take only 3-4 minutes to complete. Please take a few moments to complete the Supply Chain Competitive Advantage Survey.
Start the Supply Chain Survey:
Gone are the days that supply chain was merely an expense. These days, savvy decision makers are gaining advantages over the competition by leveraging the data and tools available to them. In this survey, we will be exploring the methods, tools and processes that supply chain professionals utilize to gain competitive advantage via their supply chain.
Building applications, especially custom ones, carries with it the burden of answering the question: Does this do what the customer wants?
With complicated systems with many interacting features and business rules, answering this question can be daunting. In fact, evaluating the answer can be daunting too, from the perspective of the customer. Having the sales guy check some boxes in a questionnaire, or watching a demo just doesn’t leave you with the assurance that the application with handle all the business requirements, from either perspective, the vendors or the customer. Everyone I have spoken to who has sold complex software, or who has participated in the purchasing process of software has expressed the same doubt. They are just not sure that the tool will be a good fit. As we all know, that doubt does not always prevent the purchase of the software, as each organization has its own level of risk tolerance, and trust in the vendor’s brand or reputation. Often these other considerations can outweigh the amorphous doubt that some folks might feel. How can one quantify that doubt? Frankly, it’s a quandary.
This thought got us at Profit Point thinking… Wouldn’t it be great if there was another way to evaluate the goodness of fit or an application, or the appropriateness of the parameter settings, to match the business needs of an organization. Would it be great if there was a way to eliminate (or greatly reduce) the doubt, and replace it with facts. Either a business rule is obeyed or it is not. Either a decision is made according to the requirements, or it is not. Let’s eliminate the doubt, we thought, and the world would be a better place. (well a little bit anyway).
There are many processes for testing an application as it is being developed, with writing test scripts, and evaluating the results. All these are based on testing little pieces of code, to ensure that each function or sub routine does what it should do in each case of input data. These processes work fine in our opinion, but only when the sub of function is able to be considered independently form the others. When the system has functions that interact heavily, then this approach doesn’t reduce the doubt that the functions may conflict or compete in a way that the whole system suffers. How then to evaluate the whole system? Could we treat the entire application as one black box, and evaluate the important business cases, and evaluate the results? This is exactly what we have done, with the effect of reducing the doubt to zero about the suitability of the application for a business.
With several of our clients we have worked out what seems to be a great process of testing a complex software solution for suitability to the business requirement. In this case, the detailed level function testing methods were not open to us, since the solution relied on a Linear Programming technique.
This process is really just an amplification of the standard testing process.
- Define the test case, with the expected results
- Construct the test data
- Build or configure the application
- Run the Test using the Test Data and Evaluate the results – Pass or Fail
This is the standard process for testing small functions, where the expected results are clear and easy to imagine. However, in some systems where there many interacting rules and conflicting priorities, it may not be simple to know what the expected results should be without the help of the tool’s structure to evaluate them. Such is the case with many of our application, with layer upon layer of business rules and competing priorities… The very reason for using an LP based approach makes testing more complex.
In the revised process, we have, for each new business requirement:
- Construct the test case with the test data
- Build or configure the application
- Set the expected results using the results of the first pass build
- Re-factor the code and test until all test are passing
In my next blog I will show you the simple excel based tools we use to facilitate the test evaluation.
In practice, the process works well, new versions of the application go into production without any surprises, and with full confidence of the application management team that all the business requirements are 100% met.
No doubt – no doubt a better process.
By Jim Piermarini
I was sitting on the plane the other day and chatting with the guy in the next seat when I asked him why he happened to be traveling. He was returning home from an SAP ERP software implementation training course. When I followed up and asked him how it was going, I got the predictable eye roll and sigh before he said, “It was going OK.” There are two things that were sad here. First, the implementation was only “going OK” and second, that I had heard this same type of response from so many different people implementing big ERP that I was expecting his response before he made it.
So, why is it so predictable that the implementations of big ERP systems struggle? I propose that one of the main reasons is that the implementation doesn’t focus enough on the operational decision-making that drives the company’s performance.
A high-level project history that I’ve heard from too many clients looks something like this:
- Blueprinting with wide participation from across the enterprise
- Implementation delays
- Data integrity is found to be an issue – more resources are focused here
- Transaction flow is found to be more complex than originally thought – more resources are focused here
- Project management notices the burn rate from both internal and external resources assigned to the project
- De-scoping of the project from the original blueprinting
- Reports are delayed
- Operational functionality is delayed
- Testing of transactional flows
- Go-live involves operational people at all levels frustrated because they can’t do their jobs
Unfortunately, the de-scoping phase seems to hit some of the key decision-makers in the supply chain, like plant schedulers, supply and demand planners, warehouse managers, dispatchers, buyers, etc. particularly hard, and it manifests in the chaos after go-live. These are the people that make the daily bread and butter decisions that drive the company’s performance, but they don’t have the information they need to make the decisions that they must make because of the de-scoping and the focus on transaction flow. (It’s ironic that the original sale of these big ERP systems are made at the executive level as a way to better monitor the enterprise’s performance and produce information that will enable better decision-making.)
What then, would be a better way to implement an ERP system? From my perspective, it’s all about decision-making. Thus, the entire implementation plan should be developed around the decisions that need to be made at each level in the enterprise. From blueprinting through the go-live testing plan, the question should be, “Does the user have the information in the form required and the tools (both from the new ERP system and external tools that will still work properly when the new ERP system goes live) to make the necessary decision in a timely manner?” Focusing on this question will drive user access, data accuracy, transaction flow, and all other elements of the configuration and implementation. Why? Because the ERP system is supposed to be an enabler and the only reasons to enter data into the system or to get data out is either to make a decision or as the result of a decision.
Perhaps with that sort of a focus there will be a time when I’ll hear an implementation team member rave about how much easier it will be for decision-makers throughout the enterprise once the new system goes live. I can only hope.
Sales and operations planning (S&OP) is an integrated business management process that enables a company to continually balance and manage the supply chain supply and demand to achieve its strategic and tactical business objectives. More and more business leaders are relying on S&OP to align and improve decision making across the disparate parts of their organization. And, many companies are still adopting and improving the techniques and tools that they use to improve S&OP.
So this year, we conducted an S&OP Survey of key decision makers to learn more about their challenges, concerns and expectation for 2012. Business leaders from a variety of companies and industries were polled. Here’s what we learned:
- Many companies lack the metrics needed to capture the benefits from S&OP
- Scenario and sensitivity analysis is the tool of choice for S&OP planners who understand that sales forecasts are imperfect
- More companies are beginning to collaborate with suppliers and customers to improve S&OP
- For many companies, point-of-sale (POS) data may be the key to effective sales and operations planning
To read the complete report, including our conclusions, click the link below:
“With every passing year, the amount and variety of information available to make business decisions continues its exponential growth. As a result, business leaders have an opportunity to exploit the possibilities inherent in this rich, but complex, stream of information. Alternatively, they can continue with the status quo, using only their good business sense and intuition and thereby risk being left in the dust by competitors. Top-tier companies have learned to harness the available data with powerful decision support tools to make fast, robust trade-offs across many competing priorities and business constraints.”
Read the complete article here: Face Complexity – Making Sound Business Decisions