Archive for the ‘Supply Chain Improvement’ Category

Frequently when we work with clients to implement decision support tools for supply chain scheduling and planning, they often have some unique constraint that is essential to model and may be unique to their environment.  Some recent examples we have encountered include the following:

  • When producing a batch in a make to order environment, the plant always produces some extra amount called the purge quantity which is stuck in the piping from the reactor to the packout line. After purging the line, this material is recycled into the next batch.
  • A warehouse can have capacity constraints on both the
    1. Throughput based on the number and type of doors and
    2. Storage based on material characteristics such as hazardous material classifications.
  • When working with a dairy industry client, the bill of materials changes throughout the year based on the component ratios of the milk produced by the cows which drives the product split.

OctopusThese types of situations are a regular occurrence and require modeling tools that allow for the flexibility to deal with them.  We will implement decision support tools either with a development suite such as Aspen Tech’s Supply Chain ManagementTM or develop an application that connects to an optimization engine such as FICO’s XpressTM.   These tools provide a base starting point but then allow for adding modeling constraints that are required to include to get to a solution that the client can actually implement.

In addition, having this flexibility allows the work processes and tools that enable the work processes to evolve over time as the business needs change.

This flexibility though has to be balanced with some level of standardization.  Therefore we will often build a new application by using a previous application as a starting point.  For a production scheduling tool, there are many things that are common between different implementations including how to represent the schedule via an interactive Gantt chart, common basic reports, standard interfaces to external systems, etc.  In a production planning tool, typically there are plants, warehouses and transshipment points to be modelled via a network representation; costs and capacities at each of these nodes in the network that need to be modelled; and an objective function that is either to minimize cost or maximize profit.  All of these would be common elements between different planning model implementations.

CrabBalancing flexibility and standardization brings the following benefits:

  • Flexibility allows for
    1. Modelling essential constraints that may be unique to a particular client’s environment but are required to get to a feasible solution that the client can actually implement.
    2. Changing the tool over time as the business needs change.
  • Standardization allows for
    1. Faster / cheaper implementation.
    2. Faster / cheaper support.
    3. Ease of training when moving to a different role but using similar tools.

Having a hybrid of flexibility with standardization is the best of both worlds!

In the wake of the recent election, there has been a lot of talk about the types of changes we’ll be facing over the next few years.  The continuing analysis of the election and a recent plane ride have given me a good refresher course on some of the critical factors that enable a successful change in an organization or doom it to failure.

The day after the election, I was traveling home to Houston and I took the advice that I give to my two college-age sons: if you really want to know what’s going on with a particular issue, make sure you get at least two different points of view and the truth is likely to fall somewhere in the middle.  So I bought a copy of The New York Times and The Wall Street Journal to read their analyses of the election results.  Needless to say, the newspapers had some fairly different interpretations of the same sets of facts.  Sometimes, it was a matter of drawing different conclusions from the same set of data and other times it was a matter of which facts were emphasized and in what order that would lead a reader in two different directions depending on which paper I was reading.

The following week, I was traveling home after our project team delivered the final presentation to our executive sponsors.  Our team had recommended a number of changes to the client’s supply chain; some fairly straight forward and others that would require a significant change in culture.  As it happens, I sat next to a gentleman who helps companies change cultures.  We had a good conversation, helped by our third row-mate, who bought drinks for the row, about a number of different things.  However, one thing that stuck with me was his premise that an organization’s results are determined by its culture.  In this organizational model, actions drive results, but beliefs drive actions.  Thus, to change the results in a company, one must change the beliefs held by the people who impact the results.

Once again, I was reminded that the key to a successful change is the people who run the process.  If they are not engaged and if they don’t believe that the change will be a good one, you’re in for a very rough ride.  Further, when trying to understand the current beliefs that drive the actions that drive the success of your change, it’s best to seek out more than one source of information.

Here at Profit Point, we typically put in a fair amount of effort up front to scope out a project together with our client.  This typically helps us and our client to set appropriate expectations and develop mutually agreeable deliverables.  These are key to project success.  But another key element to project success is getting good quality data that will allow our clients to make cost effective decisions from the analysis work we are doing or the software tool we are implementing.

Decision support models are notoriously data hogs.  Whether we are working on a strategic supply chain network design analysis project or implementing a production scheduling tool or some optimization model, they all need lots and lots of data.

The first thing we do (which is usually part of our scoping effort) is identify each of the data types that will be required and what will be the source of this data.  To do this we start with what decisions are trying to be made and what data is required to make them successfully.  From there we identify if the data currently exists in some electronic form (such as an MRP system) or whether it will have to be collected and entered into some system (say a spreadsheet or database program) and then figure out how the data will get into the tool we are developing.

Second, we try to get sample data from each data source as early as possible.  This allows us to see if the assumptions that were made as part of the scoping effort were valid.  There is nothing like getting your hands on some real data to see if what you and your team were assuming is really true!  Often there are some discoveries and revelations that are made by looking at real data that require design decisions to be made to be able to meet the project deliverables.

Third, to help with data validation we find it extremely helpful to be able to visualize the data in an appropriate way.  This could take the form of graphs, maps, Gantt charts, etc. depending on the type of data and model we are working on.  On a recent scheduling project, we had the schedulers review cycle times in a spreadsheet but it wasn’t until they saw the data in Gantt chart form that they noticed problems with the data that needed correcting.

Identifying data sources, getting data as early as possible and presenting the data in a visualized form are absolutely required to make a project successful.  Omitting any of these steps will at least add to the project cost and / or duration or possibly doom the project to failure.

Profit Point has been helping companies apply mathematical techniques to improve their business decisions for 20 years now, and it is interesting to review some of the advances in technology that have occurred over this time that have most enabled us to help our clients, including:
• The ability for companies to capture, store and access increasingly larger amounts of transaction and anecdotal data that quantify the behavior and motivation of customers, manufacturers, suppliers and other entities
• The improvement in analytical capabilities that help make optimized choices, in such areas as the solving mixed integer optimization problems, and
• The improvement of computing technology, allowing us to perform calculations in a fraction of the time required just a few years ago

A recent post on the Data Science Central website highlights the use of advanced techniques based on these advances by on-line marketplace Amazon, which is generally acknowledged as one of the most tech-savvy companies on the planet. 21 techniques are listed that Amazon uses to improve both their day-to-day operations and planning processes, including supply chain network design, delivery scheduling, sales and inventory forecasting, advertising optimization, revenue / price optimization, fraud detection and many others. For a complete list see the link below:

Like our customer Amazon, Profit Point is committed to using these techniques for the benefit of our clients – we have been concentrating on implementing business improvement for our clients, including optimization in various forms, since our very beginning. Are you, like Amazon, using the best methods to seize the opportunities that are available today?

Over the past week I’ve had two experiences that made me think about what’s required for a successful Organizational Change.  The first was our CSCMP Roundtable tour of a family-owned food distribution company that had built a large, successful regional business by leveraging their founder’s focus on customer satisfaction and valuing his employees as cornerstone of the business.  The company had recently been purchased by another family-owned company and was in the midst of a successful wholesale change in IT systems and work processes.  Having seen many organizations struggle with such a large change, I asked our host about the secret of their organizational change.  In a word, he said, “Culture.”

Immediately after the new owner had completed the purchase, they spent a lot of time with the employees reassuring them that the values of the company wouldn’t change even though the way that they did their jobs might change dramatically.  In the end, the two companies’ cultures valued the same things: customer satisfaction and their employees.  With that in mind the change management effort began as an inclusive effort with a clear set of goals for the new work processes.  Not that there weren’t any bumps in the road, but the two once-separate organizations were able to push towards the new way of doing business as a common team.

IMG_0841So what does that have to do with a bike ride on a windy day?  That’s where the second experience of the week comes in.  Over the weekend, I completed a two-day 176 mile ride around Galveston Bay.  Just like a good organizational change-management effort the first day was preceded by a lot of training and preparation and accompanied by excitement and adrenalin.  We had some tough slogs, particularly one 21 mile stretch directly into a 15 mph headwind.  It was grueling, but we knew it was coming and grunted our way through it.  But then came the pay-off, the headwind became a tailwind as we sailed down the coast on our way to the finish line for Day 1.  Again, like an organizational change, we had some tough slogs, but our preparation paid off and we were able to celebrate as we hit our first big milestone.

The second day of the ride promised to be a tough one.  We had already ridden 97 miles on the first day, winds were blowing at almost 20 mph and were forecast to be mostly in our face all the way back to our starting point on Day 1.  I knew it would be a challenging day, so I decided that speed was not very important; just finishing.  In addition, I knew that I needed to find some like-minded riders so we could work together into the wind.  Luckily fate smiled upon me and I found a couple of riders that were taking the same approach to the ride.  We teamed up, taking turns pulling from the front so that the other two could draft and waiting for each other when we had flat tires.  We also got to celebrate when we turned away from the wind and had it at our backs for short stretches before turning into it again.  The parallels to a successful organizational change jumped out at me.

  • We made a realistic assessment of the challenges ahead
  • We set goals that were within our reach, given the challenges
  • We found allies with the same mind-set and worked as a team towards a common goal
  • We celebrated success when we had a downwind leg
  • We finished as a team

I hope to see you out on the road, be it organizational change or riding your bike into the wind.  Good luck, and let me know if you need someone to help pull on the next ride.


Are new products placing strain on your warehouse space and warehouse operations? Are increases in revenue from new products being offset by higher supply chain costs? Are you seeing increasing costs for the disposal of discontinued products? Are you experiencing significantly higher costs for specialty SKUs for specific channels or specific customers? If you answered yes to any of these questions, then it may be time to consider a process of optimizing your SKU portfolio.

SKU optimization is a critical annual process that companies need to develop and execute on an ongoing basis. SKU optimization is the combination of analysis and the realities of the competitive marketplace used to determine the merits of adding, retaining, or deleting items from a company’s product assortment.

It’s simply a systematic and consistent business process for analyzing, evaluating and deciding on how to manage your SKU portfolio in order to better align with your organization’s overall strategies, objectives and goals. An effective SKU optimization program lays the groundwork for important initiatives, such as capacity planning and price optimization. Benefits of this integrated, cross-functional business process include improved profitability, increased product availability (lower out of stocks) and increased labor and asset productivity.

Why is SKU Optimization critically important? Research shows customers only use approximately 340 unique items per year in their households (down from 390 last year) from a pool of more than a million items sold *2014 ARM Research, Gartner*. Many consumer product companies have seen an explosion in SKUs over the past decade but don’t have a process for evaluating the merits of individual SKUs. The vast majority of all SKUs become liabilities to an organization at some point in their individual lifecycles.

Smart, proactive companies establish a consistent and repeatable process to identify when that inflection point is reached and execute plans to capture as much profit as possible before discontinuing the item, subject to marketplace constraints and competitive factors. If you’re interested in learning more about SKU optimization,you may either download or view the following presentation on this topic.


Steve Westphal
EDGE Network

At Profit Point network design analysis, answering such questions as
• how many facilities a business needs,
• how large they should be and where they should be located, and
• how they should change over time, etc.
is one of our specialties. We have performed this type of analysis for a range of companies across multiple industry types and geographical regions, and we have developed our own network design-focused software package to help us do this type of study. (And we teach folks how to use the software as well, so they can answer their own network design questions, if they want to pursue that.)

Our modeling “toolbox”, our Profit Network software, is designed to be flexible and data-driven, so that the user can focus more attention on a key part of the supply chain where the questions must be answered, without having to define more detail than is really desired in other areas to the supply chain.
One of the key elements in many of the models we or our clients construct is the bill of materials. This data specifies the materials that are required to produce goods along the supply chain, be they intermediate materials or finished goods. For instance, if you are making a finished good such as a loaf of bread, the bill of materials would specify the quantities of flour, yeast, salt and other ingredients that would go into a batch.

To get trustworthy results from a model, it must require that the bill of materials (BOM) data be defined, and be used, in deriving the solution. (In some models we have encountered the BOM is just a suggestion, or products can be created from thin air if the BOM data in not defined.)

The BOM logic must also be able to capture the reality of a situation. The BOM may need to vary from one machine to another within the same facility. Or it might need to vary over time – as an example, when agricultural or dairy products are ingredients to a manufacturing process, the ingredients might have different characteristics over the seasons of the year, thus requiring different input quantities over time to produce a consistent, standardized output.

We work closely with our clients to ensure that our software is matched to their needs, and that it gives them the flexibility they need as their businesses change.

We just finished the fall soccer season in my home.  I was thinking about watching my children play soccer when they were younger after a conversation with one of our consultants.  He had just come back from visiting a prospective client where he was doing an assessment of their supply chain work processes and systems.  Speaking frankly, this prospective client really did not have well defined work processes and certainly didn’t have systems implemented to enable good work processes.  Mostly they seemed to run from one fire to the next and tried to do their best in tamping out the flames enough to be able to move onto the next crisis.  Our consultant came back feeling dizzy from observing how they operated.

When my kids were younger and playing soccer, their style of play could be characterized as “kick and run”.  They really either didn’t understand the concept of trying to possess the ball or couldn’t execute this strategy.  If you have the ball, you have the opportunity to score.  If your opponent does not have the ball, they can’t score.  It’s a simple as that.  After watching my kids play on Saturday mornings with this “kick and run” style, I would really enjoy going to see a local college team play.  They have won numerous national championships and play at a very high level.  They understand and are able to execute this “possess the ball” style of play.  It was always helpful to see how the game should be played and get my perspective straightened out.

Perhaps the “possessing the ball” analog in the operation of a supply chain is “possessing the key information.”  In soccer, you have to get the ball to your attackers at the right time and in the right place in order to score.  Likewise, in the supply chain, you have to get the right information to the right people at the right time to beat the competition.  If you are feeling dizzy from fighting fire after fire (playing “kick and run”) in your supply chain operations and don’t seem to be making any progress on making things better and more stable, it would be our privilege to help assess where you are at and work together to move your organization toward operating in championship form.

In developing a supply chain network design there are many criteria to consider – including such factors as the impact of the facility choices on
• Cost of running the system,
• current and future customer service,
• ability to respond to changes in the market, and
• risk of costly intangible events in the future
to name a few.

Frequently we use models to estimate revenues / costs for a given facility footprint, looking at costs of production, transportation, raw materials and other relevant components. We also sometimes constrain the models to ensure that other criteria are addressed – a constraint requiring that DCs be placed so that 80% of demand be within a day’s drive of a facility, for instance, might be a proxy for “good customer service”.

Some intangibles, such as political risk associated with establishing / maintaining a facility in a particular location, are difficult to measure and include in a trade off with model cost estimates. Another intangible of great interest for many companies, and that has been difficult to make tangible, is water risk. Will water be available in the required quantities in the future, and if so, will the cost allow the company to remain competitive? For many industry groups water is the most basic of raw materials involved in production, and it is important to trade off water risk against other concerns.

As I wrote in a previous blog published in this forum,

There are several risks that all companies face, to varying degrees, as global water consumption increases, including
• Physical supply risk: will fresh water always be available in the required quantities for your operations?
• Corporate image risk: your corporate image will likely take a hit if you are called out as a “polluter” or “water waster”
• Governmental interference risk: governmental bodies are becoming increasingly interested in water consumption, and can impose regulations that can be difficult to deal with
• Profit risk: all of the above risks can translate to a deterioration of your bottom line.

The challenge has been: how to quantify such risks so that they can be used to compare network design options.

Recently a post entitled “How Much is Water Worth” on LinkedIn highlighted a website developed by Ecolab that offers users an approach to monetization of water risks. This website allows the user to enter information about their current or potential supply chain footprint – such as locations of facilities and current or planned water consumption – and the website combines this information with internal information about projected GDP growth for the country of interest, the political climate and other factors to calculate a projected risk-adjusted cost of water over the time horizon of interest.

This capability, in conjunction with traditional supply chain modeling methods, gives the planner a tool that can be used to develop a more robust set of information that can be used in decision-making.
For more details visit the website

Excel has many advocates as a system for the development and implementation of supply chain processes. At first glance, it is extremely attractive – a flexible application that many people know how to use. It is portable and the cost of ownership is negligible (

However, this ease of use and ubiquitousness comes at a cost. As with any product that offers an initially short learning-curve, Excel becomes extremely complicated to use and maintain in the mid- and long-term development cycles. Problems include:

Performance – Excel runs on a single desktop or laptop. As such, its ultimate performance is limited by the device and it cannot take advantage of distributed (cloud) computing.

Development Sprawl – Development of an application in Excel is usually ad hoc with a single expert developer working part-time on the project. Adequate documentation is rare and version control non-existent.
Code and Data Comingled – Is that a formula or a data cell? There is almost nothing to protect data cells from being overrun by formula cells and vice versa.

Isolation of Calculated Results – These are stored in just one place in the workbook and are often the result of other cascading calculations. Thus changing an intermediate step can cause widespread changes throughout the workbook.

Scoping – What formulae have access to a given cell? Which ones can change its contents (and how would you know)? An object oriented language such as C# utilizes a core principle of encapsulation, which is all about the exposure (and protection) of data. Although spreadsheets have rudimentary methods of data scoping (protected cells, separate worksheets), they are seldom used.

Transactions – Excel does not have a built-in mechanism for recording changes to data and formula cells. This means that changes do not comply with ACID standards ( If a large spreadsheet is in the process of updating and Excel crashes, can you reliably restore it to its pre-update state?

Poor Security in Older Versions of Excel — From Wikipedia: “Currently, the 40-bit key protection used in Office 97–2003 can be easily cracked by the password-hacking software. The 128-bit key AES protection employed in Office 2007–2010 can still be considered as a relatively secure one. At the moment, however, cloud computing facilities are capable of unlocking a substantial number of the files saved in the Office 2007–2010 format.”
Corporate password and security policies may not apply to desktop applications such as Excel, meaning that strength, frequency of password change and other standards are not enforced.

Flexibility and Scaling – If the supply chain process changes, how can the code and business logic be changed with assurance? Excel has many “nooks and crannies” in which to hide data and formulae.

For small and even mid-sized solutions, Excel has many strengths and advantages. However, the same elements that make Excel so attractive in these situations proves to be a critical flaw for large scale projects. Companies with sizable logistical planning requirements need to look to more formal and distributed solutions for long-term sustainability.

Many of our activities at Profit Point are focused on helping clients in identifying and implementing changes that improve the efficiency of existing supply chain networks, ranging from planning to operations and scheduling.  In the short term we are usually trying to find ways to use existing capabilities more effectively, but as you look out over longer time horizons supply chains evolve to develop new links, and these must be considered as you plan.

One instance of this evolution was described by my colleague, John Hughes, who recently wrote about the rise of a “New Silk Road”– a rail network stretching through Western China, Kazakhstan, Russia and Belarus to Europe – used for transporting manufactured goods from Asia to meet demand in Europe.

But Asia has a complementary demand that must be met for their manufacturing systems to function, the demand for energy to power their factories and cities.  The growing worldwide demand for energy, and for faster routes to market, is opening up another new link in the global trade routes – the Northern Sea Route, a maritime route connecting Pacific ports with Europe via the Arctic.

Lawson Brigham, professor of geography and Arctic policy at the University of Alaska Fairbanks, was recently quoted on the website as saying “What’s really driving the Northern Sea Route is global commodity prices and natural resource development, particularly in Russia.”

The northern reaches of the earth are currently hotbeds of energy development, and much of the activity is focused on adding Liquefied Natural Gas (LNG) production capacity.  Projects are on-line or in progress stretching from the North Slope in Alaska to the Yamal Peninsula in Siberia to Hammerfest in Norway.  The Northern Sea Route offers quicker shipments of many of these supplies to major Asian ports, shaving ten to twenty days off one-way transit times from Russia and Norway to ports in Korea and China, compared to routes through the Suez Canal.

Climate change has made these routes generally ice-free for several months of each year, and thus more cost effective, but ice-strengthened cargo ships, with icebreaker support, are still required to keep the route open in the colder months, thus driving up the costs.

Supply chain planning activities on a global scale will over time need to expand to consider the potential impact of these types of shipping options.  Keep an eye out for this and other new links in the global chain as they become available – change is inevitable.


For a more information on this route see articles like these:


This quarter’s Supply Chain Quarterly features an article by Dr. Alan Kosansky and Ted Schaefer entitled A Fresh Approach to Improving Total Delivered Cost.

“Most companies calculate total delivered cost (TDC) based on inaccurate and outdated assumptions. Using optimization technology to more accurately forecast TDC by product and customer will help them to improve both their supply chain planning decisions and their costs.

Profitability is the engine that drives all successful businesses. To manage profitability, a company must understand and have good control of both its revenues and its costs.

For a long time, companies have had a good understanding of the revenue side of the business at a detailed customer and product level. It is only in recent years, however, that they have begun to understand their costs at the same detailed level by customer and product. To gain that insight, many companies use total delivered cost (TDC)—the complete cost of sourcing, producing, and delivering products to customers. TDC, in turn, has become a critical metric in guiding supply chain planning decisions.”

Read the complete article on Supply Chain Quarterly.

Total Delivered Cost

In the recent weeks, I have been thinking about testing our applications, like our popular Profit Network, or Profit Vehicle Planner.  When we test, we run data sets that are designed to stress the system in different ways, to ensure that all the important paths through the code are working properly.  When we test, our applications get better and better. There are many good reasons to test, most importantly, is to know that an improvement in one part of the code does not break a feature in a different part of the code.


I have been thinking about how we could test our code a bit more, and the means by which we could do that. I have been reading about automated testing, and its benefits. They are many, but the upshot is that if the testing is automated, you will likely test more often, and that is a good thing.  To automate application testing requires the ability to churn out runs with nobody watching. And to do that, the application needs to be able to be kicked off and run in a way that there are no buttons or dialog boxes that must be manually clicked to continue. There can be no settings that must be manually set, or information reviewed to decide what to do next. In addition, the application must then save the results somewhere, either in the instance of the application, or to a log file, or to a database of some sort. Then finally, to really be testing, the results must be compared to the expected results to determine the pass/fail state of the test. This requires having a set of expected results for every test data set.


In looking at this process above, I see numerous similarities to the process used to run a sensitivity analysis, in that many runs are typically run, (so automation is a natural help) and the results need to be recorded. Sensitivity Analysis is a typical process for user of our Profit Network tool, and out Profit Planner and Profit Scheduler tool.   An additional step in sensitivity analysis however, is that you ApplicationHarness1may desire to change the input data in a systematic way (say Demand + 5%, and Demand -5%), and to the extent that it is indeed systematic, this too could be folded into the automation. The results analysis is different too, in that here you would like to look across the final sets of results at the differences, while in testing you just compare one set of test results to its expected results.  I can foresee difficulty in automating the data changes, since each type of data may need to be changed in a very specific way.  Never-the-less, even if the data changes are manual, they could be prepared ahead of the run, and the runs themselves could be grouped in a batch run to generate the results needed for a sensitivity analysis.

Constructing a harness that lashes up to an application where you can define the number of runs to be made, the setting for that run, the different data sets to be used, and the output location for results to be analyzed, would be useful not only for testing, but for the type of sensitivity analysis we do a lot of here at Profit Point.

I am going to encourage our developers to investigate this type of a system harness to be able to talk to and control our applications to be able to run them automatically, and have their results automatically stored in a data store for either test or sensitivity analysis.

Jim Piermarini  |  CEO Profit Point Inc.


More Links in the Chain to Prosperity

December 16th, 2013 5:10 pm Category: Supply Chain Improvement, Supply Chain Planning, by: Gene Ramsay

Great strides have been made in transportation infrastructure in the last 150 years; such feats as the construction of the Suez and Panama Canals, and the development of long-distance railroad and highway networks, have reduced cost and fostered trade for the world as a whole. And more changes are coming on line today, or are in the pipeline, including significant additional throughput capacity in Panama in 2015, and the gradual development of added rail capacity connecting Asia and Europe. (For more information on the latter, see the article The New Silk Road by my colleague John Hughes.

Another area where additional transport capacity would greatly benefit trade, and potentially bring a significant improvement in living standards, is central Africa. This area has immense natural resources, such as the copper and cobalt found in the Democratic Republic of the Congo and Zambia, and the coffee grown in Uganda, that require both land and sea transport to reach major manufacturing and consumer areas. A number of projects are under construction or consideration to bring change to the supply chains in that part of the world.

An article in The Economist earlier this year looked at barge and rail transport projects underway in Egypt, Guinea, Ghana and Angola, and profiled plans for future infrastructure improvements by firms such as Citadel Capital of Cairo.

The strategic thinkers at consultancy Stratfor have focused a number of their recent research articles on the different options available to better connect the rich ore in the landlocked central area of Africa to world markets. The South African port of Durban has the best developed cargo handling facilities among the ports in the southern part of the continent, and currently attracts large volumes of minerals, but this requires a long road trip from Katanga in the Congo through Zambia and Botswana to South Africa. Durban is substantially further from the sources than ports such as Dar es Salaam in Tanzania, or Walvis Bay in Namibia, but the road route to Durban is relatively safe and stable, whereas paths to other ports suffer from unreliable or otherwise inadequate infrastructure, or the requirement to switch between rail and truck en route from mine to coast. But that is perhaps just a short term issue – given the potential rewards, several plans to improve these routes are in various stages of planning or development.

As these changes become visible on the horizon, and gradually take place, companies would be well-advised to evaluate their options, and plan for these changes – strategic supply chain planning allows management to keep their firms at the forefront of the business world.

One of the pleasures of working at Profit Point is having the occasion to reflect on and write a blog about a subject that is of interest to me. Taking time to reflect on what is important, especially now around Thanksgiving and the holiday season has rewards of its own, but especially with regards to the business we are in, helping people make their supply chains better, brings rewards unique to our profession. I am pleased and thankful to be a part of it.

At the grand scale, improving supply chains improves the very heart of most businesses. It reduces waste, reduces cost, increases efficiency and profit, and reduces the detrimental impact of commerce on our world, including reducing our carbon footprint and land fill waste as well as promoting the most efficient use of our natural resources and raw materials. Profit Point routinely has an enormous impact on the business of our customers, and I personally am pleased and thankful to be a part of it.

On a more human level, the people at our customers who interact with Profit Point personnel on our projects, come away with a profound sense that they can make a difference in their company; the impact of our network studies or scheduling work or other projects demonstrates the ability of a better business process to be rolled out and actually change the business for the better. Once involved in a successful project, many of these people move up in their organizations to lead other successful business change projects. Nothing succeeds like success! I am pleased and thankful to be a part of it.

Reflecting on our part in the business world as small actors on a large stage, it is rewarding to be able to know that we have made some difference.

Wishing you the same peace of mind this holiday season, Jim Piermarini, CEO Profit Point Inc.

The primary goal of Big Data Analytics is to help companies make better business decisions by enabling analysts or other users to analyze huge volumes of transaction data as well as other data sources that may be left untapped by conventional business intelligence programs.

Big Data IconAsk yourself, does your company:

  • Use advanced analytical techniques (e.g., simulation, optimization, regression) to improve decision making
  • Routinely use data visualization techniques such as dashboards to assist users or decision-makers in understanding complex information?
  • Combine and integrate information from many data sources for use in decision-making?
  • Use systems that automatically make operational changes, based on performance criteria, business rules in response to signals from sensors?
  • Use systems to give you the ability to decompose information to help root cause analysis and continuous improvement?

If your answers are yes then your company is on the road to implementing Big Data Analytics. If your answer is no, then read on to learn more.

We see companies develop and employ enterprise applications that use advanced analytical techniques to explore Big Data and generate optimal supply chain plans to improve decision making. These applications allow management to visualize their supply chain before and after optimization, helping to identify areas of risk and recommending and allowing management changes to these plans in an almost real-time environment.

The most advanced applications use sophisticated mathematical algorithms, typically mixed integer programming optimization models, to analyze their Big Data and generate optimal schedules and supply chain network designs. In addition, these tools allow the user to modify those plans in real time to align with their tactical or strategic goals.

For example, in S&OP optimization some of these advanced technologies implement color-coded information, which allows users to identify shortages or constraints with their production operations by SKU. The user then has the ability to point and click to drill down for further data analysis and quickly make the needed changes to modify the plan. The key to this type of a system is to provide actionable insights into their global supply chain, and allow worldwide operations the ability to make changes or corrections to their production process at any time 24/7. This process of examining Big Data using advanced technologies provides global supply chain plan visibility and an improved decision-making process. Contact us to explore a professional approach to take advantage of Big Data Analytics.

Profit Point, a leading supply chain optimization firm, adds total delivered cost and margin at the customer location-product level of detail to its supply chain network design software.

Supply Chain Network Design Software

Profit Network™ – Supply Chain Design Software

Profit Point, the leading supply chain optimization consultancy, today announced the release of an update to Profit Network™, a supply chain network design software that is used by supply chain managers all over the world to gain visibility in to the trade-offs they will face when designing or optimizing a global supply chain. In addition to several other new enhancements, Profit Network now allows users to analyze and report on the total delivered cost and the resulting gross profit margin for all products delivered to each customer location.

“With the ever-increasing availability of granular data across the supply chain, many of our clients have expressed a strong interest in analyzing and reporting on the total delivered cost of a single product or set of customer products,” said Alan Kosanksy, Profit Point’s President. “Previously, it was quite a challenge to understand how costs accumulate over time from raw material procurement through manufacturing, inventory, transportation and customer delivery.  Now our customers are able to see the true total cost for each unit of product delivered to each customer.  This will be a powerful tool in helping them evaluate their product and customer portfolios.”

In addition to total delivered cost, now Profit Network also enables more control over source-destination matching, as well as inventory levels by establishing minimum and maximum number of days of inventory demand.

“Profit Network software has been helping Fortune 500 companies around the world build more robust and profitable supply chains for more than 10 years,” said Jim Piermarini, Profit Point’s CEO and CTO. “Over that time, the dramatic increase in data availability across the supply chain has provided us tremendous opportunities to solve unique and critical problems in a variety of supply chain networks.”

In addition to Profit Network, Profit Point’s line of supply chain software also includes Distribution and Vehicle Planning, Sales and Operations Planning (S&OP), Production Planning, Scheduling and Order Fulfillment software.

To learn more about how Profit Network can help you analyze and improve your Supply Chain Network Design, call us at (866) 347-1130 or contact us here.

About Profit Point

Profit Point Inc. was founded in 1995 and is now the leading supply chain software and consulting company. The company’s team of supply chain consultants includes industry leaders in the fields infrastructure planning, green operations, supply chain planning, distribution, scheduling, transportation, warehouse improvement and business optimization. Profit Point has combined software and service solutions that have been successfully applied across a breadth of industries and by a diverse set of companies, including Dow Chemical, Coca-Cola, Lifetech, Logitech and Toyota.

Supply Chain Survey 2013:
Gaining Competitive Advantage

If you’re reading our blog, you are probably someone who is deeply interested in supply chain improvement. So we’d like to invite you to participate in this brief survey. And in return, we will send you exclusive, early access to the results of the survey along with our analysis .

Your insights and experiences are very important to us. And we are hosting the survey on a trusted, 3rd-party site so your responses will remain completely confidential. The survey is relatively short and should take only 3-4 minutes to complete. Please take a few moments to complete the Supply Chain Competitive Advantage Survey.

Start the Supply Chain Survey:

Gone are the days that supply chain was merely an expense. These days, savvy decision makers are gaining advantages over the competition by leveraging the data and tools available to them. In this survey, we will be exploring the methods, tools and processes that supply chain professionals utilize to gain competitive advantage via their supply chain.

Building a competitive advantage across the supply chain starts with an individual that is unwilling to accept the status quo. Traditionally, these team members might be considered supply chain innovators. We like to refer to this persona as a supply chain “Disruptor”.

Disruptors aren’t nearly as menacing as they sound. They don’t disrupt to people they work with, but rather disrupt a stale, outdated way of thinking and acting. They are the outliers in corporate America. They are not content to maintain the status quo. Making nominal improvements to keep up with industry standards just isn’t that interesting to the disruptor. No, the disruptor sees a very different vision of the future and acts accordingly. The disruptor is not trying to stay with the pack. Their plan is to leave the pack in the dust! Better still, they’re not concerned with the pack, but instead obsess about their customer.

In this series of posts, we’ll discuss what it takes to become a supply chain disruptor, or innovator if you prefer. But let’s start with some basic traits. Here’s a list of dos and don’ts that seem to be common, although not exclusive, to disruptors:

  • Do
    • Recognize that the supply chain can be a competitive advantage
    • Rely on smart people to power the supply chain
    • Obsess about the processes that define the supply chain
    • Rely on technology to improve speed and decision-making
    • Have a “modular” way of thinking. I.e., see the whole picture as one unified system, but willing to break up the pieces to improve performance
    • Acknowledge that the process and the enabling technology are only as good as the people who will implement and live with them
    • Rely on data-driven metrics
  • Don’t
    • Blindly favor cost-reductions over improving customer service
    • Accept limitations that others do
    • Become complacent with past performance
    • Assume that monolithic, standards-driven technology is good enough for every aspect of the supply chain
    • Believe vanity metrics

In sum, this list embodies two key concepts that we at Profit Point hold to be supply chain truisms:

  • People, Process, Technology – the essential ingredients of any successful supply chain, presented in order of importance.
  • Manage by Metrics – As Peter Drucker suggested, “management by objectives” leads to improvement. Gather and analyze the right data to generate the best decisions.

In future posts, we’ll dive in deeper and provide case study examples to better explain some of the traits that define the disruptor. If you have any ideas or suggestions that you’d like to explore, feel free to leave a comment below.

Are you a supply chain disruptor? To learn more about Profit Point’s Supply Chain Optimization Services, please contact us.

Building applications, especially custom ones, carries with it the burden of answering the question: Does this do what the customer wants?

With complicated systems with many interacting features and business rules, answering this question can be daunting. In fact, evaluating the answer can be daunting too, from the perspective of the customer. Having the sales guy check some boxes in a questionnaire, or watching a demo just doesn’t leave you with the assurance that the application with handle all the business requirements, from either perspective, the vendors or the customer. Everyone I have spoken to who has sold complex software, or who has participated in the purchasing process of software has expressed the same doubt. They are just not sure that the tool will be a good fit. As we all know, that doubt does not always prevent the purchase of the software, as each organization has its own level of risk tolerance, and trust in the vendor’s brand or reputation. Often these other considerations can outweigh the amorphous doubt that some folks might feel. How can one quantify that doubt? Frankly, it’s a quandary.
This thought got us at Profit Point thinking… Wouldn’t it be great if there was another way to evaluate the goodness of fit or an application, or the appropriateness of the parameter settings, to match the business needs of an organization. Would it be great if there was a way to eliminate (or greatly reduce) the doubt, and replace it with facts. Either a business rule is obeyed or it is not. Either a decision is made according to the requirements, or it is not. Let’s eliminate the doubt, we thought, and the world would be a better place. (well a little bit anyway).

There are many processes for testing an application as it is being developed, with writing test scripts, and evaluating the results. All these are based on testing little pieces of code, to ensure that each function or sub routine does what it should do in each case of input data. These processes work fine in our opinion, but only when the sub of function is able to be considered independently form the others. When the system has functions that interact heavily, then this approach doesn’t reduce the doubt that the functions may conflict or compete in a way that the whole system suffers. How then to evaluate the whole system? Could we treat the entire application as one black box, and evaluate the important business cases, and evaluate the results? This is exactly what we have done, with the effect of reducing the doubt to zero about the suitability of the application for a business.
With several of our clients we have worked out what seems to be a great process of testing a complex software solution for suitability to the business requirement. In this case, the detailed level function testing methods were not open to us, since the solution relied on a Linear Programming technique.
This process is really just an amplification of the standard testing process.

  1. Define the test case, with the expected results
  2. Construct the test data
  3. Build or configure the application
  4. Run the Test using the Test Data and Evaluate the results – Pass or Fail

This is the standard process for testing small functions, where the expected results are clear and easy to imagine. However, in some systems where there many interacting rules and conflicting priorities, it may not be simple to know what the expected results should be without the help of the tool’s structure to evaluate them. Such is the case with many of our application, with layer upon layer of business rules and competing priorities… The very reason for using an LP based approach makes testing more complex.
In the revised process, we have, for each new business requirement:

  1.  Construct the test case with the test data
  2. Build or configure the application
  3. Set the expected results using the results of the first pass build
  4. Re-factor the code and test until all test are passing
Profit Point's Software Testing Process

Profit Point’s Software Testing Process

In my next blog I will show you the simple excel based tools we use to facilitate the test evaluation.

In practice, the process works well, new versions of the application go into production without any surprises, and with full confidence of the application management team that all the business requirements are 100% met.

No doubt – no doubt a better process.

By Jim Piermarini

Contact Us Now


Contact Us

Contact UsInfo

Please call:
+1 (610) 645-5557

Meet our Team

Our Clients

Published articles

  • A Fresh Approach to Improving Total Delivered Cost
  • Filling the Gap: Tying ERP to the Business Strategy
  • 10 Guidelines for Supply Chain Network Infrastructure Planning
  • Making Sound Business Decisions in the Face of Complexity
  • Leveraging Value in the Executive Suite
  • Should you swap commodities with your competitors?
  • Supply Chain: Time to Experiment
  • Optimization Technology Review
  • The Future of Network Planning: On the Verge of a New Cottage Industry?
  • Greening Your Supply Chain… and Your Bottom Line
  • Profit Point’s CEO and CTO Named a "Pro to Know" by Supply & Demand Chain Executive Magazine