Knowledge Management and the Trouble with Assumption

A few weeks back I bumped into a friend of mine who is working as an application manager at a large food distribution centre. After the usual updates on what was new, she delved into a story about one of their legacy applications that was a pain point. In essence the problem was that the tool was not scalable, and it was this way because of change management. I admit at first when I heard “legacy app and issues” my mind was drifting a bit. But when she said it was the fault of the Change Manager, my ears perked up a bit.

“I’ll bite, what did the Change Manager do?” I asked, even if I was a bit skeptical.

The background was this…

It turns out that the original application was built in house to allow the warehouse manager to report on performance metrics. Using this data they would then be able to schedule the next week’s forecasted staffing requirements. Sounds simple enough, right? The requirements changed over time as the organization grew. The Warehouse Manager no longer was going to shoulder the responsibilities of metrics and scheduling. It was put onto two leads, one for each. We’ll call them Metrics Lead and Schedule Lead. The Warehouse Manager transferred knowledge to the two leads and they were off to the races until…

The Operations team for the warehouse bought a nice new off-the-shelf inventory tool. The salesman, who worked the room like a Kennedy, sold them on the idea that without cost they could input their scheduling information and later maybe something for reporting the metrics down the line as well. This sounded pretty agreeable to the business and they made the purchase. Aside from a few minor hiccups, all business functionality seemed to work fairly smoothly at first. The scheduling was generally well-driven from the output the new inventory application provided.

After several months, Metrics Lead approached the Schedule Lead to see about now leveraging the reporting functionality as was originally outlined. They reached out to the vendor and an integration point was established. There was only one challenge; they required a manual intervention for information to be dumped from the inventory files to the metrics files each day, which included items shipped out as well as items returned to the warehouse that were shipped out in error. An automated solution could be created, but the professional services associated with this were determined “not worth it” from the business perspective. Without this file dump, the piece count that each warehouse employee shipped out in a day would not be accurate.

Later on that same year, there was a downturn in the economy and Warehouse Manager had to let go of middle management (Schedule Lead and Metrics Lead). While he didn’t want to lose them, he was assured that the suite of applications would help him to cover the function (that he once did by himself) quite well. Unfortunately there was nothing physically documented (knowledge management) about how the applications worked together, and in his “brain dump” from previous exchanges with his subordinates it was explained that these files should be moved each day.

The application was still pretty solid for a few more months. It was at this point that the VP of the company decided to have a “pay for performance” initiative. “We have a tool that tells us how many cases each staff selects a day and we can have an incentive program based on their performance,” the VP boasted. There was quite the buzz on this. However, this program was not formally discussed with any IT stakeholders.

At this point the company was quite large, and the application was starting to have some challenges with  file capacity during the daily transfer. A change was created to automate the file transfers to the application at night. There was only one challenge—the ‘returns’ file could not be automated due to other process constraints.

…see, it’s the Change Managers fault…she said. I was a bit puzzled.

“It was because of them that the process had to change, and then the pay for performance wasn’t accurate and the warehouse workers were not getting paid properly.”

“I see,” although I didn’t.

The real trouble from my perspective isn’t that change management failed; rather there was a lack of suitable information to make an appropriate decision on many levels. From the cheap seats, it looks like some of the following was missed:

  • The lack of knowledge captured about the application and how it works (and integrates with other components) – knowledge management is important.
  • That the business drove this decision. Risks should have been discussed and had sign-off before proceeding. Including IT would have been a good start, but having better information to make decisions is key.
  • A risk mitigation plan should have been included outlining what to do should issues arise.
  • An understanding that this process at its core was a workaround (risk)

The company had to terminate the pay for performance program. Despite its potential, the organization was not in a position to leverage its data to accurately pay staff on the information captured.

What did we learn?

There was not a project to implement this pay for performance. There was only a change request submitted, and as a result many important requirements were missed:

  • Had the architecture, process and limitations of the tool been documented, they may have been in a better position to discuss what the tool was actually capable of.
  • While this change did not go as planned, it does not fall solely on the change manager, and there were several stakeholders who were not aware of the challenges associated with this tool and allowed this to proceed.
  • This shortcoming should be documented; did I mention knowledge management?

The key thing to remember here is that better information will allow us to make better-informed decisions. We cannot operate on assumptions.