G Corporation Story
In my early days as a Delivery Consultant for what was then the leader of the Quality Management Systems industry, I was called along with several colleagues to the rescue of one of our customers, let’s call it GCorp for the purpose of this article.
There were several occasions of severe Quality Events which recurred in several of GCorp’s sites, and therefore the company was issued a consent decree by the FDA and one of its sites was closed for a while. As you can imagine, all hell broke loose and the entire staff was all-hands-on-deck to remedy all findings and improve the quality of manufacturing to prevent this from ever happening again.
Now, the odd thing about this incident, is that GCorp wasn’t cutting any corners about managing quality before these issues came to light; they had a large internal team developing and implementing new processes in our QMS for years prior to the event, and they were spending significant dollars on getting outside help (from us) to keep up with the demand of their users, as well as keep the system in compliance. They had one of the most mature instances of our software and had a significant amount of data already captured.
So how is it possible, that after all this effort put and money spent on managing quality, they were experiencing such severe quality issues, to the effect of having one of their manufacturing facilities shut down for several months?
It took me many years to realize that the problem was in the mission statement of the system: Quality Management System. The name usually implies “the capturing of data and the tracking of the quality process”, which is a nice way of saying that we track the process and capture the data, so we are “insured” against any visit of a regulatory body.
However, it is my belief that the true test of QMS is improving the quality of the product, and not simply withstand an audit. QMS should be the main working tool for the entire working staff and should therefore strive to improve the quality of the product, not simply track records from start to finish, and I believe that this can be achieved by considering the following core capabilities.
GCorp’s systems was one of the most complex ones, with process configuration covering all of the possible scenarios, no matter how rare they were, but with very little thought about user interface. Capturing data wasn’t much fun due to the large number of data fields, which were mostly irrelevant for the majority of the employees participating in the process, and you never really knew what the next step is because the process was just too long and complex. The system was simply designed with tracking processes in mind, not with being a true productivity tool for production workers and quality managers.
QMS should be super simple and user friendly, so that users will capture data because it assists them in their daily work, not because they have to. A production worker needs to see only the bare minimal fields relevant to tracking a QE while he’s in the production line, while the Quality Manager should see a different view because he’s populating other fields and because he is working in a totally different physical environment.
In GCorp’s system, despite and perhaps due to its advanced customisation, it was nearly impossible to understand who is responsible for each task and how will all tasks come together towards a resolution.
Once an event is recorded, it’s all about addressing it ASAP to reduce the potential adverse product impact.
A good QMS must incorporate tools to enable various teams from various sites, and sometime external suppliers, to collaborate and reach a speedy conclusion of the matter.
Decision Supporting Insights
Amazingly, GCorp’s quality issues were already “coded into the Metrix”, but no one knew how to read through it, and we now know that simple decision supporting tools could have saved significant inconvenience for patients and quite a lot of money to GCorp. The way their CIO put it: “I have years of data which is absolutely of no use for me if I cant’s trend on it”.
Digging through their DB after the fact, we found a clear trend of increasing number of deviations and overdue deviations in the site which was closed, up until the point it was shut down. Now this should be a simple trend to capture, but in those days, it required some DBs knowledge and reporting capabilities, so these trends weren’t looked at. “If only you showed me this trend a year ago it could have saved us all a lot of grief” their CIO told me, and he was spot on; a good QMS should deliver such basic insights at a click of a button without requiring technical capabilities; it should be a matter of drag and drop.
Moreover, in hindsight there were many quality issues and corresponding CAPAs which recurred in several sites, but it was impossible to draw the connection and identify the trend. Sites were implementing the same Corrective Actions without ever knowing that their neighbouring sites already tried this course of action and failed. Therefore, a good QMS should have identified these recurrences and alert them to top management, as there might be a deeper underlying issue here.
A modern QMS should be user friendly for daily users and employ user interfaces they know from products in their daily lives (i.e. Facebook, Google, etc).
Capturing data should be a breeze and should bring actual value to their daily work, so that it encourages them to capture accurate and complete data.
Also, QMSs should encourage teams to collaborate and contribute to the speedy, accurate, and productive conclusion of the matter, even if they are half way across the globe or are outside the team and/or organisation.
And finally, QMSs should provide decision supporting tools to identify hazardous trends in the organisation and alert them ASAP.
Today, simply tracking quality issues is not enough, a good QMS should IMPROVE the quality of your product and increase the efficiency of your team.
Author: Or Tzook, CRO @ Simploud.com