In 1776
Adam Smith explained to posterity how specialization increases productivity using the now famous example of a pin factory. While one master pin maker could turn out anywhere between 1 and 20 pins each day, going through all the steps involved in making pins all by himself, a specialized army of laborers, each fulfilling one step in the pin making process, could increase productivity more than two hundred fold and turn out almost 5000 pins per person per day. This would have the triple benefit of enriching the factory owner, creating jobs and making pins both affordable and widely available for consumers. What happened to the master pin maker, who used to make a very nice living when pins were expensive and hard to come by? He would most likely be employed in the factory to supervise the smooth flow and quality of the new pin mass production system. He would make sure that each laborer works at a speed appropriate for feeding the next laborer in line and he would probably sample a few pins here and there to make sure they are as sharp and sturdy as the ones he used to make in the olden days. When the master pin maker passes away a new supervisor would be hired, most likely one that has never made an entire pin before, but instead has a much better understanding of the production process. The profession of pin coordinator has been born.
Although Adam Smith put forward the notion of specialized labor, Henry Ford is customarily credited with the invention of the modern assembly line. Interestingly, Ford is
attributing his invention to the observation of Chicago’s meat packing industry. It seems that while no two cows are identical, the butchering of animal life lends itself rather well to disassembly line methodology. Today, manufacturing assembly lines use human labor where it is cheap and in abundant supply, and are staffed with robotic machinery where human labor is expensive and/or scarce. In all cases the process is orchestrated and controlled by sophisticated computer software. This is why we are all able to purchase a car, chat on our cell phones and enjoy perpetually fresh slices of white bread in plastic bags, amongst many other wonderful things, which were once only available to the wealthy few.
Modern medical care is increasingly out of reach of most people. It is expensive, and adequate resources are scarce in many areas. Medical care also varies widely in quality, and the costs of production are anybody’s best guess, depending on geography, time of year and even workers vacation and education schedules. This is very much the same as making pins in the eighteen century. In all fairness, some specialization of labor has already occurred in medicine, but there is no coherent method of placing each worker in his/her station of the continuum of care, and there is no standard process by which workers hand off work from station to station. According to experts, this lack of orderly processing, along with the absence of quality control, is creating a terrible waste of resources and a flurry of defects in the finished products. If the advanced methodologies of modern day manufacturing are working so well for everything from cars to pins to cows, wouldn’t it make sense that we should at least try them in medicine?
Fortunately, we already have several pieces of the puzzle in the works. As mentioned above, we do have a certain degree of specialization in medical practice. We also have hospitals, which could function very much like factories, but as
Clayton Christensen observes, most have no well-defined assembly lines. And then, of course, we still have the independent small shops that take piece-work home and operate without any standardized quality control. We also have the beginnings of computerized control systems in the form of Electronic Health Records (EHRs), which, according to
John Halamka, are quickly moving from just bookkeeping software to dynamic coordination of processes, complete with encyclopedic knowledge of medicine and a good measure of artificial intelligence to devise and “enforce automated care plans”.
The only thing left to do is to lay out proper assembly lines, and we don’t really need to think outside the box too much, because manufacturing has solutions for this dilemma as well. In modern industry, there are practically no factories that start out with raw materials and end up with a finished product. Instead, some factories concentrate on producing parts and others are built to receive parts and assemble them into useful products. Exact specifications for each part, to be followed by production lines and relied upon by assembly lines, make this geographically dispersed process possible. In health care, the primary care homes will serve as production centers, where people are constantly measured, tracked, tested and evaluated, so when they are finally shipped to a hospital for a procedure, the hospital knows immediately which assembly line to place them on and the omniscient EHR will control the most minute detail in the process, from medication dosing to incision size and implantable device brand and model, thus reducing both errors and costs. Once the hospital’s work is done, patients are released back to evaluation and management in production centers, and here is where the cyclical nature of health care differs from a typical manufacturing process, and this is why it is extremely important that EHRs be interconnected and preferably Cloud based to achieve a high degree of omnipresence.
Yes, there are many more details to be worked out, like emergencies, accidents and the exact specifications that an EHR should contain on each type of person. We will have to establish quality feedback loops between hospitals and primary care centers to continuously refine processes for both entity types, so basically the EHR will need to be able to adapt to, and learn from, new information, in a manner similar to IBM’s Watson software. Since people are not pins or even cars, the tolerance levels (allowed deviation from specs) will be high initially, so line workers will need to be highly skilled as well. In all likelihood physicians will be working those lines for the foreseeable future. As the learning control system improves, portions of work would be offloaded to less skilled resources and eventually to machines, and more significantly, entire tasks could be packaged into deterministic protocols and pushed out from expensive hospitals to the less skilled primary care production centers, which will further push the most trivial tasks to consumer owned devices.
Obviously, EHRs will prove to be the heart, brain and circulatory system, of the health care industry. As we speak, EHRs are increasingly being tasked with care coordination activities (not to be confused with continuity of care, or longitudinal care), which are the precursor to the industrial line controller. Folks wondering why they should use EHRs that are not ready for prime time, should understand that we have to have an EHR in every practice, so that the system can have visibility into current processes to learn, adapt, grow and devise new methods of providing care. After all, you cannot control that which you cannot see.
If you think this is all farfetched and disastrous, please find a senior citizen that lived through the Great Depression and ask her what she thinks about dinner being prepared moths in advance in computer controlled industrial vats, thousands of miles away from home, pumped full of preserving chemicals, freeze dried, shrink wrapped and delivered by airplane to a football field size department store, with minimal human intervention, ending up in a small irradiation chamber in your home before it hits your dining table (or couch). Yet we all buy the stuff and feed it to our kids with no apologies, because it is cheaper, faster and more convenient than tenderly preparing beef stroganoff and baking pot pie at home, after work, every day. And neither grandma nor you can even fathom the handcrafting of pins by master artisans. Is health care really that much different?