A brief history of Quality Management

Medieval bakerThe Medieval Guilds of Europe

Craftsmen began organizing into unions called guilds at the end of 13th century Europe.  The medieval guilds were generally one of two types: merchant guilds or craft guilds.  Craft guilds, were occupational associations comprised of artisans and craftsmen.  For instance, there were guilds of weavers, dyers, and fullers in the wool trade and of masons and architects.  Also there were guilds of painters, metal-smiths, blacksmiths, bakers, butchers, soap-makers, and so forth.  They set and maintained standards for the quality of goods and the integrity of trading practices in their industry.  These guilds were responsible for developing strict rules for product and service quality. Inspection committees enforced the rules by marking flawless goods with a special mark or symbol. 

Craftsmen often placed a secondary marks on the goods they produced.  Initially this mark was used to track the origin of defective component items. But over time the mark came to represent a craftsman’s good reputation or trademark.  Master craftsmen marks served as proof of quality for customers throughout medieval Europe. This approach to manufacturing quality was dominant through the Industrial Revolution in the early 19th century.

industrial revolutionThe Industrial Revolution and Quality

Until the early 19th century, manufacturing in the industrialized world followed the craftsmanship model.  Product inspections began in Great Britan in the mid 1750s.  This helped fuel the Industrial Revolution in the early 1800's.


In the early 19th century, manufacturing in the United States tended to follow the craftsmanship model used in the European countries. Since most craftsmen sold their goods locally, each had a tremendous incentive to meet customers’ needs for quality.  If quality was not met, the craftsman risked losing hard fought for customers. Therefore, masters maintained pre-sale inspections as a form of quality control.

The Factory System

During the Industrial Revolution, the factory system began to divide the craftsmen’s trades into specialized tasks.  Craftsmen became factory workers and shop owners became production supervisors.  This a decline in employees’ sense of empowerment and autonomy. Quality in the factory system was maintained by audits and inspections. Defective products were either reworked or scrapped.  Often workers were fired if the quality standards were not met.  The quality audits were disliked - as they sometimes are today.

The Taylor System

Frederick Taylor's goal, in the late 19th century was to increase productivity without increasing the number of higher paid skilled craftsmen. 

Late in the 19th century the United States broke further from European tradition and adopted a new management approach developed by Frederick W. Taylor, whose goal was to increase productivity without increasing the number of skilled craftsmen. He achieved this by assigning factory planning to specialized engineers and by using craftsmen and supervisors as inspectors and managers to executed the engineers’ plans.  As you can envision, a focus on productivity had a deleterious effect on quality.  To fix this,  factory managers created inspection departments to reduce defective products reaching customers.

wwii productionQuality during World War II

After entering World War II, the Federal Government made regulations to transform the civilian economy to military production.  Quality and safety became critical components of the war effort.  Unsafe military equipment was unacceptable, and the U.S.  The armed forces inspected virtually every unit produced to ensure that it was safe for operation. This required huge inspection forces and caused problems in recruiting and retaining inspection personnel.

To balance safety, quality and productivity, the armed forces began to use sampling inspection to replace unit-by-unit inspection. With the aid from Bell Laboratories, they adapted sampling tables and published them in a military standard, known as Mil-Std-105. These tables were incorporated into the military contracts so suppliers clearly understood what they were expected to produce.

Quality in the Early 20th Century

Processes were included in quality practices at the beginning of the 20th century. Walter Shewhart began to focus on controlling processes in the mid-1920s, making quality relevant not only for the finished product but for the intermediate processes.

Shewhart recognized that industrial processes yield data that could be analyzed using statistical techniques and laid the foundation for the control charts still being used today.

W. Edwards Deming, a statistician with the U.S. Department of Agriculture and Census Bureau, became a proponent of Shewhart’s SQC methods and later became a leader of the quality movement in both Japan and the United States.

At first, Japan had a widely held reputation for poor quality exports, and their goods were shunned by international markets. This led Japanese organizations to explore new ways of thinking about quality.  Joseph M. Juran, predicted the quality of Japanese goods would surpass product quality in the United States by the mid-1970s because of Japan’s revolutionary rate of quality improvement.

malcomb baldridge awardAt first, U.S. manufacturers focused on production cost and import restrictions.  This did nothing to improve quality.

Price competition declined while quality competition increased.  Finally chief executive officers of major U.S. corporations stepped up forward to provide personal leadership in the quality movement. The U.S. response, emphasizing not only statistics but approaches that embraced the entire organization, became known as Total Quality Management (TQM).  It became an advertising campaign at Ford Motor Company with the slogan, "Quality is Job 1."

In 1987, the International Standards Organization published the ISO 9000 series of quality-management standards.  American companies were at first slow to adopt the standards but eventually came around.