Long before the first neonatal intensive care units (NICUs) opened in the 1960s, physicians treated premature infants and debated their future. To be sure, the smallest such infants likely died within hours at home, victims of what we would recognize as respiratory distress syndrome. The fate of those born 1 to 2 months early, however, hung in the balance. In 19th-century American cities, where some 15% to 20% of all infants died before reaching their first birthday, premature babies were the most vulnerable of all. Many succumbed to a vicious cycle of hypothermia, poor feeding, and infection in the first weeks of life. Physicians rarely did anything to save them, and resigned themselves to seeing this high mortality as a law of nature, not amenable to medical effort.
Evolution of the NICU
All of this began to change in 1880. In that year, the French obstetrician E.S. Tarnier introduced two simple but effective interventions on the wards of Paris's largest maternity hos-pital: the incubator and the gavage feeding tube. Combined with skilled nursing care and wet nurses, the introduction of these two measures was associated with a remarkable decline in mortality of premature newborns in the 1200- to 2000-g range from 66% to 38%. Tarnier's results ignited widespread excitement in France and around the Western world. Paris's health authorities installed incubators in other maternity hospitals, and a variety of newer and more complex models appeared across Europe and the United States. Most remarkably, so-called incubator baby shows became a staple of world fairs and midways at the dawn of the 20th century (Figure 1-1). Featuring live infants, these shows invited the public to view the new technology in a setting that evoked faith in progress and a dash of side-show sensationalism.
Early Neonatal Ethics
Yet many in the United States questioned the wisdom of saving premature babies. This ambivalence had nothing to do with fears of crossing the limits of viability: as already noted, incubators were used for infants born following gestations greater than 32 weeks, not the “micropreemies” associated with NICUs today. In the early 1900s, however, prematurity was widely regarded as nature's way of expelling a defective fetus. The colloquial term for the premature infant was “weakling,” an ill-defined category that included both newborns born prematurely as well as those who were small for gestational age (SGA) from presumably infectious or hereditary conditions. Particularly in the United States, infant mortality reformers preferred to focus on saving older infants from environmentally related diseases such as gastroenteritis rather than address the newborn period. Such reservations grew with the rise of the eugenics movement and attendant fears that the “unfit” were bearing children at a much higher rate than that of the educated middle classes. Reflecting such sentiments, few hospitals introduced effective incubator care in the United States.