Giants on Clay Feet—COVID-19, infection control and public health laboratory networks in England, the USA and (West-)Germany (1945–2020)

Summary In early 2020, COVID-19 exposed differences in public health laboratory systems’ testing abilities. Focusing on Germany, the USA and the UK between 1900 and 2020, this article argues that studying the distinct evolution of laboratory infrastructures is critical to understanding the history of infection control and the limits of template-based reforms in global health. While each analysed laboratory infrastructure was shaped by a unique national context, neoliberal visions of lean public services and declining resources led to significant reform pressure from the 1970s. The US Center of Disease Control’s model of epidemic intelligence provided an attractive template to integrate resources and focus planning on preparedness scenarios. It also helped justify cuts to local laboratory infrastructures. Effects were uneven: in the USA and the UK, improved integration failed to compensate for local laboratory cuts and loss of autonomy. By contrast, Germany’s subsidiary principle allowed for limited federal integration while leaving local services mostly intact.

discrepancies between countries with similar political systems and resource levels, journalists have highlighted funding cuts for Anglo-American laboratory systems, 3 flawed decision-making by the Johnson and Trump administrations, and errors by public health agencies. 4 Social scientists have focused on wider structural factors such as public health responses' lack of sensitivity towards entrenched socio-economic disparities, ineffective communication, and an over-reliance on outsourcing testing. 5 Meanwhile, historians have described the global COVID-19 response as a sign of the fragility of biomedical promises of microbial control and the dangers of over relying on mathematical modelling and preparedness scenarios. 6 This article complements and goes beyond these analyses by using COVID-19 as an opportunity to survey the longer-term technological trends, epidemiological ideals and political trajectories shaping public health laboratory systems in (West-)Germany, the UK and the USA. Drawing on Susan Ley Star and Karen Ruhleder's seminal work on invisible infrastructures that become visible in times of crisis, 7 it argues for a broader (re-)engagement with the mundane technical systems underpinning modern public health. In the case of the laboratory, historians of science and medicine have already devoted significant attention to the early history of public health microbiology and international standardisation. 8 Meanwhile, scholars of epidemiology and public health have highlighted how rising amounts of microbial data informed new approaches to modelling disease and vaccine politics. 9 However, 3 few have focused on how these developments fed back into existing laboratory networks or how the later twentieth century gave rise to a new form of networked laboratory infrastructure capable of processing tens of thousands of samples and exchanging information-almost-in real time. 10 Adaptations and gaps within these new laboratory infrastructures have similarly failed to attract sustained historical analysis. Social scientists and historians of global health have rightly highlighted complications resulting from the simplistic export of templates of laboratory surveillance, epidemic intelligence and preparedness to low-income settings. 11 However, the fact that the very same high-income laboratory infrastructures that underpinned these developmental templates were themselves experiencing problems when it came to routine disease control has not been sufficiently reflected. 12 As this article will show, paying close attention to the varying evolution of laboratory infrastructures is key to understanding the complex interplay between local and national disease surveillance as well as the ambivalent public health legacies of twentieth century templates of epidemic intelligence and preparedness.
After providing an overview of the unsystematic origins of public health laboratories in all three countries, the article's first part traces the factors shaping the evolution of (West)-German, the UK, and the US laboratory networks between ca. 1930 and 1970. During this period, laboratories in all three countries became integrated into increasingly sophisticated networks of microbial processing, storage and exchange. Each network evolved to meet the needs of distinct biosecurity concerns, political environments national health systems, as well as in response to competition from commercial or hospital-based laboratory services. However, a shared underlying ideal centred on integrating the national provision of specialised reference services with routine microbial testing and public health work at the local level.
Coordinating these laboratory networks and the growing amount of microbiological data produced by them required significant fiscal investment and new forms of information processing. During the 1950s, the US Center of Disease Control (CDC) responded to 1 the challenge of coordinating autonomous state laboratory networks by developing a new organisational template of epidemic intelligence and integrated disease reporting. Rather than producing high volumes of microbiological data, the focus was on pooling, sorting and transforming various streams of information on disease prevalence into actionable intelligence.
Part two of the article traces the domestic adoption and subsequent spread of the US epidemic intelligence template to other countries. Starting in the 1970s, authorities in the USA and the UK used the template to justify focused investment in high-tech laboratories for individual pathogens, automated sample processing, electronic reporting systems, centralised epidemiological coordination and rapid response capabilities. Reforms were, in part, driven by visions of real-time surveillance and a strategic reorientation of infection control towards hypothetical preparedness scenarios. However, in both countries, they were also a reaction to neoliberal ideals of lean public services, declining resources for routine communicable disease control, and cuts to public health laboratory services. The result was a growing imbalance of national capabilities, an outsourcing of laboratory services to the private sector and a relative loss of influence on the part of traditional public health microbiologists.
Focusing on the 1990s, part three reconstructs how their growing expertise in molecular typing, electronic reporting and epidemic intelligence allowed centres of excellence like the American CDC and the UK's Colindale campus to serve as international models for a cost-effective, sentinel-based laboratory approach to microbial disease surveillance. It simultaneously tracks how outside of these centres of excellence, streamlined reporting capabilities not only failed to compensate for, but also helped justify further cuts to local public health infrastructures and routine diagnostic services. As described in part four, the growing structural weaknesses of Anglo-American laboratory infrastructures were not fixed by brief funding surges in response to public health emergencies. They were also not picked up by international preparedness reviews that had been shaped around ideals of pandemic preparedness and epidemic intelligence and were dominated by experts from both countries.
In all of this, (West-)Germany provides an interesting contrast to the Anglo-American history of central integration and local erosion. Following the public health establishment's close involvement with Nazi racial hygiene, the post-war years saw competencies handed back from the federal centre to state and communal authorities. Despite Cold War competition over health indicators, public health remained a relatively neglected policy area and the sub-par performance of West Germany's fragmented laboratory services served to highlight the advantages of more integrated systems. A major push to strengthen and integrate public health and laboratory systems occurred in the wake of reunification. Advised by the US experts, Germany began to integrate communal and state-based laboratory networks and (re-)establish the Robert Koch Institute (RKI) as a coordinating hub of epidemic intelligence. Despite this push for centralisation, the subsidiary principle (Subsidiriatä tsprinzip) of Germany's federal constitution, which holds that decisions should be made at the most decentralised level, meant that states continued to have considerable influence and that powers for Berlin came at the price of federal resources for local and state authorities and laboratories. This system not only helped retain local testing capabilities, but also allowed for a certain flexibility of responses should national centres of excellence be overwhelmed. The article concludes by reflecting on what insights the technical and political history of laboratory infrastructures and surveillance ideals may hold for post-COVID infection control.
Part One: From Patchwork to Control (ca. 1930Control (ca. -1970 Public health laboratory infrastructures have always been shaped by a mix of politics and science. While the roots of modern microbiology date back to the mid-nineteenth century, it was only after 1900 that bacteriological investigations began to play an increasingly prominent role in public health. 13 The reasons for this are manifold: new technologies such as selective media, biochemical and serological typing and sero-diagnostics were enhancing bacteriologists' ability to survey microbial environments. Collaborative investigations with epidemiologists highlighted previously unknown modes of disease transmission and dynamics as well as new environmental, animal, and human reservoirs of disease. 14 Meanwhile, governments in North America, Europe and parts of Asia and South America were investing more resources in containing microbial threats to their civilian and military populations as well as colonial work forces. 15 Key to the resulting 'hunt' for microbial reservoirs of disease was the integration of laboratory-based bacteriological with clinical and epidemiological resources. 16 While early efforts were characterised by ad hoc alliances between different academic and medical disciplines, 17  measures, vaccination, epidemiological surveillance, and bacteriological mass-testing to curb the spread of bacterial diseases like typhoid. Wartime bacteriologists, however, proved powerless when it came to explaining-let alone containing-the viral Spanish Flu pandemic (H1N1) or phenomena such as the variation of virulence among isolates from the same bacterial species. Meanwhile, intrusive wartime surveillance systems proved unpopular amongst civilians. 18 Although many wartime surveillance systems were dismantled after 1918, they continued to influence the thinking of interwar planners. In all three analysed countries, the 1930s saw growing state involvement in medical and social care, rising clinical demand for testing and fears of bacteriological warfare led to a renewed emphasis on expanding and integrating public health laboratories. As will be shown, emerging laboratory networks were shaped both by unique national circumstances and the shared ideal of integrating the local provision of microbiological services with specialised national reference centres, which could identify rare pathogens, store and provide microbial reference strains, establish and enforce standardised laboratory protocols, and survey disease prevalence trends. 19 In the UK, public health laboratories had begun to arise unsystematically at the local level since the late nineteenth century. Although calls for a cohesive national pathology service dated back to 1913, Britain's Ministry of Health (est. 1919) and Medical Research Council (MRC) (est. 1913) only slowly began to consolidate preventive and research activities during the interwar period. 20 A limited amount of central testing and epidemiological coordination of outbreak responses was conducted by London's Lister Institute and a three-man Ministry of Health laboratory on Endell Street, which was 'depressing in [its] appearance and poorly equipped'. 21 Beyond this, most investigations continued to be carried out in a decentralised fashion by local medical officers of health (MoHs), a few dedicated communal public health laboratories, and commercial, hospital and university laboratories-who did not always report findings to the responsible MoH. 22 Medicine, 1996, 9, 195-214. followed 19th shoe-leather case tracking. 23 In 1939, an incomplete survey counted at least 32 county or municipal labs across England and Wales. 24 Biosecurity fears triggered a gradual move towards more systematised laboratory coverage. Concerns about the weaponisation of bacteriological agents had existed since the emergence of pure culture techniques around 1880. However, they were heightened during the 1920s as a result of German agents' wartime attempts to infect Allied livestock and post-war bioterrorism allegations against Irish Republicans. 25 Fears that Britain's fragmented laboratory system would not be able to respond to new forms of bacteriological sabotage and warfare were voiced by the Committee of Imperial Defence in 1934 and led to the establishment of a Subcommittee on bacteriological Warfare in November 1936. In the case of war, it was decided to establish a centralised emergency laboratory network spanning urban and rural areas across England and Wales (Scotland, Northern Ireland, and London had distinct arrangements). The Emergency Public Health Laboratory Service (EPHLS) would protect troops and civilians from epidemics and bacteriological sabotage. A 'sister' organisation at Porton Down would develop offensive and defensive capabilities. 26 The EPHLS was mobilised as part of the newly centralised Emergency Medical Services 1 week before the declaration of war on 3 September 1939. An integrated network of 19 laboratories was set up across England and Wales with provincial laboratories reporting back to a central laboratory headquartered away from enemy aircraft in Oxford ( Figure 1). Despite complaints from commercial laboratories, the ability of EPHLS to offer free epidemiological, microbiological and other laboratory services soon convinced MoHs and health professionals to collaborate with the new organisation. The collaboration held mutual benefits. At the local level, disease control was boosted by free EPHLS laboratories and experts. Standardised bacteriological data and processing (rising from 929 to 247,214 specimens per month between October 1939 and December 1941) helped EPHLS researchers survey national microbial prevalence and uncover interlinked outbreaks that would have remained invisible to local investigators. 27 In the absence of bacteriological attacks, EPHLS research also developed new laboratory technologies: national phage-typing schemes were rolled out for typhoid, paratyphoid, Salmonella enterica serovar Typhimurium and Staphylococcus aureus; 'Moore swabs' were developed to isolate and trace bacteria in sewer and water systems; centralised 'finger print registries '  taxonomies were refined. 28 Its track record secured the EPHLS a post-war future. Although Britain's National Health Service (NHS) (est. 1948) divorced public health from medical care and maintained its own microbiology services, the renamed Public Health Laboratory Service (PHLS, est. 1946) managed to maintain a large-scale laboratory network. Many PHLS laboratories' location in hospitals and close ties to MoHs allowed for rapid and flexible responses to local problems. Between 1947 and 1953, the PHLS expanded from 28 to 50 laboratories (Figure 1). Headquartered in Colindale, it also began to play a prestigious international role through managing international reference laboratories for Salmonella and Staphylococci. 29 In 1960, the PHLS Act transferred responsibility for the PHLS from the MRC to the Ministry of Health-thereby establishing the PHLS as a specific body within Britain's health protection structure. 30 In marked contrast to parallel developments in the US, epidemiology was not a dominant discipline within the early PHLS. Despite having conducted interwar research in experimental epidemiology, 31 the initial co-directors of the service (Graham Selby Wilson, William MacDonald Scott, and William Whiteman Carlton Topley) had focused the service on providing basic and specialist laboratory services to MoHs. Resulting epidemiological outbreak investigations were usually headed by MoHs with PHLS specialists advising. The focus on microbiological service provision was mirrored in the wider organisational structure of the PHLS, which centred on relatively autonomous local laboratories and a select number of powerful reference laboratories in London. The service's main monthly bulletin remained dominated by articles on laboratory techniques and protocols, the identification of new bacterial types, and occasional model investigations of individual outbreaks.
A dedicated PHLS Epidemiological Research Laboratory was eventually established under the directorship of W. Charles Cockburn in 1946. Similar to the American CDC, Cockburn's office began to gather national disease data-first on salmonellosis and then on other diseases-and provided expertise on the efficacy of pertussis vaccine roll-outs and clinical trials involving experimental polio and rubella treatments and prevention. A weekly joint publication of microbiological and epidemiological information began to be published from the late 1950s onwards. 32 However, overall, epidemiology continued to play a secondary role to public health microbiology. Parallel developments in the USA would transform this relatively passive mode of national epidemiological analysis into a new template of active epidemic intelligence centring on integrated data gathering, centralised analysis, and rapid outbreak response capabilities. The new template would not only shift the relative influence of public health microbiologists and epidemiologists in favour of the latter group, but also prove a successful international export.
Similar to the UK, the US public health laboratories had emerged in an unsystematic fashion from the late nineteenth century onwards. In 1887, a National Hygienic Laboratory (predecessor of the National Institutes of Health) was established at the Marine Hospital in Staten Island. By 1907, all the US states had boards of health, which were supplemented by local and county health departments. 33 In 1914, 47 of 48 states reported on the activities of their dedicated public health laboratories. National organisations like the State Laboratory Directors Conference (est. 1927, now Association of Public Health Laboratories (APHL)) and Association of State and Territorial Health Officials (est. 1942) were established to coordinate activities and large-scale Rockefeller Foundation funding for disease control further boosted laboratory capacity. 34 However, in a noted difference to England and Wales, responsibility for infection control was not transferred to the federal level but continued to rest with individual states. This meant that federal officials focused efforts on strengthening inter-state surveillance and assisting state-level programmes with earmarked funds and expert services. 35 During the interwar period, the Public Health Service (PHS) and other federal authorities stepped in to stop human, animal and plant diseases from crossing state borders. The PHS also supported states with the control of venereal disease and malaria and collected data on 29 notifiable diseases. 36 Legislative reforms increased ties between federal and state authorities. The 1935 Social Security Act provided a permanent machinery for the distribution of federal funds via grant-in-aid to state and larger local health departments and the 1938 National Venereal Disease Control Act released significant funds for state-based interventions. 37 However, ultimate authority over laboratories and funding continued to rest with individual states.
Despite improved interwar coordination, state-based responsibility meant that there remained considerable variation in the quality, number and speed of laboratory services. Free official public health laboratory services were unevenly distributed with poorer and rural areas enjoying less coverage than wealthier and metropolitan areas. Responding to these gaps, a growing number of commercial mail order and hospital-based pathology laboratories offered for-profit tests for various diseases. Quality concerns led to attempts by professional societies to regulate commercial services via approved laboratory lists and training curricula. 38 Quality assurance problems were not limited to the private sector. In so-called centralised states, state health authorities operated local health departments as branches, which made laboratory quality control and data collection easier. However, in decentralised states, local health departments were financed by local governments and retained more authority and responsibility for the delivery of public health services, which complicated oversight. 39 Resulting variations in data provision and quality made it hard to make robust claims about national or state-level disease incidence.
Problems were compounded by blurred state-level responsibilities. In 1940, a PHS survey came to the 'somewhat startling' realisation that 'within a single State as many as 18 separate agencies contribute something to the health activities' and that 'no jurisdiction' had less than '6 agencies involved'. 40 Communicable disease responsibility was mostly located within state health departments, but additional responsibilities could be located in as many as eight other agencies. Blurred responsibilities negatively impacted notification practices and laboratory quality control. 41 Laboratory work was also heavily skewed towards a limited number of prominent pathogens. As a result of the 1938 Venereal Disease Law and marital examination laws, syphilis testing occupied two-thirds of US states' entire diagnostic laboratory work. However, only 16 states regularly assessed commercial Wassermann tests and only nine controlled diagnostics for other diseases. 42 Funding differences exacerbated performance variations. In 1940, US states spent ca. $285 million on public health (ca. $1.90 per capita). However, Tennessee only spent $0.76 per inhabitant whereas New York spent $3.27 and Hawaii $5.03. 43 For the US infectious disease control to improve, there was a need to boost coordination between federal and state authorities, expand laboratory services and standardise laboratory testing and reporting protocols. Similar to the UK, the Second World War provided an important stimulus. Domestically, disease control was an important part of wartime efficiency drives and justified investment in public health and laboratories systems. Externally, the global nature of the war and concerns about Japanese biowarfare justified creating centralised epidemiological coordination and medical intelligence capabilities. 44 40 developments carried over into the post-war period. With spending increasing at the local, state and federal levels, 86 per cent the US citizens were served by a local health department in 1950 and over 34,000 persons were employed full-time in various public health agencies. 45 The 1944 Public Health Service Act also strengthened central oversight by specifying a federal coordinating role over US public health. 46 From 1951, PHS officials and the Council of State and Territorial Epidemiologists began to meet annually to decide on a list of diseases to be reported by states. Notified data was soon compiled and published on a weekly basis in what would become the Morbidity and Mortality Weekly Report. 47 The biggest innovation to come out of the 1940s reordering of the US public health was a national centre of epidemiological and bacteriological expertise. Originating in wartime malaria and typhus control programmes, the CDC was established as the Communicable Disease Centre in 1946. As an initially obscure federal department outside Washington DC, the young agency had to convince Congress, the PHS, and individual states to finance its activities and transfer powers. 48 Another challenge lay in expanding its remit from vector-borne to other diseases. Boosted by biological warfare concerns and expanding vaccine schedules, 49 the CDC solved both challenges by providing epidemiological advice, bacteriological reference services and rapid local assistance during outbreaks. Between 1950 and 1960, the young agency acquired responsibility for federal disease responses (1950), established a national network of assigned typing laboratories (1951), acquired responsibility for venereal disease (1957) and tuberculosis control (1960), and oversaw programmes against polio (1955), influenza (1957) and nosocomial infections (1957). 50 The 1969 Clinical Laboratories Improvement Act further expanded CDC power over domestic microbiology by giving it the authority to license clinical laboratories soliciting or accepting specimens in interstate commerce. Internationally, the CDC turned into an increasingly prominent arm of the US soft power diplomacy and was designated International Influenza Center for the Americas, International Shigella Centre, and WHO serological reference centre for syphilis. 51 Although some have described it as a 'dumping ground' 52 for declining public health programmes, the rapid expansion of the CDC was underpinned both by bureaucratic 45 CDC, 'Achievements in Public Health, 1900-1999', 1145. 46 Pollock, Fevers and Cultures, 35.  opportunism and an increasingly concrete ideal of centralised epidemic intelligence. Because it never acquired control over state or local laboratory networks, the CDC focused on supporting states and gathering and analysing resulting disease data. This focus on supporting rather than controlling baseline public health services was reflected in its motto: 'We exist to serve the states.' 53 It also led to a distinct distribution of power between the agency's four branches (epidemiology, laboratory, training and technology). In Savannah, the CDC's technology branch acquired significant resources through its involvement with the US malaria eradication programmes and by developing equipment for mass vaccination campaigns. In Atlanta, the training branch provided courses and educational material while the bacteriology branch gained international renown through the work of microbiologists like Philip R. Edwards. However, it was in the epidemiology branch that the biggest influence and power within the CDC rested. 54 Headed by famed epidemiologist Alexander Langmuir, the CDC epidemiology branch built on wartime models of 'medical intelligence' by conceiving of infectious disease surveillance as a form of systematised intelligence gathering and analysis. In contrast to the traditional focus of public health surveillance on tracing the transmission of pathogens between individuals, Langmuir focused on surveying larger (human and microbial) populations. 55 According to Langmuir, surveillance was: . . . the continued watchfulness over the distribution and trends of incidence through the systematic collection, consolidation and evaluation of morbidity and mortality reports and other relevant data. Intrinsic in the concept is the regular dissemination of the basic data and interpretations to all who have contributed and to all others who need to know. The concept, however, does not encompass direct responsibility for control activities. These traditionally have been and still remain with the state and local health authorities. 56 Although Langmuir was careful not to infringe on state competencies, this did not mean that he thought of surveillance as a passive activity. At his behest, a dedicated CDC Epidemic Intelligence Service (EIS) was formed in 1951. With deliberate overtones of military intelligence activities, the EIS was tasked with coordinating responses to disease outbreaks-and biological attack-that exceeded local and state capabilities. A cadre of EIS officers could rapidly deploy while Atlanta provided centralised coordination. The EIS' logo summed up this combination of 'boots on the ground' field epidemiology, centralised analysis and increasingly international ambition by depicting a globe straddled by a sole. 57  branch-Langmuir's vision of integrated surveillance and epidemic intelligence would play a profound role in shaping the organisational evolution of the CDC and wider international thinking about infection control. 59 Whether the existing organisation and scale of post-war public health laboratory infrastructures was the most cost-effective way of achieving this control became debatable.
In contrast to Britain's centralised laboratory infrastructure and the US model of statebased systems and service-like federal coordination, West Germany's public health laboratory infrastructure was deliberately decentralised. During the Kaiserreich, German bacteriology had achieved international renown but remained decentralised outside of Prussia and the military. By 1911, there were over 50 official bacteriological institutes across the German states. 60 Centralisation efforts occurred during the First World War and the interwar era. However, it was the fusion of public health and racial hygiene during the Nazi era that triggered the most dramatic expansion of state investment in and centralisation of public health activities. Within a year of the 1934 law for the unification of the health system (Gesetz zur Vereinheitlichung des Gesundheitswesens), there were 740 official public health departments (Gesundheitsä mter) across Germany. Local and state-level departments and hygiene institutes frequently employed bacteriologists to screen populations for venereal disease, tuberculosis, and other perceived threats to public and racial health. There were also close connections between leading bacteriologists at the Reichsgesundheitsamt, the military and the RKI at the national level. 61 A further intensification of laboratory and clinical screening occurred following the passage of the regulation to fight transmissible diseases on 1 December 1938-although wartime pressures soon constrained available resources. 62 Their enthusiastic support of Nazi racial hygiene meant that large parts of the German public health and bacteriological establishment fell into disrepute after 1945. In West Germany, prime responsibility for public health laboratories and infection control was decentralised and firmly re-embedded in the Federal Republic's 10 states (Bundeslä nder) and powerful municipalities (Kommunen). 63 In contrast to communist East Germany's centralised health and public health systems, West Germany's federal government limited itself to acting as a political 'manager' enacting compromises between different parts of a complex health system but developed little epidemiological or bacteriological expertise of its own. The wider West-German health system consisted of ca. 500 local and communal health offices (Gesundheitsä mter)-often with their own public health laboratories-10 state health departments (Landesgesundheitsä mter), over 1,300 statutory health insurers covering about 90 per cent of the population, and powerful practitioner and hospital associations. 64 Federal authority for public health rested with the Federal Office of Health (Bundesgesundheitsamt), 65 the Federal Ministry of Health (Bundesgesundheitsministerium, est. 1961), and the Conference of Health Ministers of the Federal States (Gesundheitsministerkonferenz). Below the federal level, some states like Bavaria had integrated public health laboratory services while others relied on communal arrangements for laboratory provision with loose coordination via state health departments. 66 Fragmented responsibilities and the divorce of state-funded public health from insurance-funded curative medicine led to a long-term stagnation of the former. 67 By the 1970s, relative oversupplies of hospital beds and doctors contrasted with withering investment in West German public health and laboratory capacity. 68 Despite strong inter-German rivalry over health indicators and expanding vaccination schedules, West Germany did not systematically boost or integrate laboratory networks and epidemiological monitoring. 69 According to a 1985 WHO-commissioned review, there was 'virtually no planning explicitly directed towards improving the health of the population, improving the effectiveness and quality of health care, or intensifying protection against illness'. 70 Although the number of public health positions per 1,000 inhabitants remained stable on paper, low salaries and prestige led to recruitment problems and a significant number of vacancies. 71 While some reviewers noted that West German decentralisation created a certain 'flexibility', 72 lack of integration and the relative insularity of post-war West German epidemiology and bacteriology negatively impacted disease control. 73 There was limited coordination at the federal and state level via stakeholder conferences and circulars. 74 Local public health laboratories could request specialist assistance from universities, the RKI (part of the Federal Health Office from 1952 to 1994), or Hamburg's Bernhard Nocht Institute for Tropical Medicine. However, health sector rivalries and lacking federal interest meant that infectious disease control as an integrated national endeavour remained neglected. The Federal Health Ministry only began publishing internationally comparable disease statistics in 1963, there was no regular and comprehensive assessment of health statistics, and no West German institution could rival the PHLS or CDC as an international authority for laboratory standards, public health training, research and epidemic intelligence. 75 Part Two: Neglect  The ongoing post-war decline of infectious disease led to a relative decline of attention for traditional pathogen control and surveillance in all three analysed countries. Although Cold War competition and international health programmes strengthened the influence of the PHLS and CDC over laboratory systems in other parts of the world, 76 public health managers had to contend with a shift of resources away from laboratory-based pathogen surveillance towards chronic and non-communicable diseases. 77 The described shift of attention towards non-communicable diseases coincided with a wider economic reordering of public health. In line with contemporary governments' attempts to control rising expenditure, managerial cost-benefit thinking and neoliberal ideals of 'lean' government led to a restructuring of public health laboratory infrastructures. New technologies like automated sample processing promised to make laboratory systems more efficient whilst simultaneously allowing for personnel cuts. Meanwhile, the CDC's model of epidemic intelligence provided an attractive template to justify the pruning of existing laboratory services. In contrast to 1930s visions of public health laboratory infrastructures with a strong local presence and relative autonomy regarding sampling remits, planners increasingly focused investment on epidemiological hubs and integrated specialist laboratory networks for the surveillance of a select number of important pathogens. The goal was to transform public health surveillance from a mostly 'archival function' to a streamlined system 'in which there is timely analysis of the data with an appropriate response'. In the UK, the 1974 NHS reforms transformed public health. Since 1939, close local collaborations between relatively autonomous PHLS labs, MoHs and clinicians had been a hallmark of British public health. 79 However, in 1974, the once powerful office of the MoH was disbanded. With Britain's welfare state and costs expanding, the 1974 reforms aimed to improve the management, quality, and integration of social and health services by transferring the responsibilities of local authorities for personal health services outside hospitals to new NHS regional and area health authorities. 80 MoHs' importance had already declined as a result of GPs' expanding role in maternal and child health and the removal of social care from their remit. Their remaining duties were now distributed among other local authority and NHS services. 81 The move fragmented and undermined community-level public health. 82 Local authorities retained environmental and some public health services, GPs remained responsible for maternal and child health, and community physicians with ill-defined preventive health duties were appointed to NHS regional, district, and area authorities. 83 According to later estimates, the 1974 reorganisation also led to a significant loss of expertise by prompting the early retirement or dispersal of ca. 400 highly skilled public health doctors. 84 The severance of traditionally close ties between local public health laboratories and MoHs and growing tensions between public health and clinical microbiologists also impacted the PHLS. Since 1946, the PHLS laboratory network had continued to expand before reaching its maximum extent in 1969. A large part of this growth was achieved by establishing dual purpose laboratories that were embedded in NHS hospitals and provided both public health and clinical microbiology services. By 1969, 42 of 63 PHLS constituent laboratories were providing general bacteriological services to the hospitals they were based in, 13 were supplying some kind of microbiological service to a hospital, and only 8 were solely conducting public health research. 85 The advantages to the PHLS were obvious: clinical microbiology generated revenue and data on infection levels, enabled close working relations with local clinicians, and facilitated outbreak responses. However, existing arrangements also made NHS clinical pathologists accuse the PHLS of 'poaching' their work. 86 Around 1966, it was agreed that the PHLS should only provide services to local hospitals if invited to do so. 87 Three years later, a review of PHLS activity recommended against further merging clinical and public health microbiology. Public Health Laboratory Service core tasks were now defined as 'to make a continuous study of how microbial diseases are spread, and what advice should be given for their control'. 88 88 Ibid., 94. 63 laboratories over the next 10 years (see Figure 1). Coupled with the 1974 loss of access to MoHs, this was a first blow to the 1939 vision of public health laboratory services that were closely integrated with local epidemiological and clinical expertise.
Although the PHLS secured the reassignment of most its laboratory staff, the 1970s fiscal crisis and growing government emphasis on non-communicable diseases created competition for resources and pressure for cost-effective microbiology services. 89 In response, the PHLS invested in the computerisation and automation of routine microbiological testing. 90 PHLS authorities also tried to find ways of compensating for the loss of MoHs by developing new ways of integrating microbiological and epidemiological services. The PHLS' fragmented response to the 1973 smallpox release at the London School of Hygiene and Tropical Medicine and concerns about imports of viral haemorrhagic fevers further highlighted the need to strengthen in-house epidemiological expertise and improve outbreak responses. 91 A resulting review called for a national centre of epidemiological expertise. 92 The obvious template for reform seemed to be the CDC's model of epidemic intelligence. In 1977, Britain emulated the US agency by founding the PHLS Communicable Disease Surveillance Centre (CDSC) under the leadership of Nicol Spence Galbraith. Recruited in 1976, Galbraith had previously worked as an MoH and area medical officer in the London area and had long argued for a centrally financed and coordinated national epidemiological service along the lines of the CDC. 93 Initially, it seemed that Galbraith's CDSC would have a similar status to the PHLS Epidemiological Research Department with only one additional staff member and a seconded senior medical officer working in a repurposed portakabin. However, following a 1977 visit to the CDC in Atlanta, Galbraith embarked on a programme of coordinated expansion. In 1978, the CDSC achieved national prominence by quickly reacting to the 1978 Birmingham smallpox outbreak. 94 The episode allowed the CDSC to prove its value as a 'highly active information and co-ordinating centre'. 95 In addition to receiving its own building (1980), it began publishing a CDC-style weekly Communicable Disease Report (CDR) on national prevalence levels. It also acquired competencies for the national surveillance of immunisation programmes (1985), began offering teaching and training in communicable disease epidemiology (1988), and used the resulting 'cadre' 96 of officers to conduct EIS-style field investigations at the behest of local authorities. However, an expansion into the area of environmental health and non-communicable diseases beyond the traditional microbiological focus of the PHLS was resisted. 97 The rise of the CDSC as the PHLS' epidemiological 'nerve centre' 98 and growing focus on the US-style epidemic intelligence coincided with a relative decline in the perceived importance of the local public health laboratories that had previously formed the backbone of the PHLS. Despite increases in the volume and integration of microbiological sampling, a growing number of voices began to question the cost effectiveness of existing public health laboratory services: the ad hoc nature of 1970s PHLS lab closures had increased discrepancies between a better served south and a more poorly served north; fiscal constraints were forcing the PHLS to press NHS authorities for more compensation of microbiological services; and the new Thatcher administration included the PHLS in its programme of cost cutting reviews. 99 In 1985, a Department of Health and Social Services (DHSS) review recommended a radical transfer of the PHLS' 52 area and regional public health laboratories (Figure 2) to NHS health authorities-a move that would have turned Colindale into a CDC-like centre of epidemic intelligence rather than a national provider of public health microbiology. 100 The proposal prompted fierce criticism from British health professionals. Writing in the British Medical Journal in 1985, former PHLS director James Howie warned against demoting the PHLS to a consulting epidemiological service and breaking up a laboratory network that connected London to the rest of the country. Contrasting Britain and the USA, Howie was adamant that the 'peripheral' contacts provided by its extensive laboratory infrastructure made the PHLS more resilient and attuned to local problems than other systems. Planners should strengthen both epidemiological and microbiological services rather than demoting one in favour of the other: So the PHLS is at a crossroads. Starved of funds, it can maintain its national laboratory service only at the expense of curtailing its epidemiological activities -or conversely maintain or increase its epidemiological activities only at the expense of the laboratory service. (. . .). What seems plain is that [NHS] district health authorities have neither the incentive nor the powers to get going measures of action and investigation covering regions wider than their own. (. . .). [The PHLS] is admired and envied for its effective peripheral contacts by its counterpart in the United States (. . .). Those who would break the essential PHLS links between centre and periphery -who are they? Do they really know what they are doing? 101 The DHSS review's call for strengthened central epidemic intelligence prompted a merger of the PHLS Epidemiological Research Service and CDSC (1985). However, medical criticism, rising numbers of disease outbreaks, and the contemporary HIV/AIDS crisis made the Thatcher government decide to maintain the wider PHLS laboratory network. 102 Howie's warnings nonetheless proved prescient: by the 1990s, budget pressures, politicised responses to major health crises and hopes that electronic integration and epidemic intelligence would be able to compensate for slimmed down local microbiology services began to pose a serious threat to Britain's public health laboratory infrastructure. Developments in the USA highlighted the dangers of neglecting investments in routine local public health laboratory services in favour of a limited number of national and regional hubs of excellence. Internationally, the CDC remained a poster child of microbiological and epidemiological expertise due to its rapid outbreak responses, research on classic and new pathogens (e.g. cholera and haemorrhagic fevers like Lassa, Marburg and Ebola), and prominent role in the wider global smallpox eradication effort. 104 As described by Etienne Gosselin, its attractive mix of providing assistance to states (both domestically and abroad) via technology, epidemic intelligence and field epidemiology served as an 'interface through which the CDC penetrated international health programs and institutions'. 105 However, outside of Atlanta, US politicians' commitment to routine infectious disease control was waning. In the fiscal year 1973/1974, there were 12,295 clinical and public health laboratories in the USA. Around 3 per cent (392) of these were official state and local public health laboratories, which performed ca. 70 million laboratory tests per year (2 per cent of total laboratory examinations). 106 Per capita expenditure and the number of analysed public health specimens increased throughout the 1960s and 1970s. 107 This increase was driven less by classic work on communicable disease 108 than by toxicological, microbiological and chemical testing related to growing concerns about chronic disease and environmental health. 109 Rising per capita expenditure also belied increasing understaffing in public health laboratories. After peaking at 6,598 in 1975, the number of budgeted positions in territorial public health laboratories declined to 5,603 (À15 per cent) in 1982. 110 Recruitment problems meant that many budgeted positions remained unfilled with a vacancy rate of 11 per cent reported in 1979. 111 Whereas free public health testing had traditionally been viewed as an essential service, fiscal pressure, competition from commercial laboratories and the retreat of universities from unlucrative public health work made an increasing number of public health laboratories charge fees. Additional problems resulting from lack of coordination, underinvestment in laboratory improvement and 'red tape' prompted some commentators to call for an abandonment of all free routine testing and a concentration of public resources on specialist laboratories and microbial reference services. 112 For many decision-makers, the broad local and state public health laboratory services that had helped lay the foundations for the rise of US epidemic intelligence were no longer essential. Initially, it seemed as though the CDC in Atlanta might escape some of these domestic pressures. Promoted from the rank of PHS Division to Bureau in 1968, the agency profited from rising health spending by the Johnson and Nixon administrations and acquired new competencies in areas unrelated to communicable disease including nutrition, occupational safety, smoking and environmental health. 113 The new areas of involvement were a reaction to the contemporary epidemiological transition and declining political attention for infectious diseases. They also diverted agency resources away from traditional public health microbiology towards the epidemiological assessment of risk factors associated with lifestyles, diets and environmental factors. To reflect its shift away from communicable disease, the agency was renamed Center for Disease Control in 1970. 114 The CDC's expansion into new areas was not without challenges. After moving from success to success during the first decades of its existence, the 1970s saw the CDC experience a series of setbacks and financial pressures. Starting in 1974, successive US administrations responded to surging inflation and economic stagnation (stagflation) by imposing limits and cuts on federal expenditure. In 1976, the CDC suffered additional damage due to its advocacy of mass vaccination against swine flu (H1N1), which resulted in the sacking of its long-standing director David J. Sencer and major organisational reorganisation. In line with the Surgeon General's 1979 call for a second public health revolution centring on preventive health, the CDC was renamed into Centers for Disease Control and Prevention and focused more of its increasingly constrained resources on lifestyle and environmental issues. 115 Following Ronald Reagan's election, Sencer's successor William Foege was forced to administer severe cuts of federal public health funding. 116 Once again, microbiological expertise did not fare well. Despite the parallel emergence of HIV/AIDS in the USA, the CDC announced it was considering laying off between 350 and 780 of its 3,700 employees in 1982 with about half of lay-offs planned in the area of infectious disease. 117 Between 1981 and 1983, the CDC's budgeted staff-year-or full-time equivalent workload-ceiling sunk from 4,249 to 4,045 years (À4.8 per cent). Further cuts between 1985 and 1988 reduced the ceiling for original departments to 3,904 staff years (À8.12 per cent). 118 Cuts further concentrated agency resources towards cost-effective epidemic intelligence and strategic response capabilities for high-priority risks.
While the USA and UK thus saw fiscal pressures, declining interest in infectious disease, and an increasing focus on centralised epidemic intelligence erode resources for local and national public health laboratory services, West German arrangements highlighted Laboratory infrastructures in the UK, US, and (West-)Germany the dangers of lacking epidemiological integration and local fragmentation. Throughout the 1960s and 1970s, post-war arrangements had remained relatively static: local Gesundheitsä mter were responsible for most laboratory testing and public health enforcement; state offices compiled data, aided outbreak responses and administered health programmes; federal activity mostly focused on the political coordination of state programmes. Epidemiological oversight was limited. According to a 1976 survey of venereal disease control, public health workers lacked power in relation to insurance-financed clinicians. Many clinicians did not adhere to updated 1970 disease reporting requirements and had monopolised lucrative treatments. The small number of local Gesundheitsä mter offering consultative and laboratory services referred patients to local practitioners. Systematised laboratory testing was limited, disease incidence was underreported, and data was insufficient to allow for effective contact tracing. 119 Public health's low status and the absence of comprehensive clinical and microbiological data made it difficult to apply new concepts of epidemic intelligence, recruit talented workers or convince politicians of the value of public health investments. 120 Post-war resistance of the West German states to federal centralisation gradually diminished from the 1970s onwards. However, attempts to update disease reporting, improve integration and expand coverage with 1972 'guidelines for federal state law on public health' and 1977 'Concerted action in the public health sector' failed to resolve problems. 121 There were limited updates of state health service provisions and dental and preventive services were rolled out to wider population segments. 122 However, key responsibilities such as disease screening were assigned not to the Gesundheitsä mter but to ambulatory care physicians, who were reimbursed by sickness funds. This further diluted officials' capacity to expand, collate and act on laboratory testing data. It also eroded the wider public health workforce. Between 1970 and 1996, the number of physicians working in West German public health services decreased from 4,900 to 3,300, the number of dentists from 2500 to 800, and the number of social workers from 4,000 to 2,500. 123 Without epidemiological coordination and the integration of state and communal networks, West German public health laboratory infrastructures remained characterised by fragmentation, localism and poor morale.
Lack of political interest and planning towards 'intensifying protection against illness' 124 was also reflected in poor scientific guidance coming from the federal centre. In 1975, the Bundesgesundheitsamt recruited 50 experts to develop recommendations on nosocomial infections. However, in contrast to contemporary Anglo-American programmes, resulting guidelines on architecture and technical procedures did not cite systematic reviews or clinical studies. 125  system were harsh. A 1991 assessment found services to be both 'highly decentralised and highly formalised'-public health services were 'weak' and 'something of a poor relation'. 126 Immunisation rates were comparatively low and-despite the significant gaps characterising disease and laboratory reporting-infectious disease mortality (excluding HIV/AIDS) was over 40 per cent in excess of socialist East Germany. 127 By the end of the 1980s, post-war laboratory infrastructures had thus evolved in very different directions. In both the USA and the UK, budgetary pressures, a redirection of attention away from infectious disease, and visions of integrated epidemic intelligence had strengthened central epidemiological oversight but weakened local public health laboratory services. By contrast, lack of epidemiological oversight and integration of local and state laboratory infrastructures was hampering infection control in West Germany.

Part Three: Clay Feet (ca. 1990-2000)
Budgetary and political pressure on post-war laboratory infrastructures accelerated during the 1990s. Pressures occurred amidst growing scientific warnings about public health systems' ability to deal with surging rates of antimicrobial resistance (AMR), nosocomial and food-borne infections, and the accelerating HIV/AIDs pandemic. In the USA and the UK, health officials and experts tried to mobilise resources by framing infectious disease hazards as national biosecurity threats. Building on the now firmly established international template of epidemic intelligence, advocates claimed that identified threats could be tackled with improved mathematical forecasting, networked sentinel laboratories and molecular 'real-time' surveillance, and centralised intelligence and response capabilities. Resulting preparedness plans did not sustainably strengthen either domestic laboratory infrastructures or microbiological capacity in the foreign locations they were also targeting. Instead, they concentrated resources on high priority pathogens and individual hubs of excellence. Focusing on microbiological surveillance in Africa and South Asia, anthropologists, microbiologists, and historians have highlighted how ring-fenced, time-limited, and targeted surveillance programmes hollowed out laboratory capacity for other pathogens. 128 The same was true in the countries pushing this agenda abroad. In the UK and the USA, investment in preparedness capabilities coincided with accelerating cuts to and the privatisation of local public health laboratory services. Germany proved an exception. Following a major scandal, the US templates of epidemic intelligence and preparedness were imported but adapted to the political realities of German federalism whereby strengthening federal powers came at the price of investment in state and communal services.
For the UK's PHLS, the period around 1990 proved particularly challenging. During the 1980s, HIV/AIDS and fatal outbreaks of salmonellosis at a mental health unit at Wakefield's Stanley Royd Hospital (1984) and of legionella at Stafford General Hospital (1985) had triggered a wider review of public health services. Headed by Chief Medical Officer (CMO) Donald Acheson, the review's influential 1988 report stressed the value of epidemiological oversight and stated that the 1974 NHS reorganisation had fragmented public health at the community level. MoHs had once provided a public health anchor point for local authorities, general practitioners, PHLS laboratories and hospitals. However, after this office was disbanded in 1974, ill-defined responsibilities had led to a neglect of local preventive health and communicable disease control. 129 This neglect had been exacerbated by 1982/1983 managerial reforms, which abolished NHS area authorities and resulted in the retirement of 20 per cent of community physicians-many of them former MoHs-recruitment problems, and low morale. 130 In the case of infectious disease control, coordination between local PHLS laboratories, NHS services and local authorities was compromised by the absence of overarching epidemiological oversight. It was 'extremely important (. . .) that someone within the health authority is responsible for linking the vital work undertaken by microbiologists and control of infection teams within hospitals with cases of infection occurring outside'. 131 Effective infection control at the local level was also hampered by national funding and data integration problems within the PHLS: Evidence submitted to us demonstrates almost universal support for the PHLS and its epidemiological 'nerve centre' the CDSC. (. . .). We are concerned to learn that if there were a recurrence of serious outbreaks (. . .) in more than one part of the country at the same time, or if a single outbreak spread to more than one major centre of population, the current system would be unable to cope. 132 To strengthen communicable disease control, the 1988 Acheson report recommended re-embedding epidemiological expertise at the local level by creating the post of a medically qualified District Control of Infection Officer (eventually called Consultant for Communicable Disease Control (CCDC)), who would provide advice for local public health and clinical services. A second new post, the Director of Public Health (DPH), would work with the CCDC, coordinate preventive health at the district level and submit public health annual reports. 133 CCDCs would also be part of an advisory District Control of Infection Committee and would receive district and regional NHS support for contact tracing and public health education. 134 At the national level, the PHLS should be strengthened by making it mandatory to report outbreaks to its epidemiological service, the CDSC. In line with its focus on improving data integration, the report also called for a new national electronic disease notification system connecting local laboratories and clinics with the CDSC in Colindale. 135 While the Acheson report revitalised community public health, 136 government promises to create full NHS consultant posts for infection control trainees fell through and calls for a sustained strengthening of wider PHLS services went unheard. 137 Instead of funding both the 'nerve centre' and its microbiological sensors, the 1990s saw the PHLS forced to absorb budget cuts to its laboratory services and prove its cost-effectiveness with value for money reviews and new NHS customer arrangements. 138 PHLS workers also faced the challenge of updating phenotypic surveillance methods with new genotypic molecular technologies like pulsed-field gel electrophoresis (PFGE) and of replacing paper-based with electronic reporting systems without compromising patient confidentiality. 139 In 1994, the PHLS launched a concerted effort to facilitate real-time surveillance by establishing a National Surveillance Group. The group was commissioned to review protocols for the surveillance of specific infections and launched an Epidemiology Information Technology Strategy to upgrade IT systems. 140 New EU initiatives on transnational AMR monitoring and prospects of rapid near patient testing triggered the creation of additional surveillance systems to integrate laboratory and clinical information on infectious disease prevalence and microbial characteristics. 141 On the occasion of its 50-year jubilee in 1996, the PHLS thus remained in direct command over a network of 49 microbiology laboratories organised into nine regional groups with devolved budgets and could also draw on information supplied by NHS laboratories. 142 However, ongoing calls for streamlined and-more importantly-cost-effective microbiology services made the future of this formidable infrastructure seem increasingly uncertain. Critical commentators warned that no amount of centralised epidemiological surveillance and electronic integration would be able to fully substitute effective local laboratory services. Reflecting on the PHLS' contributions, the Lancet warned: 'At a time when the British government seems keen to be penny wise and pound foolish, the message (. . .) to those who would further pare down an already rationalised PHLS is leave well alone.' 143 Existing public health laboratory infrastructures faced even graver pressures in the USA. The cuts in 1980s to federal and state public health budgets and attempts to commercialise microbiology services had resulted in a marked decline of public health laboratory capacity. 144 146 Public health spending also decreased at the local level. 147 Reduced federal, state and local spending was accompanied by an increase of infectious disease incidence. Between 1980 and 1992, the US death rates from infectious diseases rose by over 50 per cent and once again became the third most common cause of death. 148 Alarmed, the Institute of Medicine (IoM) published a damning analysis in 1988: 'disarray of public health' 149 had resulted from underfunding, reactive policymaking, varying state capabilities and fragmented services. All but one state conducted laboratory analyses for communicable diseases, and all screened their population for over 30 health problems. 150 However, the diseases being surveyed and the reporting methods varied. 151 To improve public health, the IoM recommended creating regularly reviewed state public health strategies. The US states were 'the central force in public health' 152 but federal and state authorities had to do more to guarantee essential services like laboratory access to citizens, expand and integrate data collection, and coordinate existing assets. Congress and the CDC should use federal funds to balance fiscal disparities between states. 153 Co-chaired by Nobel Laureate Joshua Lederberg, a second IoM review issued an even sterner call for action in 1992. Titled Emerging Infections: Microbial Threats to Health in the United States, the review warned that the USA had no 'comprehensive national system for detecting outbreaks of infectious disease (except for food-and waterborne diseases)' and that there was a 'lack of coordination among the various US government agencies' and 'between government agencies and private organisations'. 154 To prepare for future 'infectious disease emergencies', 155 federal authorities should create national stockpiles and push for a 'comprehensive global infectious disease surveillance system'. 156 To compensate for inconsistencies and growing surveillance gaps at the local and state level, reviewers called on federal authorities to boost epidemic intelligence and response capabilities via improved epidemiological training at state-level and integrated electronic data systems. It also recommended that the PHS establish a 'comprehensive computerised infectious disease database' 157 to consolidate information from domestic and international programmes.
Originating in a 1989 conference on Emerging Viruses, the 1992 IoM report is now seen as a landmark publication of a new age of biosecurity politics, preparedness scenarios and hopes for 'real-time' surveillance. 158 It also signalled a shift in the disciplinary 145 IoM, The Future of Public Health, 183. 146 Ibid., 181. 147 Ibid., 188. 148 Berkelman and Freeman, 'Emerging Infections and the CDC Response', 353. 149 IoM, The Future of Public Health, 19. 150 expertise guiding decision-making. In preparing response scenarios and corresponding budgets, planners were increasingly drawing on molecular data from a limited number of sentinel laboratories and mathematical predictions generated by disease modellers. 159 The latter group had used mathematical models to aid the design of vaccination schedules and public health planning since the 1920s. 160 However, the decades after 1990 saw a significant boost of modellers' influence on the overall design of public health strategies as a result of revised epidemiological models, microbiology's shift towards automated sero-and genotyping methods and the digitisation of reporting systems. In an age of budget cuts and securitised health thinking, modellers' ability to transform the significant increase of available clinical and laboratory data into actionable risk scenarios was politically valuable. In conjunction with already prevalent cost-benefit assessments, necessarily hypothetical calculations of worst-case scenarios allowed decision-makers to justify the concentration of spending on a select number of threats. 161 In the case of US laboratory infrastructures, 'smart' investment in a select number of specialist hubs and threats legitimised cuts to the more expensive traditional system of local public health laboratories with broad non-specialist testing remits. Hopes that new electronic networks, databases, and molecular typing technologies would be able to maintain effective disease surveillance despite growing gaps of baseline microbiological coverage underestimated how difficult this transition would be for many smaller laboratories. During the 1980s, pilots like the CDC's 1984 Epidemiologic Surveillance Project or the WHO's 1988 WHONet for AMR had highlighted the potential of information technology to standardise and accelerate clinical and laboratory reporting. 162 However, the adoption of IT systems below the state level proved slow. 163 By the time of the 1992 IoM report, all states were reporting electronic disease data to the CDC but several state laboratories still relied on paper forms. Even greater disparities in adopting electronic reporting occurred at the international level. 164 In the absence of effective advocacy, the defunding of the US public health laboratory infrastructures was accelerated by the polarisation of health politics following the Clinton Administration's 1993 attempt to establish universal health coverage. 165  According to a 1992 survey, 'only skeletal staff exists in many state and local health departments to conduct surveillance for most infectious disease'. 167 Out of 23 surveyed state laboratories, 22 reported hiring freezes or loss of positions. 168 While planners focused on preparedness scenarios and Hollywood movies like The Hot Zone stoked public concerns about deadly 'foreign' pathogens like Ebola, 169 the domestic public health laboratory infrastructure for the routine control of endemic disease was creaking. In the case of foodborne diseases, 12 states no longer had any dedicated personnel. In the case of HIV/AIDS, tuberculosis and childhood vaccine-preventable diseases, state data had become so unreliable that the CDC had to spend an additional $20 million per year for HIV/ AIDS surveillance alone. Federal surveillance for multidrug-resistant tuberculosis was discontinued in 1986 before being reactivated in 1993 as a result of a significant incidence spike. With the exception of a few states like Washington, public health funding gaps were now the norm. 170 In 1991, 78 towns with a combined population of 882,430 in Connecticut-the wealthiest US state-had no local access to full public health services, which often included a laboratory. 171 According to CDC estimates, US state and PHS spending on public health amounted to ca. $14.4 billion in 1993 (ca. 1.6 per cent of total health care expenditure). 172 Two years later, this figure was corrected downwards to about 1 per cent of total health care expenditure. 173 Privatisation was another challenge. In contrast to the UK where commercial operators had been displaced by the NHS and PHLS, 174 private sector testing had remained important in post-war America. Commercial laboratories provided testing for physicians, hospitals and local authorities. 175 The CDC and APHL had tried to assure the quality of commercial testing and disease reporting with mixed results. 176 Meanwhile, the growing political emphasis on fiscal restraint and management reforms created pressure on public health departments to outsource or commercialise further parts of their own laboratory services. 177 In 1992, nearly half of surveyed state laboratory directors indicated that some form of privatisation of laboratory activities was under discussion. 178 The commercial playing field was uneven. The 166 ability of private laboratories to offer bulk testing for a select number of profitable diseases allowed them to undercut public laboratories, which also had to recoup the costs of preserving a broad skill reservoir for advanced testing of uncommon pathogens, surge capacity and other public health services. 179 Many state laboratories already charged user fees, offered for profit assurance schemes for other laboratories, and accepted Medicare fees and third-party reimbursement for some services. 180 However, payment systems proved difficult to establish and not all laboratories were allowed to keep generated revenues. In some cases, laboratory privatisation had to be reversed after it threatened core public health services as in the case of commercialised tuberculosis and sexually transmitted disease testing in New Jersey or of vendor instability in Pennsylvania. A 2000 review predicted that uncoordinated laboratory commercialisation and weakened surveillance could have dire public health consequences. 181 The CDC continued to try to compensate for eroding local and state capabilities by improving the efficacy of existing laboratory networks and by investing in new molecular typing technologies and IT systems. In 1996, it established a network of electronically linked sentinel laboratories that could detect spikes of disease and employ PFGE molecular typing for Escherichia coli and non-typhoidal Salmonella (PulseNet). 182 Other federal programmes included the 1988 Public Health Laboratory Information System (PHLIS), the 1991 National Electronic Telecommunications System for Surveillance for the communication of weekly state reports on notifiable disease, the 1996 Tuberculosis Genotyping Network, and the 1996 National Antimicrobial Resistance Monitoring System (NARMS). Integration efforts were coordinated by the Health Information Surveillance System Board (est. 1995). In some states, the introduction of electronic systems reduced disease reporting from 35 days to 24 hours but this was sporadic and systems were often not interoperable. 183 The USA also explored possibilities of strengthening international and hemispheric disease surveillance. 184 Germany once again diverged from Anglo-American trends. In the 1980s politicians had also tried to contain rising health expenditure, streamline processes and introduce market elements in the form of 'manacled competition'. 185 Meanwhile, federal and state politicians faced growing pressure to upgrade public health arrangements in view of WHO priority health goals, the population-health oriented 1986 Ottawa Charter, and German reunification in 1990. 186 However, Germany's federal constitution and the highly decentralised nature of its public health system prevented both aggressive budget cuts and ambitious political shake-ups. In line with the country's subsidiary principle (Subsidiaritä tsprinzip), which stipulated that regulation by higher authorities should only occur where problems exceeded the abilities of lower ranking authorities, the result was a cautious series of public health reforms that simultaneously aimed to strengthen epidemic intelligence at the federal level, state-based epidemiological and laboratory competencies, and interstate collaboration. Although many components of East Germany's health system were decentralised following reunification, parts of its high-performing microbiology infrastructure like the Institute for Experimental Epidemiology in Wernigerode were incorporated into the RKI. In Berlin, the status of health politics was upgraded with the separation of the Federal Ministry of Health (Bundesgesundheitsministerium) from the Federal Ministry of Youth, Family, Women and Health in 1991. Below the federal level, an increasing number of German states responded to national and international pressure by adopting statutory principles for public health and rationalising service provision-including laboratory services-at the communal level. 187 As a result of interstate collaboration, long-standing recruitment problems were partially eased by no longer making public health an exclusively medical speciality and introducing postgraduate university courses. 188 More ambitious reforms occurred after an international scandal shook the German infectious disease establishment. In 1987, West Germany had launched an Immediate Action Programme to Combat HIV/AIDS, which created ca. 700 specialist positions in public health offices and laboratories throughout the country. However, HIV/AIDS protection and testing infrastructures remained patchy. In 1993, the German public was shocked to learn that around 2,000 haemophiliacs and 600 transfusion recipients had contracted HIV due to contaminated blood and blood products. 189 Although concerns about contaminated transfusions dated back to 1983, the Bundesgesundheitsamt and state officials had failed to halt the distribution of untested HIV-contaminated products by a company called UB Plasma until October 1993. 190 The HIV scandal and concerns about multidrug resistant and emerging diseases prompted the largest shake-up of German infectious disease control since 1945. In 1994, the 118-year-old Bundesgesundheitsamt was dissolved. Its combined powers for drug regulation and communicable and non-communicable diseases were distributed to the Federal Institute for Drugs and Medical Devices (Paul Ehrlich Institut), the Federal Institute for Consumer Protection (Bundesamt fü r Verbraucherschutz und Lebensmittelsicherheit) and the RKI. The latter institution now also incorporated Germany's AIDS Centre and 186 Plü mer, 'Germany', 36; K. Riedmann, 'Die historische Entwicklung der Gesundheitsberichterstattung in Deutschland', Bundesgesundheitsblatt-Gesundheitsforschung-Gesundheitsschutz, 2000, 43, 594-99. 187 Plü mer, 'Germany', 36. 188 Institute of Social Medicine and Epidemiology and thereby became Germany's primary hub for communicable disease control. 191 In response to calls for improved pathogen surveillance, the RKI established the first federal network of reference and consultant microbiology laboratories in 1995. 192 Comprising 59 laboratories by 2015, the network spanned universities, federal and state laboratories, and private institutes. In keeping with German decentralisation, laboratories remained independent but received financial support and auditing from the RKI. In return, laboratories assumed responsibility for developing diagnostics, typing pathogens, maintaining reference strains, engaging in training and quality assurance for other laboratories, collaborating with international partners and detecting outbreaks. Integration of laboratory and clinical data streams was achieved by closely following the international template of epidemic intelligence and seconding a senior CDC epidemiologist from Atlanta to Berlin to advise German authorities. Starting in 1996, the RKI used the new stream of disease data to issue a weekly epidemiological bulletin, formed a Committee for Infectious Disease Epidemiology, updated nosocomial infection guidelines, and established an EIS-modelled field epidemiology training course. 193 In 1998, it published the first German Health Survey. 194 In the same year, new EU guidelines also triggered inter-party drafting of an ambitious reordering of Germany's infectious disease legislation and plans for integrated AMR monitoring (see Part Four of this article). 195 After four decades of relative neglect, the combined effects of reunification, scandals, and EU integration made decision-makers in Berlin firmly commit to strengthening and integrating Germany's decentralised hinterland of ca. 360 state and communal Gesundheitsä mter. 196 However, in contrast to the USA and the UK, boosting epidemic intelligence did not come at the cost of cuts to local public health laboratory services. Bound by the subsidiary principle, reformers resorted to the traditional strategy of overcoming resistance to centralisation by simultaneously investing in state-based systems. With public health and prevention expenditure making up a stable percentage of rising total health expenditure, the result was more money for the entire public health system. Large-scale reforms occurred in each analysed country. Injections of public money, public health reforms and new genomic technologies dramatically boosted authorities' ability to quickly identify and react to emerging outbreaks. On both sides of the Atlantic, planners continued to hope that targeted high-tech surveillance in specialised laboratory hubs and integrated epidemic intelligence would be able to stop outbreaks in their tracks. What would happen if this line of defence failed was the subject of numerous outbreak movies like Contagion (2011) but failed to refocus decision-making on the robustness of underlying public health laboratory infrastructures.
In Britain, the decade between 2003 and 2013 saw the biggest shake-ups of infectious disease surveillance since the creation of the EPHLS in 1939. The first shake-up was the result of the Blair Administration's attempt to reform British health, social care and consumer protection whilst also devolving powers to the national governments of Wales, Northern Ireland and Scotland. In the wake of the 1996 mad cow disease (BSE) crisis, a new Food Standards Agency (est. 2000) was placed in charge of monitoring the food chain. One year later, the Department for Environment, Food and Rural Affairs assumed responsibility for agriculture and environmental health and the creation of 303 NHS Primary Care Trusts (PCTs) led to a significant restructuring of community-level public health. Taking over from the 95 now dissolved district health authorities, PCTs were tasked with providing and commissioning primary, community and secondary health services. Preserving integrated infection control abilities did not rank high on reformists' agenda. Instead, the creation of PCTs fragmented the post-1988 realignment of local and district-level epidemiology, microbiology, clinical, and local authority services as well as the existing public health workforce of CCDCs and DPHs. 198 The PHLS did not escape reformism either. In 2003, the now 64-year-old organisation was disbanded and merged into a new Health Protection Agency (HPA). The HPA was the brainchild of CMO Liam Donaldson and his 2002 Getting Ahead of the Curve report. 199 Referring to the 2001 US anthrax attacks, Donaldson's report mirrored the preparedness language of the 1992 IoM report by painting a dark picture of infectious disease, bioterrorism, and chemical and nuclear threats. It also stated that a lack of integrated intelligence and response capabilities undermined national security. In the case of infectious disease, the report acknowledged the successful PHLS response to the limited number of British SARS cases in 2003 but called for an integration of all aspects of health protection (infectious disease, chemical and radiation hazards) under one organisational roof. This would entail major changes to Britain's public health laboratory infrastructure. 200 In a radical move, Getting Ahead of the Curve recommended ending the traditional separation of local public health and clinical microbiology and concentrating public health resources on the centralised provision of specialist services. Control over all local PHLS laboratories should be transferred to the NHS-thereby essentially implementing the 1985 DHSS recommendations and creating a CDC-like agency with one centre and a few regional hubs. Rather than controlling public health microbiology at the local level, the HPA would act as a coordinating centre of epidemic intelligence and provide microbiological reference, consulting and rapid response services to the NHS, British government and local authorities. It was envisioned that HPA experts would closely coordinate with NHS Pathology Services in England, who would now provide all local microbiology services and report relevant public health data to the HPA. Should it be required, additional NHS testing could be commissioned. Reformers hoped that transferring the distinct network of PHLS laboratories to the NHS would rationalise local microbiology services and improve diagnostic, assurance, and reporting standards via centralised NHS oversight. They also hoped that NHS PCTs, Strategic Health Authorities and laboratories would not only support the HPA during outbreaks but also devote sufficient resources towards maintaining routine public health microbiology outside of emergencies. 201 These expectations proved difficult to meet.
The HPA was formally established on 1 April 2003 with four bases at Colindale, Porton Down, 202 Chilton, and South Mimms. It comprised a health protection services division, which would integrate a laboratory and a new national epidemiology service, a microbiology services division, a centre for radiation, chemical and environmental hazards, and a division for biological standards and control. 203   comprised only 9 laboratories (of which two were located in London) while the former PHLS' 38 local laboratories were transferred to the NHS (Figure 2). To facilitate this transfer without compromising local public health laboratory services, the government guaranteed a stable budget until April 2005 after which money should come from the NHS or direct HPA commissioning. 204 Similar to the CDC's EIS, specialist Health Protection Units (HPUs) were supposed to act as the 'eyes and ears' of the HPA, liaise with local and NHS authorities, respond to health protection incidents and keep 'a finger on the health protection pulse locally'. 205 The creation of the HPA was greeted with mixed feelings by PHLS staff. 206 Giving evidence to the House of Lords, officials repeated Howie's 1985 warnings about trading off local laboratory coverage against centralised epidemic intelligence and lacking NHS incentives to adequately fund public health microbiology: There is real and general concern about the capacity to deliver enhanced surveillance or to mount an acute response. (. . .). There appear to be no plans as yet for providing any material incentive for NHS laboratories to rise to the public health challenge, and a fear that in local tussles about funding, money will be diverted from public health purposes. 207 Similar to problems within the CDC, it was also unclear whether HPA management would neglect infectious disease over chemical, radioactive or chronic disease threats. 208 Reflecting on the PHLS-led response to the 2003 SARS pandemic, 209 virologist Philip Mortimer warned that the UK had been lucky because of the nature of the pathogen and England's integrated public health laboratory infrastructure. Mortimer noted that over relying on mathematical disease modelling and epidemic intelligence in the absence of sufficient local laboratory capacity, contact tracers and isolation beds could prove costly during future pandemics. What was needed was broad infrastructural investment: . . . it should not be assumed that a recurrence of SARS is therefore unlikely, or that a further outbreak would be controllable. (. . .). If there are weaknesses or deficiencies it should not be thought that they can or should be repaired by quick fixes each time an acute threat materialises. Such expenditures fail to build the infrastructure needed to maintain a comprehensive capacity for rapid and technologically appropriate response to new pathogens, and over time they distort facilities and so hinder the effective management of the laboratory. 210 204 Mortimer's warnings proved prescient. Later remembered as a 'painful birth', 211 the first years of the HPA were characterised by a slow merging of once independent institutions. In contrast to Donaldson's vision of a 'customer-focussed' integrated public health service, 212 relations within the HPA were described as 'territorial', which 'led to a lot of wasted energy on positioning small politics and defending them'. 213 Coordinating HPA and NHS services was equally challenging. In the case of infectious disease control, recruitment problems and NHS underinvestment in microbiology services meant that building effective collaborations between regional HPA hubs and local laboratories and authorities was challenging. 214 Within a year of the end of ring-fenced funding, a 2006 review of NHS Pathology Services in England warned that it was not obvious what kind of accountability NHS Trusts had for financing local public health microbiology services in the medium-to long term. 215 Stripped of its network of local laboratories, Britain's flagship public health hub in Colindale was at risk of becoming a giant on clay feet.
The effects of losing local public health laboratory services were compounded by confused official responsibilities for infection control. PCTs were charged with protecting community public health but relied on professionals in the HPA's headquarters and HPUs for response management. Local authorities were responsible for protecting the public from infectious disease and appointing a CCDC. However, CCDCs were no longer responsible to the NHS DPH but were employed by the HPA as part of its HPUs. Meanwhile, the HPA was dependent on PCTs, NHS England Pathology Services and local authorities for routine clinical and microbiological data generation and for mounting outbreak responses. 216 At the national level, the HPA's post-2005 status as a non-departmental authority also risked creating parallel hierarchies with the NHS and Department of Health during public health emergencies. 217 The 2008 financial crash exacerbated problems with severe funding cuts for the HPA. Between 2009 and 2013, HPA staffing decreased from 4,100 to 3,700 (9.7 per cent). Meanwhile, the agency's core budget declined by over 26 per cent from £193 million (2009/2010) to £142 million (2012/2013). In response, the HPA became increasingly dependent on extramural grants and private funding for special programmes and research. Similar to the USA (see below) and parallel developments in international health, 218 relying on rigid earmarked rather than stable budgets could distort management priorities towards targeting 'profitable' funder priorities while exacerbating funding shortages for routine public health services. 219 In 2010, the new Conservative government launched a policy of fiscal austerity. Despite successful HPA responses to the 2006 polonium killings and 2009 H1N1 pandemic, 220 a further economic slimming and centralisation of public health was decided in 2011. In 2013, the HPA and its eight regional laboratories were integrated into Public Health England (PHE). The goal was to create a mammoth public health service, which would merge over 5,000 staff from 129 organisations. In contrast to the relative political autonomy of PHLS and HPA decision makers, PHE was an executive agency under direct control of the Department of Health and would be managed by a chief executive officer with no formal training in medicine or public health. 221 Although PHE was supposed to be free to speak on issues relating to the nation's health, 222 the extent of this freedom and the ability to act autonomously in the name of public health was unclear.
The creation of PHE was part of the second shake-up of British health care within a decade. The 2012 Health and Social Care Act abolished New Labour's PCTs and stripped the NHS of its responsibility for local public health services, which were returned to local authorities. According to the new doctrine of democratic localism, 223 upper tier (county councils) and unitary (metropolitan) local authorities in England would establish multiparty Health and Wellbeing Boards (HWBs), draw on expertise from lower district levels (e.g. in environmental health), and appoint a DPH for the combined delivery of public health, health, and social services at the local level. Select primary care services such as screening, immunisation, and public health services for children and pregnancy remained part of the NHS and were coordinated by Clinical Commissioning Groups and NHS England. 224 Similar to the HPA, PHE would act as an epidemic intelligence centre providing specialist reference, global health, and costeffectiveness services while supporting local HWBs and DPHs with health protection teams. PHE's coordinating role was further boosted by the 2010 Health Protection (Notification) Regulations, which updated disease and reporting requirements and gave local authorities more flexible powers to deal with emergencies. 225 (Re)-implementing effective localism was undermined by ongoing budget cuts. To maintain outbreak preparedness and routine public health services, PHE relied on support from the NHS and local authorities. This included sufficient funding for local public health laboratory services and reporting by NHS Pathology Services as well as the ability and willingness of local authorities and NHS managers to adhere to formal public health outcome frameworks. 226 Shrinking NHS and council budgets made this 219 collaboration challenging. Ministers had initially promised to ring-fence the public health budget for local authorities until 2016. However, an in-year cut of £200 million in 2015 was followed by a further 9.6 per cent cut over the next 5 years. This amounted to a real term reduction of public health spending from £3.5 billion in 2015-16 to just over £3 billion in 2020-21 (a reduction from ca. 4.1 per cent to 2.5 per cent of total health expenditure). 227 PHE itself had to absorb a 30 per cent funding cut between 2013 and 2017. Severe local funding shortages were exacerbated by 'top slicing' whereby councils reallocated ring-fenced public health budgets to other local services impacting health and wellbeing. 228 Meanwhile, retirements and recruitment problems left 17 per cent of DPH posts vacant in 2017. 229 Ahead of the biggest pandemic event since 1918, the once formidable public health laboratory infrastructure in England was smaller than at any point since 1939. Despite access to cutting-edge molecular typing, world-renowned experts, and integrated electronic reporting, the UK's centre of epidemic intelligence in Colindale had reduced political autonomy to act in the name of public health, less contact with local microbiological and public health services, and little surge testing capacity to fall back on should its national and regional reference laboratories be overwhelmed.
The growing centre-hinterland divide and discrepancy between preparedness rhetoric and actual laboratory capacity were mirrored in the USA. Between 1999 and 2020, a brief effervescence of public health funding and reform was followed by a prolonged phase of budget and personnel problems. In 1999, a review by the non-partisan Government Accountability Office (GAO) reported ongoing gaps in the US laboratory surveillance and disease reporting. Almost half a century after their creation, it remained voluntary for states to adopt the annually reviewed list of notifiable diseases issued by the CDC and Council of State and Territorial Epidemiologists. While this arrangement enabled states to opt out of testing for diseases that were rare in their territory and focus resources on a national 'core list' of reportable diseases, it also created surveillance gaps for emerging threats like hepatitis C and AMR. 230 Technology differences exacerbated surveillance gaps. In 1999, slightly more than half of state laboratories conducted molecular PFGE-typing. However, genotypic data did not necessarily correlate with phenotypic data produced by non-molecular surveillance in other states. Similar problems characterised the adoption of IT systems. Seven years after the 1992 IoM report, 40 per cent of state laboratories did not or barely used computerised systems to receive surveillance data and 21 per cent barely made use of computerised systems to transmit data. 231 Concerns were also raised about the ongoing commercialisation of public testing and 227 patchy disease reporting by out-of-state laboratories. 232 Worryingly, over 30 state laboratory directors warned that lucrative fee-based genetic screening and tests for regulatory and licensure programmes were skewing routine laboratory operations 'away from testing services beneficial to the entire community [like influenza surveillance]. . .'. 233 Powerless to override state decision-making, the CDC tried to mobilise financial resources and increase the efficiency of existing public health laboratory networks by stressing biosecurity scenarios. A 1999 CDC report declared that Americans were 'a Nation at risk'. 234 The USA was unprepared for new threats posed by the rapid dissemination of disease via global travel, AMR, and bioterrorism. Preparedness could, however, be improved by investing in a new electronic CDC Health Alert Network and by setting out regularly reviewed state improvement plans for public health. Essentially repeating recommendations from the 1988 and 1992 IoM reports, the CDC announced that it would support states with extramural grants, training, technical assistance, and evaluations. Further recommendations centred on updating public health education and creating integrated databanks. In response to the GAO, the need to improve laboratory capabilities in all federal and state health departments was also mentioned. 235 The laboratory capabilities singled out for improvement were defined by contemporary preparedness concerns. In 2000, the CDC announced the creation of a rapid laboratory response network for biological and chemical agents with integrated surveillance and a strategic plan for bioterrorism by 2004. 236 Outside of Atlanta, the APHL tried to use Congress' Healthy People 2010 initiative to promote a broader vision of public health laboratory capacities. Although it also referenced bioterrorism threats, its 2000 consensus vote on core functions of state public health laboratories only listed emergency response capabilities on place 8 of 11 -below the provision of diagnostic, surveillance, reference, quality assurance, and data management services. 237 The 2001 World Trade Centre and anthrax attacks created further momentum for the enactment of preparedness-oriented reforms of the US public health laboratory infrastructures. According to the IoM, 2001 had revealed: . . . vulnerable and outdated health information systems and technologies, an insufficient and inadequately trained public health workforce, antiquated laboratory capacity, a lack of real-time surveillance and epidemiological systems, ineffective and fragmented communications networks, incomplete domestic preparedness and emergency response capabilities, and communities without access to essential public health services. 238 During fiscal year 2002, the Bush Administration increased already rising federal preparedness spending. Of the $2.9 billion bioterrorism appropriations bill signed on 10 January 2002, $1.1 billion were dedicated to upgrading bioterrorism preparedness, infectious-disease surveillance, network integration, and hospital surge capacity. 239 The CDC and APHL responded by establishing a dedicated Laboratory System Improvement Program (L-SIP). Regulators also developed a Model State Emergency Health Powers Act that could be implemented by states to better integrate interstate emergency capabilities. 240 At the state and local level, post-9/11 funding led to a brief surge of laboratory spending. Many of the ca. 2,000 public health laboratories upgraded equipment, invested in 24/7 availability and IT networks, hired staff, and began offering molecular typing services. 241 At the federal level, the CDC expanded its epidemic intelligence capabilities by promoting existing surveillance systems like NARMS, VetNet for Salmonella in animals, FoodNet for the surveillance of foodborne diseases, and OutbreakNet for PFGE data sharing. 242 243 In 2003, the US public health authorities used their integrated resources to mount an effective response to SARS 244 -although the CDC subsequently highlighted the need for additional local testing capacity. 245 The Pandemic and All-Hazards Preparedness Act (PAHPA) and the 2006 creation of the Biomedical Advanced Research and Development Authority (BARDA) further enhanced authorities' ability to prepare for projected emergencies with strategic stockpiles and research. 246 Whether the surge of preparedness funding, which peaked in 2004, was sufficient to shore up baseline US public health capacity was debatable. 247 The new money was spent unevenly and biosecurity priorities did not necessarily align with broader infection control needs. At the federal level, Homeland Security and CDC officials repeatedly clashed over how much funds should be earmarked for bioterrorism threats like smallpox and anthrax 239 and who should have control over new surveillance data. 248 Although states and counties used federal preparedness funds to upgrade laboratories, IT systems, and typing technologies, some complained that federal bioterrorism priorities were draining resources from routine laboratory surveillance for other pathogens. 249 The relative neglect of day-to-day laboratory testing in favour of a select number of high-priority targets was exacerbated by the Bush Administration's decision to slash the federal budget for preventing chronic disease, promoting health, and controlling infectious disease by $67 million and impose flat spending on other programmes. 250 Despite parallel preparedness money, state-level fiscal problems and increased Medicaid and Medicare expenditure resulted in public health budget declines in more than half of the US states between 2003 and 2004. 251 According to a 2007 survey of state public health laboratories, 64 per cent had insufficient staff levels, many could not afford regular training, and 43 per cent remained unable to mandate the submission of isolates and specimens from commercial laboratories. 252 Despite these problems, the USA mounted an effective response to the 2009 H1N1 pandemic. New molecular capabilities meant that it took only 2 weeks from the identification of the novel virus to the creation, emergency FDA authorisation, manufacture, and distribution of a rapid RT-PCR-based test. Rapid FDA licensing had in part been possible because of prior CDC and BARDA investment in a platform technology (five target assay) for rapid influenza testing that could be quickly modified. In line with its preparedness protocols, the USA had also stockpiled reagents. 253 Within a few days of modifying the test, representatives from 44 public health laboratories had visited the CDC for training. Although false negatives resulting from less accurate rapid tests caused problems, PCR-based testing, electronic reporting, and vaccine development bolstered confidence in the US pandemic preparedness and the ability of integrated networks of public and private laboratories to overcome chronic laboratory capacity problems in emergency situations. 254 Looking back at the H1N1 response, the APHL highlighted the importance of preparedness research, hybrid public-private lab networks, and genomic technologies for future pandemics: Before [H1N1], the federal government would have to make a lot of tests, and we'd send them out, and people would use them . . . With people being able to do their own PCR, we could put up the recipe and let people make that themselves. It's a much more robust approach for people able to ramp up testing for a novel pathogen. 255 The fact that non-governmental academic, private and clinical laboratories could validate 'homebrew' tests against a published gold standard had stopped public health 248 laboratories from being overwhelmed by a flood of specimens. 256 In the midst of resulting optimism about new technologies and hybrid networks, it was easy to forget that future crises might involve less well-known pathogens and that state public health laboratories had played a key role in boosting testing, providing quality assurance, and coordinating responses outside Atlanta. 257 The resilience of this public health laboratory infrastructure faced accelerating threats following the 2008 financial crash. With the US states experiencing severe fiscal pressure, public health laboratories were forced to reduce or commercialise testing services. Federal decision-makers did little to counteract these cuts. In 2009, a large part of the Obama Administration's $5.8 billion of H1N1 surge funding was spent on vaccination and only a small percentage of funds reached laboratories. 258 Between 2008 and 2010, over 44,000 jobs were lost in state and local health departments including public health physicians, laboratory specialists and epidemiologists. 259 Staff shortages not only increased pandemic vulnerabilities but also compromised routine infectious disease surveillance. In 2010, 24 per cent of the US states could not submit 90 per cent of E. coli molecular typing results to CDC's PulseNet database within four working days. 260 Despite their importance during the 2009 pandemic, porous laboratory accreditation frameworks impeded quality assurance and data sharing between private and public laboratories as well as between local, state and federal agencies. 261 Challenges for public health laboratories increased following the enactment of the Obama Administration's 2010 Affordable Health Care Act. Medicare and Medicaid had been significant revenue sources for public health laboratories. However, the vast increase of people eligible for basic testing raised fears about the future of direct public financing, necessitated a significant upgrade of public billing systems, and increased competition with commercial laboratories-with state laboratories still required to maintain core functions and provide services to uninsured Americans. 262 The 2012 decision to reallocate $6.25 billion of the originally promised $15 billion boost to public health and subsequent sequestrations further increased problems for public health laboratories. After briefly rising to ca. 3.18 per cent of total health spending in 2002, the US expenditure on public health declined to ca. 2.65 per cent of total health spending in 2014 with a further fall to 2.4 per cent predicted by 2023. 263 Although CDC planners resorted to the familiar approach of trying to compensate for cuts with improved IT and next 256 generation sequencing, 264 recruitment problems and budget cuts-including President Trump's announcement on 10 February 2020 of a 16 per cent reduction of CDC funding-severely undermined the resilience of the US public health laboratory infrastructure just before it would be tested to its limit. 265 Of the three analysed countries, Germany was the only one to mostly strengthen laboratory services between 2000 and 2020. Driven by EU requirements and demands for integrated European surveillance networks, a new infectious disease law (Infektionsschutzgesetz) came into force in 2001. In line with previous reforms, key public health competences remained in the hand of state and communal authorities, but the 2001 law enhanced authorities' ability to respond to outbreaks with improved vaccination, surveillance and police powers-including the right to impose quarantine and job bans. The federal centre in Berlin was empowered to impose federal control in the case of disease events of national significance. State-level and federal disease reporting requirements were unified and upgraded with Berlin now able to demand laboratory and disease surveillance data that included personal information during emergencies. 266 As the federal epidemiological centre, the RKI focused more resources on curbing nosocomial infections, increasing vaccine coverage (by incorporating Germany's permanent vaccine commission), and improving infectious disease surveillance. 267 In doing so, the RKI continued to build on the CDC model of epidemic intelligence. Sentinel and reference laboratories were strengthened. Doctors, hospitals, and private laboratories were required to report both suspected and confirmed reports of notifiable diseases to communal or state health authorities, who would then communicate routine information and test results to the RKI in an anonymised form. 268 To facilitate data integration, the RKI emulated the USA and the UK investment in IT systems by rolling out SurvNet@RKI as the first integrated German electronic reporting platform for infectious disease among civilian authorities on 1 January 2001-and in the armed forces in 2006. 269 Preparedness planning featured alongside overall systems' strengthening. Following WHO guidance from 1999, Germany developed its first pandemic influenza plan-well after similar plans 264 in the USA and the UK-in 2001. Reporting systems were updated after the 2011 enterohemorrhagic E. coli (EHEC) epidemic exposed ongoing communication gaps between different states. A new electronic notification system (Deutsches Elektronisches Melde-und Informationssystem fü r den Infektionsschutz) was trialled between 2016 and 2020. 270 Repeatedly upgraded after 2001, Germany's new infectious disease legislation addressed long-standing organisational constraints at the communal and state level whilst also integrating national networks and strengthening federal oversight. This did not mean that variations in state public health arrangements ceased or that the RKI in Berlin became dominant. A further attempt to centralise and rationalise regulations and competencies was rejected by states in 2005. 271 At the political level, the 2001 reform instead solved long-standing problems of fragmented federal, state and communal responsibilities and the relative neglect of infectious disease control. Following the subsidiary principle, reforms at the top did not necessarily lead to cuts at the bottom. While English reforms fragmented and reduced local infection control capabilities and the US surge funding prioritised preparedness over routine laboratory services, German reforms strengthened both emergency and routine services. Between 2000 and 2015, state, federal, and health insurance spending on prevention and public health services amounted to between 3.27-3.52 per cent of rising total health expenditure. Money was spent at the federal, state and communal levels. 272 The more complicated nature and looser integration of German infection control and laboratory infrastructures meant that the RKI's epidemic intelligence capabilities could not match those of the CDC or PHE. However, German authorities could draw on a public health hinterland with sufficient power, autonomy and coordination to react flexibly should centres of excellence be overwhelmed. Despite coordination problems, 273 the hybrid nature of German systems building-coupled with the early decision to incentivise and incorporate testing by commercial laboratories-allowed for a rapid 2020 surge of testing and access to resulting data by local authorities. 274

Conclusion: Upgrading the Hinterland
On the eve of COVID-19, assessing the preparedness of different public health infrastructures was thus deceptive: Anglo-American centres of excellence wielded significant international influence, had developed sophisticated molecular typing and electronic notification systems for existing laboratory assets, and had fine-tuned a model of epidemic intelligence that could rapidly deploy experts and stockpiled equipment to quell localised outbreaks. Despite ongoing cuts and reforms, responses to the 2003 SARS and 270 2009 H1N1 outbreaks engendered confidence that Anglo-American public health systems could protect citizens from new or familiar pathogens.
However, beyond the centre, the wider state of public health laboratory infrastructures was worrying. Since the late 1970s, decades of budget reductions and attempts to streamline laboratory services had eroded local testing and microbiological expertise. Routine laboratory services for many infectious diseases had been cut or outsourced and staff shortages were significant. Should centres of epidemic intelligence fail to stamp out an emerging epidemic or pandemic event, there was little hinterland capacity to fall back on to protect the public's health.
These infrastructural weaknesses had been highlighted by multiple reports and exercises like the UK's 2016 Operation Cygnus. 275 Interestingly, they were, however, not adequately taken account of by international preparedness reviews. Conducted by experts, who either originated from or had been trained in the USA or the UK, prominent reviews followed the narrow priority setting approach of 1990s pandemic planning by concentrating on key performance indicators including detection, reporting, and response capabilities for specific pathogens as well as health systems' capacity to treat sick workers. 276 Missing from this mode of analysis was a focus on the non-specialist public health laboratory services, whose surge capabilities, quality assurance, and expertise in collaborating with other local public health services connected national nerve centres of epidemic intelligence to the rest of the country. The relative political autonomy of public health centres to act swiftly was also not taken account of. The result was a mode of review that was designed to confirm rather than challenge orthodoxies of centralised intelligence and lean surveillance and consistently ranked countries with more decentralised resources lower than those with individual hubs of excellence. 277 While hindsight is cheap and history provides no simple lessons, focusing on the longterm development of the laboratory infrastructures underpinning public health systems can provide insights for planning going forward. One of these insights is the importance of context and time when it comes to creating and maintaining resilience. As this article has shown, laboratory infrastructures evolved to fit the specific needs, political systems, and infection control visions of their respective national contexts. While the concerns shaping this evolution were often similar, the degree of central control, integration, and efficiency of resulting infrastructures differed: Britain's PHLS evolved its laboratory network to fit around an already existing system of local epidemiology and public health consisting of MoHs and local authorities; the CDC's concept of epidemic intelligence was developed in response to concerns about biological attack and the need to coordinate autonomous state public health laboratory systems; West Germany's laboratory system was shaped by existing communal traditions and by the post-war focus on de-Nazifying and decentralising public health.
The specificity of each laboratory infrastructure meant that reforming it required careful deliberation of how it fit into the wider public health ecosystem as well as sufficient time