Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

In as much as the main reason for the establishment of the 1986 Public Health Enquiry was the identification of problems concerned with the control of communicable disease, it was important to ensure that its recommendations were being implemented. By a curious set of coincidences, I found myself almost immediately deeply involved in many of the activities flowing from the Government’s acceptance of the recommendations. Having indicated my intention to take premature retirement from Coventry Health Authority in 1990 in order to develop my teaching activities at Warwick University, where I had been a visiting senior lecturer in ­community medicine for 10 years, I found that the fact that I was immediately available for part-time work led to a number of interesting invitations from a variety of organisations all of which were focused on one or other aspect of implementing the Acheson recommendations.

The first two approaches came simultaneously from the Director of CDSC, Dr Chris Bartlett, and Dr Deirdre Cunningham, Head of the Public Health Division at the Department of Health, who jointly recruited me essentially to carry out a survey of the 14 Regional Health Authorities in order to ascertain who would be discharging the duties of CCDC in each district, the professional background of each post holder and what specific training needs might be identified for the ­competent discharge of the relevant duties. These regional visits also would provide the opportunity to identify those trainees (mainly in public health medicine and medical microbiology) who might be keen to become CCDCs, and which in turn might give an indication of the need for training programmes that could be provided by universities or other educational establishments (Department of Health 1990). The third approach was more local. As it was my intention to continue to reside in the West Midlands, the Regional Medical Officer, Dr Michael Harrison, suggested that I should not neglect my “home” Region and therefore I gladly also accepted his invitation to take on the part-time role of coordinator of training programmes in communicable ­disease control for the West Midlands. These three interlocking sets of duties, none of which in itself was too demanding or time-consuming, produced the amusing result that, retiring from Coventry HA on the 31st May 1990, I ­recommenced work on the ­following day, operating from three separate locations considerable distances apart. As this was at a time when mobile telephones were not very common, my wife ­generously provided the coordinating link from our Coventry home.

At the same time, a steering group chaired by Professor David Miller of St Mary’s Hospital was set up by the Faculty of Public Health Medicine, the Royal College of Pathologists and the Royal College of Physicians to examine the existing ­educational facilities which might contribute to the relevant training programmes. It was thought helpful for a small sub-group to carry out the detailed work on behalf of the steering group and I readily undertook to be its convenor; it was chaired initially by Professor Raj Bhopal of Newcastle University and subsequently by myself. The task took 4 years and the outcome was the report Training for the CCDC Role (Pollock et al. 1994), published by the (then) Faculty of Public Health Medicine in London.

It soon became apparent that England’s response to the Acheson Report, during the first half of the 1990s, was far-reaching and in some depth. There was a strongly perceived need for people to be protected against communicable diseases and, unlike some other good ideas emanating from national committees from time to time, action did in fact follow. Two fields of practice within medicine were clearly able to make the major contribution, namely public health medicine/epidemiology and medical microbiology. Although the specialty of clinical infectious diseases had “equal rights” in this field it was found, in practice, that very few of these physicians applied for CCDC posts although the Royal College of Physicians was strongly represented on the steering group on training referred to above. (It would be idle to try to hide the fact that there was a measure of “healthy rivalry” between the epidemiologists and the microbiologists at this time but all committees or groups tended to ensure that both specialties had equal representation. A major joint conference held in June 1990, with a view to giving a full airing to these issues, was so well subscribed that the main hall of the Royal Institute of British Architects had to be hired to accommodate all those attending. An epidemiologist chaired the morning session and a microbiologist the afternoon one; much of the discussion was of the kind usually referred to as “full, frank and useful.”)

A further feature of the situation during this period was that the country became much more outward-looking with regard to understanding these problems, showing greater interest in how other nations were tackling them. As one example of this development, in June 1991 I was sent by Dr Harrison to CDC Atlanta, in order to gain a general picture of how the USA was responding to the threats and specifically to learn about the relatively new computerised surveillance system known as ­Epi-Info and to bring back its software so that it could be introduced locally. In this connection, I was allowed to train at CDC alongside the state epidemiologists, being given the light-hearted title of “honorary state epidemiologist.” The reverse of this situation also applied; other countries were interested in what we were doing, and I was fortunate enough to have the experience of being invited to take part in seminars on the events in England in places as far apart as UCLA, the Oklahoma State Health Department and the University of Helsinki.

Back in England, however, it was felt that the special situation of Greater London deserved particular consideration, taking into account the fact that many people might live in one part of the metropolis, work in quite a different part, and engage in social activities (including eating out) in a variety of locations relatively remote from both home and work. Given also that many thousands arrive in the capital each day by air, rail and road, it was clear that communicable disease surveillance and control were of paramount importance. Accordingly in 1993 I was commissioned jointly by the four Thames Regional Health Authorities to examine the situation across Greater London—comprising 41 separate District Health Authorities, each with its own CCDC—and report accordingly. The four Regional Directors of Public Health kindly arranged a series of appointments in Central London for me to meet the CCDCs and take evidence from them. (Without such an arrangement the project would probably have proved impracticable within a reasonable time-frame.) The main outcome of the study was the acknowledgement of the need for shared surveillance data across Greater London (Pollock 1993) and accordingly a Pan-London Change Management Group was set up to take this forward. As a result, in 1994, the London Communicable Disease Surveillance Project came into being, funded jointly by the four Thames Regions and the Department of Health and based at CDSC Colindale and with its own monthly publication, the Thames Monitor. [Achieving this outcome required thoughtfulness and sensitivity. Individual District CCDCs, quite understandably, felt that their District data belonged to them—they had collected it and they would be utilising it. However a compromise was readily reached by which District CCDCs would submit their data for a shared surveillance base covering Greater London on the clear understanding that all control activities within a District would remain their responsibility. ]

But what was actually happening in England, in terms of communicable diseases themselves, in the wake of the Acheson Report? I had some indication of one issue at my first meeting with Dr Harrison on taking up my part-time duties at the West Midlands RHA in June 1990, when he apologised for the tuna sandwiches instead of the usual corned beef ones offered, explaining that this was an instruction from the Authority’s Chairman; in the latter’s words, up to that point there had been no evidence of any condition that might be referred to as “mad tuna disease!” Certainly at that time there was considerable public anxiety that the cattle disease bovine spongiform encephalopathy (BSE) —so-called “mad cow disease”—might have been spread to humans, and in that year a national surveillance unit had in fact been set up in Edinburgh to monitor this possibility. The justification for this concern was that the evidence suggested that BSE in cattle had been caused by their being fed meat and bone meal from carcases of sheep suffering from scrapie, and therefore that the infection might cross the species barrier once more and infect humans who had eaten beef (Kimberlin and Walker 1989). The widespread publicity which this matter had received altered the dietary habits of large numbers of people and food retailers were quick to respond by replacing beef, for example in pies, by other meats such as chicken. In Scotland, the well-known small round individual meat pies, an inexpensive local delicacy, were frequently filled with macaroni cheese instead. The situation even caused me a moment of minor personal embarrassment when my mother, at that time in her early 90s, insisted that it was my duty to instruct the staff of the residential home in which she was then living not to serve beef to any resident!

A condition known as Creutzfeldt-Jacob Disease (CJD), a form of rapidly ­progressive encephalopathy affecting older people, had for many years been known to exist (Department of Health 1995). But in 1996 the Edinburgh unit began to receive notifications of a similar clinical syndrome affecting much younger persons and running a more protracted course. Investigation led to the conclusion that in these cases consumption of infected beef had taken place before 1988, when a ban on feeding the suspected material to cattle had been introduced. This new syndrome, quickly given the name new variant CJD (vCJD), was therefore regarded as the consequence of being exposed to the infected meat. In 1997 it was concluded that the agent which causes vCJD in human beings was the same as that which causes BSE in cattle (Department of Health 1997). This represented a bitter irony as in 1990 the Department of Health, on the basis of professional advice, had issued a statement via the Chief Medical Officer, Sir Donald Acheson himself, that “if there were any hazard for man from oral ingestion of beef or beef products, the risk would be very small indeed” (Department of Health 1990). Surveillance by the Edinburgh unit has continued and at the time of writing (October 2011) 1,647 deaths from vCJD had been notified. Various aspects of medical, surgical and dental practice have had to be modified in the light of this risk of contamination by blood or blood products. A great step ­forward was taken in February 2011 when the MRC Prion Unit at University College London announced that its scientists had developed the world’s first reliable blood test for vCJD. This should greatly aid diagnosis, and also allow screening and ­identification of carriers with great significance for ensuring the safety of blood transfusion (Edgeworth et al. 2011).

Another communicable disease was being recognised as a public health problem about this time—Hepatitis C. Although since the 1970s microbiologists had been aware of a further form of hepatitis which they were referring to as “non-A, non-B” it was not until April 1989 that the responsible virus was discovered and named Hepatitis C and even in the following year reliable tests for use on a routine basis were only just becoming available (Department of Health 1990). The virus is spread by blood-to-blood contact and in England the majority of infections are spread by sharing of needles and other “equipment” by injecting drug misusers. Sexual transmission is not considered common and specific screening of blood and blood products has cut across that particular path of transmission. The sinister aspect of hepatitis C infection is that the early stages are usually symptomless but, without treatment, about 85% go on to develop a chronic infection, with the risk of cirrhosis and a form of liver cancer. Routine screening of blood donations for anti-HCV (Hepatitis C virus) began on 1st September 1991 (Department of Health 1991).

It is interesting to note that the coming into being of the CCDC post, as an important recommendation of the Acheson Report, could hardly have made much difference with regard to the incidence of the above two infections in the early stages as they are both relatively “silent” chronic infections without readily identifiable symptoms of onset. On the other hand, both conditions obviously called for surveillance data as backcloth to measures of prevention and control, including a major component of health education.

The next communicable disease episode, however, could not have presented a greater contrast. The pandemic of Severe Acute Respiratory Syndrome (SARS), a pneumonia-like infection by a new member of the coronavirus family (World Health Organisation 2003a), was indeed severe and acute, the disease having an incubation period of no more than 10 days and a case-fatality rate of just under 10%. As coronaviruses are important pathogens of mammals and birds it was considered that these might have constituted the origin of the human infection (Fouchier et al. 2003). Although England did not have many suspected cases, it would be totally false to suggest that the country was not very heavily involved for a period of a few months. In fact, it would be fair to state that most countries in the world had their public health prevention and control systems tested to the hilt by this phenomenon, as the following account shows.

It is highly probable that the pandemic began in Guangdong Province, China, in November 2002 when a farmer was admitted to a local hospital—The First People’s Hospital at Foshan—and died. It is a matter of note that studies of samples of wild animals sold as food in the local market in Guangdong in May 2003 revealed that the SARS coronavirus could be isolated from palm civets (Wenhui et al. 2006). No notification was made to WHO by the Chinese Health Authorities at the time but Canada’s component of the WHO’s Global Outbreak and Alert Response Network picked up reports of a “flu outbreak” in China on 27 November 2002 (Heymann and Rodier 2004).

In February 2003 an American businessman was taken off a China to Singapore flight at Hanoi, Vietnam, because of what appeared to be a serious form of pneumonia. He died in hospital there and a number of medical staff who had treated him also became ill with the same disease in spite of routine hospital infection control measures. Particularly tragically, the doctor there who identified the condition and alerted WHO was himself fatally affected. This episode, understandably, received much media publicity and on 12 March 2003 WHO issued a global alert. Local transmission of the condition now known as “SARS” occurred within Canada, the United States, Ulan Bator, the Philippines, Taiwan, Vietnam and Hong Kong, in addition to spreading within many parts of China itself. Quarantine of contacts and closure of schools were introduced in many affected countries in an attempt to control further spread of SARS. On 27 March, WHO recommended screening of airline passengers for symptoms of the condition (World Health Organisation 2003b), and on 23 April advised against all but essential travel to Toronto. (This latter recommendation caused me a slight degree of apprehension as, just at that point, an old school friend emailed me from Toronto asking me to meet him at Aberdeen Airport as he was aware that I would be spending a few days there at the time of his visit. As he didn’t refer to SARS, I felt too embarrassed to raise the subject but heartily wished that he were coming from some other part of the globe, such was the fear, especially among ­doctors, of the infectiousness and lethal potential of the condition!)

The numbers involved globally were very high indeed. Between November 2002 and July 2003 there were 8,096 known cases with 774 deaths. England escaped comparatively lightly with just four suspected cases (i.e. those which met the agreed case definition), the first two having recently returned from Hong Kong and Taiwan respectively, although there was reasonably good evidence of a number of further infections which did not meet the suspect or probable case definition and, ­fortunately, were probably of minimal infectiousness (Nicoll A, 2003, Personal communication. Letter to author, 3rd November 2003).

As was mentioned earlier, although England was only minimally involved in terms of the number of suspected cases, this does not mean that the country did not play a major part in the unprecedented degree of international collaboration which the pandemic brought about, in addition to responding vigorously internally at national, regional and local level. Chapter 6 explained in detail how an excellent surveillance system had been set up and this proved its value throughout this ­episode in the way in which suspected cases and their contacts were promptly and ­effectively dealt with. The national reference laboratory at Colindale played a major role in this. The system demonstrated considerable “surge capacity” to deal with the ­additional pressures but only at the expense of some staff having to be diverted from their usual tasks. [The question of surge capacity for communicable disease control, especially at local level, has represented a matter of major concern since the abolition of the post of Medical Officer of Health in 1974. The point is one of such ­fundamental importance that it is dealt with in some detail in Chap. 8.]

The situation in England was made more challenging by the fact that the ­pandemic occurred at a time of great reorganisation in the services provided for communicable disease control. A major change was the establishment of the Health Protection Agency (HPA) on 1 April 2003—in the middle of the whole episode—bringing into one central Agency a number of health protection bodies including the PHLS and incorporating local CCDCs into its local health protection teams. From that point onwards, the HPA coordinated the operational public health response. This response included an on-going process of contingency planning in the belief that SARS might not have “gone away” (Harper 2004). This view, i.e. the need to avoid any complacency, was also expressed by the then Director of the PHLS Communicable Disease Control Centre, Professor Angus Nicoll, who felt that one of the factors which had allowed England to get its preparations in place in good time was the experience gained in such places as Hong Kong and Canada which had been among the ­countries earliest affected (Nicoll A, 2003, Personal communication. Letter to author, 3rd November 2003).

The background to the creation of the HPA was as follows. A 2002 Government report, Getting Ahead of the Curve: a strategy for combating infectious diseases (including other aspects of health protection), declared the intention to create a new Health Protection Agency which would provide a more integrated approach to all aspects of health protection: against infectious diseases, together with chemical and radiological hazards. The new Agency was therefore to absorb the existing functions of the PHLS (including its Communicable Disease Surveillance Centre), the Centre for Applied Microbiological Research, the National Focus for Chemical Incidents, and the National Radiological Protection Board. As mentioned above, the new Agency became operational on I April 2003 when the PHLS was already heavily engaged in dealing with the national response to the SARS pandemic—“a ­baptism of fire for the entire Agency,” as it was described (Harper 2004).

The role of the Agency is to identify and respond to health hazards and ­emergencies caused by infectious disease, hazardous chemicals, poisons or radiation. It works at international, national, regional and local levels. Although set up by the Government, the Agency is independent and provides whatever advice and information is necessary to protect people’s health. Local and Regional Agency services work alongside the NHS. In addition to the Centre for Infection (which absorbed the functions of the PHLS, including its Communicable Disease Control Centre), there are Centres for Emergency Preparedness and Response, Radiation, Chemical and Environmental Hazards, and the National Institute for Biological Standards and Controls. The Centre for Infections at Colindale is the base for national communicable disease surveillance and specialist microbiology (Health Protection Agency website 2010a).

It was not very long after the end of the SARS episode that concern began to be focused on the possibility of another pandemic—influenza. Ever since the first ­pandemic of AH1N1 infection in 1918/1919, referred to as “Spanish influenza,” there had been fears of a further world-wide recurrence of this or a similar virus. But it was not until 1957 that it appeared that such fears might be justified when the AH2N2 variant emerged, referred to as “Asian influenza” because of its origin in the Far East in May of that year. Cases began to occur in England as early as the following month and during the winter of 1957/1958 there were approximately 50,000 influenza-related deaths (Donaldson and Scally 2009). I remember the situation well as a vaccine was offered from November 1957. My fiancée and I had the first of the two doses, but the second was due only a few days before we were due to be married and so we declined the follow-up, not wishing to be indisposed on our honeymoon!

A third pandemic began in 1968, yet again in the Far East, and was referred to as Hong Kong influenza (or, as the media called it, “Mao flu,” implying perhaps a political in additional to an epidemiological threat!). The virus on this occasion was AH3N2. I also recall this episode vividly but for a very different reason: in my capacity of Deputy Medical Officer of Health of the City of Coventry, I had to first convince the City Council’s Finance Committee of the necessity to release funds to purchase the relevant vaccine and then to persuade those in the front-line community and public services of the advisability of being vaccinated. This meant not only the doctors, health visitors, district nurses and midwives, but also many others such as those who manned the City’s buses, the Fire and Rescue Services. I have clear memories of many busy evenings in the Central Fire Station’s main hall, vaccinating large groups as they came off duty or shift. In the event, this particular epidemic was not quite as serious as might have been expected, at least as far as England was concerned.

A “false alarm” occurred in 1976 in the United States. In January of that year a virus identified as “swine flu,” isolated from four sick Army recruits at Fort Dix, New Jersey was thought to be similar to the AH1N1 organism responsible for the 1918 pandemic. By March, CDC Atlanta and he United States Public Health Service had been able to persuade President Ford to approve funds of $135 million for the preparation of an effective vaccine to be ready for mass use by October. Within 10 weeks almost 50 million Americans had been vaccinated. Unfortunately, by ­mid-December, a causal relationship had been identified between the vaccine and a small number of cases of Guillain-Barré paralysis and, as there had been no further cases of “swine flu” since the ­original four, the vaccination campaign was suspended (Mullan 1989).

Obviously, after three pandemics in just half a century, epidemiologists were convinced that it would be only a matter of time before the next threat appeared and the Department of Health in London continued to develop contingency plans for such an event. In fact, in the Annual Report of the Chief Medical Officer for 2005, he stated: “When the influenza pandemic arrives, health care facilities will be under enormous pressure and will need to be targeted at those most in need. The pandemic will pose a unique challenge to NHS emergency planning in the modern era. Clear national policies and strong, well-rehearsed local plans will be the keys to ­mitigating its effects.”

It had been felt that the danger signals for a fourth pandemic were emerging when, towards the end of 2003, just a few months after the end of the SARS episode, cases of AH5N1 influenza were reported in Vietnam and China with a number of deaths, all in individuals who were working closely with poultry, and the terms Avian Influenza and “bird flu” came Into general use. (Originally it had been considered that Avian influenza did not normally infect species other than birds and pigs but there had been one episode in Hong Kong in 1997 in which 18 persons had been infected, with six deaths. On epidemiological grounds the virus had spread directly from birds to humans and this was confirmed by genetic studies.) One of the outbreaks in late 2003 was reported in a letter from Chinese scientists to the June 2004 issue of the New England Journal of Medicine and the Chinese Health Ministry informed WHO of the confirmation of this by laboratory tests. Although there was no convincing evidence that human to human spread was happening—all the cases had had very close contact with poultry—the memory of SARS was sufficiently recent to cause a certain amount of global anxiety; epidemiologists world-wide were expressing fears that the virus might mutate into a form which could allow human to human transmission. What actually transpired was that AH5N1 infections appeared to smoulder on, with poultry-associated cases and deaths mainly in the Far East. The media always seemed to be hinting that a human pandemic might be imminent and much publicity was given to the only two anti-virals which were considered to have some action against the virus, Tamiflu and Relenza. Aware of the anxieties being quietly expressed in the autumn of 2005 by some of my epidemio­logy colleagues, I gave way to family pressure and obtained a supply of Tamiflu and a box of face masks which, fortunately, never had to be used.

In contrast to Avian Influenza, so-called “Swine Influenza” when it arrived in April 2009 seemed, on the face of it, to present a real threat to humans. This was a global outbreak of a new strain of AH1N1 influenza virus which appeared to have resulted when a previous triple reassortment of bird, pig and human influenza viruses further combined with a Eurasian pig influenza (Hellerman 2009). The outbreak began in Veracruz, Mexico but continued to spread globally. The case-fatality rate appeared high initially but this was later put in perspective as the view generally developed that many milder cases had not come within any medical ambit and that the actual case denominator was very much larger than had originally been realised. In June 2009 both WHO and CDC Atlanta declared this outbreak a pandemic. In spite of its unofficial label, the AH1N1 infection was not spread by eating pork, but from person to person by droplet spray.

The first cases in England were reported on the 27th April 2009. Children were most affected and adults over 50 years of age had much lower attack rates. There were two waves of activity separated by the closure of schools over the summer, the first in mid/late July 2009 and the second peaking in mid-October 2009. The West Midlands and the London area were most affected in the early stages. The case-fatality rate was estimated to be 0.04%, most deaths being in persons under the age of 65 years. Anti-viral drugs were offered to those with suggestive symptoms and these significantly reduced the length of the illness. They were also valuable in reducing the incidence of secondary cases in household contacts. A vaccine programme was begun in October 2009 initially for front-line health care workers, pregnant women, and those aged between 6 months and 65 years who were in the defined clinical at-risk groups. From December 2009 the vaccine was also offered to healthy children between 6 months and 5 years of age (Health Protection Agency Website 2010b). The pandemic began to taper off in November 2009 and by May 2010 the number of cases was in steep decline. On August 10th 2010, WHO declared that the pandemic was over.

The pandemic presented critical emergency planning, response and recovery challenges at both national and local levels. The public health response required the assessment of hundreds of thousands of cases nationally, the delivery of enormous quantities of anti-viral medicines and a massive vaccination programme, while planning for the anticipated loss of key services and staff through illness. Throughout the progress of the pandemic both public and medical perceptions shifted from near panic to indifference as all struggled to respond to changing and sometimes conflicting public communications. In Sandwell, West Midlands, for example, the scale of the required response regarding both anti-virals and vaccination meant that large numbers of staff of the Primary Health Care Trust had to be rapidly redeployed from their normal duties, being joined by staff of the Local Authority and other local ­bodies in developments which rapidly took on a “febrile” tempo (Saunders 2011).

Unfortunately the infection re-emerged just before Christmas 2010, leaving ­doctors puzzled, but after seeming to settle down, as Watson and Pebody (2011) have pointed out, the recent 2010/2011 seasonal activity in the UK and other European countries has shown that the threat of pandemic infection has not disappeared. The whole episode has not passed without a certain amount of critical comment. For example, the European Parliament in January 2011 launched a strong attack on the World Health Organisation’s handling of the situation by its (alleged) distortion of the term “pandemic” and setting off a world-wide false alarm and thus giving rise to disproportionate public health decisions by European Union countries (Watson 2011).

It is with some regret that I have to end this continuous narrative on a note of considerable uncertainty with regard to the future arrangements for the prevention and control of communicable diseases in England. The new Coalition Government, elected in May 2010, has declared its intention, in a White Paper Equity and Excellence: Liberating the NHS, followed by the Health and Social Care Bill (at present before Parliament), to undertake a radical reform of the NHS, abolishing the existing management structures at both strategic and local levels and making ­general practitioner consortia responsible for commissioning the majority of health services for their local communities. Commissioning is intended to be based on knowledge of local needs, thus theoretically avoiding previous problems with over-­commissioning of services and subsequent financial waste (Vaid 2010).

The Government has also made it clear, in a further White Paper, Healthy Lives, Healthy People, that it intends to dismantle the HPA and place the responsibility for the majority of public health services on Local Authorities, to which Directors of Public Health are now to be accountable. (As an ex-Medical Officer of Health, accountable at the time to Coventry City Council, I cannot but be reminded of the situation prior to April 1974. Plus ça change….!)The rationale behind this change is that it is believed that such a service will give more power to local people over their health in tackling such problems as obesity, alcohol dependence, smoking, sexually-transmitted infections, and poor mental health. Furthermore, the expectation is that Directors of Public Health will be able to champion cooperation at local level so that health issues are considered alongside services such as housing, ­transport and education, thus creating an environment conducive to healthy choices. The exercise of such choices will clearly be important as the Government has made it clear that it will “stay out” of people’s everyday lives wherever possible, and instead will “nudge” people in the direction of choosing healthily. There is currently much professional and public discussion concerning the meaning of this word (new, in this context!) At a national level, a new core public health service—Public Health England—is to combine experts from public health bodies such as the Health Protection Agency and the National Treatment Agency as a part of the Department of Health itself. Public Health England will actually be accountable to the Head of the Civil Service—and not to the Chief Medical Officer; this means that, for the first time, the CMO will not be responsible for public health in England (Field 2011), a somewhat paradoxical situation as the CMO’s Annual Report was traditionally ­entitled On The State Of The Public Health. The Local Health Protection Units, broadly similar to those currently provided by the Health Protection Agency, are now to operate within its framework. It is presumed that CCDCs will work within these Units. Public Health England is also to be the appointing agency for Director of Public Health posts and the national source of professional support to such posts in their Local Authority setting.