The history and development of public health in developing countries
Than Sein and Uton Muchtar Rafei
Early public health
Empirical public health
Colonial public health
Foundation of international public health
Implementation of the International Health Institution
Science-oriented public health
Development of basic health services
Disease prevention and control campaigns
Public health successes
Gaps between the ‘haves’ and the ‘have-nots’
Political public health
Socialized public health
The Health for All movement
Eradication and elimination of disease
Emerging and re-emerging diseases
Viral hepatitis B
Diarrhoeal diseases and acute respiratory infections
Health systems reforms
New public health
Public health transition
Globalization of public health
New facts of socialized public health
Historical reviews of the development of medicine and public health form the basis of our knowledge. Such reviews may provide valuable insights that can contribute to the solution of present and future health problems. Thus, it is useful to regard the evolution of public health from the earliest times as an essential element in modern public health education. Barton (1979) described the development of the health sciences over five major areas: empirical health, basic science, clinical science, public health science, and political science. Ko Ko U (1986) set the tone for integrated health in Public Health Myths, Mysticism and Reality, describing the progress of health development across each of these areas. A study by the Institute of Medicine in the United States (1988) indicated that there had been a growing demand for public health, as a profession, as a governmental activity, or as a commitment to society. The study also indicated that public health was not clearly defined, fully understood, or adequately supported. Public health, as it was expressed, was needed to focus on improving conditions that had a bearing on the health of the people. The goals of public health in broad terms should be to identify problems that affect entire communities or populations, to marshal support to address these problems, and to ensure that the solutions are implemented. Frenk (1993), Curtis and Taket (1996), Detels and Breslow (1997), and many others later defined both the national and international perspectives of the current and future scope and concerns in public health. A series of national, regional, and international conferences, seminars, and workshops have been organized by the World Health Organization (WHO), and recently many other international bodies have been organized on the role of public health in health development in the twenty-first century.
Detels and Breslow (1997) defined public health in simple terms as the process of mobilizing local, state, national, and international resources to ensure the conditions in which people can be healthy. Historically, public health efforts meant health development to be undertaken by the government as a public sector activity. Public health action was sometimes seen as health interventions addressing more than one individual, such as community hygiene, sanitation, and water supply, health education, maternal and child health care, immunization and nutrition promotion, or disease control activities. The people who carried out such measures were known as public health workers. Commonly, public health covered promotive, preventive, curative, and rehabilitative health measures. Most of the steps previously undertaken by governments included actions to promote and protect the health of the people through segregation, quarantine, prohibition, and other sanitary and hygienic practices that were considered to be public health measures. Disease and environmental control measures or food and drug control carried out by government agencies are considered to be public health activities. Similarly, necessary legislative acts and bylaws proclaimed to control various health problems have been regarded as public health measures. Later, the connotation of the term ‘public’ was widened to encompass the involvement of people together with the government in health development efforts. This concept of a wider public role in health development has become more prevalent today when both non-communicable and infectious diseases present the major public health problems. Without the full involvement of the population, the control of these diseases becomes ineffective.
Public health problems were historically known as the diseases or conditions that particularly affected large numbers of people leading to either death or disability. They were usually socially interpreted, but not all people saw the same reality. Diseases like malaria, tuberculosis, cholera, and HIV/AIDS, respiratory infections, injuries and trauma, cancer, and problems such as maternal and infant deaths have been identified as major public health issues in the 1990s, likely to continue into this new millennium. In the late 1990s, the concepts of the essential public health functions emerged within the context of health sector reforms undertaken in many developing countries (Bettcher et al. 1998). The functions of public health should be understood as comprehensive as they are linked to each other. The essence of public health is that it deals with the health of the population in its totality.
This chapter traces the historical development of comprehensive public health in the context of over 140 developing countries, of which about one-third are classified as the ‘least developed’. The first part of the chapter deals with the development of public health before the twentieth century, especially how public health in the former colonial countries was developed and the impact on health of globalization of trade during the colonial era. It also documents the efforts of developing countries leading to the establishment of an international health organization. The second part presents the attempts made by the developing countries, as soon as they achieved independence from colonial rule. The chapter highlights how these countries tried to cope with the prevailing high morbidity and mortality conditions, including their major public health achievements and failures in preventing and controlling communicable and non-communicable diseases which are of global importance.
The third part of the chapter deals with the change in the concept of public health from narrow disease control interventions to multisectoral approaches. This coincides with the period when most countries joined the Health for All movement, which adopted the primary health care approach for reaching the universal goal of health for all by 2000. At the end of the twentieth century, developing countries were moving towards the new era of public health development, but suffering the double burden of diseases—both infectious and chronic problems—while their health systems tried to function with limited investments from both internal and external sources. The new phenomenon of globalization has made more complex challenges to the development of public health in this new millennium. The chapter also covers different phases of public health in different parts of the world. Examples are presented of how communities and countries have mobilized themselves to ensure the health and prosperity of their people. The success of public health measures depends on adhering to the basic principles of equity, social justice, and partnerships.
Early public health
Empirical public health
Since ancient times, human life has been threatened with diseases of all kinds. Historical records from the Egyptian, Roman, Greek, Indian, and Mayan civilizations reveal the dreadful nature of infectious diseases and how they were overcome. The teachings of Lord Buddha, as well as the Bible, the Koran, and Judaic literature, covered various aspects of personal hygiene and other public health practices, including civic duties. Sanitation measures were enforced through royal decrees (Sigerist 1951).
Diseases like syphilis, malaria, leprosy, tuberculosis, smallpox, measles, plague, and cholera were rampant in all parts of the world for many centuries. Most diseases occurred locally, killing thousands in certain years. The concept of ‘disease’ had been postulated within the limited ‘scientific’ knowledge available. Traditional medicine focused on management of illnesses at the individual rather than the public level. The spread of diseases due to contact amongst people or due to hereditary transmission was, however, recognized centuries ago. The treatise on economics and government by Kautilya (around 300 BC), during the early Maurya dynasty in India, showed how a king ensured the health and prosperity of his subjects through various measures and regulations. Heavy punishments were imposed on those guilty of adulteration of food, sexual violence, or of littering the streets. The royal proclamation also prescribed rules establishing brothels and entertainment centres (Rangarajan 1992). Quarantine and prohibition were major measures used historically to protect the transmission of diseases and remain as public health measures used by governments in many countries.
Sir Jeremy Bentham, Thomas Southwood Smith, Edwin Chadwick, Sir John Simon, John Snow, and William Farr stimulated public health conscience and principles in the early eighteenth century. Victorian sanitarians of the pre-Pasteur era mainly conformed to the theory that diseases related to decaying organic matter and its vaporous emanations or ‘miasma’ (Paneth et al. 1998). Max von Pettenkoffer, one of the pioneers of public hygiene, also developed modern public health principles in the same period. He believed that an agent in cholera evacuations became ineffective only after it had spent an extended period in the earth and entered the ground water. He experimented by attempting to drink by himself a glass of water containing rice-water evacuations of a cholera patient, and showed no major effect (Guthrie 1946). However, most historians of health development have related the development of modern public health to the advent of the basic medical sciences in the nineteenth century. The discovery of the microscope, animal cells and bacteria, chemicals and other substances, and other scientific knowledge and skills, including those related to the basic statistical and epidemiological methods, had provided the basis for scientific explanation of the causes of diseases and illnesses as well as their mode of transmission. The Industrial Revolution in the twentieth century encouraged social interest in the prevention and control of diseases. With increasing ability to identify the causal factors for disease, the interest in social, environmental, and political aspects of diseases and their prevention grew tremendously.
Owing to the scarcity of records, the health situation of the world in the early centuries is little known. However, a few records available from Asia, Europe, and the Middle East have made it possible to determine how diseases occurred and spread around the world, and what the early efforts were to control them. With the expansion of commerce, diseases spread from one area to other regions along the trade routes. For example, epidemics of smallpox and measles were reported in China between AD 37 and AD 653. These were due to importation from the northwest regions through migration. One Chinese record showed that around AD 640, bubonic plague was common in Kwangtung but rare in the inner provinces. The global pandemic of plague in the mid-fourteenth century, usually referred to as the ‘Black Death’, took the lives of 25 million people in Europe alone. Plague remained endemic in many countries and also spread both east and west causing millions of deaths (McNeill 1977).
Colonial public health
Trading around the world during the eighteenth and nineteenth centuries for the exploration and exploitation of natural resources led to the discovery of new territories in different parts of the world. Europeans and Americans were engaged in intense rivalry with each other for colonial possessions. In order to expand their control, these colonial powers made massive shifts of people from one continent to another, using both military and economic force. Thousands of Africans and Asians were brought to the Americas during the eighteenth and nineteenth centuries to work on the plantations in the southern part of the present-day United States or at the railway construction sites in the western or northern parts of the country. Later, they were brought to the islands of the West Indies and to South and Central Americas, and made to work in large plantations as well as in mining industries. Similarly, large numbers of people from the Indian subcontinent were shipped to Africa and other parts of Asia and the Pacific Islands. The colonials established their own administrative, legal, and medical care systems with varying degrees of autonomy and authority. The American government established a military medical corps in the nineteenth century to protect the American army, which had been expanding to new territories, as well as to protect the American commercial establishments in Mexico and other Latin American countries. Similarly, the Dutch, Portuguese, British, French, and Spanish colonial rulers first established a series of hospitals and dispensaries amongst the army establishments and later in other commercial places. The Indian Medical Service in British India and the Gold Coast Medical Department in Ghana are good examples of this (Harrison 1994; Mills 1998). Medical teams were brought in from the home countries or hired from other nations.
To protect the health of their own people and the workers, colonial rulers established laws similar to those in their home countries. Specific public health legislation varied with each colonial power, but definite imprints of them still exist. For instance, the Public Health Acts, Local Government Act, Civil Registration Act, Factory Acts, Food Adulteration Act, Vaccination Act, and Contagious Diseases Acts have remained in force for many decades. Some are in place today in many countries in Asia, the Pacific, the Americas, and Africa, where the British, Spanish, French, American, or Dutch colonies existed. European countries adopted Bismarck’s model of national social health insurance scheme, which later spread to other countries, especially in the Americas and Asia. The public health measures enforced under those public laws and regulations made a greater impact in these countries. In most countries, expatriates managed administrative and commercial activities. Some colonial powers introduced their social and cultural identity, mainly through religious groups and their educational systems. Most of the educational systems were designed to meet the administrative and commercial interests of the colonial powers. These systems also created a supply of administrative and clerical staff for assisting in the management, administration, and commercial activities of the colonial rulers (Jaggi 1979b).
European and American religious missionaries also embarked on expeditions around the world along with the colonial powers. Many of them, having allopathic medical backgrounds, established ‘Western’ medical care institutions as well as general educational systems, including nursing and medical schools. These missionaries established medical clinics or dispensaries at first and, later, hospitals in the colonial countries. The introduction of allopathic and homoeopathic medicines by these missionaries resulted in the first exposure and increasing access by people in these countries to so-called ‘Western’ medical care. Clinical and practical training for the management of tropical diseases and the prevention and control of such diseases became major subjects for training medical professionals and public health workers who had to serve in tropical countries (Uragoda 1987; Harrison 1994). The late eighteenth century saw an increasing momentum in public health education with the establishment of undergraduate and postgraduate courses designed specifically for public health, first in the home countries and later in the colonies. Pioneer public health schools were established in the colonial countries in the late nineteenth and early twentieth centuries, in order to function as centres for the development of public health policies, and to train people who had to serve in the tropics. These schools not only provided academic teaching, but also conducted research in tropical diseases. Discoveries of causative organisms and ways of stopping transmission of malaria and sleeping sickness, through clinical and public health intervention research studies initiated by these schools, led to the application and adoption of preventive and curative measures. Through the support of the Rockefeller Foundation, the London School of Tropical Medicine was transformed into the London School of Hygiene and Tropical Medicine in 1920, expanding the scope of research and teaching on tropical medicine, medical statistics, and epidemiology (Wilkinson and Power 1998). Spain also established its National School of Public Health in 1924 and introduced a public health component into its comprehensive rural medical care network.
Similar public health educational and research institutions, such as the Calcutta School of Tropical Medicine and Hygiene and the All-India Institute of Hygiene and Public Health, also in Calcutta, were established in British India in the early 1920s in order to carry out public health training and research in the region. The Haffkine Institute in Mumbai (Bombay), the King Institute of Preventive Medicine in Chennai (Madras), the Central Vaccine Research Institute in Kasauli, the National Institute of Communicable Diseases in Delhi (previously known as the Malaria Institute of India), the Indian Research Fund Association (later redesignated as the Indian Council of Medical Research), and the National Institute of Nutrition in Hyderabad were the exemplary research and teaching institutions established at that time (Jaggi 1979a). Similar educational and research institutions were established in the colonial and other independent countries, such as Thailand, the Philippines, Malaysia, Singapore, Hong Kong, Indonesia, Sri Lanka, Ghana, Nigeria, South Africa, Mexico, Brazil, and so on. These institutions of public health education worked closely with their counterparts in Western nations in order to strengthen the knowledge on disease causation, mainly with the support of the Rockefeller Foundation and colonial governments. These institutions also helped their own countries to improve the capacity of local public health administrators.
However, the actual development of public health and medical care services for the general public remained rudimentary in these former colonial countries and territories. Moving millions of people to totally unfamiliar areas had led to a high incidence of death and disability. These displaced people frequently died due to smallpox, malaria, yellow fever, typhus, typhoid, and cholera, or were disabled due to yaws, leprosy, and syphilis. Infectious diseases posed formidable obstacles in the colonization of new areas. The development of science and technology in the early twentieth century, especially in the area of physics, microbiology, biochemistry, pharmacology, and other diagnostics led to an explosion of its application in public health practices. Radio and telephone also facilitated communication amongst people. Some newspapers and magazines had a global reach. The colonials launched a major international public health initiative in the prevention and control of smallpox through vaccination, first amongst the people working in the colonial administration and later amongst the workers employed. Another notable experience was the massive community health development projects for the prevention and control of communicable diseases, mainly initiated through the support of the Rockefeller Foundation in a few Asian and Latin American countries. The attempt was aimed at developing pilot disease control projects that could be replicated in other parts of the world (Foster and Anderson 1978).
Foundation of international public health
International public health efforts actually intensified in the early eighteenth century when European nations applied protective legislative measures to prevent importation of epidemic diseases by trading ships. It became obligatory for all incoming ships, prior to unloading passengers and cargo, to follow strict quarantine measures. Later, business interests in these countries clashed with concern by governments for the health of their own population. The First International Sanitary Conference, organized by 12 European nations in Paris in 1851, tried to work out solutions for the ‘Defense of Europe’. This was the first attempt to reach a consensus on drafting international quarantine regulations (Howard-Jones 1974a).
For the next 50 years, a series of similar international sanitary conferences were held but failed to produce an international sanitary code. The reasons for delaying international consensus were partly due to the non-availability of a sound scientific basis for the prevention and control of epidemics, and partly to the vested political and commercial interests of each colonial power. The Eleventh International Sanitary Conference, held in Paris in 1903, was a major milestone in international health as it was the first international sanitary convention for the prevention and control of three epidemics: plague, cholera, and yellow fever. Based on this convention, the French Government hosted the first international health office, called L’ Office International d’ Hygiene Publique (OIHP) in 1907 in Paris. At its inception, the main objective of OIHP was to protect Europe against three notifiable diseases (Howard-Jones 1974b). Ultimately, in 1911, the tasks of OIHP were expanded as the first truly international health agency, to monitor and report the outbreaks of the three notifiable diseases, and to provide general public health information on measures taken to combat these diseases through a monthly bulletin (McNeill 1977).
Around the time of the establishment of the League of Nations, major epidemics, including the great influenza pandemic of 1918, were rampant in various parts of the world and some infectious diseases, such as cholera and plague, were threatening to become pandemics. The League had to cope with many other postwar rehabilitation problems and the Paris-based OIHP was unable to deal with such pandemics even with its originally assigned tasks. Based on the proposal of the Brazilian delegation, the League of Nations agreed, in 1920, to the establishment of an international health organization. Finally, after intensive negotiations between the League, the colonial rulers, and other countries, the League of Nations Health Organization was formed in 1923 (Howard-Jones 1977). The League of Nations Health Organization was originally assigned to handle international health matters relating to both technical assistance and clearing-house functions. The epidemiological information service of the League of Nations Health Organization was strengthened through regional bureaux in Washington, Alexandria, Singapore, and Sydney, in addition to the service provided by OIHP. A series of basic clinical and field research studies on medicine and public health were also undertaken. These were done by organizing various committees or commissions of leading public health experts in a wide range of subjects, such as malaria, tuberculosis, leprosy, maternal and child health, health systems, and medical education. In addition to its research promotion function, the League of Nations Health Organization provided technical advice as well as technical assistance to countries and promoted international medical education, including postgraduate education in public health. It also organized international health conferences, conventions, and study tours (WHO 1967).
As early as the 1930s, health administrators had expressed their concerns on the health status of mass populations, especially of those living in rural areas. The international health conferences organized by the League of Nations Health Organization in the early 1930s provided a forum for sharing experiences on public health development in the countries under colonial rule, especially those in Asia and Africa. The Intergovernmental Conference of Far-Eastern Countries on Rural Hygiene, held in 1937 at Bandoeng (Bandung), The Netherlands East Indies (Indonesia), was a cornerstone in public health and rural health development in Asia (League of Nations Health Organization 1937). The Conference, while noting the rampant conditions of communicable diseases and nutritional deficiency disorders in rural areas, studied the public health interventions of the participating countries. It also defined the central role of health in development, and emphasized the need for integrating health care and intersectoral action, which is now the current view. The countries recognized the heavy socio-economic costs of diseases. They also recognized that the adoption of possible approaches, such as bringing maternal and child health care, hospitals, and health centres nearer to the people could prevent death and disability. However, the onset of the devastating Second World War delayed effective follow-up of the Bandung Conference principles. Many developing countries became battlefields. These countries experienced destruction, destitution, and disease, as well as human misery and suffering, with a very heavy death toll. In addition, there were a series of epidemics of smallpox, cholera, typhus, and malaria. Large displacements of people and the existence of very little or no public health infrastructure or public utility distribution systems resulted in more epidemics. The situation was further accentuated by famines, which took many lives.
Implementation of the International Health Institution
The spirit of international solidarity, peace, security, and tranquillity, immediately after the Second World War, led to the creation of intergovernmental organizations like the United Nations and its specialized agencies. The original draft of the United Nations Charter did not include health. The Brazilian and Chinese delegations, however, submitted a joint declaration to the United Nations to include health in its Charter. They also called for an international conference whose purpose was to foster consensus to establish an international health organization, and to bring this organization under the aegis of the United Nations Economic and Social Council. With the unanimous approval at the first United Nations General Assembly, a landmark international health conference was held in New York in June and July 1946. At this conference, a total of 61 nations, many of which were still under colonial rule, approved the Constitution of the WHO on 22 July 1946. This initiated the establishment of the WHO as a specialized agency of the United Nations. After ratification by the member governments, the WHO Constitution came into force on 7 April 1948, and the WHO officially came into being. Attainment by all people of the highest possible level of health was its constitutional mandate. The WHO’s main functional roles are directing and co-ordinating international health work and providing advice and advocacy on international health development. The WHO is also given authority to adopt international regulations, to set international standards for biological and pharmaceutical agents, as well as other diagnostic procedures and products, and to adopt international conventions and agreements (WHO 1992). Since then, the membership of the WHO has grown to over 190 countries and territories. The WHO has worked with great harmony for over 50 years, through its six regional organizations and its headquarters, as a single international health organization.
Science-oriented public health
Development of basic health services
Former colonial countries saw the end of the Second World War as the beginning of the end of colonial rule. They all hoped for national development and believed that the period would bring peace and relief from suffering and shortages, through liberation from colonialism. There were strong nationalist movements and political agitations in all countries, preventing the reimposition of colonial rule. Within a few years, many countries achieved independence. They all started reconstruction work for immediate economic growth and social development, to catch up with the technological advances in the colonial powers.
A few countries in Asia, the Pacific, and Africa entered the post-Second World War period in a relatively calm and favourable economic position for reconstruction and rehabilitation. A few others, however, were challenged by their own internal ethnic conflicts. Many developed countries demonstrated special consideration for the welfare and economic development of their former colonies. During the first few decades after independence, developed countries had assisted newly independent developing countries, especially those devastated by the war, through multilateral and bilateral aid programmes in order to support reconstruction and rehabilitation. The United Nations General Assembly launched a programme of international economic co-operation in 1961. This programme, known as ‘the United Nations Development Decade’, was aimed at promoting self-sustaining growth and social advancement. While the countries aimed for sustainable development, the tensions and turmoil of another war, the Cold War, gripped the world. The separation of the Communist blocs (East Europe) from the capitalist states (Western Europe and the United States) created an environment marked by political and social tensions as well as confrontations and conflicts.
These early days of the reconstruction period were termed as the age of contradiction and opportunity. It was a time of increasing affluence in the developed countries, in stark contrast to the relentless march of poverty amongst the less fortunate majority in the rest of the world. The period was also termed as the age of opportunity that saw remarkable scientific and technological advancements which opened up limitless vistas and unlimited possibilities for solving the age-old problems of poverty and disease (Gunaratne 1977). The various inventions and innovations during and following the Second World War provided tremendous impetus for the application of science and technology. These included the jet aircraft, microwave instruments, radar, and other telecommunication facilities, including satellites. The discovery and mass production of quinine, dichlorodiphenyltrichloroethane (DDT), penicillin, and sulphonamides, the development of newer and effective vaccines and other drugs to prevent and control communicable diseases, the introduction of birth-control pills and injectables, the introduction and use of computers, and the improvement in imaging technologies (X-ray and CT scanning) facilitated advanced applications in public health practices. Advances in microbiology and immunology contributed greatly to the development of vaccines and diagnostic technologies. An outstanding achievement in the field of food and nutrition was the virtual disappearance of large-scale famines from many developing countries. Timely intervention of Green Revolution initiatives in the 1960s, in order to produce high-yielding varieties of grains with higher standards of farming techniques and good seeds, enhanced agricultural output, promoted self-sufficiency, and increased exports.
After gaining independence, many countries adopted ambitious plans for socio-economic development, including health. Health-care facilities, however, were almost non-existent in the post-Second World War period. There were a few professionals for health care and most were expatriates. Many countries thus initiated reviews of their national health situations and formulated long-term development plans. In India, the Bhore Committee was established in 1945 to review the health situation and to recommend improvements in the Indian health system. In Myanmar, the Sorrenta Villa Plan in 1947 and the Pyidawtha Plan in 1950 were drawn up for achieving rapid socio-economic growth, including the expansion of health and education immediately after the war. Similar socio-economic plans were initiated by other developing countries.
Regional co-operation for socio-economic and cultural development was also sought in order to increase intercountry collaboration. For example, the Colombo Plan for Cooperative Economic and Social Development in Asia and the Pacific was conceived at the Commonwealth Conference, held in Sri Lanka, in January 1950. The Colombo Plan proved to be a valuable source of technical and financial assistance to the participating countries, in the area of economic and social development and national capacity building, including health. During the same period, different bodies for regional social and economic co-operation were established in quick succession, such as the Council for Mutual Economic Assistance, the Commonwealth Association, and the Common Markets for Central America and Caribbean. Similarly, in the 1980s, the Association of Southeast Asian Nations, the South Asian Association for Regional Co-operation, and other regional political, social, and economic co-operation organizations were established. These regional political and economic groupings were organized with the aim of having common markets and co-operation in socio-economic and cultural areas amongst neighbouring developing countries, sometimes in alliance with developed nations. In order to support economic development activities, regional development banks were also established. The developed countries got together and formed the Organization for Economic Co-operation and Development in 1961. Many developed countries also established their own development agencies, such as the Australian Agency for International Development, the Danish International Development Agency, the German Agency for Technical Co-operation, the Japan International Co-operation Agency, the Norwegian Agency for International Development, the Overseas Development Administration of the United Kingdom (later renamed as the Department for International Development), and the United States Agency for International Development.
Immediately after the Second World War—with advice and support from the WHO, the United Nations International Children’s Emergency Fund (UNICEF), other United Nations agencies, and multilateral and other bilateral donors—developing countries started building up health systems infrastructures based on a network of hospitals and health centres. Minimally trained basic health workers ran these centres, especially in the rural areas. Expansion of basic health services was made through national public health projects on maternal and child health, school health, environmental sanitation, nutrition, and so on. During the early 1950s most countries adopted the Beveridge model of national health and social welfare policy and they initiated ‘free’ health services for all. Health-care facilities such as hospitals, health centres, or dispensaries, managed by medical doctors, were very few and were mainly concentrated in towns and cities. These facilities essentially were an expansion of the institutions already established during the colonial period.
Training of different categories of health auxiliaries, such as health assistants, medical assistants, health visitors, nurses, midwives, vaccinators, sanitary workers, community educators, laboratory technicians, pharmacists, and compounders, was initiated by the establishment of paramedical training institutes. These workers were deployed to serve at the various health institutions, especially those established in the rural areas. A number of rural health development and demonstration centres were also established in many countries. The Kalutara rural health project in Sri Lanka, the pilot project of the Aung San demonstration rural health unit in Myanmar, and the Singur rural health project in India were a few of them. The development of human resources for health, especially by creating medical, nursing, and other paramedical schools, was more intense between the 1950s and 1970s. Most countries did not have adequate personnel with appropriate professional training. Myanmar, Malaysia, and Sri Lanka even had to arrange for medical doctors from abroad to serve in their hospitals and educational institutions. Training institutions and related field training centres were established later to meet the local demand.
Exactly 20 years after the Bandung Conference in 1937, another international rural health conference was organized in New Delhi, India, in October 1957, this time under the auspices of the WHO. This conference reviewed and analysed a wide range of subjects: the concepts and functioning of rural health services, the training and use of multipurpose village workers, the enhancement of prevention and control of epidemic and endemic diseases, the utilization of local resources and promoting intersectoral action, and the participation of local people, including formation of village health committees. The conference recognized that the rural health centres were the basic health units where comprehensive health care could be provided to the rural population, and that they should be strengthened (WHO 1957).
Maternal and child health services in developing countries were very rudimentary during the early 1950s and the 1960s. Only a few countries had an administrative authority for maternal and child health matters at the central government level. Maternal and child health services were mainly provided through health clinics and centres employed with briefly trained midwives or auxiliary nurse–midwives. Maternal and infant mortality remained at higher levels in some countries, compared with others that had experienced high women’s status in society and better access to health care and other essential services. With advice and support from the WHO and UNICEF, developing countries started establishing separate maternal and child welfare departments in the early 1950s and the 1960s. With financial and technical inputs by the United Nations and other partners, the numbers of maternal and child health centres expanded rapidly. However, experience within a few decades showed that the vertical approach of opening maternal and child health centres and deploying maternal and child health workers alone did not serve the purpose of improving accessibility. Countries recognized the importance of providing comprehensive basic health care while focusing on the problems of mothers and children. Excessive pregnancies, inappropriate timing and spacing of pregnancies, poor health and nutritional status of the mother, inadequate care during pregnancy and childbirth, and poor educational levels of mothers were identified as the main factors responsible for most maternal and infant mortality, as well as serious morbidity amongst women and children. The United Nations International Women’s Decade (1976–1985) helped to increase awareness of these problems. The tragedy was that maternal health had received far less attention than child health (Rosenfield and Maine 1985).
Figure 1 shows the decline in trends in infant mortality rates in developing and developed countries in the last 20 years. The persisting relatively high levels of infant and maternal mortality and morbidity, and rapid population growth during the last three decades in developing countries, have added a heavy burden to improving the social and economic status of these countries. More food, more schools and health centres, and more funds from the government were needed to cope with the burden.
Fig. 1 Trends of infant mortality rates for selected countries and country groups, 1980 to 2000.
In the 1960s, most countries started adopting a comprehensive population policy in which family planning was the main strategy. Even though technology was available, only 9 per cent of women in the developing countries had access to contraceptive services in 1965. Concerted efforts of many governments with full support of international multilateral and bilateral agencies, including voluntary organizations, had resulted in the use of the contraceptive pill increasing to 50 per cent by 1990. Nonetheless, wide geographical variations still persisted. The proportion of couples using some form of contraception was approximately 75 per cent in China and East Asia, just over 50 per cent in Latin America, about 30 per cent in South Asia, and less than 15 per cent in Africa (UNICEF 1991). A majority of women in most developing countries were aware of the health risks posed by frequent pregnancies, and thus of the importance of birth spacing, but this awareness was not easily translated into action.
The Green Revolution, which began in the 1960s, impacted positively on the supply of food and nutrition. In contrast, it also created some distortion in the availability of food. Initially the Green Revolution focused less on pulses and vegetables, which were the main source of food energy in Asian and African diets, and completely neglected the development of horticulture. Thus, with seasonal shortage of vegetables, vitamin A deficiency was widespread in Asia (Gopalan 1992). Despite this drawback, the increase in food production globally resulted in substantial reductions in the number of people with inadequate access to food, from 35 per cent in the 1970s to 21 per cent in the 1990s. The average daily per capita dietary energy supplies increased from less than 9.7 kJ in 1960 to 11.4 kJ in 1990 (WHO 1998a). However, hunger and malnutrition continued to be the most devastating problems in the developing countries. Nearly 30 per cent of people in the world’s poorest nations are currently suffering from one or more of the multiple forms of malnutrition (WHO 1999b). The nutritional status of an individual depends not only on food production and the availability of food but also on food consumption. Protein-energy malnutrition, the major nutritional deficiency disease of developing countries, is due to inadequate energy intake, leading to wasting and stunting. In the 1960s, the prevailing view on protein-energy malnutrition was that it was primarily a problem of protein deficiency and that the solution lay in the distribution of protein-rich foods and protein concentrates. Research in developing countries during this period showed that protein-energy malnutrition was mainly due to calorie deficiency. It took another 10 years for this concept to be accepted by Western institutions (WHO 1986). The prevalence of underweight amongst children under 5 years of age in developing countries had declined from 46 per cent in 1975 to 31 per cent in 1995, but progress was not uniform. Estimates in 1995 indicated that 206 million children were stunted and 49 million wasted in developing countries. The continuing burden of malnutrition is rooted within poverty, underdevelopment, and inequality. However, regional trends suggest that there may be additional reasons behind protein-energy malnutrition’s persistence. The risk of being malnourished, as measured by weight, is 1.2 times higher in Asia than in Africa, and three times higher in Africa than in Latin America (WHO 1998b). Over two-thirds (72 per cent) of an estimated 206 million malnourished children, as measured in terms of height for age (stunted), live in Asia (especially southern Asia), while 25.6 per cent are found in Africa and only 2.3 per cent in Latin America (WHO 1999b).
The essential food factors, ‘vitamins’, identified by Frederick Hopkins in 1906, led the way towards nutrition research and promotion. Deficiency diseases in micronutrients, such as vitamins and minerals, had been well-known public health problems since those days and these diseases were more widespread than protein-energy malnutrition. Around 1999, an estimated 5 billion people had iron deficiency. The prevalence of anaemia was highest (over 50 per cent) in pregnant women and preschool children in developing countries. Iodine-deficiency disorders and iron-deficiency anaemia have profound effects on human health and development, including increased maternal and neonatal mortality, impaired health and development of infants and young children, limited learning capacity, impaired immune function, and reduced productive capacity. Apart from inadequate dietary intake of iron, poor bioavailability of iron from cereal-based diets and high worm infestations were mainly responsible for iron-deficiency anaemia.
Estimates in 1995 indicated that iodine-deficiency disorders were significant public health problems in 118 countries, with approximately 43 million people affected by some degree of brain damage. Fifty per cent of these handicapped people lived in South and Southeast Asia. Universal iodization of salt, advocated by the WHO and UNICEF as the main eradication strategy for the control of iodine-deficiency disorders, has been successful in most countries. For example, Bhutan, a land-locked Himalayan kingdom, witnessed a remarkable drop in the prevalence of iodine-deficiency disorders from 65 to 14 per cent within a decade, using a multisectoral approach which included adequate distribution of iodized salt, monitoring the iodine content at various points of distribution and at the consumers’ homes, and promoting social mobilization (WHO 1999a).
Hundreds of thousands of children are going blind every year as a result of vitamin A deficiency. WHO estimates in 1999 showed that around 250 million children were affected by vitamin A deficiency globally. After extensive field clinical trials in developing countries, vitamin A supplementation programmes were introduced on top of the programmes for the promotion of breast feeding and dietary improvement, with the support of United Nations and bilateral agencies. Many countries adopted the vitamin A deficiency supplementation programme as part of their national immunization programmes, though questions were raised about its technical soundness. Although there was a 30 to 60 per cent reduction from the 1985 level by 1995, vitamin A deficiency remains a public health problem in many countries (WHO 1999a).
Simple nutrition education of mothers, protection and promotion of breast feeding, dietary improvement, supplementation, and fortification, good nutrition surveillance, making vegetables accessible at affordable prices, and early diagnosis and treatment of childhood illnesses, such as measles, acute respiratory infections, and diarrhoea, were the combination of interventions identified to reduce nutrition deficiency disorders. Improving food availability and providing adequate quantities of safe and nutritious food were the long-term goals. Recognizing the magnitude of the problem, the International Conference on Nutrition, organized jointly by the WHO and the Food and Agriculture Organization in Rome in 1992, pledged to reduce severe and moderate malnutrition amongst children under 5 years of age to half of the 1990 level by the year 2000. By mid-1999, a total of 152 countries and five territories (representing 91 per cent of the WHO’s member states) had finalized or drafted their national plans of action for nutrition. Most of them continued the process of development, implementation, and monitoring of intensified strategies for the promotion of nutrition. There was a decline in the global prevalence of protein-energy malnutrition from 37.4 per cent in 1980 to 26.7 per cent in 1999 (WHO 1999b).
Promotion of environmental sanitation was also given the same priority as the development of basic health services in the early days of public health development in developing countries. Developed countries had shown from their experience in the eighteenth and nineteenth centuries that improvements in personal hygiene and environmental sanitation could prevent and control the spread of communicable diseases, especially water-borne and vector-borne diseases. Environmental sanitation work started with the deployment of public health officers, sanitary workers, inspectors, and engineers to promote the provision of safe water supply and adequate sanitation facilities, particularly in rural areas. Insufficient financial resources, inappropriate application of technology, and lack of community involvement, however, hindered progress. Municipal and local authorities could not provide adequate water supply as well as simple and proper sanitary facilities in urban or rural areas. A WHO survey, carried out in 1970 in 91 developing countries, showed that only 29 per cent of the total population had access to a safe drinking water and only 50 per cent of the urban population had access to a safe water supply. The access to adequate sanitation facilities was even worse, particularly in the rural areas (WHO 1998b).
Disease prevention and control campaigns
While attempts were made up until the 1970s to consolidate the patchy public health services and expand the basic health services facilities available, the application of scientific and technological advances in the prevention and control of communicable diseases and other public health problems began. This led to the introduction of special control campaigns in the areas of maternal and child health, nutrition, school health, environmental sanitation, occupational health, and infectious diseases. The launching of national projects for the control of major infectious diseases followed in parallel with the development and expansion of basic health services. Disease control campaigns were initiated with the new scientific and technological advancements, better understanding of disease aetiology, and the availability of appropriate tools for successful intervention. Developed nations also helped developing countries to contain infectious diseases at their source rather than through quarantine restrictions.
The WHO, UNICEF, and other United Nations agencies, together with international non-governmental organizations, such as the Rockefeller Foundation and the Save the Children Fund, promoted global disease control campaigns against major epidemic diseases. Demonstration projects were established in different corners of the world to identify appropriate disease prevention and control strategies. International experts were recruited and assigned to these projects. National experts were also trained through fellowships and on-the-job training. Many educational and research institutions were established to cope with emerging needs. The following sections highlight a few successes and failures in the prevention and control programmes against certain tropical diseases.
The control of yaws (Framboesia tropicana) is amongst the great success stories of public health. The discovery that this painful disabling disease could be controlled by arsenic and bismuth compounds accelerated the control efforts of yaws before the Second World War. Mass campaigns were carried out in different parts of the world. Nevertheless, owing to the inaccessibility of such treatment and inadequate epidemiological knowledge, the prevalence of yaws remained high in most countries in Africa, Asia, the Pacific, and Latin America until the 1950s. The WHO and UNICEF initiated a global control project against yaws as early as 1948. Mass treatment campaigns using long-acting penicillin and the increasing accessibility of the local populace to Western medical facilities transformed the situation. In the early 1950s, there were an estimated 20 million cases of yaws worldwide, half of them in Asia. Using mass treatment with penicillin as a weapon for cure, the global programme had eliminated yaws as a major public health problem by the early 1970s. Only a few scattered foci of infection persisted in Latin America, the Pacific, and South and Southeast Asia.
The spectacular success of the global yaws control programme provided a boost to other disease control campaigns. There were 110 to 115 million cases of malaria with about a million deaths in Southeast Asian countries alone during the 1950s. It was estimated that nearly 40 per cent of people in that region were at risk of contracting malaria annually, and that malaria accounted for more than 6 per cent of deaths from all causes. Economic losses due to malaria were also considerable. Most governments invested large amount of funds and human resources in its prevention and control. The discovery of quinine and aminoquinolines, as well as a better understanding of malaria transmission and vector biology at the local level, helped in improving prevention and control strategies. Prior to DDT (a long-lasting insecticide), a variety of preventive and curative measures had been tried. Bioenvironmental measures against vectors were not effective in many countries owing to the lack of adequate investment and supervision. At the same time, access to antimalaria drugs was also limited because of insufficient health-care facilities.
During the early 1950s, the WHO demonstrated in various pilot projects that residual spraying of human dwellings with DDT insecticide and the effective treatment of malaria cases with 4-aminoquinolines could interrupt the transmission of malaria effectively and efficiently. Encouraged by successful pilot projects, assured of massive support from international and bilateral agencies, and as recommended by the WHO, many governments launched large-scale national malaria control campaigns that expanded progressively in scope and coverage. National malaria institutes were also established in order to provide technical direction, research development, and training. Some countries adopted large-scale national malaria eradication campaigns using insecticide-based control strategy as the main approach.
Initially, national DDT-based malaria control or eradication programmes in most countries were a dramatic success. The reduction in the number of malaria cases was spectacular. Even from the experience of Southeast Asian countries, the malaria cases declined from over 100 million in 1950 to a low of 230 000 in 1965. The incidence of malaria in these countries also decreased from 40 000 per 100 000 population to less than 5. Some countries benefited from the transfer of technology in the domestic production of DDT. Later, the United Nations Expanded Programme of Technical Assistance (the precursor of the present-day United Nations Development Programme) joined the global effort for malaria eradication by creating a Malaria Special Account. Despite this effort, the erratic supply of DDT and inadequate planning and supervision of spraying operations led to increasing resistance amongst malaria vectors. The substantial amount of DDT and spraying equipment required for large-scale operations was beyond the means of developing countries, as were personnel and transport requirements. The technical problems of insecticide-resistant mosquitoes, and later drug resistance amongst malaria parasites, also increased. Despite this drawback, there was the beneficial effect of the use of DDT spraying on the control of a chronic systemic disease—visceral leishmaniasis (kala-azar) —which was highly endemic in Central and South Asia, Latin America, and Africa. By the early 1960s, kala-azar had completely disappeared in many areas (WHO 1992).
This oldest and widely prevalent disease was well recognized by the developing countries because of its chronic symptoms and signs. It affected individuals at their most productive period and imposed a significant social and economic burden on society. In the absence of effective control methods, people with leprosy were isolated from others. Discovery of the sulphone drug dapsone (diamino-diphenyl-sulphone or DDS) in 1943, and its availability for leprosy treatment, provided a welcome boost to leprosy control. With the active collaboration of highly endemic countries and with full support from international non-governmental organizations, the WHO and UNICEF jointly initiated the global leprosy control programme in the early 1950s. The main strategies adopted were early case detection through mass screening and treatment with dapsone, together with health education. Millions of leprosy cases were detected and given long-term dapsone therapy. Many leprosy cases were relieved of symptoms and discharged from the register after long years of treatment with dapsone. For many leprosy patients, the 1960s and subsequent decades brought hope and promise (WHO 1992).
Sexually transmitted diseases
In the early days, venereal diseases or sexually transmitted diseases, especially syphilis (lues), were regarded as the diseases of Europeans. Historic literature in Asia (India, China, and Japan) referred to syphilis as Farangi Roga (the Sanskrit for ‘foreigners’ disease’); fifteenth-century Indian literature refers to this disease as ‘Portuguese disease’. Limited information showed that sexually transmitted diseases had been rampant amongst the populace in Asia since those early periods. Treatment in those days was with mercury compounds for a prolonged period, which led to the saying, ‘Two hours with Venus and two years with Mercury’. The community intervention studies for venereal disease control with penicillin in Europe showed good results in the 1940s. The use of long-acting penicillin in 1948 as the main treatment strategy changed the outlook for the control of syphilis and other endemic treponematoses, including yaws (WHO 1992). A few pilot venereal disease control projects were initiated in developing countries in the 1950s to identify appropriate community intervention strategies. These included early case detection, effective contact tracing, early treatment, continuous surveillance, and health education. With the success of these demonstration projects, many developing countries started national venereal disease/sexually transmitted disease control projects. However, the effectiveness of these projects was short-lived. Availability of effective treatment conveyed a false sense of public health security, which ignored the increasing prevalence of prostitution, promiscuity, and homosexuality. Rapid urbanization, industrialization, and migration of labourers further contributed to promoting syphilis and other sexually transmitted diseases.
This acute bacterial disease is one of the notifiable diseases under the International Health Regulations because of its severity and easy spread. Public health experience during the eighteenth and nineteenth centuries showed that provision of adequate sanitation and safe drinking water, having adequate personal and food hygiene, and appropriate quarantine measures could contain cholera effectively. The use of new therapeutic methods and newer antibiotics, supported by adequate rehydration therapy and early case detection, demonstrated that many deaths from cholera could be averted. The development of a cholera vaccine in the 1950s also helped to arrest further spread of cholera. A new cholera biotype, eltor, appeared in 1961 in Indonesia and invaded all of Asia, creating waves of epidemic. This eltor epidemic took thousands of lives and affected millions of people for more than two decades. By the 1990s, it had spread to other regions and caused major public health crises in Africa and the Americas. Although the development of a newer cholera vaccine is in progress, a newer bacterial strain, Vibrio cholerae 0139, has been identified in India and other parts of Asia and has spread slowly to other parts of the world (WHO 1978a).
Public health successes
The greatest success achieved by developing countries in the twentieth century was the prevention and control as well as the ultimate eradication of smallpox, a dreadful communicable disease that had existed since antiquity. As a public health preventive measure, inoculation of pus taken from smallpox cases into healthy people had been practised in Asia since ancient times. This method of variolation spread to Europe and other parts of the world in the seventeenth century. It was then simplified and widely used for the prevention and control of smallpox. In 1796 Edward Jenner introduced a modified technique of variolation by using cowpox material. The scientific community in Europe slowly accepted the results of this experimentation by Jenner. Later, mass inoculation using cowpox material (called vaccination) was introduced extensively, in Britain first and then in all of Europe and other parts of the colonial world. The vaccine material, dried on thread, glass, or ivory points, was despatched to all parts of the world. Wider acceptance of mass vaccination had resulted in smallpox ceasing to be a major threat in most countries of Europe and the Americas in the early twentieth century (Henderson 1997). Although the colonial rulers introduced vaccination against smallpox, the coverage of vaccination to the population in their colonies was inadequate. Control could not be achieved on account of the variable purity and potency of the vaccine as well as poor vaccination techniques. Non-availability of large quantities of safe smallpox vaccine and the absence of appropriate methods of preserving it in hot climates were major impediments (Kiat 1978). Thus, even a century after the discovery of smallpox vaccination, the disease continued to rage throughout the developing countries.
Early in the twentieth century, the French, later followed by the Dutch, produced large quantities of freeze-dried smallpox vaccine, and these were supplied annually to their own colonies in Africa and Asia. The Lister Institute in London further improved the technology of freeze-dried vaccine production in the early 1950s. Since then, large-scale commercial production of a stable freeze-dried smallpox vaccine has spread to other developed countries and later to the newly independent developing countries. Early in the 1950s, the WHO advocated a worldwide smallpox vaccination campaign, aimed at the eradication of the disease. However, many countries were sceptical about the global campaign, mainly because of inadequate supply of vaccines and inaccessibility of health services to a large proportion of the population. After reviewing the global situation, including technical and logistic feasibility, the WHO recommended launching the Eradication of Smallpox Programme at the World Health Assembly in 1958 through resolution WHA11.54 (WHO 1964). This time, all members pledged to fight against smallpox with the guidance and supervision of the WHO. Although the tension between the two superpowers—the United States and the then USSR—was at its height around 1960, they, together with other developed countries, agreed to work with the WHO to fight smallpox (Henderson 1998).
Some countries had already achieved smallpox eradication status before 1960 through routine and extensive vaccinations backed by legislation and mass campaigns. Initially, the strategy for control was to increase routine smallpox vaccination coverage through mass campaigns. Based on the recommendations of the WHO Expert Committee on Smallpox, many developing countries resolved to eradicate smallpox with intensified mass vaccination as the main strategy (WHO 1964). Beginning in 1960, mass smallpox vaccination campaigns were launched in all endemic countries. The WHO, with the support of developed countries, helped them by extending technical assistance and logistic support. In Myanmar, the 3-year rounds of primary vaccination strategy, alternating mass vaccination in one-third of the country in each round, had been used to cover the entire population without heavy investment. The introduction of the jet injector and bifurcated needles for mass vaccination had a great impact on expanding coverage. Indonesia succeeded in eradicating smallpox using the mass vaccination strategy. Its experience of having a special programme of village-to-village searches for cases, using the smallpox recognition cards and providing rewards for reporting and educating the people, made a significant impact on smallpox eradication efforts in Asia as well as globally. Other countries adopted similar approaches in order to intensify their efforts (WHO 1978a). The experience of the smallpox eradication campaigns in Western and Central Africa in the mid-1960s showed that smallpox epidemics could be successfully contained, through a surveillance–containment strategy (active case detection and mass vaccination around cases), even when the incidence was high and relatively few people were vaccinated (Foege et al. 1971; Fenner et al. 1988).
India launched a massive public health campaign called Operation Smallpox Zero in the early 1970s. The last case of smallpox occurred in India in May 1975 (Basu et al. 1979). Bangladesh, Bhutan, and Nepal also launched similar ‘zero-transmission-targeted’ eradication campaigns and, after 1975, reported no cases. The last case of smallpox in Asia was reported in Bangladesh in October 1975. The last cases of smallpox in Africa were reported in the late 1970s, mainly in the eastern parts. The last naturally acquired human case of smallpox in the world was reported in Somalia in October 1977. The global International Commission on Smallpox Eradication gave its final report to the 33rd World Health Assembly in May 1980. Based on the findings of the Commission, the World Health Assembly made a declaration that the world was free from natural transmission of smallpox. This was the most spectacular public health achievement in the twentieth century (WHO 1992).
Gaps between the ‘haves’ and the ‘have-nots’
Great efforts made by governments and international organizations had helped in reducing the burden of many epidemic and endemic diseases, and strengthened the health-care delivery systems. Many diseases ceased to be major public health problems globally. However, the Joint WHO/UNICEF worldwide empirical study in 1975 (Djukanovic and Mach 1975) showed that approximately two-thirds of the population in the developing countries did not have reasonable access to any permanent form of health care. Health-care facilities were mostly concentrated in urban areas. Poor organization and management of the existing health facilities further compounded the situation. Since the educational institutions relevant for health workers were not properly developed, human resources for hospitals and health centres had to rely on poorly qualified people or foreign health workers. Professionals graduating from local educational institutes were not eager to work in rural areas. These professionals sometimes opposed new types of paramedical health workers on the grounds that providing medical care was too important and too complex to be left in the hands of less trained or differently trained personnel, and that it would be dangerous to do so. Inadequacy and maldistribution of resources for health services was another obstacle. Government investment in health care remained low for decades, compared with other sectors. Loans or grants from foreign sources had been used mainly for building specialized or large general hospitals, and procuring sophisticated equipment. In general, there was not much expansion of public health facilities like small rural hospitals, maternity homes, or rural health centres. Compounding these problems was the duplication of efforts by health workers functioning under many vertical programmes of disease control campaigns. There as weak development of the concept of a comprehensive health system. The central health administration had taken over the major authority and executive responsibilities, thus preventing effective and adequate delivery of services at the periphery. The integration of specialized disease control programmes into general health services moved slowly; most of them remained as autonomous bodies after more than three to four decades of operation. There was little co-ordination in planning and management between the health and health-related sectors, as well as between various sections of the health sector itself. Much of the health planning was done at the central level without closer involvement of the people responsible for implementation (WHO 1978b).
In most countries, a high proportion of people, especially those in rural areas, had no access to even minimum essential health care. Many were also prone to diseases due to a hostile environment, poverty, and the lack of knowledge of preventive measures. There were glaring contrasts in health status between developed and developing countries as well as within developing countries. According to the 1971 data, life expectancy at birth was 43 years in Africa and 50 years in Asia, while it was 71 in Europe and North America. Infant and maternal mortality rates were steadily declining in many developing countries, but remained high on account of a few countries showing a slow decline. Up to the mid-1980s, the situation remained unchanged. The average life expectancy at birth in the developing countries was around 55 years, and in Africa and some parts of Asia it was only 50 years.
Infant mortality in most developing countries was 10 to 15 times higher than in the developed world. Most deaths in the developing countries resulted from infectious and parasitic diseases. The high occurrence of such diseases was closely related to specific socio-economic and environmental health conditions, and impeded overall socio-economic development. The proportion of gross national product spent on health ranged from 1 to 6 per cent in many developing countries as compared to more than 10 per cent in the developed world. Another reason was the decision to continue the colonial systems of health care. Since many developing countries had acquired health-care institutions from their colonial rulers, their first attempt at health development after becoming independent was to continue running the existing facilities and building similar health infrastructures, mainly in towns and cities.
On a different note, the use and promotion of traditional medicine and medical practices was almost abandoned in most developing countries during and after the colonial days, even though these have been used as alternatives for health care for thousands of years. These alternative health care systems included the Ayurveda, Siddha, Unani-Tibbi, Chinese, Tibetan, and others. Herbalists, bonesetters, and spiritualists also practised non-formalized systems of traditional medicine. Most of the traditional medical practitioners had been trained through informal systems of education. In addition, home remedies, yoga, nature cure, and homoeopathy were also used in many countries, and there were millions of such practitioners all over the world. As soon as the developing countries achieved independence, they started promoting traditional medicine through new legislation. They established formal educational institutions and accorded recognition to the services of traditional medicine practitioners. A few countries set up national control bodies and central departments under their health ministries or under separate ministries to promote the development of traditional medicine. Some countries promoted research and development on traditional medicine.
During the late 1970s, at the time of intensification of primary health care, many countries recognized the importance of traditional medicine. They began to develop it in conjunction with ‘modern’ medicine. National programmes for the development of traditional medicine were initiated as a complementary strategy for promoting primary health care. Some countries even attempted integrated health-care delivery so that the basic health staff could provide both traditional and allopathic systems at the same facility. People still saw traditional medicine as an alternative to modern health care for individuals rather than as a public health measure. Newer findings on the use of various types of traditional medicine and traditional practices led to the understanding that many of these could be used as public health interventions. These traditional medicines and practices were related to good personal hygiene, healthy behaviour, and proper nutrition for the protection and prolongation of life. Much more research needs to be undertaken for establishing the effectiveness of traditional medicine and practices. Furthermore, there is a need for continued research on the application of traditional medicine in the promotive and preventive aspects of health, especially in view of the expansion in public health interventions.
Political public health
Socialized public health
The 1970s and 1980s were regarded globally as economically and politically unstable decades. There were armed conflicts within and between countries and ethnic groups in various parts of the world. Of these, the war in the Middle East in 1973 had serious consequences on the world economy due to the oil crisis. Developing countries continued to face traditional health problems, such as high morbidity and mortality, as a result of maternal and childhood diseases, infectious diseases, and malnutrition and, at the same time, they were not able to cope with the rapidly increasing population. There was too much focus on technological advancements and the use of technical interventions. Dominance of the biomedical science approach, without adequate focus on the community, led to the failure of many mass disease control campaigns.
By 1970, developing countries and the international community at large had become increasingly aware that, despite 20 years of large foreign investments and top-down development efforts, the socio-economic status of the people, including their health status, had not risen to the desired level. Many countries, especially those that had achieved independence in the late 1960s and the 1970s, were still struggling for socio-economic growth. While the New World economic order was being formulated, a new philosophy of public health development with the principles of social justice and equity slowly evolved.
The initial ideas of social medicine or the social dimension of public health had emerged around the early twentieth century. Lord Dawson made a statement in 1919 that ‘preventive and curative medicine cannot be separated in any sound principle and in any scheme of medical science. They must be together in close coordination’ (Ko Ko U 1996). Professor Winslow defined public health as ‘the science and art of preventing disease, prolonging life, and promoting physical health and efficiency through organized community efforts’. This definition was continuously debated as to whether it should fall in preventive medicine or public health. The social medicine concept emerged as a new discipline in the late 1940s and the early 1950s. Sir John Ryle described social medicine as deriving its inspiration more from the field of clinical experience, seeking always to assist the discovery of common purpose for remedial and preventive services (Ryle 1943). The leaders of both clinical medicine and public health questioned the polarization of curative and preventive medicine and specialization in each field as if they were mutually exclusive. Social medicine, as a new discipline taking inspirations from clinical experience, conceptualized by Iago Galdston (the United States), Sir John Ryle (the United Kingdom), and René Sand (Belgium) in the early and mid-1950s, failed for various reasons and the social medicine movement was overtaken by other developments. The concept of risk factors as determinants of disease returned strongly later (Ko Ko U 1996).
New knowledge about non-communicable diseases, such as cancer, diabetes, and tobacco-related diseases, became available in the late 1950s and the 1960s. The social and behavioural aspects of diseases were also recognized and many social interventions were proposed as part of health promotion. Without considering the basic concept of social medicine, many medical universities and faculties converted their departments or schools of public health into units for preventive and social medicine. Instead of teaching either social medicine or conventional public health, educational institutions considered public health to be the same as preventive and social medicine. Most of the associated changes were more of a change in designation than in the evolving concept of public health.
Many health planners came to believe that the task before them required fitting clinical medicine into a social context because of the political needs and demands of the community, the widening gaps between health needs and available resources, and the rising pressure of societal factors. Socialized health care had become the most reasonable, workable, and acceptable approach. Virchow had stated during the mid-1880s that medicine was a social science, and politics was medicine on a large scale (Ko Ko U 1996). Over the years, the definition of health and the means of attaining it were widely debated. The relationships between health and poverty were debated in an effort to find appropriate solutions. People became more aware of the social and economic determinants impacting health. Empirical evidence was collected from both the developed and the developing world. The value of health as a fundamental human right and its attainment as an essential social goal were firmly recognized. The need to rationalize the allocation of financial resources and deployment of staff was voiced. There was a consensus that vast numbers of the population remained vulnerable and a large proportion needed essential health care. A strong national commitment was required to expand and strengthen national health systems in order to ensure the greatest health benefit to the greatest number of people at the lowest cost. Mahler said, ‘Without health, life has little quality, for even if health is not everything, without it the rest is nothing’. As a result of debates on the links between health and social, environmental, economic, and political factors, there were many comments on the need to give a political dimension to international public health (WHO 1992).
The Health for All movement
These debates yielded fruitful results when, in 1977, the World Health Assembly adopted the historic resolution on ‘Health for All by the Year 2000’. It had been clarified at the very beginning that Health for All did not mean that everyone in the world would be healthy and receive treatment for all ailments. It also did not mean that nobody would be sick or disabled. Health for All by the year 2000 meant that, by 2000, people would use better approaches than they had before for preventing and controlling diseases and alleviating unavoidable illness and disability. They would find better ways of growing up, growing old, and dying gracefully. It was intended that there would be an even distribution of health resources. Essential health care would be accessible to all individuals and families, in an acceptable and affordable way, and with their full involvement. The adoption of the universal goal of Health for All helped many countries to recognize new ways of reaching higher health status and to place greater emphasis on adherence to health goals (WHO 1978a).
While almost all countries aimed at the universal goal of Health for All, they also realized that the gap was widening between the health ‘haves’ in the affluent countries and the health ‘have-nots’ in the developing world. The obstacles in closing such wide gaps in health status were clearly recognized by the world community at the international conference jointly organized by the WHO and UNICEF, at Alma-Ata in the then USSR, in September 1978. This conference was another landmark in public health development and heralded a new era. After detailed deliberations, the conference agreed on the ground-breaking Alma-Ata Declaration (WHO 1978b). This Declaration called for urgent action by all governments, health and development workers, and the world community, to protect and promote the health of all the people of the world, using the primary health care approach.
The underlying principles of health development such as equity, community involvement, appropriate technology, and a multisectoral approach were further expanded and broadened at this conference. Health as a fundamental human right was reaffirmed and every country was asked to aim at the attainment of the highest possible level of health as the most important universal social goal. Governments needed to incorporate and strengthen primary health care within their national development plans with special emphasis on rural and urban development programmes and the co-ordination of the health-related activities of different sectors. The principles of basic health services, such as accessibility, availability, acceptability, and appropriateness of health services, were retained. Primary health care was seen as a practical, scientifically sound, and socially acceptable approach. Keeping in view the need for providing essential health care to the unserved and underserved population, primary health care was expected to promote actions to reach out to these populations (WHO 1978b).
Developing countries saw the Alma-Ata primary health care conference as an opportunity for restructuring their health systems to reach the goal of Health for All by 2000. They formulated new health policies and strategies, as well as plans of action to launch and sustain primary health care. With the full support of United Nations agencies and bilateral and multilateral donors, the countries attempted to organize and manage comprehensive national health systems within the Health for All and primary health care framework. They tried to manage health resources better in order to expand their health systems. Many started emphasizing primary health care as a major public health approach in their national health development plans. Some countries concentrated on a few health-care interventions, while others tried to encompass as many elements as possible in their public health development. A few others made attempts to address the quality aspects of health-care services and programmes.
A series of innovative approaches aimed at intensifying primary health care were organized. UNICEF concurrently advocated a comprehensive child survival approach, which was later presented to the United Nations as child survival protection and development. The WHO had emphasized health systems development with the focus on strengthening district health systems based on primary health care. The Health for All movement in the 1970s and the 1980s resulted in many more public health interventions in developing countries. The accessibility of major elements of primary health care had risen to over 80 per cent of the people in most countries. Yet, progress with respect to coverage of mothers and children with appropriate health care or the promotion of safe water supply and adequate sanitation remained too slow in most countries, particularly the least developed ones. In these countries, trained health personnel had attended to less than 25 per cent of childbirths. The trend of expansion in curative care continued to be stronger than ever; more and more hospitals (including major specialty and subspecialty hospitals) were established at the expense of prevention and promotion.
There was considerable reorganization of comprehensive health systems based on the primary health care approach in most developing countries. During implementation, success in integrating health care was of varying degrees. Some difficulties still persist. For example, despite widespread acceptance by national health administrations of the idea of ‘integration’, there were many practical operational constraints in undertaking the transformation from semi-autonomous ‘vertical’ or ‘selective’ health development programmes, coexisting with the general health services, to an integrated health infrastructure. On many occasions, especially when reviewing or planning disease control programmes, alternatives were sought between an apparently selective approach to health development (vertical health system) and a systematic integrated approach (comprehensive health system). Many countries, using both approaches, had demonstrated positive and negative consequences, depending upon the socio-economic circumstances and the availability of basic health infrastructure. Resource constraints and external pressure forced governments to be more selective in health development. The issues for them were: (a) whether the selective health-care interventions gave priority attention to those types of health problems affecting the poor and the underprivileged populations; (b) whether their health systems addressed all essential health-care functions; (c) whether the country could emphasize comprehensive health care while, at the same time, protecting the technical quality and efficiency of specialized or selective health programmes.
Most nationwide health development programmes promoted increasing community awareness and the creation of active and effective mechanisms for community involvement. Many successful programmes showed that the conventional approach of merely extending basic health services had proved inadequate. It was also proving economically impossible to bear the cost of extension and expansion of public sector health services to the entire population in the face of resource constraints. Large-scale use of health volunteers, after receiving a minimal training programme, proved successful in many countries. These community health volunteers constituted a third force of human resources for health. With their involvement in community health action, many public health interventions—such as disease prevention and control, immunization, maternal and child health care including nutrition promotion, health education, treatment of minor ailments, and environmental health promotion—were undertaken.
Empowering the communities, including local leaders for health action and also through links with other development sectors, had proved successful. Such public health initiatives also received international attention and recognition. The WHO, with the support of partners, promoted the primary health care and Health for All movements by instituting the Sasakawa Health Prize, the Health for All medals, and other forms of recognition. Individuals, groups of experts, local and national institutions, and international associations were amongst the recipients of such coveted prizes and recognition.
The public health experience in the decades of the 1970s and 1980s gave rise to one notable programme—the expanded programme of immunization. This global expanded programme of immunization, initiated by the WHO and UNICEF, has been termed by many as the ‘silent public health revolution’ of the twentieth century. When the expanded programme of immunization was launched in 1974, fewer than 5 per cent of children under 5 years of age in developing countries were being immunized against major childhood diseases. Nearly 5 million young children were dying every year of measles, tetanus, whooping cough, diphtheria, tuberculosis, and poliomyelitis—childhood diseases that could be prevented by simple and effective immunizations. With the inspiration following successful smallpox eradication efforts, the WHO/UNICEF promoted the expanded programme of immunization to protect all children of the world from the six main vaccine-preventable diseases. The goal of universal childhood immunization was set for attempting to immunize globally 80 per cent of all children less than 2 years of age by the turn of the century. Until the early 1980s, this universal childhood immunization goal seemed impossible to many countries. Nevertheless, concerted efforts by many developing countries in the 1980s and the 1990s achieved remarkable results (Fig. 2).
Fig. 2 Trends in immunization coverage in the WHO Southeast Asia region, 1980 to 1998.
This significant achievement was witnessed not only in the WHO Southeast Asia region but also throughout the world, and was made possible due to the full support of the WHO and the United Nations agencies, bilateral and multilateral donors, and international and national non-governmental organizations. In 1990, 80 per cent of all children in the world were successfully immunized against six major vaccine-preventable diseases before they reached the age of 2 years. Immunization coverage in developing countries alone was estimated at 85 per cent for a third dose of poliomyelitis vaccine, 83 per cent for combined diphtheria, pertussis, and tetanus (DPT), 90 per cent for tuberculosis (bacille Calmette–Guérin or BCG), and 79 per cent for measles vaccine. As a result, the lives of approximately 2 million children were saved. This outstanding coverage was possible because of improvements in the production, transport, and storage of vaccines. A major boost to increasing coverage was also provided by extended social mobilization efforts (UNICEF 1989; WHO 1993).
Eradication and elimination of disease
In the 1980s, the eradication of smallpox aroused the world’s interest to adopt eradication and elimination of many diseases as plausible public health strategies. A number of candidate diseases have been examined for possible eradication or elimination. This required not just the availability of safe, effective, and affordable vaccines, but also the delivery of appropriate medical or public health interventions directly to the community. The global smallpox eradication programme demonstrated that an effective mass campaign required the development of a comprehensive health infrastructure extending to the remotest parts of the world. Disease control efforts could only be sustained effectively by an efficient surveillance system and an extensive basic health infrastructure. Many developing countries used the expanded programme of immunization established in the 1970s as an initial step for disease elimination or eradication. For instance, the concept of eradicating poliomyelitis regionally and later globally was developed only in the late 1980s. Community involvement in public health measures emerged as a significant strategy, along with approaches for bringing together private and public sector agencies to intensify disease eradication or elimination (WHO 1998b).
Most successful disease control programmes had two main aspects: eradicating or eliminating the diseases, and strengthening and further developing comprehensive health systems. The successes provided powerful examples of how effectively disease control management could supplement the basic health infrastructure. Viable surveillance systems, established as a result of disease eradication and elimination campaigns, were capable of adapting to other national priority programmes. Some communicable and non-communicable diseases were candidates for elimination (zero cases but with continuing risk) and some for eradication (zero cases and zero risk) (Goodman and Foster 1998). The following are a few diseases that developing countries had prioritized to eliminate or eradicate by 2000.
The 41st World Health Assembly in 1988 resolved that the global eradication of poliomyelitis by 2000 represented not only a fitting challenge to undertake, but also an appropriate goal at the end of the twentieth century (WHO 1992). With the increasing coverage of the national expanded programme of immunization/universal childhood immunization programmes, poliomyelitis had been eradicated from the Americas and many other parts of the world by 1991. Yet, many countries in Asia and Africa reported large numbers of cases, infected by wild poliovirus. They also had continuing low coverage of immunization because of inadequate supply of vaccines, lack of appropriate basic health staff, and poorly managed programme implementation. Figure 3 shows the trends of polio immunization coverage amongst different sets of countries from 1980 to 1996.
Fig. 3 Percentage of infants immunized against poliomyelitis in countries of different economic groups, 1980 to 1996.
In the wake of the poliomyelitis eradication programme, an additional significant strategy emerged with intensive advocacy by the WHO—national immunization days. The national immunization day strategy, adopted in polio-endemic countries, supplemented the routine coverage for all children under 5 years of age on certain fixed dates of the year. Usually, national immunization days were spread over 2 or 3 days, or even 4 or 5 days in highly endemic areas, about 4 to 6 weeks apart, during periods of good weather every year for 4 to 5 successive years. Polio cases reported in China dropped from 5000 in 1990 to zero in 1995, as a result of organizing national immunization days in 1993 and 1994. During 1996, in addition to the 500 million routine immunizations to children under 1 year old, a record of 450 million children (almost half of the world’s children under 5 years of age) were immunized with polio vaccine during these national immunization day campaigns. In Sri Lanka, for example, a one-day truce—day of tranquillity—was agreed in the midst of civil strife in parts of the country, to enable thousands of children to be immunized. Since then, a number of strife-torn countries have followed similar approaches (Bland and Clements 1998).
Surveillance of acute flaccid paralysis in endemic countries has been intensified through good vigilance, proper case investigation, and prompt laboratory support. A majority of countries around the world are now free from polio infection. Poliomyelitis is on the verge of eradication in a few countries in Asia and the Pacific, but the major reservoir of wild poliovirus transmission remains in South Asia and sub-Saharan Africa. The WHO and UNICEF have urged the leaders of the countries where polio persists to intensify their campaign to eradicate it (WHO/UNICEF 2000). An extraordinary effort to intensify campaigns is required using stronger political will as well as national and international resources. Until the remaining area where transmission of wild poliovirus still occurs is made free from the virus, everybody in the world is at risk. Successful efforts to eradicate wild poliovirus from the world could represent the greatest achievement in public health in the first years of this new millennium (WHO 1998b).
While success may be achieved in the polio eradication efforts, there are reports of disturbing declining trends in routine immunization. The donor-driven national expanded programme of immunization, which has been organized solely for the purpose of improved immunization coverage, were short-lived following withdrawal of donor inputs. Each country needs to adopt a strategy for sustainable immunization programmes. Experience has shown that the coverage of pregnant mothers with tetanus toxoid vaccination never reached expected levels in most developing countries. Similarly, there has been a decline in measles vaccination coverage. Intensified efforts are needed to achieve the elimination of measles and tetanus. With the advocacy of the World Bank in 1993, many national programmes started adopting the ‘expanded programme of immunization plus’. Vaccines against mumps, meningitis, rubella, hepatitis B, hepatitis C, Haemophilus influenza, and yellow fever were added to national immunization programmes. Yet, these additional vaccines were not readily available in many developing countries where they were needed most.
The new goal of eliminating this ancient chronic disease, by the end of the twentieth century, stems from the emergence and extensive adoption of an effective multidrug therapy in the 1990s. With multidrug therapy, many leprosy patients were cured and the total number of infectious cases reduced within a shorter period, thereby interrupting transmission. Early detection and prompt treatment with multidrug therapy has prevented disabilities, thus reducing the burden of disability. Around 1985, there were 10 to 12 million leprosy cases in 122 countries, half of them in South and Southeast Asia. Of these, India alone had an estimated 4 million patients. By 1990, the number had come down closer to 7 million leprosy patients globally. With increasing multidrug therapy coverage, there was a remarkable reduction of registered leprosy cases from 5.4 million in 1985 to 3.7 million in 1990, representing a decrease of 31.5 per cent (WHO 1993). Thus, in 1991, the world community resolved to eliminate leprosy as a public health problem by the year 2000 (Noordeen et al. 1996). Generous contributions were made by the International Federation of Anti-Leprosy Associations, the Nippon Foundation of Japan, the United Nations and its specialized agencies, multilateral and bilateral donors, and other philanthropic societies, through necessary financial, material, and human resources, to facilitate all endemic countries to achieve this ambitious goal. As a result, the geographical coverage of multidrug therapy had improved tremendously and the leprosy cases had decreased significantly. By mid-1999, about 1 million leprosy cases remained in the world, mainly concentrated in 24 countries, mostly in South Asia and sub-Saharan Africa. The majority of cases are still in a few states in India. If the current trend is sustained and intensified, the goal of leprosy elimination will be achieved.
Dracontiasis (guinea-worm disease, caused by Drancunculus) is a parasitic infestation aggravated by poor sanitation and hygiene. Before 1980, there were more than 10 million cases per year in Africa and South Asia. With accelerated efforts to provide safe drinking water and sanitation facilities during the International Drinking Water Supply and Sanitation Decade, the case load of 3.3 million in 1986 was reduced to 1 million by the early 1990s, despite the fact that there was no drug for this disease. In 1991, the World Health Assembly called for the eradication of dracontiasis by 2000. Since then, the endemic countries have adopted control strategies, such as clean water supply, community awareness, and involvement in surveillance and case containment, and larval control with the addition of cash incentives to identify cases. The number of endemic villages decreased tremendously in most endemic countries. Pakistan, for instance, claimed eradication status in 1997, while India was declared free of the disease in January 2000. It is expected that most endemic countries except Sudan will soon be certified dracontiasis-free. The achievement of global eradication is expected in the early twenty-first century.
Onchocerciasis (river blindness) is a highly debilitating disease caused by the filarial worm transmitted by female blackflies in the tropical countries of Africa, Latin America, and the Arabian Peninsula. The WHO, the World Bank, the United Nations Development Programme (UNDP), the Food and Agriculture Organization, and a coalition of more than 20 donors and agencies sponsored the global Onchocerciasis Control Programme, initiated in endemic countries in 1974. An estimated 36 million people in West Africa have been protected from the disease. The main attempt at that time was to break the cycle of transmission by eliminating the vector. In 1990, some 86 million people were still at risk and about 18 million were infected. A million people were visually impaired and over 350 000 were blind as a consequence of infection. Almost 95 per cent of those infected were in Africa. The control programme was converted into an elimination campaign in the Americas in 1991 and in Africa in 1996. The new strategy adopted was to eliminate severe pathological manifestations of the disease and to reduce morbidity through wider use of case management with an effective microfilirial drug (ivermectin). The pharmaceutical industry pledged to provide adequate quantities of ivermectin at no cost for as long as the disease persisted as a public health problem. Still, many endemic developing countries could not implement their disease control programmes owing to lack of financial resources to maintain staff and logistic support. With sustained national commitments, completed case treatment, selective vector control, and mass health education campaigns as well as sustained international support, global elimination of onchocerciasis as a public health problem is envisaged in the next few years (WHO 1998b).
Emerging and re-emerging diseases
The last two to three decades have witnessed the re-emergence of a number of communicable and non-communicable diseases. Simultaneously, many new or previously unrecognized infections or disease conditions, such as HIV/AIDS and viral hepatitis C, are reported in both developed and developing countries. Complacency towards the prevention and control of many infectious diseases, such as tuberculosis, dengue and dengue haemorrhagic fever, malaria, and plague, in the last three decades resulted in disease control programmes being neglected in many countries. The main factors aggravating this situation were the change in human demographics and job opportunities (mass migration), human behaviour (e.g. sexual relations), improvement in technology and industry (air-conditioning, food processing and preservation, and so on), environmental degradation, microbial adaptation and resistance, the continued legacy of financial and resource constraints, and the breakdown of public health measures (especially surveillance). The latter factors were most common in developing countries, especially during periods of civil strife and natural disasters. Thousands of people were displaced, disabled, or died due to infectious diseases during natural disasters or ethnic conflicts. As a result, a number of diseases or conditions that had been controlled have re-emerged and many new ones have emerged, causing grave concern in developing countries.
AIDS was unknown before 1981; the causal organism, HIV, was discovered in 1983. HIV infection emerged as an explosive pandemic by the mid-1980s, and is by far the most profound infectious disease of the latter half of the twentieth century. It will continue as the most devastating disease of the twenty-first century. Every year a million people are infected and most will die. As soon as the main modes of transmission of AIDS were identified, a global programme on AIDS was initiated in the mid-1980s by the WHO, with the full support of the World Bank, United Nations agencies, and other interested partners. After a decade, its work was consolidated into a new United Nations AIDS Control Programme, which was co-sponsored by United Nations agencies and their development partners.
In the absence of an effective vaccine, the main strategies of global and national HIV/AIDS control programmes were political advocacy, mass education including sex education, behavioural intervention and social mobilization, and integrated social development. Trends in AIDS incidence have differed in various parts of the world. The overwhelming majority of people with HIV and AIDS are in the developing countries of Asia and sub-Saharan Africa. In industrialized countries, the number of deaths due to AIDS has dropped rapidly due to increasing accessibility of proper care and expensive therapy. The progress in improving life expectancy over the past three decades in many developing countries has been reversed by HIV/AIDS, especially in the most severely affected countries (WHO 1999c). In Africa, HIV/AIDS has resulted in an increase in both infant and adult mortality. The impact on human development has gone beyond mortality, as the epidemic affects the sustainability of households and the socio-economic resources of communities (UNDP 1998). An effective vaccine is still a long way off and an effective drug therapy is also far too expensive to be widely applicable and accessible to developing countries, especially with the forthcoming implementation measures under multilateral trade agreements, such as TRIPS (the agreement on Trade-related Aspects of Intellectual Property Rights, one of the multilateral trade agreements). The problem of HIV/AIDS is likely to become even worse in the early part of the twenty-first century unless more effective public health interventions are identified and implemented.
Tuberculosis emerged as a globally dangerous infection in 1993. At the end of the twentieth century, it became one of the major leading causes of death globally, with 2 to 3 million deaths taking place each year. Annually, about 3 million cases occur in Southeast Asia and nearly 2 million in sub-Saharan Africa, with 340 000 cases in Europe. One-third of the incidence between 1994 and 1999 was attributed to HIV/AIDS. Effective tuberculosis case management, vaccination of infants with BCG, and preventive therapy were the main control strategies. The WHO promoted the use of the ‘directly observed treatment, short-course’ (DOTS) strategy for the effective management of infectious tuberculosis cases. More than 119 endemic countries had adopted this approach by 1998. When DOTS was introduced in 1993, no more than 2 per cent of active tuberculosis cases worldwide were being treated by this new method. By 1998, the DOTS population coverage had reached 43 per cent. The worldwide goal is to have universal DOTS coverage by the year 2005 and to achieve the target rapidly of treating 85 per cent of all active sputum-smear-positive tuberculosis cases successfully, and detecting 70 per cent of such cases. However, resource constraints and the lack of political commitment have hindered programme implementation, thus impeding coverage of all tuberculosis cases under DOTS (WHO 1998b, 2000b).
Improvement in sanitary measures, use of antibiotics and insecticides, and keeping good vigilance over dead rats (rat falls) resulted in a dramatic decline in the cases of human plague in many developing countries. Sporadic reporting of rat falls and bubonic plague cases has been noted in many parts of Asia, the Americas, and Africa in the last few decades. The outbreak of plague in India in 1994 created a wave of public panic around the world. Inadequate preventive and control measures caused India to lose a few billion dollars of export earnings. This and similar situations led to an increasing awareness of the relationship between international trade and health, especially the implications of health for multilateral trade agreements (Kinnon 1998).
The largest epidemic outbreak of another internationally notifiable infectious disease, yellow fever, occurred in Ethiopia in the early 1960s, causing 30 000 deaths. Even though yellow fever can be prevented and controlled through effective immunization, the same number of deaths due to yellow fever persisted globally every year. The disease is still endemic in 34 African countries, including 14 of the world’s poorest. Several countries in West Africa reported yellow fever outbreaks during 1994 to 1995. Some countries in South America are still at risk, and Peru reported an outbreak in 1995. At present, air travel is the easiest route of disease transmission. The presence of vectors and non-immune people could lead to a possible epidemic of yellow fever in various parts of the world. Thousands of lives could be lost unless a good surveillance system and higher immunization coverage are sustained.
Viral hepatitis B
Infection due to hepatitis B virus became a global health problem in the late 1980s. In 1990, more than 2000 million people, or two out of every five people on Earth, were affected with hepatitis B virus. About 350 million of them became chronically infected carriers, and of these, one-quarter were at high risk of serious illness and eventual death from cirrhosis of the liver and primary liver cancer (WHO 1998b). Although a safe and effective hepatitis B vaccine, the first and currently the only vaccine against a major human viral infection, has been available since 1982, the coverage of immunization with hepatitis B vaccine is still relatively low. The WHO and UNICEF recommended hepatitis B vaccination as an integral part of the national immunization programme. Over 90 countries, most of them in Asia, the Pacific Islands, parts of South America, and sub-Saharan Africa, have had national programmes so far. Yet, some countries with a high prevalence of viral hepatitis due to hepatitis B virus, especially the developing countries in Asia, are not able to implement immunization of hepatitis B vaccine as a nationwide programme. The delay is due to lack of availability of appropriate and affordable hepatitis B vaccine. With sustained national and international commitments and the resource support of the international community, there is a possibility of achieving a high coverage of hepatitis B vaccine vaccination so that the disease can be eliminated by 2025 (WHO 1999c).
Even after the integration of extensive eradication campaigns into national control programmes, malaria remains a major public health threat in about 100 countries, most of which are located in the tropics and subtropics. There are 300 to 500 million cases and 1.5 to 2.7 million deaths, 90 per cent of them reported in tropical Africa. Chloroquine-resistant Plasmodium falciparum infection, which appeared in Thailand and Colombia in the 1970s, spread to most countries of the tropical world. Recently, chloroquine-resistant P. vivax infection has also been reported in Sumatra and Oceania. Many countries had introduced multidrug therapy as the first- and second-line therapy for malaria. The malaria situation in Asia, especially in South and Southeast Asia, and also in some parts of the American and African regions, is alarming, with an increasing spread of multidrug-resistant P. falciparum infection. In addition, the vectors of malaria transmission are building resistance against a variety of insecticides. Inappropriate use of insecticides, the fluid movement of people in border areas and other malarial areas, and irrational use of drugs are the major factors for the continuing high prevalence.
Rapid deterioration in the malaria situation in many countries calls for greater efforts by governments of the endemic countries and the full support of the international agencies. In 1998, the WHO initiated a global programme called Roll Back Malaria, as a new health sector-wide partnership, to combat the disease at global, regional, country, and local levels. The programme anticipated a 50 per cent reduction in the number of deaths from malaria within a decade through better access by all people in malaria-affected areas to a range of effective interventions. Early detection and rapid treatment, multipronged interventions, well co-ordinated strategies, focused research, and the development of a dynamic social movement are the major strategies adopted in this global programme. Development partners would work together with the malaria-affected countries to achieve this new goal (WHO 1999c).
Diarrhoeal diseases and acute respiratory infections
Developing countries had adopted the prevention and control of diarrhoeal diseases and acute respiratory infections, mainly pneumonia, as part of major public health interventions, as these diseases have always been major causes of morbidity and mortality in infants and young children. Poor sanitation and housing conditions, including unclean and smoky kitchens, inadequate supply of safe water, and improper personal hygiene resulted in a high incidence of diarrhoeal diseases and acute respiratory infection. The total number of acute respiratory infection episodes in young children throughout the world has been estimated to be around 2000 million a year. A child in a developing country suffered from an average of two or three episodes of diarrhoea per year. This resulted in 1500 million episodes of illness, and more than 3 million deaths each year in children under 5 years of age. Acute respiratory infection added another 4.3 million deaths annually. Diarrhoea and acute respiratory infection cases accounted for two-thirds or more of outpatient and hospital admissions in most countries and they often required expensive medications (WHO 1998b).
The advent of inexpensive and effective oral rehydration therapy for diarrhoea and simplified case management for acute respiratory infection and other childhood illnesses in the early 1970s resulted in conceivable progress in implementing effective clinical interventions. Even so, progress has been very slow in promoting the integrated management of childhood illness. The action to impart the knowledge and skill to basic health workers on integrated management of childhood illness has required a lot of professional and financial resources. Though integrated management of childhood illness is low cost, simple, acceptable, and effective, only one-third of the developing world’s families know about and have access to integrated management of childhood illness. Greater efforts are required to ensure that 80 per cent of the families in the developing countries have access to integrated management of childhood illness, in order to reduce the burden of diarrhoeal diseases and acute respiratory infection (WHO 1998a, b).
While infectious diseases continued to be a major public health problem, there were ominous signs that non-communicable diseases were increasingly prevalent in the developing world (Fig. 4). Improved longevity, together with changes in lifestyles and diets, and increased use of tobacco and alcohol, had contributed to a sharp rise in the incidence and prevalence of respiratory and cardiovascular disorders, malignancies, and mental illnesses. Many developing countries are not adequately prepared to tackle this double burden of diseases, as evidenced by relatively weak programmes for the prevention and control of non-communicable diseases. Recent estimates of global burden of disease and injury indicated that over 24 million people were expected to die due to non-communicable diseases in the developing world in the year 2000 (actual figures not yet published), and if uncontrolled, another 38 million will die by 2020 (Murray and Lopez 1996).
Fig. 4 Causes of death: distribution of deaths by main causes in developing regions for 1985, 1990, and 1997.
Although the discovery of insulin to treat diabetes was made in 1921, the disease was reported as an uncommon condition in the developing world. Current estimates indicate that 143 million people are affected with diabetic disease worldwide. The WHO estimates that by 2025 there will be about 300 million people with diabetes. The increase in the number of cases is mainly due to the demographic change and changing dietary patterns, lifestyles, and urbanization. With early diagnosis and effective management, diabetes can be controlled.
During the 1920s, George Papanicolaou introduced the Pap smear test for screening cervical and uterine cancer. This pioneering test is still used for identifying early cases of cervical and uterine cancer. Yet, this type of screening is rarely accessible to young and ageing women who are at greatest risk in rural areas of the developing world. An estimated 425 000 new cases of cervical cancer are diagnosed each year globally, most of them from urban areas. Regular periodic screening, early diagnosis, and prompt treatment can reduce mortality by 85 per cent (WHO 1998b).
Cancer of the lung was the highest-ranking cancer in 1997, both amongst the total population as well as the population aged 15 to 64 years. The link between lung cancer and tobacco smoking has been known for decades. However, implementation of tobacco control activities is slow. Other tobacco-related conditions include cardiovascular diseases, which are responsible for more than 5 million out of 12 million deaths in developed countries. They are rapidly emerging as a major public health concern in many developing countries as well. Cardiovascular disease already accounts for 10 million deaths annually in the developing world. The most common cardiovascular diseases are hypertension (high blood pressure), coronary heart disease (heart attack), and cerebrovascular disease (stroke). If it follows current patterns of tobacco use, it is estimated that about 500 million people will eventually be killed by tobacco in the next few years. To mount a global effort against tobacco-related diseases, the WHO recently initiated the global tobacco-free initiative (WHO 1999c).
Health systems reforms
The WHO and other agencies have continuously evaluated health systems development since the 1970s. Annual reviews by the WHO and its development partners reported impressive progress of health development in many countries. Concerns were expressed, however, on the widening gap in health status between and within countries. The following are the main factors that slowed down progress in implementing Health for All strategies using primary health care as the key approach:
a lack of full understanding of the fundamental policies and principles of primary health care and Health for All applicable to national health systems development, which led to non-achievement of universal access to essential health care on a continuing basis;
lack of co-ordination and collaboration between specific health intervention campaigns with the development of basic health infrastructure (district health systems development);
difficulties in involving communities in health action;
slow pace of integration of vertical disease control campaigns into the general health infrastructure;
weak planning and management of health-care delivery, especially at the operational levels;
imbalance and irrelevance of human resources for health.
Investments in health from government funds (from either internal or external resources) have declined in many developing countries. The same is also the case for investments by donor agencies in health. Donor support for the development of health systems is usually linked to specific objectives. Donors are generally keen on supporting selective public health intervention projects, especially for control of specific diseases. Thus, the gap between the principles of global solidarity at various international conferences and the implementation of those principles has remained very wide (WHO 1993, 1998b; Tarimo and Webster 1994; UNICEF 1996).
In the last decade, there has been widespread democratization in most developing countries. This has led to a certain amount of devolution of power and responsibility to the people, thereby increasing their involvement in the planning and management of development programmes, especially in social, environmental, and economic development. This change in overall government policy and organization, when undertaken, usually extended across all sectors. The World Bank, the International Monetary Fund, and many other multilateral and bilateral donors used these changes as a condition for extending their loans or grants. Many health sector reforms initiated by developing countries in the last few decades included decentralization or devolution as important strategy. However, the approach varied between countries depending on the extent of devolution and decentralization of authority, division of responsibility and resources, and the management capacity at each level of the health systems (Rafei 1993).
Evaluation reports showed that policy formulation and development, capacity building for planning and management at the local level, and the development of operational procedures are the main strategies being adopted to strengthen decentralization efforts. While training in management skills and knowledge are necessary, other facets, such as performance appraisals and quality assurance, including procedural reviews and revisions, delegation of authority and responsibility, assurance of accountability, and on-the-job training and problem solving, also usually need to be in place. Policy and organizational changes across all government sectors are also important for the success of decentralization.
Another trend in health systems development worldwide is the increasing role of the private—both for-profit and non-profit—sector. During the last few decades, the appropriate public and private mix in health systems has been extensively debated. This debate stemmed from the fact that the major proportion of health expenditure was from private sources and that governments could not increase their expenditure on health. The role of the private sector in health care was mainly concentrated in the area of provision of medical care, including health insurance. However, many private agencies are now involved in resource mobilization, health promotion, and also in preventing and controlling major health problems.
Various health systems reforms have been initiated during the last few years in both developing and developed countries in order to enhancing the optimal mix of private and public health provision. Some developing countries started with the introduction of user charges for both outpatient and inpatient care or the provision of private beds. Some had even attempted privatization of public health facilities, whereas others promoted the increasing involvement of private health-care providers in national health development. Some had introduced national health insurance schemes or expanded the existing social insurance coverage. The major changes introduced required the consumers to pay more for health care, especially in poorer countries (Creese 1994; Hanson and Berman 1998). The question of selecting an alternative option between the Beveridge and Bismarck models for health-care financing reform is still under debate. It is not a simple solution of choice, but how the most appropriate mix of the two models can fit within the existing socio-economic, political, and health situation in the country. Governments need to play a strong role in this decision.
Health research and development advance together with technological advancement. Previously, developing countries always relied on the results of research and development done by developed countries, especially in the area of science and technology. Nonetheless, many scientific breakthroughs actually came from experiences gained in developing countries, for example, malaria transmission, dengue vaccination, contraceptives, and multidrug therapy. Since its early years, the WHO has urged the promotion of research capability in developing countries. In the mid-1970s, after the establishment of regional and global advisory committees on medical research by the WHO, the scientific communities from the developing world have played significant roles in international research promotion and development. A series of research studies on the prevention and control of tropical diseases, the promotion of human reproduction including contraception and other fertility control measures, the strengthening of health systems, the protection of environmental health, the control of non-communicable diseases, and testing other essential health and medical care interventions have been initiated.
Partnerships in health research were established in the areas of cancer, human reproduction, and tropical diseases. The International Agency for Research on Cancer was established in 1965, under the aegis of the WHO, to conduct research on environmental biology and cancer. In 1972, the WHO launched a special programme of research, promotion, and development on human reproduction to meet the needs of the developing countries. Later, in 1988, the UNDP, the United Nations Population Fund, and the World Bank joined the WHO in this programme as co-sponsors. This Human Reproductive Research Programme made major contributions to the improvement of reproductive health, especially in the development of different methods of fertility regulation. Currently, a major network of over 110 institutions, mainly in developing countries, is involved in research on human reproduction. Over 1700 scientists from developing countries have been trained in various disciplines in human reproductive research and are participating in global research efforts.
The WHO, together with the UNDP and the World Bank, also established a special programme for research and training in tropical diseases in 1975. This programme concentrated initially on eight tropical diseases: leprosy, malaria, onchocerciasis, yellow fever, Chagas’ disease, dengue/dengue haemorrhagic fever, and tuberculosis, all of which have been affecting millions of people in the tropics. Over the last 25 years, the Programme for Research and Training in Tropical Diseases has created more than 30 products related to drugs, diagnostics, and vector control tools, and has another 30 advances (including vaccines) in the pipeline. Research training grants have been provided to about 1000 scientists from developing countries to improve their scientific and research capabilities. After training and assuming influential positions in the ministries of health and national research institutions, many graduate scientists have introduced significant managerial, technical, and political changes (WHO 1999c).
In the innovation and intensification phases of primary health-care implementation during the post-Alma-Ata era, people working in public health realized that health systems research was an important tool for innovation. Many countries strengthened their health systems research units within the ministries of health. Some even established separate national institutes to provide appropriate scientific information to the decision-makers. Considerable progress was made in capacity building and capability strengthening in promoting health systems research. There was an enormous increase in the number of health systems research studies carried out. However, when compared with investments in basic science research, the budget allocations for health systems research remain relatively small.
The Ad Hoc Committee on Health Research, established by the WHO and interested development partners, in its report on Investing in Health Research and Development, concluded that the central problem in health research promotion and development was the ’10/90′ disequilibrium (WHO 1996). According to the WHO–World Bank study in 1990 on the burden of diseases (World Bank 1993), the top 20 diseases and their risk factors are affecting 90 per cent of the world population (Murray and Lopez 1996). An estimated US$ 56 billion was invested globally for health research, yet only 5 to 10 per cent was spent on health research on issues that affected the large majority of the world’s population. The Ad Hoc Committee cautioned the global community of the enormous economic and social consequences to society as a whole of such misallocation of resources. This concern became even more important in the context of the public health challenges of the twenty-first century (GFHR 1999). The world community will need to increase efforts to complete the unfinished agenda for preventing unnecessary deaths, sickness, and disability. Essential public health interventions are needed to tackle new, emerging, and re-emerging health problems. A special effort is required to strengthen global capacity and capability to deal with the increasing burden of non-communicable diseases and conditions, and to promote healthy ageing. Both developed and developing countries need to work together to address the inequity and inefficiency of health systems. The successes of current and future efforts in health sector reforms demands international solidarity.
Technical co-operation amongst developing countries, as a mechanism for solidarity, emerged in the 1970s. The United Nations General Assembly in 1972 invited the UNDP Governing Council to identify the best ways of enabling developing countries to share their capabilities and experience, with a view to increasing and improving development assistance. The concept of technical co-operation amongst developing countries was soon recognized as an integral part of international co-operation for development. The United Nations and its agencies, including the WHO, took active steps to promote the concept and its application amongst developing countries. Within the intercountry and regional co-operation mechanisms, technical co-operation in health, through technical expertise, exchange of faculty members, training facilities, and other resources, was promoted. The meeting of health ministers of non-aligned and other developing countries, in 1984, further endorsed the technical co-operation amongst developing countries mechanism and even brought forth an initial plan of action on technical co-operation amongst developing countries for Health for All.
New public health
Public health transition
The final decades of the twentieth century witnessed rapidly changing political situations and severe economic upheavals, especially towards the end of the Cold War. Strong demands were made for pluralistic democracy, good governance, social justice, respect for human rights, and a clearly defined role of the state and for economic globalization. Social expectations and awareness increased in many developing countries. Primarily, progress and development depended on the inherent strength of a nation and its people, their ability to adopt new and appropriate patterns of behaviour, and to embrace social and other forms the freedom which they wished to pursue. The existing trends in developing countries are towards the adoption of more democratic forms of government, achieving peace and prosperity often after prolonged civil and military strife, dismantling of rigid central planning and economic systems, and adopting a more market-oriented economic approach. Many countries are actually preoccupied with domestic problems, but a few have been benefiting from a period of stability and sustainable growth. Such political and socio-economic changes are not achieved without painful choices. Costly repercussions in terms of civil disturbances and economic crises occurred in Eastern Europe, Africa, the Americas, and Asia.
Most developed nations and a few developing countries experienced demographic and epidemiological transitions. These countries had seen parallel reductions in birth and death rates. The age structure of their population had changed from the stage of numerous young and few elderly to nearly equal numbers in most age groups. While developed countries went through epidemiological and demographic transitions covering over two centuries, developing countries were experiencing the same transitions in decades (WHO 1999c). Enormous social, economic, and epidemiological changes rapidly followed the demographic transition. Added to this are the deteriorating environmental conditions that hinder sustainable development and human well being. People in developing countries are still not only exposed to traditional environmental health hazards, such as poor water supply and sanitation and inadequate food hygiene, but are also at high risk of the hazards of uncontrolled industrialization. Mismanagement of natural resources and poor living conditions in rural areas and urban slums have also led to higher risk of diseases from degraded environmental conditions (WHO 1997).
There is no doubt that people in developing countries are healthier than their counterparts a century ago. Life expectancy at birth in the least developed nations improved from 31 years in 1950 to 51 years in 1990. Compared with developed countries, which have a life expectancy at birth of more than 70 years, developing countries still have some way to go (WHO 1999c). The majority of developing countries in South Asia, the Pacific, Latin America, and sub-Saharan Africa still account for the major proportion of the global burden of communicable diseases. In addition to the higher prevalence of new, emerging, and re-emerging communicable diseases, the prevalence of non-communicable diseases like cancer, cardiovascular diseases, and other chronic degenerative diseases and conditions is also increasing. Inappropriate road conditions and inadequate control of traffic rules and vehicles in developing countries have resulted in high incidences of accidents and trauma. This double burden of disease has made it difficult for health policy-makers and administrators to decide on equitable allocation of scarce resources. Addressing this rapid epidemiological transition may be the biggest challenge for public health at the start of the new century. In addition, there were problems of implementing many international treaties and conventions in health-related areas. One notable example was the difficulty in implementing the biological and chemical weapons conventions. Eradication of smallpox in the human population is an outstanding achievement, but the final extinction of the smallpox virus remaining in a few laboratories has remained controversial from the scientific point of view. Able and dedicated scientists, the general public, and governments, together with the international community, have continued to play their part in meeting the challenge to address the problems associated with the combined burden of communicable and non-communicable diseases in developing countries (Kaplan 1999).
Globalization of public health
Globalization has been a phenomenon that is characterized by worldwide interdependence in all aspects—economic, political, social, and cultural. The transformation of local societies to a global society resulted in the blurring of territorial frontiers. An analysis of the impact of globalization in recent years is now available in the health and development literature (Frenk et al. 1997; Chen et al. 1999; Society for International Development 1999). It is accepted that globalization has enhanced opportunities for human advancement as well as economic and cultural growth of all peoples. It also offers great potential for achieving health for all. Global markets, finances, technology, knowledge, solidarity, and governance have the ability to improve the health of people everywhere. The UNDP and the World Bank have examined the opportunities and challenges for human and socio-economic development within the present context of globalization and localization, and suggested ways in which the global society should react to these issues, especially in the area of governance, markets, and sustainability (UNDP 1999; World Bank 1999). The threats and opportunities of public health have been analysed extensively within the context of globalization (Jamison et al. 1998; Walt 1998; Yach and Bettchner 1998; Berlinguer 1999; Navarro 1999). Similar policy and programme debates at various international arenas have dealt with the changing perception of decision-makers on national health development within the context of globalization, the possibility of exploiting technology advances, the need for national and global solidarity, and the need to sustain moral and social values.
Berlinguer used the term ‘microbial unification of the world’ for the phenomenon of the transmission of epidemics across countries and continents as part of globalization. Today, local control of the outbreak of a disease has become a global struggle. The rapidly increasing number of people travelling and migrating to neighbouring countries or across the world poses the threat of spread of communicable diseases. As disease agents know no physical boundaries, transmission can take place at any time anywhere in the world. With proliferation of information networks, the news of epidemic outbreaks spread globally almost instantaneously. Exaggerated and sensational news captured the audience rather than a true situation. The WHO, in consultation with its member states, is trying to address this problem by establishing a global disease surveillance system and also by revising the International Health Regulations. Debates continue on how this revision can be done within the framework of globalization.
Microbial and chemical hazards in food are of major concern worldwide. Certain infectious diseases, especially food-borne diseases like cholera, can be carried with food exports and tourism. If the preparation of food and food products is in accordance with good manufacturing practices, and no reported cases of infectious diseases are associated with the consignments, food imports should not be restricted. However, there are many trade-related cases from Asia, the Americas, and Africa where exports from the developing countries are restricted or sanctioned by developed nations because of the occurrence of infectious diseases. Peru lost over US$ 770 million during the 1991 cholera outbreak because of trade sanctions by several countries on imports of Peruvian fish and fishery products. Similarly, India lost around US$ 4000 million from export earnings because of the plague outbreak in a few states in 1994. Several African countries lost millions of dollars because of an embargo on certain fishery products during the cholera outbreaks in 1998 (Kinnon 1998; Miyagishima and Kaferstein 1998). In all these cases, the WHO provided appropriate information on disease outbreaks, and certified that there was no relationship between disease outbreaks and production of foodstuff. The WHO also suggested to the respective governments to apply protective measures based on scientific knowledge and evidence. These activities served to further the international harmonization of sanitary measures aimed at the protection of human life and health. The WHO could work closely with the World Trade Organization, with respect to the public health aspects of international trade disputes arising as a result of disease outbreaks. The WHO could also work with the World Trade Organization in revising the International Health Regulations, while the possibility of the International Health Regulations being formally recognized in future amendments of the multilateral trade agreements could also be explored. The international community needs to work together to ensure that reporting nations, mainly developing countries, are not unfairly penalized.
New facts of socialized public health
It is clear that health and economic development are mutually dependent. Policy-makers in both developed and developing countries are increasingly becoming concerned about finding equitable, realistic, and sustainable approaches for improving the health situation. Unfortunately, experience has shown that many governments in developing countries regard health expenditure by the public sector as purely a commodity of consumption and want to minimize it. The total spending on health proportionate to gross domestic product in these countries is still below 5 per cent. The main issue is not just the low percentage of expenditure, but also the lack of efficiency in the allocation of limited funds. International agencies have encouraged developing countries to set priorities and improve resource allocation.
The WHO has advocated a new paradigm for health that sees health as central to development and to the quality of life. Health development must be achieved through a dynamic yet harmonious balance between health in terms of consumption and health as an investment in the future of humankind (WHO 1992). The World Bank has initiated priority setting in health by introducing the notion of cost-effective public health intervention packages, tailored according to the public financial realities of each country (World Bank 1993). Many developing countries, especially those receiving substantial foreign investments in health, now design essential health-care packages as part of their national health-sector-wide programmes.
The two decades since the Alma-Ata Conference have shown that sustained progress of the Health for All movement using primary health care as the main approach towards achieving the universal goal of health for all is a complex and difficult task for many developing countries. A few developing countries in Asia, Latin America, and Africa showed that they could achieve major improvements in health outcomes while keeping the total public health spending at modest levels. Governments in these countries have consistent development policies and programmes for reaching the poor and the vulnerable populations with the most effective and appropriate preventive, promotional, and curative health interventions, in addition to other social services. The use of health volunteers in providing essential primary health care and also in expanding health knowledge was the success of the decade. Underprivileged people have access to health care (from providers of either public or private), where the right kind of health care is available for common ailments. This universal health-care coverage was one of the major strategies adopted at the Alma-Ata Conference.
In spite of the rapid expansion and improvement in health systems development over two decades, not all citizens in the developing countries have access to minimum essential health care. Many children in some isolated areas of the world are still missed for polio, measles, and tetanus immunizations. Many mothers deliver their babies unattended by trained personnel. A majority of tuberculosis or leprosy cases miss multidrug therapy. The challenge during this century is to ensure access of essential health care to all citizens of the world irrespective of their race, religion, citizenship, or residence. The WHO has advocated the ‘new universalism in health’: that health care must be compulsory for all, but it does not imply coverage for everything (WHO 1999c). Studies have indicated that out-of-pocket payment for any health care could penalize the poor and the underprivileged. Prepayment or any system of risk pooling allows a wide range of incentives for efficient purchase of services. Essential public health interventions can be efficient and effective in quality, and not dependent on who is providing. These essential interventions for each country may need to be defined based upon the health financing mechanisms, health systems infrastructures, social behaviour, and other aspects of socio-economic development.
Public health development in the last few centuries has explicitly shown the need for increasing interdependence between and amongst developed and developing countries, in order to promote the health of the people of the world. The health risks are shared by every citizen of the world and could also be seen as opportunities for improvement. Technological interventions for health care may not be the only solution; socio-economic and ethical dimensions are also relevant for successful implementation of such interventions. Debates have now been initiated on global governance of health. During the last 50 years, many international agencies (both within and outside the United Nations system) have emerged out of necessity to deal increasingly with international developmental issues, both in policy and programme terms, including health and other social development. Each of them has advantages.
Recently, the Director-General of the WHO defined a new role which the WHO could play as a step towards becoming a truly international as well as a global health organization (Robbins 1999; WHO 1999c). While the WHO’s main mission remains the attainment for all people, of the highest possible level of health, it has further adopted a corporate strategic framework that will enable it to make the greatest possible contribution to world health through increasing its technical, intellectual, and political leadership (WHO 2000). The WHO would like to work with developing countries to focus global efforts to build healthy populations and communities. This will be achieved by addressing the excess burden of sickness and suffering resulting from both communicable and non-communicable diseases, especially in poor and marginalized populations. Partnerships will be established to sustain and support health system development so that equitable health outcomes are achieved and people’s demands are met. The vision of the new universalism is realized through the development of an enabling policy and improved institutional environments, both nationally and globally.
The efforts of national and international communities should aim at promoting healthy living, reducing the double burden of disease, and making essential health care accessible to all. Since the Health for All movement, health, equity, and social justice remain the main theme of social policy priority. The social values and principles of solidarity, social justice, and ethics for primary health care and Health for All will be relevant now and in the future. All public health professionals and the international community need to sustain that vision. They need to commit and rededicate themselves to meeting the opportunities and challenges of health development in the twenty-first century.
Barton, W.L. (1979). Alma-Ata: signpost to the new health era. World Health, July, 10–4. WHO, Geneva.
Basu, Z., Jezek, Z., and Ward, N.A. (1979). Eradication of small pox from India. WHO Regional Publication Southeast Asia Series No. 5. WHO–SEARO, New Delhi.
Berlinguer, G. (1999). Globalization and global health. International Journal of Health Services, 29, 579–95.
Bettcher, D.W., Sapirie, S., and Goon, E.H.T. (1998). Essential public health functions: results of international Delphi study. World Health Statistics Quarterly, 51, 44–54.
Bland, J. and Clements, J. (1998). Protecting the world’s children: the story of WHO’s immunization programme. World Health Forum, 19, 162–3.
Chen, L.C., Evans, T.G., and Cash, R.A. (1999). Health as a global public good. In Global public goods (ed. I. Kaul, I. Grunberg, and M.A. Stern). Oxford University Press, New York.
Creese, A. (1994). Global trends in health care reform. World Health Forum, 15, 317–22.
Curtis, S. and Taket, A. (1996). Health and societies: changing perspectives. Arnold, London.
Detels, R. and Breslow, L. (1997). Current scope and concerns in public health. In Oxford textbook of public health (3rd edn) (ed. R. Detels, W.W. Holland, J. McEwen, and G.S. Omenn), Vol. 1, p. 3. Oxford University Press, New York.
Djukanovic, V. and Mach, E.P. (ed.) (1975). Alternative approaches to meeting basic health needs in developing countries: a joint UNICEF/WHO study. WHO, Geneva.
Fenner, F., Henderson, D.A., Arita, I., Jezek, Z., and Ladnyi, I.D. (1988). Smallpox and its eradication, pp. 473–516. WHO, Geneva.
Foege, W.H., Millar, J.D., and Lane, J.M. (1971). Selective epidemiologic control in small pox eradication. American Journal of Epidemiology, 94, 311–15.
Foster, G.M. and Anderson, B.G. (1978). Medical anthropology, pp. 224–5. John Wiley, New York.
Frenk, J. (1993). The new public health. Annual Review of Public Health, 14, 469–90.
Frenk, J., et al. (1997). The future of world health: the new world order and international health. British Medical Journal, 314, 1404.
GFHR (Global Forum for Health Research) (1999). The 10/90 Report on Health Research 1999. GFHR, Geneva.
Goodman, R.A. and Foster, K.L. (ed.) (1998). Global disease elimination and eradication as public health strategies. Proceedings of a conference held in Atlanta, Georgia, United States, 23–25 February 1998. Bulletin of the World Health Organization, 76 (Supplement 2), 5–162.
Gopalan, C. (1992). Nutrition in developmental transition in Southeast Asia. Regional Health Paper, SEARO, No. 21, pp. 9–11. WHO–SEARO, New Delhi.
Gunaratne, V.T.H. (1977). Challenges and response: health in Southeast Asia Region. WHO–SEARO, New Delhi.
Guthrie, D. (1946). History of medicine. Lippincott, London.
Hanson, K. and Berman, P. (1998). Private health care provision in developing countries: a preliminary analysis of levels and composition. Health Policy and Planning, 13, 195–211.
Harrison, M. (1994). Public health in British India: Anglo-Indian preventive medicine 1859–1914. Cambridge University Press.
Henderson, D.A. (1997). Edward Jenner’s vaccine. Public Health Reports, 112, 116–21.
Henderson, D.A. (1998). Smallpox eradication—a cold war victory. World Health Forum, 19, 113–19.
Howard-Jones, N. (1974a). 1: The scientific background of the international sanitary conferences 1851–1938. WHO Chronicle, 28, 159–71.
Howard-Jones, N. (1974b). 5: The scientific background of the international sanitary conferences 1851–1938. WHO Chronicle, 28, 455–70.
Howard-Jones, N. (1977). 2: International public health: the organizational problems between the two World Wars. WHO Chronicle, 31, 449–60.
Institute of Medicine, United States (1988). The future of public health. Committee for the Study of the Future of Public Health. Division of Health Care Services, Institute of Medicine. National Academy Press, Washington, DC.
Jaggi, O.P. (1979a) History of science, technology and medicine in India. Vol. 13: Western medicine in India: medical education and research. ATMA RAM, Delhi.
Jaggi, O.P. (1979b) History of science, technology and medicine in India. Vol. 14: Western medicine in India: public health and its administration. ATMA RAM, Delhi.
Jamison, D., et al. (1998). International collective action in health: objectives, functions, and rationale. Lancet, 351, 514–17.
Kaplan, M.M. (1999). The efforts of WHO and Pugwash to eliminate chemical and biological weapons—a memoir. Bulletin of the WHO, 77, 149–55.
Kiat, L.Y. (1978). The medical history of early Singapore, SEAMIC publication No. 14, pp. 221–35. SEAMIC, Tokyo.
Kinnon, C.M. (1998). World trade: bringing health into the picture. World Health Forum, 19, 397–406.
Ko Ko U (1986). Public health: myth, mysticism and reality. WHO–SEARO, New Delhi.
Ko Ko U (1996). Closing the gaps in health care—a holistic approach to medical education. In SEA regional conference on medical education, 7–9 February 1996, pp. 152–67. Faculty of Medicine, Chulalongkorn University, Bangkok.
League of Nations Health Organization (1937). Report of the intergovernmental conference of Far-Eastern countries on rural hygiene, Bandoeng (Java), 3–13 August 1937. League of Nations Health Organization, Geneva.
McNeill, W.H. (1977). Plagues and peoples. Basil Blackwell, Oxford.
Mills, A. (1998). Health policy reforms and their impact on the practice of tropical medicine. British Medical Bulletin, 54, 503–13.
Miyagishima, K. and Kaferstein, F.K. (1998). Food safety in international trade. World Health Forum, 19, 407–11.
Murray, C.J.L. and Lopez, A.D. (ed.) (1996). The global burden of disease: a comprehensive assessment of mortality and disability from diseases, injuries and risk factors in 1990 and projected to 2020. Global Burden of Diseases and Injury Series, Vol. I. Harvard University Press, Cambridge, MA.
Navarro, V. (1999). Health and equity in the world in the era of ‘globalization’. International Journal of Health Services, 29, 215–26.
Noordeen, S.K., et al. (1996). Eliminating leprosy as a public health problem—is the optimism justified? World Health Forum, 17, 109–44.
Paneth, N., et al. (1998). A rivalry of foulness: official and unofficial investigations of the London cholera epidemic of 1854, American Journal of Public Health, 88, 1545–53.
Rafei, U.M. (1993). Primary health care in changing world—Southeast Asia regional perspectives. Paper presented at the 15th anniversary celebration of the Alma-Ata Conference on Primary Health Care, Almaty, 13–14 December 1993. WHO–SEARO, New Delhi.
Rangarajan, L.N. (ed.) (1992). Kautilya: the arthashastra. Penguin Books, New Delhi.
Robbins, A. (1999). Brundtland’s World Health Organization: a test case for United Nations reform. Public Health Reports, 114, 30–9.
Rosenfield, A. and Maine, D. (1985). Maternal mortality—a neglected tragedy. Where is the M in MCH? Lancet, ii, 83–5.
Ryle, J.A. (1943). Social medicine: Its meaning and scope. British Medical Journal, ii, 633–6.
Sigerist, H.E. (1951). A history of medicine. Vol. 1: Primitive and Arabic medicine. Oxford University Press, New York.
Society for International Development (1999). Responses to globalization: rethinking health and equity. Development, 42, 1–158.
Tarimo, E. and Webster, E.G. (1994). Primary health care concepts and challenges in a changing world: Alma-Ata revisited. Current Concerns: SHS paper No. 7. Document WHO/SHS/CC/94.2. WHO, Geneva.
UNDP (United Nations Development Programme) (1998). Human development report 1998. UNDP, New York.
UNDP (United Nations Development Programme) (1999). Human development report 1999. UNDP, New York.
UNICEF (United Nations International Children’s Emergency Fund) (1989). States of the world’s children, 1989. Oxford University Press, New York.
UNICEF (United Nations International Children’s Emergency Fund) (1991). States of the world’s children, 1991. Oxford University Press, New York.
UNICEF (United Nations International Children’s Emergency Fund) (1996). States of the world’s children, 1996. Oxford University Press, New York.
Uragoda, C.G. (1987). A history of medicine in Sri Lanka—from the earliest times to 1948. Sri Lanka Medical Association, Colombo.
Walt, G. (1998). Globalization of international health. Lancet, 351, 434–7.
Wilkinson, L. and Power, H. (1998). The London and Liverpool Schools of Tropical Medicine 1898–1998. British Medical Bulletin, 54, 281–92.
WHO (World Health Organization) (1957). Report on rural health conference, 14–26 October 1957. Document SEA/RH/9. WHO–SEARO, New Delhi.
WHO (World Health Organization) (1964). WHO Expert Committee on small pox: first report. World Health Organization Technical Report Series, 283, 9–11, 15, 24, 31.
WHO (World Health Organization) (1967). Twenty years in Southeast Asia 1948–1967. WHO–SEARO, New Delhi.
WHO (World Health Organization) (1978a). A decade of health development in Southeast Asia 1968–1977. WHO–SEARO, New Delhi.
WHO (World Health Organization) (1978b). WHO, Alma-Ata 1978—Primary health care, Geneva, 1978. HFA series No. 1. WHO, Geneva.
WHO (World Health Organization) (1986). Regional Advisory Committee on Medical Research for Southeast Asia. Proceedings of the special session commemorating the tenth anniversary on 12 April 1985. WHO Regional Publications, Southeast Asia Series No. 15. WHO, New Delhi.
WHO (World Health Organization) (1992). WHO Collaboration in Health Development in Southeast Asia: 1948–1988. WHO–SEARO, New Delhi.
WHO (World Health Organization) (1993). Implementation of global strategy for Health for All by the year 2000: Eighth report on the world situation, Vol. 1. WHO, Geneva.
WHO (World Health Organization) (1996). Investing in health research and development: report of the Ad Hoc Committee on Health Research Relating to Future Intervention Options. Document TDR/Gen/96.1. WHO, Geneva.
WHO (World Health Organization) (1997). Health and environment in sustainable development: five years after the Earth Summit. Document WHO/EHG/97.8. WHO, Geneva.
WHO (World Health Organization) (1998a). Evaluation of the implementation of the global strategy for Health for All by the Year 2000, 1979–1996. Document WHO/HST/98.2. WHO, Geneva.
WHO (World Health Organization) (1998b). World Health Report 1998. Life in the twenty-first century: a vision for all. WHO, Geneva.
WHO (World Health Organization) (1999a). Health situation in the Southeast Asia Region 1994–1997, p. 155. WHO–SEARO, New Delhi.
WHO (World Health Organization) (1999b). Nutrition for health and development: progress and prospects on the eve of the twenty-first century, Document WHO/NHD/99.9. WHO, Geneva.
WHO (World Health Organization) (1999c). World health report 1999. WHO, Geneva.
WHO (World Health Organization) (2000a). A corporate strategy for the WHO Secretariat. Report by the Director-General to the 105th Session of the Executive Board. Document EB105/3. WHO, Geneva.
WHO (World Health Organization) (2000b). Global tuberculosis control. WHO Report 2000. Document WHO/CDS/TB 2000.275. WHO, Geneva.
WHO/UNICEF (2000). Final push in campaign to eradicate polio. WHO Press Release WHO/1, 6 January 2000. WHO, Geneva.
World Bank (1993). The world development report 1993: investing in health. World Bank, Washington, DC.
World Bank (1999). The World Development Report 1999/2000: entering the twenty-first century. World Bank, Washington, DC.
Yach, D. and Bettcher, D. (1998). The globalization of public health. I: Threats and opportunities. American Journal of Public Health, 88, 735–43.