Leave a comment

1.2 The history and development of public health in developed countries(cont’2)

Traditions of market regulation affected public health more broadly. Concern about water quality in metropolitan London, for example, reflected consumer outrage at high prices and poor quality and quantity well before there was any epidemiological evidence that such water was causing cholera. Equally, public willingness to accept that epidemiological evidence was tied to anger at paying too much for an irregular and visibly dirty water supply (Hamlin 1990). It is also likely, though difficult to show, that the ready acceptance of the new scientific forms of food inspection in the late nineteenth century reflected consumer expectations that the service was necessary and appropriate for government to undertake.
In the case of environmental nuisances too, institutions of public health took over from long-standing institutions for settling civil disputes. The term ‘nuisance’, drawn from the Anglo-French for annoyance, is peculiar to the English common law tradition, but analogous concepts operated in other cultures. ‘Nuisance’ referred to an accusation, and later to the legal determination, that actions on one person’s property or in the public domain interfered with another’s enjoyment of the rights of property, which included a right to enjoy health (Blackstone 1892; Novak 1996; Hamlin, in press). While in earlier centuries the concept had been very broad—including excessive noise, disturbances of the peace, the blocking of customary light—by the middle of the nineteenth century urban dung, human and animal, and action against nuisances acquired a basis in statute law that supplemented its status in civil law. Beginning with the first English Nuisances Removal Act of 1847, passed in expectation of the return of cholera, doctors, and later a new functionary called an inspector of nuisances (later a sanitary inspector) were charged with identifying nuisances and taking steps to have them removed (Wilson 1881; Hamlin, in press). The legislation reflects concern that a legal tradition built upon the power of property was unsuited to a situation in which most property was not occupied by its owners, and that one which depended upon an outrage to sensibility was unsuited in a situation in which peoples’ sensibilities were insufficiently attuned to the particular states of environment presumed to be associated with cholera. But while this change was an emergency response to cholera, its effects were more far reaching; in effect, it represented the investing of community standards in a permanent institution with enforcement powers, rather than leaving them to be worked out, incident-by-incident, through the common law of nuisance and tort. The inspectors of nuisances did not restrict themselves to the causes of cholera; they and their successors responded to community complaints, which sometimes were primarily aesthetic. They became the defenders of the ever rising and increasingly universal standards of middle-class life, and however far their activities might stray from any direct relation to disease control, the inspectors carried the authority of public health (Hamlin 1988, 1994; Kearns 1991). Towards the end of the nineteenth century some epidemiologists, recognizing that the tracing of cases and contacts provided a more exact means of disease control, suggested that concern with these broad measures of environmental quality was an unjustified expense that deflected the attention of public health departments from what really mattered (Casseday 1962; Rosenkrantz 1974). In some cases they were effective in severing sanitation and public works from public health, but often they found that the public, which tended to support clean streets and pleasant neighbourhoods, continued (and continues) to appeal to public health as justification for their concern. Here too, medicine, however distantly it might be linked to the environmental condition under scrutiny, gave public action a legitimacy that would otherwise have been difficult to create.
The medicalization of public police that these examples suggest was clearly underway by the middle of the eighteenth century. The concept of medical police arose first in Germany and Austria, later in Scotland, Scandinavia, Italy, and Spain; in France the rough equivalent was hygiene publique. In America and in England the term and concept never really caught on. Medicine’s rise to prominence reflected an alliance between medical practitioners who sought state patronage and the ‘enlightened despots’—rulers who, like Austria’s Joseph II, sought a science of good government that would significantly strengthen their states. Increasingly, rulers like Joseph felt obliged to test their policies against some tenets of rationality; health seemed to offer a well-defined arena of rational government, a set of means to improve the state and to measure the progress of that improvement (Rosen 1974a, b). How much the regulation of personal behaviour could improve the health of soldiers and sailors was becoming recognized; why not practice the same techniques on the rest of society? The effect of this medicalization was to move matters of police further from the realm of local social relations—for example, the determination and enforcement of community standards over cleanliness or food quality—and towards that of scientific rationality.
The classic text of eighteenth century medical police is Johann Peter Frank’s six volume System einer Vollständingen Medicinischen Polizey, or A System of Complete Medical Police, which appeared between the 1779 and 1819 (Frank 1976). Frank (1745–1821) had a distinguished career as a medical professor and a public health and hospital administrator, mainly in Vienna. He began his giant work with a discussion of reproductive health (two volumes), including suggestions for the regulation (and encouragement) of marriage, prenatal care, obstetrical matters, and infant feeding and care. He turned then to diet, personal habits, public amusements, and healthy buildings. The fourth volume covered public safety, which involved everything from accident prevention to the injuries supposedly inflicted by witches, the fifth volume dealt with safe means of interment, and the sixth with the regulation of the medical profession. In Frank’s cameralist view, anything that adversely affected health was a matter for public policy and an appropriate subject for regulation—rights, traditions, property, and freedoms, had no status if they interfered with the welfare of the population.
In its most far-reaching definitions, modern public health approaches the domain of a comprehensive police. It also recognizes that a wide range of factors are implicated in health conditions—current public health concerns include the effects of violent entertainment, the prevention of gun violence, and the conditions of the work place. But in modern liberal democracies, much of what Frank saw as the obvious business of the state is deeply problematic. For, in the nineteenth century, public health shifted radically in mission and constituency. It became less a means of maintaining the state, and more a means by which the state served its sovereign citizens with an (increasing) standard of health that they (increasingly) took as a right of citizenship.
The public health of human potential
The emergence of a public health that is not merely reactive or regulative, but that takes as its goal the reduction of rates of preventable mortality and morbidity, is a product of the eighteenth century. It is also one of the most remarkable changes of sensibility in human history. Its causes are complex but poorly understood. It clearly required the development both of knowledge of the problem and of the means to solve it. The concepts of preventable mortality and excess morbidity required being able to show that death and illness existed at much higher rates in some places than in others. Whilst there were a few attempts in seventeenth and eighteenth century Europe to determine local bills of mortality they were too few to provide a basis for comparison. In contrast, by the late nineteenth century annual mortality rates were an important focus of competition amongst English towns. The central government’s public health officials, notably John Simon, chief medical officer of the Privy Council from 1857 to 1874, badgered towns with poor showings to analyse the reasons for their excess mortality and to take appropriate action (Brand 1965; Lambert 1965; Eyler 1979; Wohl 1983). By the end of the century, and during the twentieth century, reliable morbidity statistics were available to provide a better understanding of the remediable causes of disease. The gathering and analysis of such data has become a central part of modern public health.
The mission of prevention was also tied to a very real growth in knowledge of the means of prevention. The widespread adoption of inoculation and, after 1800, of vaccination for smallpox was the first clearly effective means to intervene decisively to prevent a deadly disease. Initially through the development of the numerical method and the cultivation of pathological anatomy in the Paris hospitals in the first decades of the nineteenth century, and subsequently through bacteriological and later serological methods, infectious diseases were distinguished and their discrete causes and vectors identified (Ackerknecht 1967; Bynum 1994). Such recognition led ultimately not only to the ‘magic bullet’ thinking of vaccine development; it also underwrote campaigns to improve water quality and provide other means of sanitation, and sometimes, as with tuberculosis and typhoid, programmes to identify, monitor, and regulate carriers.
Yet these factors alone cannot account for the widespread conviction that human health could, and must be, significantly improved. They are means, not ends. Whatever the symbolic significance of effective action against smallpox in boosting confidence, vaccination successes did not imply that all infectious diseases were amenable to a similar strategy. In most cases the new medical knowledge did not precede the determination to improve the health of all but was developed in the process of achieving that goal. A great deal of success was achieved despite quite erroneous conceptions of the nature of the diseases and their causes. The great sanitary campaign against urban filth (based on a vague and flexible concept of pathogenic miasms) is the best known example.
Recognition of differential mortality was not new in the early 1800s, but it did not necessarily convey a need for action. That there was a mortality penalty associated with poverty, infancy, and urban living was clear; but some regarded the town as a necessary corrective to the overfecundity of the countryside, and characterized the poor as occupying a fixed station in life whose biological characteristics included higher mortality than the virtuous middle classes (though not necessarily than the profligate aristocracy) amongst compensating benefits (like less anxiety and a simple, healthy, diet) (Sadler 1830; Weyland 1969). And even humane and optimistic writers saw infant mortality rates of 25 per cent or more as providential (Roberton 1827). To the influential eighteenth-century Lutheran clergyman Christoph Christian Sturm, God’s providence was evident in the symmetry of the curve of mortality by age: mortality rates were high amongst the very young and very old, and low in between (Sturm 1832). This is in contrast with the modern sensibility which admits no justifiable reason (beyond, perhaps, the climatic factors that determine the range of some disease vectors) for differential mortality or morbidity.
The age of liberalism: health in the name of the people 1790 to 1880
The rise of liberalism changed all this. Whilst ‘liberalism’ covered a wide range of philosophical, political, economic, and religious ideas, at its heart were notions of individual freedom and responsibility, and usually, of equality in some form. In 1890, when John Simon, the pioneer of English state medicine, surveyed progress in public health during the past two centuries in his English Sanitary Institutions, he included a lengthy chapter on the ‘New Humanity’. In it he covered the antislavery movement, the rise of Methodism, growing concern about cruelty to criminals and animals, legislation promoting religious freedom, the replacement of patronage by principle as the motor of parliamentary democracy, the introduction of free markets, the rationalization of criminal and civil law, and efforts towards international peace. Simon saw little need to explain how this concerned public health; he was sketching a fundamental change in ‘feeling’ that underlay changes in public health policy.
Society had become readier than before to hear individual voices which told of pain or asked for redress of wrong; abler … to admit that justice does not weight her balances in relation to the ranks, or creeds, or colours, or nationalities of men.
No longer were humans so much cannon fodder; the best policies were those which maximized ‘human worth and welfare’ (Simon 1890; compare with Pettenkofer 1941; Coleman 1974; Haskell 1985).
What Simon recognized was that with the granting of equal political and economic rights and responsibilities, it had become impossible to see health status as the birthmark of class, race, or sex. Nineteenth-century French and English liberals recognized that some—particularly women, children, or the poor—still suffered ill health disproportionately owing to the workings of the labour market, but they saw such consequences as incidental, accidental, and often, as temporary; in principle all had an equal claim to whatever version of human and health rights a society was prepared to recognize. As Simon also recognized, this change in feeling was both cause and product of the widening distribution of the political power it sanctioned.
And yet liberalism was no clear and compact doctrine, and its implications for public health were, and still are, by no means clear. Few of the pioneers of liberal political theory bothered to translate human rights into terms of health. They wrote mainly with middle-class men in mind, and saw the threats to life, liberty, and property as political rather than biosocial. The expansion (or translation) of political rights into rights to health was gradual, piecemeal (it has never been the rallying cry of revolution), complicated, and even fundamentally conflictual—it was and is not always the case that the choices free individuals make will be compatible with protecting the public’s health, or even their own. Concern with public health arose accidentally, and quite differently and at different times in the developed nations. At the beginning of the twenty-first century an obligation to maintain and/or improve the health of all citizens exists only in varying degrees in the politics of developed nations.
Many early liberals found health rights hard to recognize because so much of public health had been closely associated with the medical police functions of an overbearing state. In revolutionary France the first instinct was to free the market in medical practice by abolishing medical licensing, a policy quickly recognized as disastrous for maintaining the armies of citizen-soldiers who were protecting the nation (Foucault 1975; Riley 1987; Weiner 1993; Brockliss and Jones 1997). Even after new, meritocratic and science-based medical institutions had been established, the cadre of public health researchers that it fostered—at the time the world’s leaders in public health epidemiology—found it difficult to conceive how their findings of the preventable causes of disease could be translated into proposals for preventive legislation. Poverty, and to some degree working and living conditions, were dictated by the market; government mandates would induce dependence or simply shift the problem elsewhere. Thus France was the scientific leader in public health for the first half of the nineteenth century without finding a viable political formula for translating that knowledge into prevention (Coleman 1982; LaBerge 1992).
In early nineteenth-century Britain the ideas of T.R. Malthus led a broad range of learned public opinion, liberal and conservative, to similar conclusions. Disease was amongst the natural checks that kept population within the margins of survival. Successful prevention of disease would be temporary only; it would postpone an inevitable equilibration of the food–population balance that would occur through some other form of human catastrophe (Dean 1991; Hamlin 1998). Malthusian sentiment blocked attempts to establish foundling hospitals. Notwithstanding the fact that such institutions were notoriously deadly to their inmates, it was felt that their existence encouraged irresponsible procreation—faced with full economic responsibility for their actions, men (or women, depending on how one viewed the prevailing legal arrangements for child support) would stifle their urges (McClure 1981).
By 1850, in both France and England it was no longer possible to maintain what for many was a complacent and convenient faith in the welfare-maximizing actions of a completely free society. A number of factors shattered this faith. Firstly, no government ever adopted the programme of the early nineteenth-century liberals in full. In central, eastern, and southern Europe the old concerns of state security continued to govern their public health. In Sweden and later France, concern about a state weakened by depopulation fostered attention to the health and welfare of individuals. Secondly, working-class parties, while often generally sympathetic with political liberalism, saw no advantage in economic liberalism. Often they demanded adherence to the moral economy of the old order, which damped fluctuations in grain prices and backed up the working conditions that craft guilds had established. Most important is that many liberals themselves arrived at what is properly called a biosocial vision, a concept of society which recognized that it was impractical, inhumane, and injudicious to impose economic and political responsibilities on people who were biologically incapable of meeting those responsibilities: liberty had biological prerequisites.
These considerations were central to debates in France and Britain in the 1830s and 1840s. Governments in both countries were apprehensive about revolution and wary of an alienated underclass, urban and rural, of people who could not be trusted with political rights and seemed immune to the incentives of the market. Such people represented a reservoir of disease, both literal physical disease and metaphorical social disease, that could infect those clinging precariously to the lower rungs of the respectable working classes. Reformers proposed somehow to transform these dangerous classes, usually with Bibles, schools, or experimental colonies. Such was the political background against which Edwin Chadwick (1800–1884), secretary of the English bureau charged with overseeing the administration of local poor relief, developed ‘the sanitary idea’ in the late 1830s (Finer 1952; Lewis 1952; Chadwick 1965; Richards 1980; Hamlin 1998). Chadwick justified public investment in comprehensive systems of water and sewerage on the grounds that saving lives—particularly of male breadwinners—would be recompensed in lowered costs for the support of widows and orphans. But he also suggested that sanitation would remoralize the underclass, and for many supporters this was its most important feature. Politically, sanitation was a brilliant idea, since every other general reform was deeply controversial: proposals for religion and education were plagued by sectarianism; calls to improve welfare by allowing free trade in grain (leading to lower food prices) ran afoul of powerful agricultural interests; proposals for regulating working conditions were unacceptable to powerful industrial interests. Notwithstanding complaints that towns be permitted to undertake it in their own way and their own good time, sanitation achieved remarkable popularity in nineteenth-century Britain as the locus of hope not just for improved health, but in general, for a prettier, happier, and better world.
In treating insanitation as the universal cause of disease, Chadwick hoped to establish a public health that was truly liberal. He sought to deflect attention from other causes of disease, such as malnutrition and overwork, for these were areas of great potential conflict between public health and liberal policy. For many, the liberty of the free (and in the case of women unmarried) adult to bargain in the market for labour without state intervention to limit hours or kinds of work was axiomatic. And the need for food was to be the spur for work and self-improvement. Interventions by what has recently been called a ‘nanny state’ seemed to imply an obligation to the state and to affirm the desirability of dependence and subjugation. There were grounds for such concern: the relations of political status to health were fraught with ambiguity. Frank had written passionately of misery as a cause of disease amongst the serfs of Austrian Italy, but had not advocated the elimination of serfdom. Virchow argued in 1848 that liberal political rights were the answer to typhus in Silesia while in Scotland W.P. Alison argued on the contrary that too rigorous a liberal regime was the cause of poverty-induced typhus (Frank 1941; Rosen 1947; Weindling 1984; Hamlin 1998).
For about a generation, from 1850 to 1880, sanitation was unchallenged in Britain (and in much of its empire) as the keystone of improved health. Chadwick’s campaigns led to a series of legislative acts, beginning with the Public Health Act of 1848 and culminating with a comprehensive act in 1875, that established state standards for urban sanitation and a bureau of state medicine, staffed by medical officers in central and local units of government and charged with detecting, responding to, and preventing outbreaks of disease (Wohl 1983). Outside Britain, sanitation did not have the same purchase. While continental towns and states took on sanitary projects for a variety of pragmatic reasons, adopting eventually the English paradigm of a water-centred sanitary system, the sanitary idea did not dominate public health (Simson 1978; Göckenjan 1985; Goubert 1989; Labisch 1992; Münch 1993; Ramsay 1994; Hennock 2000; Melosi 2000). They concerned themselves more with establishing networks of local medical officers and with controlling the transmission of contagious diseases through the regulation of travel and prostitution. Through the 1880s, the United States remained an exceptional case, coming closest to following a policy that an individual’s health was a private matter alone. The national government maintained a system of marine hospitals along the coasts and navigable rivers, but less for controlling the spread of epidemics than for relieving ports of the burden of caring for sick seamen. In the early 1880s it established a National Board of Health to advance knowledge on public health issues of common import, but despite a superb research performance, it was scrapped within a few years on the grounds that public health was the business of the individual states (Duffy 1990). Often dominated by rural interests, many state legislatures had little enthusiasm for public health. Louisiana, which established a state board of health to combat yellow fever, was an exception (Ellis 1992). Towns and cities were more active, but often only sporadically, taking steps when faced with epidemics. States that did establish boards of health usually focused on specific problems rather than on public health generally: in Massachusetts the allotment of pure water resources was a key issue; elsewhere it was food quality, care for the insane, vital statistics, or the threat of immigrants (Rosenkrantz 1972; Shattuck 1972; Kraut 1994). In Michigan concern about kerosene quality (it was being adulterated with volatile and explosive petroleum fractions) and arsenical wallpaper dyes spurred the establishment of a state board of health in 1873 (Duffy 1990).
1880 to 1970: the golden age of public health?
By the 1880s the classic liberalism of the first half of the nineteenth century was giving way to a resurgent statism. The European nations, the United States, and later Japan competed for colonies and international influence. If the newly liberated or the newly enfranchised had some claim to a right to health, they also had a duty to the state to be healthy. In most of the industrialized nations there was renewed interest in monitoring social conditions. While the emerging techniques of empirical social research gave this inquiry the aura of quantitative precision, the surveys disclosed little that was distinctly new about the lives or health of the mysterious poor, the usual targets of public health and social reform. Much of it seemed new, however, because it now registered as problematic. For example, the enormous contribution of infant deaths to total mortality had long been clear, but only towards the end of the century did infant mortality, persistently high even in relatively well-sanitized Britain, become a problem in itself as distinct from an indicator of sanitary conditions in general. The health conditions of women too, and of workers, began to command attention in a way that they had not done previously.
While these newly recognized public health problems partly reflected the changing distribution of political power, they also reflected anxiety about the nation’s vulnerability, and even the decadence of its population. Worried about the strengths of their armies, states like Britain discovered in the 1890s that too few of those they would call up were competent to be mobilized, and they attributed the problem to a vast range of causes: poor nutrition (coupled with lack of sunlight in smoky cities), bad sanitation, bad mothering, and bad heredity (Soloway 1982; Pick 1989; Porter 1991a, 1999). Epidemics of smallpox following the Franco-Prussian War of 1870 and again in the 1890s disclosed the gaps in vaccination programmes (Baldwin 1999; Brunton, in press). The usual response was to redouble the state’s efforts to take responsibility for the immune status of its population. The persistence of syphilis registered at a new level of unacceptability (Brandt 1985; Baldwin 1999).
This led to an expanded public health, one highly successful in terms of reduced mortality and morbidity. It was undertaken jointly in the name of the state and the people, but it involved the regulation of an individual’s life—home, work, family relations, recreation, sex—that went beyond the medical police of the previous century. From a contemporary standpoint such intimate regulation of the individual by the state may seem overbearing, but, with some notable exceptions, the populations of developed countries accepted it as an appropriate and even desirable role for the state.
New diseases, or old diseases that were (or seemed) more prevalent or virulent, new institutions for the practice of public health medicine, and advances in medical and social science contributed to this new relation between states and people. During the 1860s a long-standing analogy of disease with fermentation matured into the germ theory of disease as the research of Louis Pasteur and John Tyndall made clear the dependence of fermentation on some microscopic living ferment (Pelling 1978; Worboys 2000). During the 1880s, primarily through the work of emerging German and French schools of determinative bacteriologists, it became possible to distinguish many microbe species from one another, to ascertain the presence of particular species with some degree of confidence, and therefore to link individual species with particular diseases (Bulloch 1938). Through serological tests developed in the succeeding decades, the presence of a prior infection could be determined, regardless of whether anyone had noticed symptoms. Notwithstanding the increasing recognition of the many ways microbial agents of disease were transmitted from person to person, the effect of the rise of the germ theory was to focus attention on the body that housed and reproduced the germ—for example, the well-digger working through a mild case of typhoid—even when there were alternative strategies (water filtration or, by the second decade of the twentieth century, chlorination) that protected the public reasonably well most of the time (Hamlin 1990). The general interest in the human as germ bearer and culture medium brought with it an emphasis on the labour-intensive business of case-tracing, of keeping track not only of those who showed symptoms of the disease but also those with whom they had contact (Winslow 1980; Coleman 1987). In the key diseases of typhoid fever, syphilis, and tuberculosis concern with the inspection and regulation of people was exacerbated by the recognition that not all who were infected were symptomatic. The case of ‘Typhoid Mary’ Mallon, the asymptomatic typhoid carrier who lived for 26 years as an island-bound ‘guest’ of the City of New York, is notorious, but it was also important in the working out of both legal limits and cultural sensibilities with regard to the trade-off between civil rights and public health (Leavitt 1996). Newly virulent forms of diphtheria and scarlet fever, deadly childhood diseases transmitted person to person or by common domestic media, also gave immediacy to decisive public health intervention.
Such monitoring could not have occurred without a large rank and file of local public health officers. It was during the late nineteenth century that public health was identified as a distinct division of medicine and that most of the developed countries solidified a reasonably complete network of municipal and regional public health officers: in Germany, the Kreisartz; in France, the Officier de Santé; and in Britain, the Medical Officer of Health, assisted by the sanitary inspector. Increasingly these officers worked as part of hierarchical national health establishments to which they reported local health conditions and from which they received expert guidance. While preceding generations of public doctors had often been drawn from the ranks of undercapitalized young doctors, beginning in the mid-1870s many were specially trained and certified for public health work (Novak 1973; Watkin 1984; Acheson 1991; Porter 1991b). A commitment to public health was increasingly incompatible with ordinary medical practice, not so much because of its specialized knowledge, but because it was built upon a quite different ethic. There had long been economic tension between public and private medicine in areas of practice like vaccination, in which public authorities either took over entirely or inadequately compensated private practitioners for services that had traditionally been part of the ordinary medical marketplace (White 1991; Brunton, in press). But monitoring healthy carriers and those who might be susceptible to disease introduced a new regime of medicine—one which responded to an ethic of public good, even if there were no client-defined complaint. Effectively, bacteriology, epidemiology, and associated measures of immunological status redefined disease away from patient complaint. The healthy carrier might see no need to seek medical care, but to the public health doctor that person was a social problem. On occasion private doctors were appealed to for a diagnosis (bronchitis, pneumonia) that would protect one from the health officer’s diagnosis of tuberculosis, which would bring loss of employment and social stigma (Smith 1988).
Rivaling the germ theory as the major motif of public health thinking from the 1890s to the 1930s was the application of the emerging science of heredity to the improvement of the human populations, the science and practice of eugenics (Paul 1995; Kevles 1995). Whether or not eugenic concerns were the source of the greatest anxiety about the public’s health is debatable, but they were the locus of the greatest hope for health progress, the home of a residue of utopianism that had coloured the medical police and sanitary literature. Even more than other forms of public health, eugenics exposed a class, and sometimes a racial, division that had long been a part of public health: much of public health practice was predicated on a distinction between those, usually the poor, who were seen to represent the objects of public health efforts and those, often the well to do, who authorized intervention, whether to improve the lot of the poor, to protect ‘society’, or perhaps even to block the physical or moral contagia that might infect their own class (Kraut 1994; Anderson 1995). Eugenics appealed mainly to those with wealth and power: those others who were to improve their lot rarely identified heredity as the source of their unfortunate circumstances.
Such an attitude is reflected in the most infamous application of the eugenic viewpoint, the attempt by Nazi Germany to exterminate Jews and other ‘races’ regarded as inferior and unfit not only to intermarry with so-called ‘true Aryans’, but even to survive. While historians’ views of the origins of the Holocaust differ, some of the immediate precedents for a state policy of negative eugenics

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: