tag:blogger.com,1999:blog-75397229789946456502024-02-20T03:45:26.864-05:00A Voice of ReasonIn an Age of Emotion...Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.comBlogger61125tag:blogger.com,1999:blog-7539722978994645650.post-66012395322943711162012-10-21T08:17:00.000-04:002012-10-21T08:17:37.141-04:00Origins of the Cold War<div dir="ltr" style="text-align: left;" trbidi="on">
With the 50th anniversary of the Cuban Missile Crisis I thought I would publish an essay that I wrote back in 2001. <br />
<br />
In 1914 H. G. Wells’ novel The World Set Free was published. While its literary quality at times takes a back seat to the message that Wells was trying to promulgate, it was in this work that he coined the name ‘atomic bomb’, depicted its discovery and nature, and described its use in a world war. The story is told from the perspective of the 1970's and describes a world war that occurred in the 1950's. In an interesting intersection of fiction and reality, Wells’ fictional scientist Holsten first achieves artificial atomic disintegration of bismuth in 1933, and it was in 1933 that the Joliot-Curies first produced radioactive phosphorous by bombarding aluminum with electrons. Upon hearing of their work, Leo Szilard immediately knew what it meant, because he had read Wells. With the dropping of the atomic bombs on Hiroshima and Nagasaki, a year before Wells’ own death, one war was ended, and a new one begun.<br />
<br />The Cold War may have officially begun with the dawning of the nuclear age, but its origins lie further in the past than those pivotal acts. In his 1994 book, The Specter of Communism, Melvyn P. Leffler concentrates on the geopolitical events of the years 1917-1953 in his quest for those origins. What began as an ideological clash following the Bolshevik revolution, turned into a more aggressive anti-Communism in the 1940s with Stalin’s selling of raw material to the Nazi’s and the 1941 Soviet nonaggression pact with Japan. Stalin’s direction to Communist parties abroad to end their alliances with antifascist parties, and the resulting actions of the American Communist Party further spurred anti-Communist feelings in the U.S.<br />
<br />Fear of Communism did not, however, translate into a fear of the Soviet Union, especially when compared to the Nazi threat. In a realization of the axiom that politics sometimes makes for strange bedfellows, the Allies found themselves in a wartime alliance with the Soviet Union against Germany. But the alliance was never an easy one. Each side had its own agenda and reasons for engaging in it, and after the war it was not long before it began to unravel under the forces of mutual distrust and misapprehensions. As the fragile alliance crumbled the specter of Communism raised its head once more and the ideological clash was transformed into a power struggle that would continue for over forty years and leave very few aspects of our world untouched.<br />
<br />While the brevity of the work precludes an in-depth analysis of the complexities of the origins of the Cold War, Leffler does point out the role that misapprehensions played in the early years of its waging when the stakes were being identified and the battle lines were being drawn. As the title implies the image that the United States had of Communism was a combination of fantasy and reality, an apparition of frightening demeanor. The Communist versus capitalist, autocracy versus democracy, bipolar scenario did form a convenient framework for articulating the conflict, but the world is seldom, if ever, that black and white. If, at times, his portrayal of the Soviet Union seems a trifle naive or simplistic, perhaps that merely reflects the naivete with which it was perceived in popular American culture. As the blurbs on the back indicate, his book is a good introduction to the subject, but it is not the whole story.<br />
<br />In fact, as John Lewis Gaddis points out in his book We Now Know, the whole story has not yet been told, and could not be told before that war was over and the archives of the players opened and their secrets revealed. This 1997 book is one of several to appear in recent years that examines the Cold War from a perspective that includes not only the knowledge of how it ended, but also how some of the other players perceived its waging. Any one of his central chapters could easily be turned into a book in its own right, and some of them already have been. As with Leffler, Gaddis is covering the early years of the Cold War, but his coverage extends for another ten years, stopping at the Cuban Missile Crisis, rather than the death of Stalin. The greater length of his book allows for a more detailed treatment of the origins of the Cold War, and the extended scope results in a greater coverage of the use of the threat of nuclear weapons in its waging and also the impact of the conflict on the Third World, where a new Great Game was being played out.<br />
<br />This is a work of synthesis, and an attempt on his part to relate what we now know about the Cold War to what we thought we knew, and as such it is interesting to read it in conjunction with Gaddis’ earlier work The United States and the Origins of the Cold War 1941-1947, published in 1972. With access to foreign sources opening up and the continuing declassification of documents, individuals that perhaps seemed a bit flat in Leffler’s book gain depth and complexity in Gaddis’. And with this added understanding of the players, the events of the Cold War become more understandable, if no less tragic.<br />
<br />Lack of access to these sources, however, does not mean that you could not write a penetrating historical analysis of the Cold War, even before that war was over, as Barton J. Bernstein’s American Foreign Policy and the Origins of the Cold War amply demonstrate. Written in 1970 this work was considered to be somewhat radical in its interpretation, but in light of the sources available in a post-Cold War, post-Soviet world, it now comes across as being rather perceptive. Perceptive, too, was Wallace’s assessment of how the post war actions of America might be seen by other nations (in the letter to Truman, quoted on pp. 381-82), and one can’t help wondering what would have happened had Truman been more receptive to Wallace’s analysis. Unfortunately, Wallace was fired for publicly criticizing Truman’s foreign policy. With the firing of Wallace, the way was opened for George Kennan, a respected foreign service officer who wound up providing the intellectual power behind the Policy Planning Staff and formulating the policy of containment that would dominate U.S.-Soviet relations. The story of his influence on U.S.-Soviet relations is ably told in Wilson Miscamble’s 1988 book George Kennan and the Making of American Foreign Policy, 1947-1950.<br />
<br />Bernstein’s analysis of Truman’s foreign policy also provides a nice counterpoint to the analysis put forth in Leffler’s book A Preponderance of Power : National Security, the Truman Administration, and the Cold War, which examines the grand strategy of the Truman years. In this in-depth analysis he is seeking to answer what he considers to be the perennial questions of the Cold War regarding US-Soviet relations, the spread of the conflict to the Third World, the arms race that it spawned and especially the question of whether or not the U.S. policy was wise or foolish. He began the research for this book in 1979, and the book itself was published in 1992, so perhaps the optimism of its conclusions is not too surprising, considering the events that were taking place in the world during that time.<br />
<br />In contrast to Leffler’s optimism is the much more critical look at American foreign relations given by William Appleman Williams in his 1959 book The Tragedy of American Diplomacy, in which he broke with the traditional view of America as isolationist and argued that although we may not have had an empire in the sense of the French or the British, our economic policy did reflect an imperialist motivation. In an argument since echoed elsewhere the conflict between Communism and capitalism becomes a competition for economic markets and influence.<br />
<br />For over forty years the struggle between the United States and the Soviet Union dominated world politics, and dictated the economic priorities in both nations, and as we look beyond its origins to its eventual outcome we must also come to terms with its enduring legacy. Perhaps the most prominent symbol of that legacy is the nuclear arsenal that many analysts insist prevented the Cold War from heating up. Regardless of what is finally determined about the veracity of that claim, living under the threat of nuclear annihilation in the form of mutual assured destruction for almost half a century has had a lasting impact upon our culture. From the 1950s craze for bomb shelters (my brother John lives in a house that contains one) to the sudden upsurge in the sighting of UFOs (a phenomena that Carl Jung attributed to the uncertainty of living under the threat of nuclear annihilation) the images of nuclear fear have permeated our society. It is no coincidence that Japan came out with the Godzilla movie, featuring a monster created from the effects of atomic bomb radiation, and who can forget Stanley Kubrick’s dark satire Dr. Strangelove.<br />
<br />As the images of the Cold War permeated our society the lines between fiction and reality continued to blur. The war was fought in many ways and on many fronts. It was fought with secrets, threats and images and, as in most conflicts, propaganda and control of information were important factors. As the conflict stretched out over decades, evolving into a stalemate, the conception of its nature crystallized into a battle between good and evil, light and dark, although which side was which was surely a matter of perspective.<br />
<br />But as we penetrate that history, and gain the perspective that only the distance of time can provide us we may find that distinction failing, as it usually does in any conflict. In its effort to contain the Soviet Union the United State interfered in the internal affairs of other countries, supported dictatorships, even conspired in the assassination of heads of state and opposition leaders. The irony that many of these actions were carried out under the aegis of promoting democracy was not lost on the peoples of the countries in whose affairs the U.S. meddled, and in a post-Cold War world the dragon’s teeth that were sown during its waging are continuing to sprout, further complicating the enduring legacy of this U.S.-Soviet conflict.<br />
<br />
<b>Further Readings on the Cold War</b><br /><br /><b>John Lewis Gaddis, The United States and the Origins of the Cold War, 1941-1947, Columbia University Press, 1972</b><br /><br />A history of U.S. policy toward the Soviet Union during and immediately after World War II. Attempts to examine the many forces - domestic politics, bureaucratic inertia, individual personalities of the major players, as well as the perceptions of the intentions of the Soviets - that influenced the key decisions being made.<br /><br /><br /><b>Melvin Leffler, A Preponderance of Power : National Security, the Truman Administration, and the Cold War, Stanford University Press, 1991</b><br />An in-depth analysis of Truman’s “grand strategy” that attempts to answer the big questions of the Cold War regarding the formation of U.S. foreign policy, written even as that war was coming to an end. One can’t help wondering if the euphoria of that sudden victory affected the tone of his conclusions.<br /><br /><br /><b>Wilson Miscamble, C.S.C., George Kennan and the Making of American Foreign Policy, 1947-1950, Princeton University Press, 1988</b><br /><br />A study of George Kennan’s influence on foreign policy as the head of the Policy Planning Staff, his formulation of the policy of containment, the implementation of the Marshall Plan and the formation of NATO.<br /><br /><br /><b>William Appleman Williams, The Tragedy of American Diplomacy, Norton, 1959</b><br />A revisionist history that presents the U.S. as a tough, sometimes ruthless, promoter of its own economic power and influence. Points out that even if the U.S. did not have an empire in the sense that the British and French did, its economic policy was a form of imperialism.<br /><br /><br /><b>John Lewis Gaddis, The United States and the End of the Cold War : Implications, Reconsiderations, Provocations, Oxford University Press, 1994</b><br /><br />A look back that asks, now that the Cold War is over, what’s next? Includes an interesting reassessment of John Foster Dulles and Ronald Reagan and new interpretations of how America waged the Cold War, including the role of morality, nuclear weapons and espionage.<br /><br /><br /><b>John Lewis Gaddis, Strategies of Containment : A Critical Appraisal of Postwar American National Security, Oxford University Press, 1982</b><br /><br />In-depth assessment of American post-war foreign policy, focusing on George Kennan’s policy of containment. But he goes beyond its formulation to claim that American leaders misunderstood Kennan’s intentions, resulting in policy actions that Kennan did not approve of.<br /><br /><br /><b>John Lewis Gaddis, Jonathon Rosenberg, Ernest R. May & Philip H. Gordon, eds. , Cold War Statesmen Confront the Bomb : Nuclear Diplomacy since 1945, Oxford University Press, 1999</b><br /><br />Attempts to answer one of the most debated questions of the Cold War: did nuclear weapons prevent World War III? by examining the careers of 10 Cold War statesmen - Harry Truman, John Foster Dulles, Dwight D. Eisenhower, John F. Kennedy, Josef Stalin, Nikita Kruschev, Mao Zedong, Winston Churchill, Charles De Gaulle, and Konrad Adenauer - and their perceptions of war in light of nuclear weapons.<br /><br /><br /><b>David Holloway, Stalin and the Bomb : The Soviet Union and Atomic Energy 1939-1956, Yale University Press, 1994</b><br />A penetrating account of the development of the Soviet atomic bomb, and Soviet nuclear policy, covering the war years and the origins of the Cold War. Holloway draws upon sources only recently available and this glimpse of America during this time through Soviet eyes, specifically Stalin’s eyes, is absolutely fascinating. Equally fascinating is the insight he provides into the culture of the Soviet Union during this period.<br /><br /><br /><b>Hugh Gusterson, Nuclear Rites : A Weapons Laboratory at the End of the Cold War, University of California Press, 1996</b><br /><br />An anthropologist’s look at the culture of Lawrence Livermore National Laboratory. The place that, along with Los Alamos National Laboratory, made the nuclear warheads that may or may not have preserved the peace. Remember the military-industrial complex? This is an important part of it, and an important legacy of the Cold War.<br /><br /><br /><b>Spencer R. Weart, Nuclear Fear : A History of Images, Harvard University Press, 1988</b><br /><br />As the title implies Weart is examining the psychological aspects of the nuclear legacy of the Cold War. While his analysis is a bit too Freudian for my tastes it makes for some fascinating reading, and will probably make you take a second look at some of those old monster movies.</div>
Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-41658723360059937062012-04-07T19:52:00.000-04:002012-04-07T19:52:42.928-04:00Opera in America<div dir="ltr" style="text-align: left;" trbidi="on">From 2002 to 2004 I lived in a loft apartment in Center City Philadelphia. I had always wanted to live the life of a city sophisticate, and I finally had the chance. It was a lot of fun, but it was expensive. One of the many cultural activities that I enjoyed while living in the city was the Opera Company of Philadelphia. I had season tickets for a seat in one of the Proscenium Boxes. It gave me an excuse to sew elegant evening gowns, make matching jewelry, and where the beaver coat that I inherited from my grandmother. Even after I moved down to Virginia I maintained my box seat until they stopped offering the Saturday evening performance. I loved going to the opera, but reading the surtitles was always awkward.<br />
<br />
Then one day I saw <a href="http://en.wikipedia.org/wiki/La_Grande-Duchesse_de_G%C3%A9rolstein">The Grand Duchess of Gerolstein</a> performed in English. It was a revelation. Why, I asked myself, don't they perform all operas in America in English? It was hilarious, especially when the Grand Duchess snagged her costume on the scenery during one of her entrances and was briefly stuck. When she pulled free she was trailing about 8 feet of trim, which the General promptly stepped into. She actually lost it, doubling over in laughter, while the General gamely persevered. It was then that I realized that forcing us to read translations vastly diminished our enjoyment of the performance for the simple reason that you can't really listen and read at the same time. And you definitely can't watch what is going on on the stage while you are busy reading the surtitles.<br />
<br />
So why don't the folks that produce operas in America translate them into English? Is it some outmoded idea of remaining faithful to the original? Or of maintaining the purity of the work? Do they think it is too hard? Or do they simply not think of it at all? As I watch the Metropolitan Opera struggle to keep going it occurs to me that presenting opera in America in English could revitalize the art form and revitalize the Met. And why stop with translating operas into English. Why not make it a practice to translate the opera into the native tongue of whichever country it is being performed in? After all, there is nothing magical about the language of the original, it just happened to be the language of the librettist. I have a feeling the Wagner fans will think I am a heretic for saying that.</div>Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com1tag:blogger.com,1999:blog-7539722978994645650.post-30337197876585504202011-07-10T10:05:00.003-04:002011-07-11T21:34:47.425-04:00A New NASAI grew up reading science fiction, watching Star Trek, and dreaming of a future in space. I became an Aerospace Engineer, studied orbital mechanics and control systems, and later I earned an MS in Physics (Observational Astronomy). I was star struck. Needless to say, I have been disappointed with our lack of progress in space exploration.<br />
<br />
The Apollo program was a cold war effort motivated by beating the Russians and while it was an amazing feat of technological development, it did not leave us with any sustainable capability. The original concept of the Space Shuttle would have been much better. It would have been totally reusable, with a flying delta winged booster carrying the shuttle up before detaching and returning to Earth. But Congress nickled and dimed NASA to death, literally. If that original design had been used, there would have been no solid rocket boosters to blow up, or drop insulation on vulnerable heat tiles. But when the Atlantis touches down, we won't even have that capability any more.<br />
<br />
In more recent years, there have been efforts at the privatization of space exploration, and I cheered when Space Ship One won the X-Prize, but more needs to be done, and it is still hard to get the financial resources needed to create the infrastructure that a sustainable presence in space requires. NASA still possesses remarkable facilities and despite their brain drain, they still have a lot of very smart people working for them, but they are being run by bean counters who lack vision and are being strangled by the Government. They need to break free, create a vision for sustainable space exploration, and take their case to the American people, perhaps even to the world. There is an organization, called <a href="http://www.kickstarter.com/">Kickstarter</a> that allows people who need financial backing to reach out to the public in an effort to gain that backing. I recently supported an independent publisher through Kickstarter, and it has occurred to me that this could be a new model for a number of endeavors, including space exploration.<br />
<br />
Another possible method of funding would be to allow tax payers to actually have some discretion as to where their tax dollars go. I read a short story many years ago that used this as the basic premise (it was a Christmas story, and I think it was called <i>World Peace</i>). When the people in that world filed their tax returns they could go through all the possible programs and select those that they wanted to support.<br />
<br />
Of course, if our government is really serious about turning our economy around and creating jobs, they could just increase NASA's budget instead of spending money paving roads that don't really need it. Economic studies (see this <a href="http://er.jsc.nasa.gov/seh/economics.html">one</a>, for example) have shown the benefits of spending on space (as opposed to spending on defense). And for those who think that NASA gets lots of money already, they don't. NASA's budget for 2011 is 19 billion dollars, which is less than 1 % of the Federal Budget.<br />
<br />
And for the skeptics out there it is already being done:<span style="font-size: small;"> </span><br />
<a href="http://www.nytimes.com/2011/07/12/science/12crowd.html?hpw=&pagewanted=all"><span style="font-size: small;">Scientists Turn to Crowds on the Web to Finance Their Projects</span></a>Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-63226900735327397272011-05-24T06:47:00.000-04:002011-05-24T06:47:37.072-04:00Paul Weindling, “A Virulent Strain: German bacteriology as scientific racism, 1890-1920”<div dir="ltr" style="text-align: left;" trbidi="on"><i>Race, Science and Medicine, 1700-1960</i>, Waltraud Ernst & Bernard Harris, eds. London: Routledge, 1999. 218-34.<br />
<br />
This book is a collection of essays centering on the issue of race in science and medicine. This particular chapter is arguing that bacteriology became racialized as a reaction to transmigrants crossing from the East in the 1890s. This racialization of disease became even more pronounced during the German occupation of eastern territories during World War I.<br />
<br />
By the time of the 1892 cholera outbreak in Hamburg, epidemics were seen as belonging to a more primitive time, when Europeans were considered to be more or less on the same cultural level as the ‘colonial’ races. But even as outbreaks of cholera were becoming increasingly rare, bacteriologists were aware of other ‘Asian’ diseases that were threatening the European races. Leprosy was on the rise on the Baltic fringes of Germany, and there were fears about the importation of typhus, small pox and the plague by transmigrants from the East traveling to Ellis Island.<br />
<br />
As bacteriological knowledge increased, the identification of pathogens with diseases resulted in a more objective specificity, but it also opened the door for the possibility that susceptibility to a pathogen was a racial attribute. If this could be shown, it could be used to give an objective scientific basis to the notion of different human species. In cases where the animal vector (such as the louse) was identified before the pathogen, anyone infested with the animal vectors were considered a threat to the overall population, and it was easy to conflate the ethnicity of the carriers with the contagion.<br />
<br />
In response to the threat of disease from transmigrants, medical stations were set up on Germany’s eastern borders that inspected and disinfected transmigrants. These stations did not, however, have a consistent policy for such cleansing, nor did they necessarily have adequate facilities. Policies could be quite draconian and dehumanizing, and the attendants were often coarse and ill-mannered. Women and men were separated, breaking up families, and sick children were often removed to distant hospital facilities, with no information provided to the relatives, and no visitation by family members allowed. Although there was no official interest in religious background (religion was not generally recorded) the German and American press sensationalized the idea that Eastern European Jews were importing infections. Public prejudice against Russian Jewish refugees increased during the 1890s, and the stereotype of Eastern Europeans as living in filth and squalor was reinforced.<br />
<br />
With German and Austro-Hungarian occupation of eastern territories during World War I, the situation only worsened. Occupying troops enforced sanitation standards upon the population, often targeting specific groups within that population (e.g., Polish Jews, Serbian Muslims). Anti-lice pamphlets were prepared in Yiddish, with the help of Rabbis, that urged the cutting of hair, the shaving of beards, and the burning of (infested) wigs of Orthodox women, but they were not effective. Some considered this lack of effectiveness as being due to the ‘primitive’ religious culture of the Jews. In turn, the local communities resented this intrusion into their private lives and viewed the delousing installations with hatred, even burning some down. The Germans compiled lists of Jews that were to be forcibly washed and deloused every week, and closed shops if their owners refused to be deloused.<br />
<br />
Labor shortages during the harvest led to the importation of large amounts of Eastern European workers into Germany. Although the workers were deloused, the delousing was not effective and in 1917 there was a severe typhus epidemic in Warsaw, which aroused further racist hostility towards Jews.<br />
<br />
In contrast to the attitude of German and Austrian medical officers in Poland, the Austrian medical officers in Serbia prided themselves on respecting local religion. They used Serbian women to inspect Muslim women for typhus. They viewed themselves as “apostles of civilization” (p. 230) and the overall tone of their actions was more moralistic and religious, than racial.<br />
<br />
Unfortunately, by 1918 the prejudice of German and Austrian authorities against Polish Jews as carriers of typhus had increased to the point that refugees arriving in Vienna were held in concentration camps, under atrocious conditions that caused deaths among the inmates. Medical officers believed that Polish Jews constituted an epidemic risk, and the Reich authorities closed the borders to these workers. They were vilified as immoral, lazy, opportunistic, dirty and unreliable.</div>Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-86061320144086112922011-04-18T06:36:00.000-04:002011-04-18T06:36:21.084-04:00Beatrice Webb<div dir="ltr" style="text-align: left;" trbidi="on"><i>My Apprenticeship</i>, AMS Press, New York, 1977 (reprint of the 1926 London edition published by Longmans, Green & Co.)<br />
<br />
For Beatrice Webb (née Potter) the underlying controversy of life is the struggle between the Ego that affirms and the Ego that denies, and it is upon the course of this controversy that the attainment of inner harmony and consistent conduct in personal and public affairs rests. For Beatrice this debate was resolved into two questions: Can there be a science of social organization, analogous to mechanics or chemistry, that would enable mankind to forecast what will happen in society and allow us to alter those events. And, if there is such a science, is science all we need? Or do we also need religion? This book is a tentative attempt to answer those questions and describes her journey towards socialism, the Fabian Society and her marriage to Sidney Webb.<br />
<br />
She concludes, finally, that society is a vast laboratory in which experiments in human relationships are constantly being carried out, consciously or unconsciously, and that to survive and prosper we should equip ourselves with the knowledge of how things happen. And that this knowledge can only be obtained by persistent research into the past and present behavior of humanity. But knowing how things happen does not settle the question of what ought to happen nor should it because, with regard to that question, science has no answer. Answering the question of ‘ought’ depends upon human values, which alter from society to society and over time.<br />
<br />
For Beatrice, answering the question of ‘ought’ led her to socialism. Her research in the East End revealed to her the physical misery and moral debasement that was the legacy of the rack-renting landlord and the capitalist profit-maker of nineteenth-century commerce and industry. Some of these ills (low wages, long hours, unsanitary working conditions) she felt could be remedied by appropriate legislative action and pressure from the Trade Unions. This meant a move from early Victorian individualism to an all-pervading control, in the interest of the community, of the economic activities of landlords and capitalists.<br />
<br />
But even if this regulation did succeed in alleviating the worst injustices of the capitalist system, there still must be some way to insure a minimum state of civilized existence for every citizen via some form of socialism that would provide public education, public health, public parks, and public provision for the elderly and the ill, and some form of support for the involuntarily unemployed, paid for out of rates and taxes.<br />
<br />
To address what she considered the psychological evil of a society divided into the haves and have nots, or the rich and the poor, a schism that would not be remedied by a rise in wages as the United States demonstrated, she recommended an alternative to the modern business model based upon the co-operative movement. In the co-operative she saw the invention of a new type of industrial organization in which an industry was governed by the community of consumers for the common benefit of the consumers. To this organization she wished to add Trade Unions or professional societies, whose purpose it was to protect personal dignity and individual freedom by giving workers the means to participate in the administration of their trades and services.</div>Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-35226750461876402932011-03-20T07:34:00.000-04:002011-03-20T07:34:36.232-04:00Social Control in Nineteenth Century Britain<div dir="ltr" style="text-align: left;" trbidi="on">ed. by A. P. Donajgrodzki. London: Croom Helm, 1977<br />
‘Social Police’ and the Bureaucratic Elite: A Vision of Order in the Age of Reform (pp. 51-76).<br />
<br />
This book is a collection of essays concerned with the application of the concept of social control (borrowed from sociology) to the study of relationships between the classes in nineteenth century Britain. Although the contributors have different perspectives on social control, all share a fundamental assumption that social order is not only maintained through legal systems (police and prison) but is also expressed through a wide variety of social institutions, both formal and informal. The book purports to be the first collection of historical essays to make use of this concept.<br />
<br />
The essay under consideration here, written by Donajgrodzki, is concerned with examining the common foundations in the thought of Hugh Tremenheere, a traditionalist, and Edwin Chadwick, a Benthamite (in fact he was Bentham’s amanuensis and a devoted adherent). Both men, he claims, approached the problems of social control from the perspective of social police. This perspective was characterized by the belief that it was a common morality that produced social order, so that any policy aimed at maintaining it would have to take into consideration not just the legal systems, but also religion, morality, education, leisure activities and even housing and public health. It further held that if left to themselves, the poor were liable to be led astray, that is they were normless. They are, perhaps, like a errant children, who do not know any better and must be guided in their moral development as well as in the everyday acts of life.<br />
<br />
Donajgrodzki believes that the notion of social police may be an adaptation and intensification of pre-industrial beliefs about the proper relationship between the classes and the control of the poor. In some parts of the country civil authority was matched by ecclesiastical authority, and this permitted extensive scrutiny of the lives and behavior of the poor and the possibility of social control through the power and influence of the clergy. With industrialization the poor often gained some measure of economic independence, but this independence was not seen to carry over into a right for individualism, which was seen as incompatible with social order. Industrialization tended to intensify the feeling that the poor need to be guided and taught, and led to speculations about how this could best be achieved.<br />
<br />
Hugh Seymour Tremenheere, the first mines inspector, was a traditionalist whose duties included both reporting on the technical aspects of mining and also on the state of the people in the mining districts. He felt that the way to maintain social order was to create a controlling and sustaining environment in which all factors, even the most trivial, were carefully considered. His was a theory of reciprocal obligation, employers had a moral obligation to their employees, and the interests of both were the same. He did not fear the intellectual and moral development of the poor and felt that it would contribute to social stability, because once they had been properly educated the poor would understand the nature of the proper relationship between themselves and the rich. Industry would thus play the leading role in creating the proper environment for the working poor, with the state merely ensuring that the socially destructive practices of industry was curtailed. The state should also contribute to the social welfare of the people by increasing the numbers of schools and staffing them with appropriate role models. He also wanted an increase in the number of clergy, seeing them as front line enforcers of proper social behavior.<br />
<br />
Whereas Tremenheere approached the problem of social control from a paternalistic perspective, Chadwick approached it from a Benthamite one. They both saw order as being the product of a variety of social processes and thought that it was attainable only if the poor were watched over and guided. But Chadwick believed that harsher and more coercive measures of enforcement were as important as the benevolent provision of the proper environment. And unlike Tremenheere he felt that the state should take a much more active role in creating a systematic, humane and efficient social police. The role of the police was to include not just the apprehension of criminals but also the supervision of public leisure and the enforcement of public health measures. To offset their role as enforcers, police should also take on humanitarian and benevolent roles in a community, such as acting as fireman. He felt that Tremenheere placed too much emphasis on the role of the church and that he was not hard enough on the trade unions, whom Chadwick saw as a disruptive force. While he is often remembered for his advocation of state intervention, he was also enthusiastic about a paternalistic role for the industries for many of the same reasons that Tremenheere was.</div>Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-40843930228666115622011-02-21T08:13:00.000-05:002011-02-21T08:13:07.707-05:00Sick and Tired of Being Sick and Tired<div dir="ltr" style="text-align: left;" trbidi="on"> Susan L. Smith, <i>Sick and Tired of Being Sick and Tired : Black Women’s Health Activism in America, 1890-1950, Philadelphia,</i> University of Pennsylvania Press,1995.<br />
<br />
As the subtitle of this book implies, it is about the role of black women in black health care. The time period under consideration, 1890-1950, was a time of legalized segregation, but it was also a time when the American welfare state was expanding. Unfortunately those benefits generally did not cross the color line. In response to this, and as part of the political agenda for black rights and equal access to government resources, black activists attempted to draw attention to black health issues.<br />
<br />
The creation of a black health movement began as a private crusade instituted primarily by black club women. These women constructed the infrastructure of their communities through their work in religious and secular groups, groups that included not only church associations, but also female auxiliaries and women’s clubs. These clubs started day nurseries and kindergartens. They opened working girls’ homes in the North and the Midwest to help young black migrants from the South with housing, employment information, and moral instruction. But because segregation and racism prevented African Americans from getting even the most basic health care, these clubs focused most of their interest on public health work. Despite personnel and monetary limitations, they provided health education and some basic health services to impoverished communities and in Atlanta and Chicago they tried to provide African Americans with the same basic urban amenities that white communities received as a matter of course via tax-supported city services.<br />
<br />
In 1915 these reform efforts became part of a national black health movement when Booker T. Washington launched a health education campaign from the Tuskegee Institute in Alabama. This campaign, known as National Negro Health Week, was seen by black leaders and community organizers as a way for advancing the race through the promotion of black health education and cooperation across racial lines. The Tuskegee Institute served as the headquarters for the campaign until it was taken over in 1930 by the United States Public Health Service (USPHS) and turned into a year-round program.<br />
<br />
In the 1930s the statistical information now available revealed the plight of black Americans in the form of higher mortality and morbidity rates as compared to the white population. Growing awareness of the problem among health officials did not necessarily lead to better health treatment for blacks, but rather led white officials to blame the African Americans themselves for their illness by saying that it was due to their behavior and, in the case of venereal disease, to their sexual immorality and promiscuity. In response to these accusations, the black leaders responded with the statement that a population was only as healthy as its sickest members and called for an end to racist practices and the integration of health services, seeing these measures as the only real solution for the health issues facing black Americans.<br />
<br />
By the 1940s the medical civil rights movement arose as black health workers struggled to integrate hospitals and medical and nursing schools and associations. The effort was met by resistance within both the white and black communities. But in 1950 the USPHS pronounced the end of the National Negro Health Movement and the Office of Negro Health Work on the grounds that the nation was moving towards integration.</div>Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-82990071426117692812011-01-30T08:34:00.000-05:002011-01-30T08:34:58.655-05:00The Impact of the Plague in Tudor and Stuart England<div dir="ltr" style="text-align: left;" trbidi="on"><i>by Paul Slack, Routledge & Kegan Paul, 1985</i><br />
<br />
When he wrote this book Paul Slack (at the time a Fellow and Tutor in Modern History at Exeter College in Oxford) was more interested in the social response to the disease than in the disease itself, devoting almost half of the book to this subject. But he also realized that to understand the social response he needed to understand aspects of the disease itself, such as frequency of occurrence, which social groups and locales were affected and the mortality rates. The addressing of these questions occupies the first half of the book. The time period that he is covering, as indicated by the title, is the sixteenth and seventeenth centuries.<br />
<br />
Part I of the book gives us an introduction to the disease and its manifestations, as well as an overview of its impact on society and the attitudes and actions that resulted. The cause of the plague was not known, and it was attributed to both natural and supernatural agents. Similarly, the treatment of it incorporated both natural and supernatural elements. Books and pamphlets were published that included both herbal remedies and prayers. Special sermons were preached during plague outbreaks, and sometimes plague fasts were held. Diagnosis was not exact and there were other diseases prowling the populations such as typhus, the sweating fever and malaria, leading to further confusion and uncertainty. The plague itself was manifested in several ways. There was the bubonic plague with its carbuncles, buboes and spots, which sometimes occurred in a mild form, without marked visible symptoms. A more deadly variant of the disease was septicaemic plague, in which the bacilli invaded the blood stream, causing death before the external symptoms of plague had time to appear. A third variety is pneumonic plague, which may begin as a case of bubonic plague that becomes complicated by pneumonia. This latter variant changes the disease vector from fleas to humans as the bacteria is coughed out in the sputum of the victims and inhaled by the people around them. It was highly contagious, had a shorter incubation period than bubonic plague, and left untreated was almost 100 percent fatal.<br />
<br />
Along with the fear and uncertainty that an outbreak of plague invoked, it also placed a much more practical strain on the society in the loss of its members, sometimes in large numbers (a quarter to a third of a town’s population). This depressed the economy as well as straining the infrastructure as those as yet untouched by the disease struggled to deal with the dead bodies that needed to be disposed of as quickly as possible. Knowledge of the plague was passed down essentially unchanged from the time of the Black Death. The first medical book printed in English was a Little Book on plague, published in 1486 probably as a result of an outbreak of the sweating sickness. Outbreaks of diseases often seemed to inspire the printing of books on the plague, and in the second half of the sixteenth century a growing number of them were religious tracts and sermons.<br />
<br />
Part II of the book examines the frequency and severity of outbreaks using parish records and the number of wills probated as indicators of the presence of plague. Slack examines the records of Essex and Devon counties in an attempt to understand what kinds of communities were most likely to be affected by the plague. From there he moves on to the urban settings of Exeter, Bristol and Norwich and then to the metropolitan setting of London. From his case studies in the counties of Essex and Devon he draws two conclusions: 1) bubonic plague could cause a greater number of mortalities in a shorter time span than any other epidemic disease; and 2) that most communities suffered at least one epidemic during the course of a century and were lucky if they did not suffer more. The risk was generally greater in towns than in rural areas, although living in the country was not a guarantee of safety.<br />
<br />
He finds a more consistent picture when he examines the records of Exeter, Bristol and Norwich. Although they differ in the timing and severity of the epidemics in all three cities the occurrence of plague was connected with the economic and social conditions of the communities. Plague was a part of urban life. It was a regular visitor to all three cities. It struck Norwich in 1544, 1554, 1579, 1584 and 1589. The frequency and severity in Norwich may be due to the fact of its nearness to the Low Countries and its large immigrant population. It struck Bristol in 1565 and 1575 and Exeter in 1570 and 1590. It tended to be concentrated in fringe parishes that were primarily inhabited by poor laborers. In urban areas the occurrence of plague had a definite social dimension.<br />
<br />
By the sixteenth century London had already gained a reputation as being filthy and plague was seldom completely absent from it. The best records come from the city itself in the bills of mortality that it published in the seventeenth century. These documented not only the number of deaths but also their locations, making possible the charting of the progress of the disease through the city. A fresh outbreak would often begin, as would be expected, in the east, near the river and the docks, although that was not always the case. Once again, the most affected parishes were on the fringes of the city where the poor resided.<br />
<br />
Part III of the book examines the social reactions to the plague and the actions that resulted. The strategies to battle the plague began in London as part of the government’s general pursuit of social policies that might benefit the common man and improve social order. England lagged behind other countries in their adoption of measures to control the spread of plague and often simply adopted and adapted strategies already in use abroad. In 1518 Cardinal Wolsey founded the College of Physicians to improve English medical care, which also marked the beginning of public policy regarding plague. Those policies primarily focused upon separating out the infected to pesthouses or shutting them up in their own homes. Neither policy was rigorously enforced, the former because of the cost of establishing and running pesthouses and the latter in part because of the humanitarian issues raised and in part because of the difficulty of enforcing the isolation. It was much better to prevent the outbreak of plague itself, and to that end quarantines were enforced on ships and goods arriving from areas where a plague outbreak was known to have occurred.<br />
<br />
The publication of the bills of mortality in London documented for all to see the incidence and location of plague deaths. The advent of newspapers helped to spread this information outside of the city. This unprecedented supply of information allowed patterns of infection to be seen and helped to rationalize the reactions to plague, at least among the educated. Although the carriers of the plague were not identified and without a germ theory of disease its cause remained unknown it did help to destroy the claims of its supernatural origins.<br />
<br />
Plague broke down the social order, existing divisions were often exacerbated. The people resisted the efforts of the officials to impose plague regulations because they saw them to be as threatening as the disease itself. The public resisted the imposition of the regulations and the plague rate rose, which led the government to go to greater and greater lengths to enforce them. While the public were concerned with the suffering of themselves and their fellows, the officials were concerned with maintaining order, and they viewed the plague as part of the broader problem of poverty.</div>Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-11517664734502383892011-01-09T08:39:00.000-05:002011-01-09T08:39:50.264-05:00Mission and Method<i>The early nineteenth-century French public health movement</i><br />
Ann F. La Berge, Cambridge University Press, 1992<br />
<br />
<br />
Drawing from official archives, this book is more a history of institutions than it is of the people who were affected by those institutions. Instead it focuses on a select group of men who served on the Paris health council and as editors of the <i>Annales d’hygiéne publique et de médecine légale</i> and how they created and institutionalized the idea of public health and hygiene, as well as how they put those ideas into practice through their work on health councils, in their publications and in their investigations.<br />
<br />
Public health measures have generally been dominated by two different missions, emergency measures whose primary purpose is to deal with epidemics and regulations for dealing with public nuisances and waste disposal. The former measures were usually temporary, enforced only in times of crisis, and the latter measures were applied mainly to larger towns and cities, where the higher population density made such regulations a necessity. The idea of public health prominent in late-eighteenth-century France was dominated by an Enlightenment approach that emphasized progress, rational reform, education, natural law, empiricism and humanitarianism. It included preventive medicine as well as practices aimed at improving the quality of life, and reducing mortality and morbidity.<br />
<br />
While it had its foundation in the Enlightenment, the public health movement developed amidst the competing ideologies of liberalism, conservatism, socialism and statism, with liberalism and statism dominating. The liberals wanted a minimal amount of state intervention, preferring solutions that were local and individual with the private practice of medicine, while the statists felt that the state should assume the primary role in public health reform and management and that public health experts should serve as advisors to the state, even proposing a medical civil service. The debate between liberalism and statism took place within the context of scientism, the idea that science was the key to progress and that the scientific approach was the best way to achieve positive knowledge.<br />
<br />
During the Revolution the national government had accepted responsibility for national health, and both Napoleon and the Bourbons had continued the tradition. By the 1820s several public health programs were in place including a nationwide vaccination program, a national health care program of both epidemic physicians and health officers, a national administration of sanitation and a Royal Academy of Medicine to replace the Royal Society of Medicine. There were institutions at the national level as well as at the local level with municipal and departmental health councils. There also arose the idea of a public hygienist. These were not simply physicians, but rather physicians who were willing to practice empirical science in order to understand the causes of disease and death, who would undergo special training for their job and who would work in cooperation with other specialists including chemists and engineers.<br />
<br />
A major component of the mission of public hygienists was to investigate all possible causes of disease and death and to make recommendations for their solution. In the process they encountered a wide range of health problems and issues. Not only were they involved in sanitary reform and ensuring the purity of food and drink, but they also examined more complicated social welfare issues such as prostitution, wet nursing, foundlings and child labor laws. Their approach to such problems varied but they all recommended regulation, inspection, and legislation to help improve public health.<br />
<br />
They were aided in their work by the existence of the <i>Annales d’hygiéne publique et de médecine légale</i>, which was unique to France. It was the first journal in the West devoted to public health and legal medicine. In it ideas were exchanged and research published. The journal also reviewed or published most of the major French works on public hygiene and served as an international forum on public health issues, including the coverage of foreign developments and publications. This commitment to promoting and publishing their ideas was also mirrored by their educational efforts at the local level. Because many programs were voluntary (vaccination against small pox, for example), their effectiveness depended upon the public understanding the advantages of compliance and the public hygienists were instrumental in that education process by providing reports of their benefits that were based on more scientific foundations.<br />
<br />
This period also saw the application of statistics to the effort to understand the contributing factors of disease and death, if not their causes. Louis-René Villermé did extensive statistical studies of Paris and his findings linking poverty and death contributed to the notion of death as a social disease.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-37826062483972575062010-11-27T08:13:00.000-05:002010-11-27T08:13:28.829-05:00Health Reform in 19th Century AmericaRonald Numbers, <i>Prophetess of health : Ellen G. White and the origins of the Seventh-Day Adventist health reform, Knoxville</i> : University of Tennessee Press, 1992<br />
<br />
Ellen G. White is one of four nineteenth-century founders of a major American religious sect (the others are: Joseph Smith - Mormon, Mary Baker Eddy - Christian Science and Charles Taze Russell - Jehovah’s Witnesses), but she is not widely known outside of her church. Yet when she died in 1915 she left behind a legacy that consisted not only of the Seventh-day Adventist Church, but also sanitariums and hospitals located throughout the world. She also inspired an educational system that is still highly regarded, traveled, lectured, and wrote dozens of books. She was born Ellen Gould Harmon, along with her twin sister Elizabeth, on November 26, 1827.<br />
<br />
Her influence sprang from the visions that she began experiencing in 1844, when she was seventeen. These trances lasted anywhere from a few minutes to several hours, and during them she received messages about events both in the future and the past, heavenly and earthly. These visions were accepted as genuine revelations from God, and her followers (with her encouragement) regarded her as a true prophetess on a par with the prophets of the Bible.<br />
<br />
On June 5, 1863, in Otsego, Michigan, she received her vision regarding health, in which God revealed to her the hygienic laws that should be followed by Seventh-day Adventists. They were to give up eating meat and other stimulating food, neither drink alcohol nor use tobacco, and avoid medical drugs. When they were sick they were supposed to rely on the remedies of Nature, including fresh air, sunshine, rest, proper diet, exercise and water. Women were to cease wearing the fashionable clothing of the time (including hoop skirts and corsets) and wear “short” skirts and pantaloons. Followers were also supposed to curb their “animal passions” (masturbation was an especial evil leading to deformity of mind and body, not to mention spirit).<br />
<br />
Health reform was not new. In the early nineteenth century, America was not a healthy or hygienic place. Americans ate too much meat and not enough vegetables and fruits. Their food was heavy with grease and fats, and they drank too much Brazilian coffee. Public sanitation was horribly inadequate, and personal hygiene wasn’t much better. Most Americans seldom, if ever, bathed.<br />
<br />
<br />
In the 1830s, Sylvester Graham launched a full-blown health crusade. In the summer of 1830 the Pennsylvania Society for Discouraging the Use of Ardent Spirits invited him to come and lecture under its auspices. He accepted and was soon giving lectures featuring his scientific and moral arguments against consumption of alcohol. Reverend William Metcalfe was also preaching in Philadelphia at this time. He was the author of the first American tract on vegetarianism and had brought his English congregation over in 1817 and established the vegetarian Bible Christian Church. Graham added the vegetarianism to his lectures on temperance. In 1831 he broke away from the Society and was lecturing at the Franklin Institute on a broad range of topics including proper diet and the control of the passions. The 1832 cholera epidemic thrust Graham and his health reforms into the spotlight.<br />
<br />
Another reformer, important partly because he was associated with the Millerites (as was Ellen White) and also because her reforms mirror many of his, was Larkin B. Coles. His claim to health reform fame lie in two books: <i>Philosophy of Health: Natural Principles of Health and Cure</i> and <i>The Beauties and Deformities of Tobacco-Using</i>. His view of health reform was a moralistic one, and was not unique among health reformers. But both Cole and White saw obedience to these laws of health mainly as a requirement for entry into heaven rather than as a means for living a more enjoyable and healthy life on earth.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-21369351874837314582010-11-13T06:53:00.001-05:002010-11-13T16:38:11.844-05:00Science as a Social ConstructDouglas, Mary. “Environments at Risk” in Science in Context, Barry Barnes & David Edge, eds. Cambridge, MA, MIT Press, 1982 (pp. 260-75)<br />
<br />
Science in Context is a collection of essays focusing on the sociology of science. The purpose of the collection, as stated in the General Introduction, is to “provide a tolerable indication of what is going on in the sociology of science, and, more importantly, of what kind of social activity science is, and what its significance is.” The primary focus of the collection is on the relationship between the sub-culture of science and the wider culture that surrounds it, especially as it relates to science as a source of knowledge and competence and as a cognitive authority for evaluating knowledge claims.<br />
<br />
Central to the ideas of sociology of science are the writings of Thomas Kuhn, especially his book <i>The Structure of Scientific Revolutions</i>. From Kuhn, sociologists of science have concluded that science is a social construct, and that even statements of scientific fact have a conventional character. Because it is constructed and not intrinsic to the natural world, they conclude that it cannot be self-sustaining, and if it cannot be self-sustaining in the sub-culture of science, then neither can it be self-sustaining in mainstream culture. There is nothing in science that implicitly reveals its correctness and so its standing in society depends upon the degree of trust and authority with which society imbues scientists and institutions.<br />
<br />
In her essay, Mary Douglas examines the issue of credibility in the context of the ecology movement. She is concerned with how beliefs arise and how they gain support. The approach she takes is of the anthropologist from Mars, an hypothetical being that is agnostic when it comes to beliefs about the Earth’s environment. In her view this suspension of belief is what allows us to confront the fundamental question of credibility. She asserts that civilizations throughout history have viewed their environments to be at risk, although the risks they identified were generally not the same, but she claims that all civilizations pin responsibility for the crisis in the same way. The environment is put at risk by human folly, hate and greed.<br />
<br />
In the present, however, we have an added factor: self-knowledge. Because we can compare our beliefs with those of others we lose the filtering mechanism that those earlier civilizations possessed. We no longer have anything to restrict our perception of the sources of knowledge. Credibility is easier in a limited belief system, but how do you determine credibility when opposing sides of an issue both make sense? This is the question confronting environmentalists in our age.<br />
<br />
Through various anthropological examples she endeavors to show that the credibility of a belief regarding how the environment will react to human action depends upon the moral commitment of the community to a particular set of institutions. For example, bison do not like fratricide (murder within the tribe), so such an act endangers the well-being of the tribe and as a result has special sanctions. So long as the institutions in question maintain the loyalty of the community, nothing can overthrow the beliefs that support those institutions. If those institutions lose the support of the community, she claims that the beliefs are easily changed. A particular view of the universe and the society holding that view are thus interdependent. They form a single system and neither can exist without the other. Any given environment that we know thus exists as a structure of meaningful distinctions.<br />
<br />
In this credibility debate the role of laymen and social scientists is to examine the sources of our own bias. Because we lack the moral consensus that gives credibility to ecological warnings we do not listen to the scientists. Similarly, because we lack a discriminating principle we are easily overwhelmed by our pollution fears. This discriminating principle comes from social structures and it allows a culture to select which dangers it will fear and also to set up a belief system that will address those dangers. Without that structure we are prey to every dread and right and wrong cease to exist. This is the price of full self-consciousness, but it is a price that she feels we must pay. When we do that the classifications of social life will be gone and we will recognize that every environment is simply a mask and support structure for a certain kind of society. Understanding both the nature and value of that society is as important as understanding the sources and nature of the pollution that puts our environment at risk.<br />
<br />
Mary Douglas deliberately picks an area of science where our understanding is incomplete and in which the debate over competing theories has become politically charged. Consensus is not the final arbiter of a scientific theory or hypothesis. Unfortunately in the case of the environment politicians and advocates have created a situation where that is the level at which the discussion of the various theories and hypotheses is taking place.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com2tag:blogger.com,1999:blog-7539722978994645650.post-37638319724698264972010-11-01T05:14:00.002-04:002010-11-02T07:20:02.561-04:00Michael Kater, "Doctors under Hitler", Chapel Hill: University of North Carolina Press, 1989.This monograph is a sociohistorical study of the medical profession under the Third Reich and rests on the author’s previous work analyzing doctors and medicine from Wilhelm II to Hitler. It draws upon documents in the Federal Archive of Koblenz and the Berlin Document Center. Primary material was also drawn from the student archive in Würzburg and other regional West German archives. He also drew on the papers of the former panel physicians’ association, the KVD, as well as the predominant professional journals and memoirs of physicians that lived beyond 1945.<br />
<br />
At the dawn of the Third Reich, in 1933, there was a surplus of physicians, inherited from the republican era. These doctors were at first hopeful that the new regime would address issues left over from the health administration of the Weimar Republic, but their hopes were not fulfilled. Under the republic medical graduates had to spend three years as an assistant in a hospital where they were poorly paid, and forbidden to seek other sources of income. Establishing themselves as independent practitioners was almost impossible for a doctor straight out of medical school. One of the complaints lodged by spokesmen for this group was that medical institutions should stop advertising junior positions for bachelors only. They also emphasized that, after public school teachers, high school teachers, and jurists, they represented the fourth largest group of academically trained professionals born after 1900.<br />
<br />
But under the Third Reich, the medical profession became a microcosm of the larger Nazi sociopolitical system, governed by the Nazi leadership principle and redefined in National Socialist terms. Physicians now had to present every private contractual arrangement to the Reich Physicians’ Chamber for approval, register with the Nazi medical agencies and keep them informed of any changes in their family status or medical qualifications. They also had to report on their patients. All serious cases of alcoholism, ‘incurable’ hereditary or congenital illness (i.e. imbecilism) and highly contagious diseases such as venereal disorders were recorded and reported to the appropriate authority.<br />
<br />
The doctors themselves were required to undergo continued training. Partly this was to break down the distinction between general practitioners and medical specialists, but it was also to teach them National Socialist concepts of health and medicine. The unpopularity of these courses was perhaps offset by another change in their profession implemented by the Nazi legislators, its redefinition. By stating that the medical occupation was not a business, the Reich Physicians’ Chamber was able to exclude anyone who was not properly schooled or licensed.<br />
<br />
This did not do away with medical quacks, however, for the Nazi conception of medicine favored the lay element over ‘school’ medicine. Instead they created a new class titled “physician of natural healing” open to anyone who could demonstrate the requisite ability. Anyone in this group with extraordinary talent could enter a medical facility without the usual professional medical qualification, and could even receive a license as a doctor medici. The Nazis further required that regular doctors had to assist registered lay healers at the latter’s request.<br />
<br />
Under the Third Reich medicine became the preeminent academic discipline, with approximately 30 percent of all university faculty being composed of medical teachers by 1935. Medical faculty also became dominant in university power politics. Between 1933 and 1945 the percentage of medical faculty serving as rectors increased from 36 to 59 percent. Along with this increase in power and significance there was the establishment of a new discipline that became a part of the medical curriculum after 1933, Rassenkunde or Rassenhygiene, race hygiene or eugenics. This ‘science’ consisted of three parts: anthropological, sociological, and medical, and its goal was to improve the superior race, while eliminating the inferior ones.<br />
<br />
Kater thus links the professionalization of medicine in the Third Reich with its corruption. West German doctors saw these events as a struggle between the forces of freedom and democracy against the totalitarianism of the Nazi regime. A battle which the latter eventually won. East German doctors, on the other hand, saw these events as the result of a premeditated conspiracy between fascist-minded German doctors and Nazi political leaders. Kater feels that the truth is somewhere in between, but that it lies closer to the East German perspective, than the West German one.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com1tag:blogger.com,1999:blog-7539722978994645650.post-12830301837392916882010-10-10T14:15:00.010-04:002010-11-01T05:15:56.585-04:00Treating the Disease vs Treating the PatientWhile I was pursuing my History of Science studies at Notre Dame I took a seminar course on Medicine and Society. My last two posts are from that class. I came to hate that class and it was a large factor in my decision to drop out of the program, but I did learn some important lessons during it. The crux of the message that the professor was trying to get across to us was the way that the medical profession dehumanizes the patient and ends up treating the disease, and not the human being. If you want to see this message in a very disturbing but highly distilled form just watch the film "Wit" with Emma Thompson.<br />
<br />
This lesson was reinforced for me this past week when I had to rush home to Ohio because my father was in the hospital. He went in for something relatively minor but ended up in the hospital for a week being treated for another condition. A condition that was due, in part, at least, to actions taken by the hospital staff in their treatment of his original issue. I am not saying that the staff was malicious in their treatment, but they were aggressive and interventionist, so that rather than assuming that the change in his condition might be due to the drugs they had given him they kept chasing symptoms. It quickly became apparent that the treatment was reactive - x happened, so they did y, without ever really trying to understand the whole picture, the patient. In the end my father spent a week in the hospital and underwent a procedure that was probably not really necessary.<br />
<br />
<br />
It is hard challenging the medical profession when you are a patient, they are so authoritative, and when there is something wrong you get swept up into their treatment course and it takes over your life. I saw this myself when I was undergoing treatment for breast cancer. I tried hard to be an informed patient and question the treatment but there was one week in which I had a CAT scan, a PET scan and two biopsies. Everything checked out as fine, but that week was quite an ordeal, both physically and emotionally. My oncologist's conclusion after all of that was that if you did tests and scans you will always find something that is odd, and if you let yourself, you will chase these oddities for quite some time before concluding that while odd, they are not dangerous or unhealthy. My oncologist now uses me as a poster child for not doing more than is necessary. He still feels bad about putting me through that ordeal. <br />
<br />
There is a lot of debate going on right now about how to fix the health care system. Well, one of the things they should do is treat the patient, not the disease. One of the hardest things about being a doctor is the process of diagnosis (this is actually a place where expert systems could be useful) and rather than being thoughtful or logical about ordering tests they just order a whole suite of them. It is as if they are throwing a whole bunch of darts at a dart board in the dark, hoping that one of them hits the target. That is simply not a rational or cost effective approach to treatment. It isn't good for society and it isn't good for the patient.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-7757928229503261212010-10-10T09:56:00.000-04:002010-10-10T09:56:18.297-04:00Demographics<i>Fertility, Class and Gender in Britain 1860-1940</i> <br />
Simon Szreter, Cambridge University Press, 1996<br />
<br />
In the early part of the 20th century there was a growing awareness of a declining birthrate in the industrialized nations. In Austro-Hungary and France the birth rate in some rural areas had begun to decline substantially during the 18th century, with similar declines taking place among the aristocratic and bourgeois groups as early as the 17th century. In 1945 a theory of demographic transition was published. It proposed three stages of demographic development: an initial pre-industrial stage of high birth rates and high death rates, an industrial phase of high birth rates and declining death rates (leading to substantial population growth) and a post-industrial phase of low birth rates and low death rates.<br />
<br />
This theory was based upon a single case, that of Britain. It utilized the findings of the 1911 census, which analyzed the fertility patterns of the British population from 1851-1911 and the newly released study conducted for the Royal Commission on Population that covered the period 1901-1946. The 1911 census used what has become known as the professional model of social classification in which all male occupations are assigned to one of five grades (professional upper and middle class, intermediate, skilled workers, intermediate, unskilled workers). The 1911 census analysis found that the higher the social class, the earlier and more rigorously it controlled its fertility.<br />
<br />
This classification scheme was based upon three assumptions: 1) the occupation of the male head of household was the best way to classify families; 2) a primary division existed between the higher-status non-manual occupations (they were more professional) and the lower-status manual occupations (assessed according to skill) and 3) the fact that a single hierarchical social grading system was a valid classification scheme. It should be noted that this scheme excludes women and their labor, both paid and unpaid. It should also be noted that those living off private means, and thus listing no personal occupation, were classified alongside paupers in a residual category, labeled the unproductive class.<br />
<br />
In 1869 Francis Galton published <i>Heredity Genius</i> in which he examined the families of ‘eminent men’ in England in an effort to determine the heritability of both mental and physical qualities. He went on to coin the term eugenics in 1884. By the end of the 19th century there was widespread concern that modern society was reversing evolution, leading to the degeneration of the English people. This was partly driven by an increase in the recorded rates of lunacy from 2.26/10,000 in 1807 to 29.26/10,000 in 1890, (Mathew Thomson, <i>The Problem of Mental Deficiency: Eugenics, Democracy and Social Policy in Britain c. 1870-1959</i>). By the first decade of the 20th century mental defectives became defined as the central eugenic threat facing the nation. Greater social awareness plus universal education led to the growing realization of the presence of mentally deficient people in the population. This heightened awareness coincided with growing fears about the fitness of the population. In 1907 the Eugenics Education Society was formed.<br />
<br />
During the period 1875-1883, the Anthropometric Committee of the British Association for the Advancement of Science provided an hereditary basis for the professional model. The professional model thus acquired the status of an empirically tested theory. Despite the fact that it was based upon unexamined social conventions it had been turned into a naturalistic theory of British society’s essential structure.<br />
<br />
In the beginning of the 20th century an environmentalist counter movement emerged opposing the ideas of the eugenicists that the poor were poor because of the way they were, rather than because of social or environmental factors. At the forefront of this movement were the Fabians who, although they shared a nationalistic interpretation of social Darwinism with the hereditarian biometricians, did not agree with them as to the causes or the appropriate political means to achieve the optimal nation. They held that poverty was not the manifestation of inherited biological deficiencies but rather that the environment was responsible for the moral and material degradation of the working man.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-59646009967498753142010-09-26T08:39:00.001-04:002010-09-26T08:40:05.453-04:00The Anatomy Act<i>Death, Dissection and the Destitute</i><br />
by Ruth Richardson, Penguin Books, 1988, 426 pp., appendices, notes, bibliography, index.<br />
<br />
<br />
In 1518, the College of Physicians was founded to improve the state of medical knowledge in England, but improvements were hampered by one very simple fact: the lack of human bodies for dissection. In 1540, the companies of Barbers and Surgeons were united by Royal Charter and Henry VIII granted them the rights to the bodies of four hanged felons per year. Charles II increased that number to six. But these dissections were ostensibly public affairs and were part of the sentence inflicted upon the criminals. Thus, from the start, dissection was seen in the public eye as a punishment for criminals and as a defilement of the corpse, not as a means of gaining medical knowledge.<br />
<br />
This shortfall in supply was made up by one very simple solution, robbing graves. This was done either by disinterring freshly buried corpses, or by waylaying the bodies before they were buried. Work houses, charity hospitals and asylums were favorite sources as their occupants were poor, indigent or had no relatives to claim their bodies. The supplying of anatomists and surgeons with bodies sometimes involved the collusion of grave diggers, sextons, administrators at the facilities mentioned, undertakers and even clergy.<br />
<br />
The men who plied this trade were called resurrectionists. Grave robbing was not a crime, per se, since the body was not considered property. While a man could be hung for poaching, he would not be hung for stealing a dead body, unless he also stole the personal effects of the corpse. It was a lucrative business and it is perhaps not entirely surprising that at some point some one would see the advantage of using the anatomists as a means of disposing of murder victims. The most celebrated case was that of Burke and Hare in Edinburgh.<br />
<br />
Mrs. Hare was the owner of a cheap lodging house in which an elderly man died while still owing her money. To pay off this debt, Burke and Hare sold him to an anatomist for £7.10s. When another lodger fell very ill, Burke and Hare eased him on his way and sold his body for £10. In all they killed 16 people before they were discovered, and introduced a new verb into the English vocabulary: to burke. Burke was hung and dissected on 28 January 1829, Hare turned King’s evidence and was spared, and the anatomist to whom they sold the bodies, Knox, was never charged.<br />
<br />
The first Anatomy Bill (Bill for preventing the Unlawful Disinterment of Human Bodies, and for Regulating Schools of Anatomy) was submitted to Parliament by Henry Warburton on 12 March 1829. It did not pass, partly because of its length, the fact that it used the word dissection and because it obviously singled out the poor as the primary source of bodies. In 1831 Bishop and Williams, the London Burkers, were discovered. They had been supplying bodies to schools for some time when they decided to help matters along. They confessed to killing three people before their trial, although on the eve of their execution on 5 December 1831, Williams supposedly confessed that the number was closer to sixty.<br />
<br />
Warburton introduced his second Anatomy Bill ten days after their execution. This one was called simply A Bill for Regulating Schools of Anatomy, and the word dissection had been replaced with the phrase anatomical examination. It was shorter than his previous bill and though it still targeted the poor, it did not do so directly. It merely said that unless you or your executor or other lawful party expressly forbid it, your body was liable to undergo anatomical examination. It was eventually passed, but it did little to increase the supply of legitimate bodies. For the most part it simply cut out the middle man of the resurrectionist.<br />
<br />
In this book, Ruth Richardson has given us a detailed social and political history of the events leading up to and surrounding the Anatomy Act using numerous primary sources including government documents, official reports, pamphlets and newspapers. She links it with a general change in attitudes towards the poor, culminating in the New Poor Laws, and stigmatizing poverty by connecting their deaths with a fate that had previously been reserved for criminals. She claims that it also lead to a societal fear among the poor of the pauper’s funeral, helping to spur the growth of burial clubs and friendly societies. In addition, the connection of work houses as suppliers of anatomists lead to a general mistrust of these institutions. Other factors that are mentioned are the corruption and nepotism of the Royal College of Surgeons, the establishment of the Lancet by Thomas Wakley as a means of promulgating medical knowledge and as a vehicle for medical reform and the role of the Benthamites in the passing of the Anatomy Act itself.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-87939049239370247852010-09-18T07:46:00.000-04:002010-09-18T07:46:02.750-04:00David Noble - America by DesignA neo-Mumfordian, Noble enlists science in the conspiracy of Big Money in taking over the world and turning us all into parts of their machine, whose sole motives are profit and power.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-69205499473090401452010-09-18T07:45:00.000-04:002010-09-18T07:45:11.554-04:00C. Hamlin & P. Shepard - Deep Disagreement in U.S. AgricultureAs we watch policy debates, and technology debates, and all the debates on environmental issues, the question arises: how can we ever hope to resolve all of these differences? How can we ever find a solution that will satisfy all the parties concerned? This book presents a method for doing just that.<br />
<br />
By creating a neutral ground, and translating between the various interest groups in a disagreement, academics can enable a rational dialogue between the parties concerned that might actually lead to mutual understanding and maybe even a solution.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-56378693983959366792010-09-12T09:19:00.000-04:002010-09-12T09:19:55.465-04:00Bruno Latour, AramisWhen a technology fails, how do we explain what happened? How do we understand what happened? In <i>Aramis</i> Latour uncovers the multiple narratives that underlie failure, and perhaps, by implication, success.<br />
<br />
Among the themes that he addresses are the sexuality of technology. Latour wants to refute the idea that the theory of evolution can be applied to scientific progress, which assumes that later technology is an improvement over earlier technology and that it better meets/serves the needs of “the environment” (i.e., humanity).<br />
<br />
He also advocates heterogeneous engineering in which major social questions concerning the spirit of the age or the century and “properly” technological questions are blended into a single discourse. This leads to the notion of translation, in which a global problem is transformed into a local problem through a chain of intermediaries that are not “logical” in the formal sense.<br />
<br />
In addition, in order for a project to succeed, an engineer has to stimulate interest and convince the public. They must market innovation and technology. All of which leads to the question: is technological reality rational? Consumers, like technology, are invented, displaced, and translated through chains of interest.<br />
<br />
He recommends two kinds of charts to help understand technology: sociograms, which chart human interests and translations; and technograms, which chart nonhuman interests and translations. Both people and technology (human and nonhuman actors) are alike in that just as you have to compromise when dealing with a number of people, so you have to compromise when integrating any new technology.<br />
<br />
But one of the problems of an innovative project is that the number of actors that needs to be taken into account are not known from the beginning. If you don’t have enough actors, the project loses reality, if you have too many actors, the project becomes over-complicated and will probably fail.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-35890558596441271422010-09-09T06:27:00.000-04:002010-09-09T06:27:08.355-04:00Jerry Mander, In the Absence of the Sacred (1991)Jerry Mander is asking “what happened to the future?” and challenging the idea that advances in technology equates to progress and that this is a good thing.<br />
<br />
He defines a minimally successful society as one that:<br />
1) keeps its population healthy, peaceful and contented;<br />
2) has sufficient food, shelter, and a sense of participation in a shared community experience;<br />
3) permits and encourages access to the collective wisdom and knowledge of the society and whose members have a spiritually and emotionally satisfying existence.<br />
<br />
Mander wants to encourage awareness, care and respect for the earth’s life support system.<br />
<br />
But while technology has given us an improved standard of living, with greater speed, greater choice, greater leisure, greater luxury (bigger, better, faster, more), we haven’t eliminated poverty or crime and we don’t even have universal education. So, while our society may be a material success, it doesn’t work. And, even worse, the technological advances that have made this all possible have led to environmental degradation, but no one (except Mander?) seems to be questioning the price of technology.<br />
<br />
In response, Mander wants to challenge what he calls the Pro-Technology Paradigm that is characterized by:<br />
1) dominance of best-case scenarios<br />
2) the pervasiveness and invisibility of technology<br />
3) the limitations of the personal view - we don’t see the wider effects of our tools, only how they help us<br />
4) the inherent appeal of the machine - its flash and promise<br />
5) the assumption that technology is neutral and the idea of a scientific priesthood - nuclear power leads to autocratic systems, while solar energy leads to democratic systems (centralized power vs. distributed power)Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com2tag:blogger.com,1999:blog-7539722978994645650.post-87742555183361643332010-08-31T06:04:00.000-04:002010-08-31T06:04:21.137-04:00Philip Scranton, “Determinism and Indeterminacy in Science & Technology”<i>Does Technology Drive History: the Dilemma of Technological Determinism</i> (1994)<br />
<br />
How do we write the history of science and technology? If we set aside technological determinism than we must also abandon the idea that changes and shifts in technology govern the restructuring of social formations and organizations or of cultural practices. So how do we capture the dynamics of the interactions of science and society without resorting to new reductionisms that substitute a new universal for an old one? Neither can we assume that technical change represents a unified process.<br />
<br />
Deterministic approaches to the history of technology have meant that the situational links between technical changes and social and political relations have often been left unspecified and under-investigated, because technological determinism insulates technological change from extra-technical initiatives. But once we move past linear and reductionist accounts of technological change we can begin to fill in some of the gaps and silences of the history of technology and science. Gaps such as non-Western concepts of technology and technical practice, technical-environmental relations, technologies of sexuality and family limitation, or to technologies of the management of the incarcerated or the dead.<br />
<br />
It is Scranton’s belief “that technological change proceeds in the absence of overarching rationalities; that it proceeds along multiple coexistent trajectories; that links between technical change and sociopolitical relations are intimate and underspecified; and that stepping beyond reductionist teleologies reveals an array of intriguing silences in the history of technology.” (p. 163)Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-58021320810180242842010-07-04T09:06:00.000-04:002010-07-04T09:06:14.098-04:00Daniel Boorstein - The Americans: The Democratic ExperienceThe meta-narrative: that the century following the Civil War was an Age of Revolution in which the meaning of community, of time and space, of present and future was being continually revised. It was a century in which a new democratic world was being invented and discovered by Americans.<br />
<br />
This is the story of how we got to where we are. The creation of the everywhere community. Mass production, chain stores, suburbia, everywhere you go, there you are. The same commodities, the same merchandisers. The same television shows, the same radio shows. The homogenization, the democratization of American society. For Boorstein it is a good thing. Remember those Popular Science reels that showed us what the future was going to be like? The glorification of the gadget, gee whiz science.<br />
<br />
But by the end even Boorstein seems a little out of breath. The pace of scientific and technical progress has become so fast, is it out of our control? Has it become its own power?Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-26332155507428400642010-06-27T08:28:00.000-04:002010-06-27T08:28:42.634-04:00Edelstein, Michael - Contaminated Communities - 1988The aim of this book is to identify the major social and psychological impacts that stem from residential toxic exposure and to examine their significance. Edelstein bases his analysis on four postulates: 1) that the social and psychological impacts of toxic exposure involve complex interactions among the different levels of society as well as differing across time and with environmental context; 2) that these impacts affect how the victim behaves and how they understand their lives both in the short and long term; 3) that toxic exposure incidents are traumatic and invoke coping responses in their victims; and 4) that contamination is inherently stigmatizing and the very possibility of such contamination arouses fear in the public.<br />
<br />
Toxic exposure undermines the very fabric of society. It leads to a loss of trust, the inversion of the home (formerly seen as a safe haven, now hopelessly poisoned), a sense of a loss of control in one’s personal life and over the present and the future, a different relationship to and assessment of the environment (now seen as dangerous, and insidious in it’s dangers) and a pessimistic attitude towards one’s expectations about health. It places the adults in contaminated families under a great deal of stress as they become isolated and stigmatized by their contamination and it teaches children to fear.<br />
<br />
Toxic victims also become absorbed by government agencies and bureaucracies that threaten the victim’s social identity. This is compounded by the fact that the government’s aims and values may not be the same as the values of the victim’s, especially with regard to acceptable risk, which has more to do with economic and political forces. The regulator’s, on the other hand, have their own restrictions that they operate under. They are bound by regulations, political realities and limited resources.<br />
<br />
Toxic contamination in other communities leads to anticipatory fear in communities as yet untouched, resulting in the “not-in-my-backyard” (NIMBY) response, which serves to articulate citizens’ frustration over the manner by which projects are sited. It arises, in part, from the failure of the regulators to take the psychosocial impact of these facilities seriously. The citizens, seeing no room for compromise in the response of the regulators, regard the situation as an all-or-nothing, win-or-lose, battle.<br />
<br />
One of the lessons that Edelstein draws from his study is the engineering fallacy, which involves the assumption that problems can be solved in isolation, away from the complicating factors and uncertainties of the real world. If we narrow a problem enough, it will be controllable, and solvable. He points out Bateson’s 1972 book, <i>Steps to an Ecology of Mind</i> as a source for an alternative method to that of traditional science and engineering. In this approach, any learning is done within a context. Called metalearning, it seeks to recognize the context of a problem, rather than deduce and isolate it.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-74522208731902944552010-05-23T08:53:00.000-04:002010-05-23T08:53:26.338-04:00Wynne, Brian “Misunderstood misunderstandings” (pp. 19-46)<i>Misunderstanding Science? The Public Reconstruction of Science & Technology (1996)</i><br />
<br />
A case study of hill sheep-farmers of the Lake District of northern England that were affected not only by Chernobyl, but also the nuclear reactors at Sellafield (formerly Windscale). Wynne examines the interplay between social and cultural identities, especially those of the sheep-farmers, as they see themselves threatened by the scientists interventions. What is revealed is arrogance on the part of the scientists who discount the local knowledge of the sheep-farmers, even when that knowledge is essential to understanding the scientific issue at hand, namely radioactive contamination.<br />
<br />
The relationship between the scientists and the sheep-farmers is further undermined by the lack of full disclosure on the part of the scientists, and also by the changing assertions of the scientists. At first they said there would be no effects from Chernobyl, but six weeks later (20 June 1986) the Minister for Agriculture announced a ban on sheep sales and movement in several of the affected areas. Once the scientists had admitted the contamination, they insisted that the initially high cesium levels would fall soon, but their predictions were based upon a false scientific model. Their model was based upon empirical data of alkaline clay soils, not the acid peaty soil found in these upland areas.<br />
<br />
The degree of certainty that the scientists expressed in their statements denied the ability of the farmers to cope with ignorance and lack of control, and the degree of standardization of knowledge denied the variation of the conditions in the region from their models and even from farm to farm. We thus see scientists inappropriately applying their specialized knowledge and not acknowledging the specialized knowledge of the sheep-farmers, but where the farmers were willing to work with the scientists, the scientists did not seem to be willing to work with the farmers.<br />
<br />
The distrust that the farmers soon came to have for the scientists, and then for the government that was employing them, also caused the farmers to question the government’s assertions about Sellafield. The scientists asserted that contamination from Chernobyl could be distinguished from contamination due to Sellafield, thus making Chernobyl a convenient cover or scape goat for previous misdeeds. As the distrust grows you transition from considering the scientists merely as arrogant, to thinking that maybe there has been some kind of coverup or conspiracy, all of which only serves to further undermine the public’s trust and understanding of science.<br />
<br />
Wynne sees the conflict as one between social identities, both groups, the scientists and the sheep-farmers, have their identity threatened by the other. These sorts of conflicts bring to light the whole issue of knowledge systems and the problems that arise when formal knowledge systems interact with informal ones. The formal knowledge systems often don’t know how to acknowledge or understand the informal knowledge systems because the former have a hard time quantifying the latter. The problem may simply be one of communication, these two types of knowledge systems simply may not speak the same language, or, even worse, they may speak the same language but mean subtly different things.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-33483772911696740352010-05-16T15:06:00.000-04:002010-05-16T15:06:01.878-04:00B. Campbell - “Uncertainty as Symbolic Action in Disputes among Experts” - Social Studies of Science 15 (1985): 429-53Campbell claims that uncertainty does not cause controversy because the content of scientific knowledge is a social construction. Therefore uncertainty is something that is negotiated, discussed and argued about. He further argues that the adequacy of empirical evidence thus becomes a prop in the social negotiations that occur over the credibility of expert statements made in public arenas, where the authority of a scientist as an expert is connected to the image of the relationship between scientific understanding and empirical evidence.<br />
He is attempting to establish five points.<br />
1) uncertainty is a strategic element of argument as opposed to something that causes argument;<br />
2) adequacy of evidence and knowledge is relative and varies with the social situation of experts;<br />
3) the social structuring of expert arguments does not mean that the scientists’ arguments have been ‘distorted’ by the social circumstances of their expertise;<br />
4) uncertainty arguments don’t necessarily undermine the credibility of scientific expert knowledge;<br />
5) the approach that he takes emphasizes the political dynamics of expertise and the complex relationships between scientific and policy issues.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0tag:blogger.com,1999:blog-7539722978994645650.post-40867778145750359762010-05-02T07:09:00.000-04:002010-05-02T07:09:22.430-04:00E. Chargaff - Heraclitian Fire (1978)An idealist, a romantic, a classical man in a modern world. He never found his place, never found himself. [Oscar Levant: It’s not what we are that hurts, it’s what we aren’t.] In the 20th C, especially in the latter half, the pace of scientific advances became inhuman, and science became inhuman, accumulating facts, but not understanding. Increasingly fragmented, increasingly specialized, a Red Queen’s Race. There is no time to understand the ramifications or the importance of a discovery, because a new one is just around the corner. Even the sciences are forgetting their history. The citations and references of research papers rarely go further back than a decade. Everything is compressed, eternally in the present. Without a past how can there be a future. Scientists need to ask why they are doing what they are doing, to what ends are they working, how will their knowledge be used.<br />
<br />
Echoing Mitroff, Chargaff is a humanist scientist making a plea for the development of Dionysian science.Carolyn J. Blakelockhttp://www.blogger.com/profile/15436329813517665795noreply@blogger.com0