Thursday, April 29. 2010
It may not have been Gerald Imber's primary intention, but his recent biography of William Halsted sheds light on how the U.S. developed the world's best-performing clinical care system, and even hints at a better way forward. It started with a group of medical pioneers committed to both research and clinical practice, with a deep respect for repeatable and verifiable scientific findings, and holding themselves and others to high standards. That plus an unfettered market proved a recipe for success.
William Halsted can justifiably be called the Father of Modern Surgery. He pioneered local anesthesia; raised aseptic surgery to a higher level; and invented procedures such as hernia and aneurysm repair. But his overall contribution extends far beyond these technical achievements. Halsted transformed surgery from a brutal act of desperation into a gentle, life-saving art. He and his colleagues at Johns Hopkins not only developed new methods of diagnosis and treatment, they set new standards for physician training and proficiency.
As the title suggests, there is another and quite disturbing side to Halsted. The man was a drug addict. But Halsted did not set out to get high; he became addicted as a result of experiments he performed on himself with local anesthetics. Little was known about treating drug addiction at the time. Though his addiction spanned 38 years of a lengthy career, it did not stop him from performing hundreds of operations and achieving a series of breakthroughs.
It's too bad that our current political leaders are too arrogant to consult history, but if they are truly interested in ensuring affordable health care for all they should study the careers of surgeons such as William Halsted and Harvey Cushing. Both men could and did command exorbitant fees from those who had the means. They could have sat around waiting for the occasional wealthy patient, but they understood that it was in their own interest to treat everyone, regardless of financial means. They charged nothing to the poor, moderate fees to the middle class, and high fees to the wealthy. That allowed them to keep their skills sharp and their coffers full.
Though Imber explores Halsted's personality and personal life, he also describes in detail a number of the medical advances achieved by Halsted and his colleagues. Included are Halsted's gallbladder surgery and Walter Dandy's pneumo-ventriculography for locating brain tumors. These were huge developments in their day. After reading this book, you'll also understand why Johns Hopkins is one of the world's best hospitals--if not the best.
Sunday, January 17. 2010
You Are Not A Gadget: A Manifesto is an interesting read, but overall a disappointment. In fact, it isn’t about you at all. It’s a digirati’s second thoughts about Web 2.0 and the emergent hive mind.
Lanier is a guy with boundless curiosity—and that’s something I admire. He takes the reader on a fast-paced tour of some of the digital realm’s most exotic nooks and crannies. However, he doesn’t argue for individualism as much as against what he calls “cybernetic totalism.”
The book’s second to last paragraph is probably the best summary of the tome--if you can make sense of it. In Lanier’s own words:
The most important thing about postsymbolic communication is that I hope it demonstrates that a humanist softie like me can be as radical and ambitious as any cybernetic totalist in both science and technology, while still believing that people should be considered differently, embodying a special category.
If everyone wants to upload themselves to the Web’s collective consciousness, then I’m willing to help them figure out how to do it, but personally I reserve the right to remain an old-fashioned person with a physical body.
Lanier points out, helpfully, that there is a downside to the Creative Commons and open software. He even dares to suggest that proprietary systems are sometimes superior. (This is a point I have been making for years.) Unfortunately, Lanier presents the latter idea almost as an aside. Instead of shouting it from the rooftops, he whispers it from a sidebar off the book’s main text.
There are some things about the book that I found annoying. Lanier demonstrates his own hive mentality. For example, he takes a few obligatory (but gentle) swipes at George W. Bush. He claims that there wasn’t an independent press during the Bush presidency and that the Bush years “are almost universally perceived as having been catastrophic.” I’ve criticized G.W. Bush more harshly, but at least my criticisms were based on facts.
More annoying is Lanier’s hive writing style. He is a member of a generation of science writers who revel in nuance but shun grand ideas. He makes overly generous use of dubious words such as “retropolis” and “computationalism.” And there are cute-sounding section titles (such as “Schlock Defended”) every several paragraphs.
Still, Lanier is an interesting fellow and this is an interesting book. Just don’t buy it expecting an impassioned defense of the individual in the face of rising collectivism.
Wednesday, January 6. 2010
"I just put down a book that I had a hard time putting down: The History of Wireless: How Creative Minds Produced Technology for the Masses by Ira Brodsky, KC9TC..."
Surfin': The Ghosts of Surfin' Past
Stan Horzepa, WA1LOU
Contributing Editor, ARRL.org
Friday, November 13. 2009
Book Review: The Cult of the Amateur, by Andrew Keen
Andrew Keen sounds the alarm: the Internet is being used to spread false information, cheat artists and authors, steal our identities, destroy our reputations, violate our privacy, and lower our standards. He’s right about all of that. As a society, we need more discussion of the Internet’s corrupting influence, and we need to update some laws to reflect the new online reality.
However, Keen is wrong about two crucial points, and it detracts from the value of an otherwise impassioned and clearly-written book.
Web 2.0 is the Internet after the bubble. It’s a brave new world in which everyone is always connected, everyone has access to the world’s information, and everyone gets to participate. And it’s what Keen calls “The great seduction,” because it’s debasing our culture.
Keen is right that our children spend too much time online. He’s right that the Internet is teeming with false information. He’s right that it’s wrong to illegally redistribute digital music just because it’s easy to do. And he’s right that people we meet on the Internet often are not who they pretend to be.
To hear Keen tell it, however, the Old Media offers all of the virtues that Web 2.0 lacks: trained journalists, qualified experts, fact-checking, a clear line between reporting and editorializing, and ethical standards with teeth. For example, Keen points to Thomas Friedman of the New York Times and Robert Fisk of the Independent in the UK as bona fide experts on the Middle East. Reasonable people might disagree about whether Friedman is as knowledgeable as he is opinionated. But Robert Fisk is well known for his bias against Israel and his belief that journalists should advocate rather than just report the news.
Ironically, Keen uses the example of doctored photos submitted by Reuters photographer Adnan Hajj during the last war between Israel and Lebanon. Keen wants us to remember that Hajj was fired for violating Reuters’ ethical standards. He seems oblivious to the fact that Internet-based citizen-journalists (such as Charles Johnson) have discovered and exposed a series of faux photos, memos, and stories that got past the Old Media’s vaunted gatekeepers.
To wit, the circulation of major daily newspapers is dropping like a rock not just because the Internet is timelier and less expensive, but because consumers are tired of shoddy reporting and editorializing that starts on Page One.
There is one other thing missing from Keen’s analysis: caveat emptor. Sure, there are people selling bad stuff on the Internet, just as there have always been swindlers. We will never enjoy total protection against fraud and deception—unless we are willing to start giving away our freedoms. Personal responsibility has to be factored into the equation.
Keen is right about the dangers of Web 2.0. Instead of trying to revive the Old Media or regulate ourselves to death, however, we need to push for higher standards and update laws where necessary.
Sunday, March 22. 2009
Malcolm Gladwell's best selling book Outliers is masterfully written, thought provoking, and an entertaining read.
It also offers bad advice based on dubious conclusions.
Relying mainly on anecdotal evidence, Gladwell sets out to convince us that success is not just the result of intelligence, ambition and hard work. The surprising truth, according to Gladwell, is that success actually has more to do with the opportunities provided by one's family, local community, and society-at-large.
Gladwell revisits the familiar "nature versus nurture" debate and comes down squarely on the side of nurture. Outliers is a cleverly repackaged version of the "It's not what you know but who you know" message.
What makes this book exceptional is Gladwell's argument that even ambition and hard work are more the product of upbringing than some unexplainable inner drive. And I will grant there is some truth to what he says. But I'm not ready to abandon the often overriding influence of individual initiative.
The problem with anecdotes is that it's easy to pick the ones that support your position and ignore the ones that don't. Gladwell tells us that a key success factor in professional hockey, soccer and baseball is birth date. He explains that cutoff birth dates in each age group favor the oldest kids. For example, in Canada the cutoff date for junior hockey is January 1, which means that a kid who just turned ten competes with kids who won't turn ten for another 10, 11, or 12 months. He offers as proof the fact that 17 out of 25 players on the Medicine Hat Tigers' roster were born in January, February, March or April.
I won't dispute Gladwell's conclusion that success in Canadian hockey is skewed by birth date. But towards the end of the book he slips in this little gem "If Canada had a second hockey league for those children born in the last half of the year, it would today have twice as many adult hockey stars." Sorry, but that does not follow. A second hockey league might reduce the skew, but it would not necessarily double (or even increase) the demand for professional hockey players.
Gladwell suggests that success in any field requires about 10,000 hours of work. But it's not just about working hard. Bill Gates had access to a computer terminal at the tender age of 13 and that allowed him to acquire 10,000 hours of programming experience. Likewise, the Beatles obtained a gig in Hamburg, Germany which gave them 10,000 hours of experience performing live. It's all about the opportunity, you see.
I'm not sure Gladwell's math is accurate, but that's not the issue. Bill Gates succeeded as a businessman--not as a programmer. The Beatles got the gig in Hamburg because they were good from the start and were willing to invest long hours playing in seedy clubs.
There is one distinction that seems to escape Gladwell entirely: the distinction between modest success and extraordinary success. Certainly the factors discussed by Gladwell contribute to modest success. Children raised with high expectations will generally do better than those raised with low expectations. But extraordinary success often transcends factors such as birth date, economic class, ethnic group, and education. To wit, there is no place in Gladwell's scheme for someone like Michael Faraday, who became one of the greatest scientists in history despite an impoverished childhood and limited education.
Gladwell tells us that successful people really aren't outliers at all. They are the beneficiaries of opportunities provided to them by others. But he cherry-picks his examples. I can cherry-pick many more counterexamples.
Wednesday, December 10. 2008
James Gleick’s bestseller Chaos: Making a New Science was published more than twenty years ago. The book introduced the public to an intriguing theory, the mavericks who pioneered it, and some of its most promising applications. So where does the new chaos science stand?
Gleick’s book is interesting and well-written, but I found it unsatisfying. The use of the word “chaos” is misleading. Chaos implies a total lack of order; the phenomena described in the book are all deterministic. In fact, some people call it “deterministic chaos.” Sounds like an oxymoron to me.
Near the end of the book Gleick admits that some of the researchers he portrayed are themselves uncomfortable using the word “chaos.” The systems they study appear to behave randomly only because they are extremely sensitive to small perturbations in initial conditions. That’s true for systems that are either complex or simple and nonlinear.
Don’t get me wrong. I’m not against studying systems that are complex or nonlinear. Some of my best friends are complex or nonlinear.
What bugs me is that Gleick and others use the word “chaos” simply for its mystique. I guess we are supposed to imagine that chaos theory transcends ordinary science and discovers higher truths. Still, I’m willing to cut the chaos kids a little slack. Brochures present products in their best light and resumes do the same for job seekers. Why shouldn’t complex/nonlinear system researchers toot their own horn?
What worries me is that chaos theory promises to revolutionize important applications such as diagnosing and treating heart arrhythmia--but mainly delivers more research proposals. I realize that fractals—which have many practical applications—are now associated with chaos theory, but the mathematics underlying fractals has been around a long time. Chaos theory also promises advances in such far-flung fields as economics, meteorology, and ecology; good luck with that!
An article in Complexity Digest suggests that chaos theory “inspired” development of a method for predicting epileptic seizures. No one can argue with that claim. But it boils down to searching for and finding patterns amidst what appear at first to be random fluctuations. Isn’t that called “pattern recognition”?
It’s hard to trust a science with a misleading name. Particularly when there is so much taxpayer-funded research.
Friday, November 21. 2008
Self-doubt is essential for scientists and inventors. Doubt helps us avoid mistakes. Doubt leads us to discoveries. Unfortunately, it’s easy to say we constantly question our own beliefs--and hard to do.
People try to convince themselves and others that they are healthy skeptics merely because they question and challenge other people’s ideas. Sorry, but that doesn't fly. It’s easy to doubt ideas in which we have little if anything invested. A genuine healthy skeptic challenges his or her own convictions and leanings, and does so repeatedly.
I just finished reading The Pleasure of Finding Things Out, a compilation of speeches and other short works by the American Nobel Prize winning physicist Richard Feynman. The importance of being skeptical is a recurring theme in the book. Some of my favorite quotes from Feynman on the subject are:
"The first principle is that you must not fool yourself - and you are the easiest person to fool."
Specifically, Feynman said that it’s important for scientists to report all of the details of their investigations, highlighting areas of uncertainty. (Yes, even scientists tend to "spin" their results.) And when following up on recent research by others, they should carefully repeat the original experiments. In Feynman’s view, there is a continuum of certainty. At one extreme is complete doubt, and at the other extreme is complete certainty; science always operates between these extremes. Scientists like to give others the impression they deal with absolute certainty, but that's not true.
I encounter many people who say they are skeptical thinkers, but I encounter very few people who really live by the ideas above. In fact, I see people proudly displaying a distinct lack of skepticism.
For example, I often hear scientists say “We now know (fill in the blank).” What they should say is “We now believe…” or “We now think we know…”
As America’s second greatest inventor (after Thomas Edison), Edwin H. Armstrong, was fond of saying “It ain’t ignorance that causes all the trouble in this world. It’s the things people know that ain’t so.”
We tend to reduce any topic we are investigating to a few key facts and principles. It helps us to get our arms around the issues. Unfortunately, there is inevitably some loss of information. The lost information doesn’t seem crucial at the time, but it often becomes important later. That's why it's necessary to remain vigilant.
Apply this thinking to controversies such as global warming, intelligent design, superstrings, and health care reform. I'm not suggesting that the accepted view is necessarily wrong. Nor am I saying both sides are partly right. But we should recognize that our own positions are never completely unassailable, that the other person’s position may have merit we have overlooked, and that as healthy skeptics we should be willing to revisit ideas supported by people who are otherwise intelligent and reasonable—no matter how wrong we previously thought they were.
Note: several changes (including the title) were made Dec. 9 to clarify this post.
Tuesday, July 15. 2008
I just read How the Laser Happened, physicist Charles Townes’ memoir about the development of the maser and laser. In addition to chronicling Townes’ seminal contributions to “amplification by stimulated emission of radiation,” the book provides a unique window into the process of discovery and invention. Townes reinforces some conventional ideas about the creative process while totally demolishing others.
For example, Townes recognizes the value of collaboration. However, it’s not clear that he believes individuals become more creative when they collaborate with others. Rather, he understands that ideas from multiple sources may be required to solve a complex problem. Collaboration may entail working together closely—or it may simply mean sharing ideas at a conference. Townes ends the book by concluding that discoverers and inventors must think what no one else has thought and take paths that no one else has traveled. To me, that sounds more like going it alone than collaborating.
Townes describes how at first respected scientists doubted his work. When he began developing the maser at Columbia University, both I.I. Rabi and Polykarp Kusch tried to dissuade him, arguing it was a waste of resources. One Columbia professor went even further, insisting that elementary physics precluded the maser working as Townes envisioned. Niels Bohr told Townes “But that is not possible” and John von Neumann exclaimed “That can’t be right.”
To say that most physicists came around once Townes demonstrated a working maser would be a tremendous understatement. Suddenly everyone wanted to climb on the maser bandwagon. A popular joke at the time was that maser stood for “means of acquiring support for expensive research.”
Townes denies that maser and laser development were carefully planned and managed. He attributes the inventions, instead, to the freedom granted researchers in those days to pursue their own ideas. Townes does not come right out and say that planning sometimes impedes scientific discovery, but I think that is what he means.
One of Townes’ most fascinating ideas concerns the scientific method. While most scientists like to test their ideas in the laboratory as soon as possible, Townes prefers to get the theory right first. That way, developing a device is simply a matter of correctly applying the theory. It isn’t that he considers empirical research the wrong way to do science—it’s just that it isn’t the right way for him.
Though Townes describes his method as if it were merely a matter of personal preference, there are obviously some big issues here. The orthodox view is that the scientific method requires experimentation and observation. Townes obviously believes that the underlying principles can all be worked out in advance.
This leads me to the conclusion that throughout his career Charles Hard Townes was primarily a theoretician in the tradition of Maxwell and Einstein. That is, Townes believed he could make discoveries working exclusively in the realm of ideas. Perhaps it was because he wanted to ensure his theories were proved correct that he continued on to the implementation stage. Or perhaps it was because he knew devices based on his theories would open further avenues of research.
Two things are certain. “Amplification by stimulated emission of radiation” has led to breakthroughs in fields ranging from precision time measurement to radio astronomy to fiber optic communication to eye surgery. And Townes made it happen not by following the rules, but by (gently) breaking them.
Monday, May 12. 2008
The latest flare up in the evolution versus intelligent design dispute has brought out some questionable assertions regarding the role of theory in science. One such claim is that a theory is a hypothesis that has passed a round of tests. That is, there are different degrees of certainty in science; ‘theory’ is simply the intermediate state between ‘hypothesis’ and ‘law’.
That is certainly good news for the proponents of reigning theories. But is it good for science? The history of science shows that relatively few theories endure—and even fewer endure without modification. More important, theory-as-partially-verified-hypothesis confuses the distinction between the facts that a theory attempts to explain and the explanation.
The French scientist Claude Bernard, widely considered the Father of Physiology, recommended that researchers treat all theories with skepticism. He had good reasons for this attitude. Bernard was at the forefront of the movement to liberate medicine from scholasticism and refashion it as a dynamic, experimental science. His classic work An Introduction to the Study of Experimental Medicine is worth quoting at length (all of the following quotes are found in Part 1, Section III):
The first condition to be fulfilled by men of science, applying themselves to the investigation of natural phenomena, is to maintain absolute freedom of mind, based on philosophic doubt. Yet we must not be in the least skeptical; we must believe in science, i.e., in determinism; we must believe in a complete and necessary relation between things, among the phenomena proper to living beings as well as in all others; but at the same time we must be thoroughly convinced that we know this relation only in a more or less approximate way, and that the theories we hold are far from embodying changeless truths. When we propound a general theory in our sciences, we are sure only that, literally speaking, all such theories are false. They are only partial and provisional truths which are necessary to us, as steps on which we rest, so as to go on with the investigation; they embody only the present state of our knowledge, and consequently they must change with the growth of science, and all the more often when sciences are less advanced in their evolution. On the other hand, our ideas come to us, as we said, in view of facts which have been previously observed and which we interpret afterword. Now countless sources of error may slip into our observations, and in spite of all our attention and sagacity, we are never sure of having seen everything, because our means of observation are often too imperfect. The result of all this is, then, that if reasoning guides us in experimental science, it does not necessarily force its deductions upon us. Our mind can always remain free to accept or to dispute these deductions. If an idea presents itself to us, we must not reject it simply because it does not agree with the logical deductions of a reigning theory. We may follow our feelings and our idea and give free rein to our imagination, as long as all our ideas are mere pretexts for devising new experiments that may supply us with convincing or unexpected and fertile facts.
Bernard is not saying that scientists should be extreme skeptics. Nor is he saying that theories are mere guesses. What he is saying is that theories are attempts to connect the factual dots within a logically consistent model. Theories can help lead us to new discoveries. But we must not let them blind us to facts that don’t fit the model.
Bernard proceeds to describe how unwarranted attachment to theories can lead to error:
If a doctor imagined that his reasoning had the value of a mathematician's, he would be utterly in error and would be led into the most unsound conclusions. This is unluckily what has happened and still happens to the men whom I shall call systematizers. These men start, in fact, from an idea which is based more or less on observation, and which they regard as an absolute truth. Then they reason logically and without experimenting, and from deduction to deduction they succeed in building a system which is logical, but which has no sort of scientific reality. Superficial persons often let themselves be dazzled by this appearance of logic; and discussions worthy of ancient scholasticism are thus sometimes renewed in our day. The excessive faith in reasoning, which leads physiologists to a false simplification of things, comes, on the one hand, from ignorance of the science of which they speak, and, on the other hand, from lack of a feeling for the complexity of natural phenomena. That is why we sometimes see pure mathematicians, with very great minds too, fall into mistakes of this kind; they simplify too much and reason about phenomena as they construct them in their minds, but not as they exist in nature.
He goes so far as to suggest that science can give rise to a unique form of superstition:
...We must trust our observations or our theories only after experimental verification. If we trust too much, the mind becomes bound and cramped by the results of its own reasoning; it no longer has freedom of action, and so lacks the power to break away from that blind faith in theories which is only scientific superstition.
Or as the great American inventor Edwin H. Armstrong was fond of saying, “It ain’t ignorance that causes all the trouble in this world. It’s the things people know that ain’t so”:
It has often been said that, to make discoveries, one must be ignorant. This opinion, mistaken in itself, nevertheless conceals a truth. It means that it is better to know nothing than to keep in mind fixed ideas based on theories whose confirmation we constantly seek, neglecting meanwhile everything that fails to agree with them. Nothing could be worse than this state of mind; it is the very opposite of inventiveness...
Educators would do well to heed Bernard’s words:
In scientific education, it is very important to differentiate, as we shall do later, between determinism which is the absolute principle of science, and theories which are only relative principles to which we should assign but temporary value in the search for truth. In a word, we must not teach theories as dogmas or articles of faith. By exaggerated belief in theories, we should give a false idea of science; we should overload and enslave the mind, by taking away its freedom, smothering its originality and infecting it with the taste for systems.
Finally, he points out that not only religious beliefs, but scientific beliefs, can obstruct the search for truth:
To sum up, two things must be considered in experimental science: method and idea. The object of method is to direct the idea which arises in the interpretation of natural phenomena and in the search for truth. The idea must always remain independent, and we must no more chain it with scientific beliefs than with philosophic or religious beliefs; we must be bold and free in setting forth our ideas, must follow our feeling, and must on no account linger too long in childish fear of contradicting theories...
Today, Bernard is revered as a champion of vivisection but shunned as an opponent of human clinical studies. Though he was wrong to assume animal research is sufficient, his skepticism towards the use of statistics in human studies was not entirely unwarranted. Scientists would do well to read his book today.
UPDATE May 13, 2008 - 10:00am Eastern:
According to Wikipedia, even the National Academy of Sciences has decided it's time to redefine theory. The Academy reasons that if a theory purports to explain a sufficiently large number of facts, then somehow that validates the explanation.
Never mind that there can be any number of theories to explain the same facts. Never mind that the Academy is essentially tossing out the bulk of philosophy of science.
NOTE: This post was originally titled Claude Bernard’s Take on “What is a Theory?”
Wednesday, February 13. 2008
Today I am inaugurating a new feature focusing on the old: reviews of classic books, articles, and documents in the history of technology.
I just finished reading Michael Pupin's engaging and inspiring autobiography, From Immigrant to Inventor. Pupin's creations included loading coils (extending the range of telephone lines) and a technique for reducing x-ray exposure time (making x-ray imaging safe for humans). Pupin was a pioneer in every sense of the word; it is reflected in his life story, straightforward writing style, and indomitable spirit.
Pupin's intellect quickly outgrew the place of his birth, the small Serbian village of Idvor in the Austrian Empire. So he was packed off to school in the nearby town of Panchevo. His new teachers, recognizing his potential, recommended that the fourteen-year old be sent to Prague. About one year later, Pupin decided he had outgrown Prague, selling most of his belongings (including his only coat) to buy a steerage-class ticket for New York City. He spent the next two weeks clinging to the ship's smokestack for warmth and arrived in America with just five cents in his pocket. After a series of menial jobs, he took the entrance exam for Columbia College at age 20, and earned free tuition. Pupin proved to be an outstanding student and athlete, and after graduating from Columbia he pursued further studies at the University of Cambridge and University of Berlin.
Pupin's idealism, more than anything else, made this a great book (he won the Pulitzer Prize for biography in 1924). Though confronted by much adversity, Pupin always saw the glass as half full. He believed that America was a land of opportunity, and his own life was a shining example. He studied not only science, but the great scientists. He defended America against European charges of crass materialism.
From Immigrant to Inventor reminds me in some ways of Laura Ingalls Wilder's Little House on the Prairie. Pupin tells his story in simple and direct language. He calmly and patiently describes the many hardships he faced--and how he overcame them. He offers a great deal of common sense, even when discussing science.
Unfortunately, the last chapter of Pupin's autobiography focuses on the National Research Council. Though he advocated a balance between pure research and applied science most of his life, in later years he seemed obsessed with big research. I liked Pupin better when he was defending American ingenuity and industry.
I recommend this book to all aspiring scientists, inventors, and entrepreneurs. It's as relevant today as it was in 1923.
(Page 1 of 1, totaling 10 entries)
Syndicate This Blog