Friday, September 17. 2010
Google CEO Eric Schmidt recently said “Between the birth of the world and 2003, there were five exabytes of information created. We [now] create five exabytes every two days.”
I don’t know how reliable the first number is, but I’m sure that digital, online technology has spurred phenomenal growth. Digital, online technology makes it easier and more cost-effective to produce, store, and distribute content. That much is obvious.
What’s less well known is that it’s now possible to measure everything. By that I mean (for starters) everything you do, everywhere you go, everything you say, and everything you read, listen to, and watch. It’s also possible to measure the behavior of groups of people, companies, countries, markets, and the entire world.
Isaac Asimov foresaw the possibility of using math to predict the future—at least in its broad outline. He based his Foundation science fiction series on what he called the science of psychohistory. However, Asimov believed that psychohistory would not be accurate enough to predict smaller scale events in the distant future.
An important question emerges: are predictions about smaller scale events unreliable because the laws of physics get in the way? It may be impossible to predict small scale events well into the future, but there’s plenty of evidence that we can predict near-term, small scale events. This is what game theory, behavioral targeting, and sociophysics are all about.
I’m concerned that the propensity to measure and record everything could be harmful to individuals. This information could be used against us in subtle and not-so-subtle ways. Plus, those who collect and analyze the data would have a huge advantage over individuals.
If privacy is dead, so is individual freedom. Avoid, as much as possible, being tracked and profiled.
Monday, August 30. 2010
I remember when people were responsible for their own actions. If you committed a violent crime, people weren't as willing as they are today to accept explanations, such as past traumatic experiences, designed to diminish your culpability. And you certainly couldn't get away with saying that a debate about the location of a mosque caused you to embrace violent extremism.
Something has gone very wrong.
In an Associated Press article by Rachel Zoll, NYC mosque debate will shape American Islam, graduate student Adnan Zulfiqar suggests that the debate over a proposed mosque at the site where nearly 3,000 Americans were massacred could "make" some American Muslims turn radical:
"They're already struggling to balance, 'I'm American, I'm Muslim,' and their ethnic heritage. It's very disconcerting," said Zulfiqar, 32, who worked for former U.S. Sen. Max Cleland, a Georgia Democrat, and now serves Penn's campus ministry. "A controversy like this can make them radical or become more conservative in how they look at things or how they fit into the American picture."
Is this a plea for fairness or a threat?
Consider the facts. There are thousands of mosques in the U.S. Almost immediately after the 9/11 attacks, our leaders declared that "Islam is a religion of peace" and that the attackers represented a perversion of that religion. There have been very few attacks against Muslims in the U.S. since then. The American people are fair-minded and tend to judge people, as Martin Luther King implored, by the content of their character.
Opposition to the proposed Ground Zero mosque is perfectly legitimate. There are well over 100 mosques in New York City. Opponents urge the developers to build anywhere but the site of the World Trade Center attack. A number of Muslims have spoken out against the proposed mosque at Ground Zero.
Part of the problem is that radical Muslims do not accept individual rights as enshrined in the US Constitution. We adhere to the principle expressed in the saying "I disapprove of what you say, but I will defend to the death your right to say it." Radical Muslims have responded violently to books and even cartoons that disparage Islam. When Adnan Zulfiqar suggests the controversy surrounding the proposed Ground Zero mosque could "make" some Muslims embrace extremism, he is simply demonstrating that he is not comfortable living in a free society.
The US Constitution protects every citizen's right to practice his or her religion. But it's important to understand that the Founders believed that religion and the proper functions of the state are two distinct spheres. The US Constitution does not guarantee the right of Muslims—or anyone else—to impose their values on others or to threaten violence when they don't get their way.
Friday, July 9. 2010
Esther Dyson always brings a unique and nuanced perspective to issues. Speaking at the Health 2.0 Goes to DC conference, Esther proposed that the health ecosystem consists of three markets: Health Care 1.0, Bad Health, and Health Care 2.0. I'm glad she emphasized some of the challenges faced by the nascent Health 2.0 market. But lumping drug abuse, processed foods, and lack of exercise together and calling it the "Market for Bad Health" is a bad idea.
Whether she intended it or not, Esther implies that to varying extents the tobacco, processed foods, automobile, alcoholic beverage, and television industries all make at least some of their money by damaging people's health. I don't think that's true. For example, some processed foods contain added vitamins; others remove natural ingredients that are harmful to people with specific allergies or medical conditions. There is also a legitimate place for foods offering benefits such as convenience or long shelf life.
You could certainly argue that the tobacco industry makes money by damaging people's health--though some smokers live long lives. But there is nothing inherently wrong with food processing (or automobiles and even alcoholic beverages). Some processed foods contain potentially harmful ingredients, just as some natural foods may be improperly handled or stored. For most people, it's probably fine to eat foods containing additives or preservatives once or twice a week. Can the same be said about eating contaminated natural foods?
But what worries me most is that calling a range of products and services the "Market for Bad Health" is an invitation for excessive government intervention and perhaps even social regimentation. We have to accept that some people will choose unhealthy lifestyles regardless of how many educational programs and regulations are created. Plus, many government programs and regulations have unintended consequences. Some things are harmful in ways that are obvious, but beneficial in ways that are not well recognized.
Because if we want to empower individuals to manage their own health and health care--and that to me is the primary virtue of Health Care 2.0--we need to let them make real choices.
Thursday, June 24. 2010
I am somewhat dismayed that less than 50% of Americans recognize that big government endangers individual rights. The good news is that 15% are undecided. These are the results of the Rasmussen Poll announced today.
You only need to study history or visit a cross section of the world's countries to know that wherever there is a lack of freedom of speech, freedom of religion, or freedom of assembly there is sure to be big, despotic government.
Sadly, young people today are rarely taught that the Founders of our country were the first to approach the design of government from a scientific perspective. The Founders examined forms of government past and present with the purpose of architecting a government that recognized the people's unalienable right to "life, liberty, and the pursuit of happiness." It was the brilliantly conceived system of checks and balances that made it work and, for the most part, endure.
It's also sad that 37% of adults believe that big government is a protector of individual rights. Though it's not surprising: even police states have supporters. We've seen this recently in Iran, where a brutally oppressive theocratic government still manages to stage mass rallies in its support and has no trouble finding people willing to beat and kill peaceful protesters. I doubt that 37% of Americans would support a police state, but I'm fairly sure they would be willing to trade some of their individual freedoms for the illusion of government-guaranteed financial security.
The real question is: how determined are the 48% to fix things?
Thursday, June 3. 2010
The following is a transcript of Benjamin Netanyahu's statement regarding the attempt to create an unrestricted supply line to the Hamas terrorists:
Once again, Israel faces hypocrisy and a biased rush to judgment. I'm afraid this isn't the first time.
Sunday, May 23. 2010
Collectivists have been with us since the beginning of recorded history, and will probably be with us for as long as the human race exists. But the popularity of collectivism ebbs and flows.
My theory is that the more prosperous a civilization becomes, the more its idle rich (and just plain idle) are able to indulge in Utopian dreams of a more egalitarian (and necessarily regimented) society. It’s only after the dead bodies are counted that most people recoil from collectivism.
Sadly, we seem to be experiencing another rising tide of collectivism. We are confronted at every turn by exhortations to be good team players, to serve others, and to “give back to the community.” We are asked, as individuals, to make sacrifices to save the planet, reduce the cost of health care, and support an ever-expanding roster of government programs. If you act in your own self-interest, you are bad. If you serve the faceless crowd, you are good.
But questions continue to nag me. If it is better to give than to receive, shouldn’t we do more to ensure that only those truly in need receive? With so much giving going on, it’s hard to imagine there isn’t a great deal of illicit receiving. Personally, I wonder about politicians whose wealth cannot be explained by their comfortable (but hardly extravagant) salaries.
One of the most important indicators of the success of a false ideology is how people frame the issues. You know you are in trouble when the ideology’s language and assumptions are built into almost every discussion. False ideologies are often draped in “fairness” and “the common good,” while anyone who disagrees is labeled “greedy” or “an extremist.”
It never ceases to amaze me how collectivists invent new theories, lexica, and excuses whenever they need them. During the 1930s, it was the workers versus the bosses. During the 1960s, it was the students versus The Establishment. Now it is members of the community versus anyone who believes in limited government.
You know it’s time to worry when collectivists start rewriting (or worse, inventing) history to prop up their ideas. In fact, I was prompted to write this post by Matt Ridley’s essay in this weekend’s Wall Street Journal: Humans: Why They Triumphed.
Ridley claims the reason humans have been so successful (compared to other species) is “collective intelligence.” That phrase is no doubt music to the ears of collectivists. To individualists, however, it is (like “the wisdom of crowds”) an oxymoron. Worse, it leads Ridley to conclude that “innovation is a collective enterprise.”
Ridley conflates the accumulation and exchange of knowledge with collective action. Here is my illustration of how this gets it all wrong. Danish scientist Hans Christian Oersted noticed in 1820 that a compass needle was deflected when an electric current in a nearby wire was switched on or off. British experimenter Michael Faraday followed up that finding and made a series of discoveries of his own about electromagnetism. While it’s fair to say that Faraday built on Oersted’s work, it would be quite a stretch to say they were engaged in collaborative research.
When you boil it all down, Ridley claims that printing, communicating, trading, and even specialization all show that innovation is a group activity. He’s right when he asserts that most of today’s products—whether a computer or a simple pencil—are too complex to be made by an individual. (Sure, manufacturing is generally a group activity.) But he appears blind to the fact that most products result from innovations made by individuals—often in the face of collective resistance.
Saturday, May 15. 2010
In 2007, Gary Kasparov published How Life Imitates Chess, a book drawing parallels between success on the chessboard and success in the boardroom. I’ve gotten glimpses of how he operates in both realms.
In 1999, Kasparov played an online chess match titled “Kasparov versus the World.” The game provided a great opportunity to observe a creative individual battling against the wisdom of the crowd (in this case, with a panel of experts on their side). As you might expect, the world played a very safe and sound game. Kasparov employed just enough subtlety and surprise to score a victory. He understood who he was up against and fine-tuned his strategy for a very smart but ultimately predictable opponent.
The contest reminded me of the difference between a standards committee and an inventor. A standards committee is good for dotting the i’s and crossing the t’s. But if you want to do something that has never been done before, you need imagination and daring.
I had my next encounter with Kasparov ten years later. In 2009, my son entered the U.S. Chess Federation SuperNationals chess tournament in Nashville, Tennessee. Kasparov was the keynote speaker at the opening ceremony. But first, we had to listen (for what seemed an eternity) to USCF dignitaries thanking the many people who put the event together. Finally, Gary Kasparov was introduced, and his words immediately resonated with the predominantly young audience. He talked about how, during his youth, he hated opening ceremonies because of the boring speeches—he couldn’t wait to start competing. The audience roared with delight.
There is much wisdom in How Life Imitates Chess. My favorite line is “Better decision-making can’t be taught, but it can be self-taught.” Modern educators either don’t believe it or won’t admit it, but most learning is an individual activity.
Saturday, May 8. 2010
Many of us welcomed the World Wide Web as a liberating technology. The Web promised to perfect the capitalist system: from now on, all markets would be global and all transactions would occur at the speed of light. Consumers would have instant access to information on any subject. No government would be able to stem the free flow of facts and opinions. With anonymous but verifiable digital cash, even oppressive taxes could be circumvented.
It turned out we were just a tad too optimistic.
It’s not acknowledged as often as it should be, but a falsehood travels as fast as the truth on the Internet. Breaking news stories often contain inaccuracies that spread quickly across the Web—even after the original report has been corrected. There is much good information on Wikipedia, but there is also some bad information. You should not always believe what you read—whether it’s in a book or on the Web.
My wakeup call came about ten years ago when online activists began promoting “direct democracy.” The activists claimed that the United States was established as a representative democracy primarily because at that time there was no simple and timely way to poll a geographically scattered population on the issues as they came up. Thanks to the Internet, it was now possible to directly poll the people, so we could do away with elected representatives. I knew that wasn’t the real reason we are a representative republic and that “direct democracy” was simply mob rule in new clothing.
On balance, it still seemed that the Internet was a liberating technology. A falsehood could spread quickly, but the truth would never be far behind. Though some people were calling for government regulation of the Internet, most users understood the Internet succeeded because it was self-regulated. Besides, no single entity could really control the Internet. There is no central point of control; on the Internet, most of the power resides at the edges. Getting on the Internet is easy and inexpensive for content publishers. And digital technology makes it easy to operate anonymously—beyond the reach of censors.
Still, one point continued to nag me. During the 20th century, several writers warned that if we weren’t careful technology would be used to pacify, manipulate, and even oppress us. Their books, films, and television programs depicted several different dystopian futures. The warnings were particularly hard to ignore because some of what they predicted was already happening.
Patrick McGoohan’s television series The Prisoner, Aldus Huxley’s book Brave New World, and George Orwell’s novel 1984 are three important examples.
In The Prisoner, McGoohan plays a secret agent who resigns only to be abducted and brought to a place called The Village. Both wardens and prisoners are addressed only by their assigned numbers. A variety of high-tech psychological techniques are used in a relentless effort to get Number Six (McGoohan) to disclose why he resigned. The Prisoner foreshadows the Web in that individuals are known by numbers (analogous to IP addresses) and elaborate schemes are used in order to extract private information (analogous to phishing attacks).
Huxley’s Brave New World depicts a futuristic, class-based society in which people are created in test tubes and trained from infancy to serve defined roles. Technology is used to keep everyone happy—and somewhat numb to reality. Huxley also believed that in the future people would be controlled through subliminal suggestion. In retrospect, the most worrisome thing about Brave New World is how willingly people give up their individual sovereignty for hedonistic pleasures. Though some Internet users still worry about privacy, many more seem comfortable trading their privacy for perks such as free email (example: Gmail).
Orwell’s novel 1984 graphically depicts how brutal leaders might use technology to brainwash, control, and spy on citizens. Big Brother’s primary tool for controlling the upper and middle classes is the “telescreen”—a combination television and surveillance camera. That was a pretty good guess—given that Orwell wrote his book in the late 1940s. Even today, most users probably don’t realize how much information they reveal about themselves when they use search engines, social networks, and cloud computing. A video camera can see your body; a Web access device is a window into your soul.
Do we really have something to worry about? A handful of online companies—most notably Google but also companies such as Facebook—have the ability to gather data on hundreds of millions of users day in and day out. Most of us can hardly begin to imagine what that information reveals about the behavior of individuals, groups, and humanity as a whole. Meanwhile, the U.S. is moving rapidly towards Crony Capitalism, a system in which leading politicians and powerful government agencies forge special relationships with a few large corporations. And there are many examples of how the Web is being used to mislead us—from altering digital photographs to manipulating search results.
Yes, it’s time to start worrying.
Saturday, March 13. 2010
I confess that I have done a complete turnaround on this issue.
Ten years ago, I felt that privacy activists were trying to make life difficult for online businesses and were inhibiting development of powerful "personalization" technology. I saw little difference between visiting a Web site and walking into a bricks and mortar store. The word "privacy" does not appear in the Bill of Rights.
Now I fear that PCs and other devices with Web access are becoming the Telescreens depicted in George Orwell's classic dystopian novel, 1984. These devices are two-way, and they are increasingly used to gather information about us. More than most people realize.
And wouldn't you know it: as my concern grows others seem to be backing off. They don't care that their e-mail is being scanned. They don't mind storing their personal documents in the cloud. They don't mind that their location is being tracked by their mobile phone. See Declan McCullagh's article at CNET news.
Now we really have something to worry about.
Sunday, February 28. 2010
Through her Little House series of books, author Laura Ingalls Wilder became one of the leading exponents of self-reliance. The original eight volumes depict life growing up on a frontier family homestead during the 1870s and 1880s. I first read these books as an adult, fascinated by Ingalls Wilder’s description of how her father built a log cabin in Little House on the Prairie.
Technology is, at its most basic level, simply the application of knowledge. Ingalls Wilder’s books are filled with great examples of how resourceful pioneers lived off the land, worked out a system of division of labor with their neighbors, and overcame adversity. A log cabin is hardly what anyone would consider advanced technology, but it demonstrates how a little knowledge plus simple materials provided by nature can be used to great effect. Though it seems Ingalls Wilder’s father moved the family whenever he felt their current location was getting too crowded, at one point he worked for a railroad, and the family benefitted from transportation technology both as a source of supplies and for access to distant markets. Ingalls Wilder’s daughter Rose Wilder Lane would later take advantage of another technology, the printing press.
Today’s technology should be a much greater enabler of individualism—and to some extent it is. Think of the Internet as virtual printing press and virtual railroad rolled into one. However, the ability to make a living on your own is endangered by tax increases, copyleft, inflation, open source, net neutrality, and crony capitalism. And the reason is simple: many of our leaders have discovered that it is easier for them to amass unearned power and wealth under collectivism.
(Page 1 of 2, totaling 19 entries) » next page
Syndicate This Blog