Thursday, April 7. 2011
Business experience is key to doing things right the first time, avoiding predictable mistakes, and riding out storms. But some situations call more for guts. Three leading figures from the History of Wireless—Guglielmo Marconi, Paul Galvin, and David Sarnoff—are great examples.
Marconi was a courageous visionary. When others said that wireless wouldn’t work, and even if it did there was no need for it, he only became more determined. That story has been retold many times. Less well known is how Marconi built a reputable business using incredibly primitive and unreliable wireless technology.
Marconi’s wireless technology was a step above smoke signals. The receiver could only detect the presence of a signal. To send a coded message, the transmitter had to be turned on and off at specific intervals and the receiver had to record the signals on a moving tape. Plus, at the time there weren’t separate frequency channels, so competing users had to take turns.
Marconi hid his technology’s weaknesses by hiring and training his own operators and offering wireless strictly as a turnkey service. He carefully avoided competition with cable-based systems. Still, he understood that the only way to grow the market was to continuously extend the range of wireless communication. He claimed the first transatlantic wireless transmission and milked it for what it was worth—and then some. The first message consisted merely of the letter “S” sent over and over in Morse code. There was no independent confirmation of the achievement.
But so what? Marconi created an important niche market and quickly dominated it. He wasn't afraid of anything or anyone. In fact, he made a convincing argument for granting his company a monopoly: Competition would endanger the safety of existing maritime users by causing interference.
Paul Galvin was another gutsy entrepreneur. His original goal was to build a successful small business. However, he learned the hard way that you have to aim much higher just to survive. After a string of failures, he started yet another business during the Great Depression. His company, Motorola, manufactured and sold car radios. He proved that a successful business could be built even during the worst of times by offering a lower cost solution.
His son Robert turned Motorola into a corporate giant. But it was Paul Galvin who got the boulder rolling by refusing to give up. Failure simply wasn't an option.
David Sarnoff deserves much of the credit for building the broadcast industry. But he is often remembered as a hard-hearted businessman. That’s an unfair verdict. Sarnoff’s job was to achieve the best financial results for RCA’s employees and shareholders. Even critics acknowledge that amassing personal wealth was never his top priority. He has been accused of cheating Edwin Armstrong (who invented FM radio) and Philo Farnsworth (who invented one of the first television cameras). However, years earlier Sarnoff made Armstrong a rich man by acquiring patents from him for cash and stock. RCA eventually paid Farnsworth a modest sum.
Lost in all of the recriminations is the simple fact that Sarnoff built the consumer market for radio broadcasting based on instinct and guts. He understood that to ensure success he had to think and act big. He needed volume to drive down radio manufacturing costs and attract advertisers. Years later, he introduced television despite fears that it would cannibalize radio.
I’m not saying that the wireless pioneers relied exclusively on guts. They surrounded themselves with experienced people and acquired their own experience on the fly. But what really set them apart was their courage—the courage to look ahead, to never give up, and to act in a big way.
Saturday, August 14. 2010
I've known for some time that America's early industrialists were not "robber barons" but brilliant entrepreneurs and noble philanthropists. For example, John D. Rockefeller created the mass market for kerosene home lighting by driving down prices, and he gave back to society by funding hugely successful efforts in education and medical research.
I was pleased to see that Burton W. Folsom, Jr. provides a nice framework for understanding the early industrialists in his book The Myth of the Robber Barons. Folsom sorts them into two groups: the political entrepreneurs and the market entrepreneurs.
The political entrepreneurs were people like Robert Fulton (steamships) and Thomas C. Durant (railroads). Their reliance on government assistance led to high prices, technological stagnation, and corruption. Market entrepreneurs such as Cornelius Vanderbilt and James J. Hill were innovators who drove down prices and improved service.
I have seen both types of entrepreneurs throughout my career. Today, the political entrepreneurs (better known as "crony capitalists" on the right and "socially responsible entrepreneurs" on the left) are clearly on the ascent. Legitimate entrepreneurs don't need to add "socially responsible" to their job descriptions because they already provide products and services that people want and value. "Socially responsible" is simply another way of saying "political" and it means seeking public money distributed by government officials--often to people with the right connections.
It's sad to see our country repeating this mistake. Our education system, media, and political elite share the blame. The way to turn things around is to once again honor individual initiative and achievement, the source of most generous and effective giving.
Saturday, July 31. 2010
These are evil times. The US economy is in critical condition. The consensus is that North Korea torpedoed a South Korean warship and Iran is developing nuclear weapons. Confidence in Congress has hit a record low of 11 percent, and thanks to Charles Rangel and Maxine Waters it will probably sink even lower.
Many years ago, I had a revelation. It’s simple, obvious, and common sense. People have made fortunes just by repackaging it. Most important, it works.
The revelation may have been said best by Walt Kelly in his Pogo comic strip: “We are confronted with insurmountable opportunities.”
The problem for most of us is not our lack of opportunities, it’s our inability to spot and seize them. And the most frustrating part is that the solution requires just 1% talent and 99% attitude.
Life presents most people with a constant stream of opportunities. But it takes a positive attitude to see and exploit them. Maintaining a positive attitude day in and day out requires a great deal of mental energy. To paraphrase one of the guys who made a fortune selling this bit of common sense, people create videos in their minds of blown opportunities that they keep replaying. The trick is to ignore or stop playing those video memories.
Admittedly, life presents some people with more opportunities than others. And a severe disability can put opportunities out of reach. But I’m convinced that most people get enough opportunities; it’s their attitude that is the biggest obstacle.
The History of Wireless and The History & Future of Medical Technology contain many examples of people who used the power of positive thinking to overcome hardship—including economic adversity. Here are four:
Michael Faraday – Whenever anyone suggests that we are cheating our children by not spending enough on education, I think of Michael Faraday. The son of a blacksmith plagued by poor health, Faraday received only the most rudimentary education. As an apprentice bookbinder, he educated himself by reading the books in his spare time. Faraday was also the victim of discrimination: rarely in that era did anyone rise above the class into which they were born.
Monday, November 16. 2009
I’ve known Dewayne Hendricks for years as a fellow wireless entrepreneur but only recently had a chance to meet up. Just back from Saipan, he was on his way to speak at an IEEE conference at Southern Illinois University in Carbondale, Illinois. My son and I met Dewayne at Flaco’s Cocina where we talked about the FCC, entrepreneurial spirit, Buckminster Fuller, and ham radio.
Dewayne exudes two things that are in short supply these days—enthusiasm for technology and high performance standards. (Actually, high standards are always in short supply.) He’s spent much of the past 30 years trekking the globe, bringing broadband Internet access to users in developing countries (such as Mongolia and the Kingdom of Tonga), rural areas (such as Indian reservations), and other challenging locations (the Northern Mariana Islands). Think of Dewayne as a wireless IMF (Impossible Missions Force) agent.
Dewayne first came to SIU for a chance to interact with futurist Buckminster Fuller at Fuller’s World Game. Fuller invented the geodesic dome (the world’s first geodesic dome greenhouse can be visited at Shaw’s Garden in St. Louis) and wrote the book Operating Manual for Spaceship Earth. Dewayne ended up as Assistant Director of SIU’s computer center and is proud to call Fuller and Paul Baran (co-inventor of packet switched networks) his mentors.
Like my son and I, Dewayne got hooked on amateur (ham) radio as a teenager. While he pursues new technologies, Dewayne (callsign: WA8DZP) feels that too many hams are only interested in operating and not enough are interested in pushing the technology envelope. He was involved in early efforts to grow the use of spread spectrum in amateur radio, but encountered resistance. That reminds me of the CDMA debate during the early 1990s. When Qualcomm proposed that mobile phone operators use code division multiple access (based on spread spectrum) they were accused of technology fraud. Now there are 500 million users defying the laws of physics...
I was particularly struck by Dewayne’s take on amateur radio’s digital/packet data/Internet capabilities. Hams used to lead the adoption of new technologies; now they are playing with stuff that’s light years behind the commercial sector.
Which brings me to an idea. In order for amateur radio to grow, it needs to attract more young people. One way to do that might be to build a global broadband/mobile amateur radio Internet access network using spread spectrum technology. Perhaps the first steps would be to form a group to study the applications that are likely to appeal to young people; determine what is technologically feasible; and recommend changes to amateur radio service spread spectrum rules.
If that sounds like a wireless Mission: Impossible, then we better call Dewayne Hendricks.
Tuesday, October 13. 2009
This post is the final installment in a series based on my book, The History of Wireless: How Creative Minds Produced Technology for the Masses, published in 2008.
Most Creative People Are Not Team Players
I’ve read a number of books about creativity, and my research tells me that most of them are wrong. The authors go on and on about things such as collaboration, juxtaposition of ideas, and looking for patterns. There may be some truth in what they say, but they miss the point.
The simple truth: most creative people are highly individualistic.
A recurring lesson from the history of wireless is that creative people don’t accept conventional wisdom. As inventor Edwin H. Armstrong was fond of saying, “It ain’t ignorance that causes all the trouble in this world. It’s the things people know that ain’t so.” For example, Armstrong invented frequency modulation (FM) radio after a respected Bell Labs engineer said it wasn’t worth doing.
Creative people don’t have to be right about everything. James Clerk Maxwell’s theory correctly predicted electromagnetic waves. But it was based on the false assumption that space is permeated with a medium for the waves to propagate through—the luminiferous ether.
There are a couple of myths that need to be dispelled. One is that not knowing too much can help—that people who are too well trained develop blind spots. My research suggests the exact opposite. It’s the people who know a field inside and out who are most likely to push beyond its existing limits.
Another myth is that great ideas sometimes just come to discoverers and inventors by chance. Scientists and inventors will often tell you that, but it’s not true. The typical story goes like this: “I was working on idea x for a long time without making progress. So I decided to give it a rest. A little while later, the solution suddenly popped into my mind.” The lesson is that creative ideas don’t just roll off an assembly line. They may need to percolate for a while. I don’t call that luck—I call it checking back later.
A particularly popular myth is that new technologies can’t take off until there are industry-wide standards. This confuses cause and effect. The standards-setting process is extremely political and companies often use it to jockey for better position. If a small company invents a better mouse trap, then the first thing the market leaders will do is call for a standard. At a minimum, it buys them time to catch up. If they are shrewd, they can use the standards-setting process to offset or completely undermine the newcomer’s competitive advantage.
The final lesson is that timing is everything. This is something Thomas Edison understood quite well. It’s not enough to have a great idea; it has to be the right idea at the right time. Nor does the best technology always succeed. Sometimes just good enough for the moment wins.
Coming: The History & Future of Medical Technology
Sunday, September 20. 2009
The National Venture Capital Association (NVCA) recently published a new edition of their Economic Impact of Venture Capital Study. Given the timing, I thought the purpose was to undermine calls for regulating the VC industry.
An article by Vivek Wadhwa (What Have VCs Really Done for Innovation?) claims the NVCA exaggerates the role of VCs in creating and driving innovation--to the detriment of entrepreneurs who often risk everything they own. He claims the NVCA is looking for handouts:
What’s behind the NVCA’s voodoo economics? Even though they vehemently deny it, VCs are looking for bailout money and tax-breaks. After spending so much time, energy and breath in the past decade arguing that government subsidies distort markets, now the wealthy, bloated VC community wants its own handouts.
Wadhwa also links to a story at BusinessWeek (Venture Capitalists Head to Washington) in which he is quoted. The article claims the NVCA is trying to tap the Small Business Innovation Research program. I don't generally trust BusinessWeek, but I must assume the quote by NVCA president Mark Heesen is accurate:
"This is an Administration with individuals who understand and respect technology," says Mark Heesen, president of the National Venture Capital Assn. (NVCA), the industry's primary lobbying group. "And from Obama on down, there is a view that innovation is key to getting us out of this economic situation."
On the contrary, the Obama administration is pushing primitive technologies such as windmills and lifestyle/preventative medicine, while dreaming up new ways to punish advanced technologies such as medical devices (re: AdvaMed) and the Internet ("Net Neutrality").
Saturday, September 12. 2009
Kudos to George Gilder (The Israel Test) for cutting through the fog and showing everyone what the Israeli-Palestinian conflict is really all about.
It’s not about denying Palestinians their land, right to self-determination, or dignity. It’s about the Palestinians’ hatred of Jews and the Left’s hatred of free enterprise.
If you believe that European Jews swooped into Palestine and stole the Arabs’ land, you need to study Middle East history, because that’s simply not what happened. I suggest you start with Mark Twain’s travelogue, Innocents Abroad. Mark Twain did not have a dog in the Jews versus Arabs fight because he wrote his book in 1869—twenty-seven years before the founding of the modern Zionist movement. The book is about Twain’s journey to the Holy Land and his astonishment upon discovering that outside of Jerusalem it was all but uninhabited. Study further and you’ll learn that it was the Ottoman Empire that ruled Palestine up until World War I, and that the British gave the majority of Palestine (76%) to the Arabs in 1928. Considering that Israel has offered to relinquish the West Bank and Gaza since 1967, it’s clear that the Arabs could own 90% of Palestine today—were that what they were really after.
The Israel Test is also unique in that it celebrates Israeli high tech entrepreneurship. I couldn’t help but experience feelings of déjà vu as I read about some of the people and companies. Though I was first introduced to Israel’s fledgling high tech industry in the early 1980s, like Gilder I later met two of Israel’s best ambassadors of high tech, the late David Medved (Chairman of JOLT) and his son Jonathan (venture capitalist and CEO of Vringo). Israel is busy inventing products that save lives, make life easier, and make life more pleasant. What positive contributions are her enemies making?
That leads me to a new idea. Given that Israel is increasingly hospitable to high tech startups, and that the U.S. is increasingly inhospitable (with the exceptions of not-for-profit and “green” enterprises), perhaps this would be a good time for Israel to offer itself as a business haven for American high tech entrepreneurs. I don’t know what if any barriers there are to American entrepreneurs setting up shop in Israel, but I suspect the current Israeli administration would be open to lowering or removing them.
Yesterday I attended a fascinating lecture by James L. Cox, MD on prosthetic heart valve design at Barnes-Jewish Hospital in St. Louis. However, there was one thing about the lecture that struck me as odd—yet consonant with the times. Dr. Cox (who retired from clinical practice) expurgated the names of the private ventures with which he is involved from his slides. For example, in a photograph of the headquarters of one venture the firm’s name on the building was blacked out.
I found that silly. There is nothing wrong with profiting from products that prove useful to others. Plus, there are better ways to allay fears that a presentation is merely a disguised sales pitch. First, provide useful and accurate background information. Second, describe what you feel are your product’s strengths and what competitors and critics say are its weaknesses. Third, trust your audience’s natural skepticism and deal with it directly and honestly.
The main thrust of the presentation was that form should follow function. While some artificial heart valves mimic the appearance of natural heart valves, it’s more important that they mimic the performance of natural valves. Dr. Cox (also known for developing the Cox maze procedure for treating atrial fibrillation) explained that that requires looking not only at basic valve function but factors such as turbulence, stresses on adjoining tissue, and so forth. His company, ATS Medical, offers both mechanical and biological valves.
One of the most interesting ideas discussed was percutaneous valve replacement—deploying a replacement heart valve using catheters. The valve is contained in a stent which, once in position, is expanded to push the natural valve leaflets aside. This isn’t a totally new concept—nor is it in widespread use. The “form follows function” design approach can be beneficial here, as well.
Friday, March 13. 2009
Letting yourself become consumed by priority disputes is one of the biggest mistakes that a scientist or inventor can make. While creators and discoverers should seek proper credit for their achievements, it’s important to recognize the most effective ways of securing credit, and to avoid getting caught up in prolonged public spats.
Many disputes arise when a scientist or inventor suggests an idea to someone who then builds on it, acquiring wealth and fame as a result. A good example is the 17th century argument between Robert Hooke and Isaac Newton. Hooke protested that he was first to describe the forces that determine the orbital motion of planets and that Newton failed to recognize his contribution in his great book, Principia.
There is compelling evidence that Hooke was first to suggest the ideas, because Newton acknowledged the fact in private correspondence. However, it was Newton who produced and published a comprehensive theory, and Hooke freely admitted that Newton took the ideas much further than he had considered.
Hooke’s position was weak. The best he could have hoped for was an acknowledgment from Newton. However, Hooke was in some ways his own worst enemy, publicly and aggressively challenging some of Newton's other ideas. It’s understandable that Newton lost whatever sympathy he had for Hooke’s claim based on Hooke’s subsequent behavior.
The lesson of history is that it is not enough to be first to propose an idea. The greatest credit goes to those who conduct a thorough study and either publish their findings or produce an invention based on those findings. To wit, history rightly favors those who do something with ideas.
Hooke only made matters worse by continuing to argue his case. His lobbying efforts must have made colleagues uncomfortable—given that most probably wanted to maintain friendly relations with both men. Hooke was a prolific scientist; what he should have done was be sure to follow through the next time he had a good idea. (In fact, Hooke made a habit of jumping from one line of inquiry to another, and rarely carried any through to completion.)
There have been many similar cases throughout history. There comes a time when the plaintiff needs to let go. Being first is not the only determining factor. When Marcel Gley publicly protested that he and not Frederick Banting was first to discover insulin, Oscar Minkowski replied “I know just how you feel. I could also have kicked myself for not having discovered insulin, when I realize how close I came to it.”
Monday, February 2. 2009
What makes a scientific experiment beautiful? A few years ago Physics World asked readers to nominate the “most beautiful experiment in physics.” The results were not terribly surprising. Readers picked Thomas Young’s double-slit experiment demonstrating that light interferes with itself as the most beautiful. My favorite—Rutherford’s experiment discovering the atomic nucleus—came in ninth place.
If beauty is in the eye of the beholder, then I have no reason to complain. However, Physics World also asked readers the reasons behind their selections. Their answers show that readers do not consider this type of beauty purely subjective. Robert P. Crease summarized their reasons as “transformative,” “economy,” and “deep play.”
Those reasons strike me as partly right and partly wrong. What makes an experiment beautiful is that it is cleverly designed, dramatic, and reveals something fascinating about nature. Everything else is fluff. For example, when someone calls an experiment “transformative” it tells us more about modern conceit than the experiment's merit. Like "disruptive" and "path breaking," it's pure cliché.
Thomas Young’s double-slit experiment was beautiful. Unfortunately, it was also somewhat misleading. One purpose of science experiments is to resolve controversies. Thomas Young’s experiment showed the wave nature of light. Other experiments demonstrated the particle nature of light. It wasn’t until much later that a modified double-slit experiment demonstrated wave-particle duality.
I have reservations about some of the other top ten choices. Newton’s “decomposition of sunlight with a prism” illustrated a fundamental concept, but it was hardly very clever. Similarly, the only thing original about Galileo’s “experiment on falling bodies” was his explanation. And Eratosthenes’ “measurement of the Earth’s circumference” relied on a combination of observation and theory—not experimentation.
Rutherford’s experiment showed that while atoms consist mostly of empty space they contain a tiny and relatively massive core he dubbed the “nucleus.” In this experiment, a beam of alpha particles was used to bombard a very thin gold foil. Studying scattering of the alpha particles, Rutherford found most traveled straight through, some were modestly deflected, and one in 8,000 bounced straight back. (He verified that the bounce backs were not just a surface phenomenon.) Rutherford famously stated “It was almost as incredible as if you fired a 15-inch shell at a piece of tissue paper and it came back and hit you.”
Rutherford followed that up with another beautiful experiment. He wanted to prove that alpha particles are helium nuclei. He directed an assistant to make a glass cylinder and place another, slightly smaller glass cylinder inside. The wall of the smaller glass cylinder was extremely thin: 1/100th of a millimeter thick. This allowed alpha particles emanating from inside the inner cylinder to enter the space between the two cylinders (after the air had been removed) but not escape through the outer cylinder’s wall. After collecting alpha particles in the space, he zapped them with electricity and examined them with a spectroscope. They showed the spectrum of helium as expected.
A beautiful experiment doesn’t have to be perfect, but it must significantly increase our knowledge of the natural world in a way that is repeatable and verifiable. With all of the distractions that surround modern science, researchers would do well to revisit beautiful experiments from time to time.
Saturday, December 20. 2008
The history of science and technology is riddled with claims and counterclaims regarding discoveries and inventions. Today's Wall Street Journal features this front page article: Alfred Russel Wallace's Fans Gear Up for a Darwinian Struggle. The claim is that Darwin stole many key ideas about evolution from Wallace.
A similar complaint has recently been revived concerning Alexander Graham Bell and the telephone. It's not yet available, but The Telephone Gambit: Chasing Alexander Graham Bell's Secret by Seth Shulman claims Bell stole the key idea for his invention from Elisha Gray.
History usually does a good job sorting out priority. The biggest problem is that people confuse thinking of an invention (or theory) and making it happen. Sure, Darwin borrowed ideas from others, but he put them together in a more convincing package. And Alexander Graham Bell fought off no fewer than 600 challenges to his patents as he and his partners transformed his simple device into a thriving business.
Inevitably, some people's thinking evolves from "Why didn't I think of that?" to "I did think of that." As I describe in my book, The History of Wireless: How Creative Minds Produced Technology for the Masses, Charles T. Jackson claimed he and not Samuel F.B. Morse invented the telegraph. Jackson also claimed it was he and not William T.G. Morton who was first to conceive the use of anesthesia in surgery. Another prolific retroactive inventor was Daniel Drawbaugh; he waited until 1880 to announce that he invented the telephone years earlier. He resurfaced in 1903 claiming he, not Marconi, invented radio.
Wednesday, September 17. 2008
There are compelling reasons to believe that homeschooling is more likely to produce original and innovative thinkers than conventional classroom-based education.
Some of the greatest scientists and inventors were either homeschooled or self-taught. Michael Faraday received a rudimentary education in Sunday school, but taught himself science by reading the books he encountered as an apprentice bookbinder. Inventors Thomas Edison and Alexander Graham Bell were homeschooled. Others such as Joseph Henry, first secretary of the Smithsonian Institution, and I.I. Rabi, winner of a Nobel Prize in physics, acquired their love of science by exploring libraries on their own.
The case for homeschooling is vigorously debated. Most parents homeschool their children for one of three reasons: they are dissatisfied with their local schools, they want to incorporate their religious beliefs, or they are convinced they can provide their children with a superior education. Roughly 1.3 million students (less than 3% of school-age children) are currently homeschooled in the U.S.
Opponents of homeschooling raise several objections. Some claim that most parents are not qualified to teach their children. However, studies show that homeschoolers perform well above average on standardized tests. Given that, many opponents focus on other concerns, the most common of which is that homeschooled children fail to acquire “socialization” skills.
Opponents tend to overlook important facts. Homeschool parents tend not to rely on classroom-style teaching; they prefer that their children learn through self study. Most children do not homeschool in isolation; there are homeschooling groups in many communities and homeschoolers often participate in classes and other educational activities outside the home.
The Education Establishment is quick to point out that its mission is not to stuff children’s heads full of “static facts” but to give them the skills they need to become “life-long learners.” Ironically, it’s homeschoolers who must learn to be self-motivated and to study independently, while children in classrooms are often fed the same material at the same pace.
If fostering creativity is a key goal of education, then we must reconsider some basic assumptions. We tend to believe that public schools are necessary to ensuring universal education—but is that true? The British philosopher John Stuart Mill argued in his essay On Liberty that government-run schools are neither necessary nor advisable. He predicted they would devolve into indoctrination centers, which by nature serve to stifle creativity.
Classroom-style learning may help some students and hinder others. In general, the pace of learning is adjusted to serve the majority of students. While well-intentioned, the slogan “Let no child be left behind” often means in practice “Let no child get too far ahead.”
Classrooms not only encourage group learning, they encourage group thinking. That is not always a bad thing; many employers are looking for people who work well with others. But group thinking tends to promote sameness and discourage contrarian ideas. The point is not that we should abolish classroom learning, but that we should remember its limitations and make selective use of it.
Homeschooling is not right for all parents and all children. People have different education goals and needs. But homeschooling is an important alternative, particularly for parents who want their children to learn how to study and think independently, and for children who would benefit more from opportunities to explore their own interests at their own pace.
There is certainly a time and place for sailing in formation. But to be creative, an individual must be free to chart his or her own course.
Friday, July 25. 2008
I am often amazed by the distrust towards companies and disdain of free markets that pervade articles in publications such as BusinessWeek, Fortune, and sometimes even The Wall Street Journal. The articles assume that businesses are by nature unethical, that free markets work for just a privileged few, and that only aggressive government intervention can safeguard the public.
Two articles about the just announced out of court settlement between wireless technology innovators Qualcomm and Nokia are good examples. In “Qualcomm plays favorites with Nokia,” Fortune Magazine writer Scott Moritz says the deal signals that Qualcomm is no longer licensing its technology on a non-discriminatory basis. And in “Why Qualcomm folded to Nokia,” BusinessWeek writer Jennifer Schenker assures us the deal is not the “win-win” outcome that Qualcomm makes it out to be.
First, here’s a little background about the dispute. Cellular telephone service was launched in the early 1980s. By the late 1980s, the industry began planning the migration to second generation (2G) cellular technologies. Europe and North America selected different standards based on the same underlying digital technology, time division multiple access (TDMA). However, a small company based in San Diego, California—Qualcomm—boldly proposed a technology that was in some ways radically different and that, the company claimed, promised significantly greater capacity along with other benefits. That technology is generically known as “CDMA” (code division multiple access).
Qualcomm failed to persuade the industry-at-large to embrace CDMA at that time, but it did manage to convince several key operators to take a closer look. Over the next few years, a series of field tests and trials convinced a number of operators and equipment manufacturers that CDMA was worth the wait. This annoyed some companies backing the already agreed upon 2G standards, and a few consultants and academics started a noisy campaign against CDMA—some going as far as suggesting that CDMA could not possibly work and Qualcomm was engaged in stock fraud.
It was a classic illustration of the saying “The pioneers are the guys with arrows in their backs.” Though Qualcomm was a successful small company, its founders were willing to risk everything for CDMA. They hoped to license the technology to larger companies, but decided to manufacture chips, subscriber units, and network equipment to ensure all of the necessary pieces were in place. Since then, Qualcomm has licensed CDMA to roughly 200 companies, and has sold its network infrastructure and handset businesses. CDMA was selected as the preferred “air interface” for third generation (3G) systems, and there are now about 500 million subscribers using various flavors of CDMA.
It’s widely believed that Qualcomm has been licensing CDMA at a royalty rate of 4.5% (applied to the selling price of devices incorporating the technology). Though this is not an onerous rate by historical standards, and it certainly hasn’t been an obstacle to market development, the mobile phone industry is very large and very cost-sensitive and there has been tremendous pressure on Qualcomm to lower its royalty rates. Going forward, I wouldn’t be surprised if Qualcomm reduces its royalty rates in exchange for other more favorable terms and conditions.
Back to the two articles…
In “Qualcomm plays favorites with Nokia,” writer Scott Moritz flatly states that Qualcomm gave Nokia “a steep discount on royalties that other phone makers won’t get.” I don’t know how Moritz can know this, but I do know why he thinks he can say it. Like many technology licensers, Qualcomm does not disclose the full terms of its agreements.
Moritz bases his assertion on a response to a question during the firm’s recent earnings call by Qualcomm President Steve Altman. Altman said Qualcomm would not automatically extend the same rate to other licensees. But he quickly added that he would look favorably on deals incorporating a similar package of terms and conditions.
Perhaps Moritz is unaware that fair, reasonable, and non-discriminatory (FRAND) licensing does not require uniform royalty rates. Or perhaps he doesn’t understand that licensers may offer discounts in return for other forms of value, such as longer duration agreements. However, I suspect Moritz was simply so determined to find a negative angle that he was willing to create one.
It is possible that Qualcomm made the final accommodation as suggested in Jennifer Schenker’s article, “Why Qualcomm folded to Nokia.” However, it’s clear that Nokia also made major concessions and this complex deal must have been in the works for some time. If Qualcomm “folded,” why did Nokia agree to pay royalties until 2023, acknowledge that Qualcomm possesses essential intellectual property (IP) for 4G, grant Qualcomm free use of related Nokia patents, and even transfer ownership of certain Nokia patents to Qualcomm? Based on that evidence, it would be just as easy to conclude that Nokia folded.
Neither Moritz nor Schenker is willing to accept the deal at face value. It is simply inconceivable to them that two big companies could reach an equitable and mutually beneficial agreement on their own. But there is reason to believe that is exactly what happened. Qualcomm and Nokia occupy different positions in the value chain, so they have different strategic goals. As the dispute dragged on, it became clear they were suffering vis-à-vis their respective competitors. To wit, Qualcomm needs the largest handset maker as one of its customers, and Nokia needs the leading 3G chipset maker as one of its suppliers.
I understand that business journalists feel they must maintain a critical posture. But it is one thing to challenge specific business strategies and tactics, and another to constantly insinuate that businesses are lying or cheating. Qualcomm and Nokia are responsible for many praiseworthy innovations, and the stories behind the two companies are quite inspiring. Unfortunately, business journalists are more interested in attributing bad motives to them than celebrating their achievements.
Tuesday, April 8. 2008
Two frequently cited examples of foolish ideas that were once quite popular are spontaneous generation and eugenics. We now know that maggots and mice don’t spontaneously arise in meat and grain, and that trying to improve the human race through controlled breeding is a dangerous idea.
These ideas are ridiculed today, but they were once supported by intelligent and even illustrious individuals—and not without good reason.
Spontaneous generation was the rational alternative to biblical creation and the perfect complement to Charles Darwin’s theory of evolution. Supporters of spontaneous generation said that given the proper raw materials and environmental conditions, life arises as the result of natural processes.
Eugenics was also a natural outgrowth of evolution. If humans and other species are the result of ongoing evolution, then we must learn as much as we can about the process, and apply that knowledge to the benefit of individuals and society. That was certainly the intention of early enthusiasts such as Alexander Graham Bell.
What brought these ideas into disrepute? Belief in spontaneous generation was an obstacle to understanding the spread of disease-causing germs. The fiercely conservative Louis Pasteur used swan-necked flasks to show that microbes do not arise spontaneously. He influenced Joseph Lister, the developer of antiseptic surgery. And Eugenics is now associated with racial discrimination, forced sterilization, and genocide.
It’s easy to portray these ideas as silly or immoral, but neither has been totally rejected. Scientists have shown that simple organic molecules can be created from inorganic substances. Genetic screening, counseling, and abortion are now common practices.
Monday, March 31. 2008
One of the greatest figures in the history of technology was an amphibian. Numerous frogs were martyred to the discovery of bioelectricity and the invention of electrocardiography. Frogs also played a supporting role in the development of the first source of continuous current, a turning point in the study of electricity.
In my History of Wireless, I discuss the “animal electricity” experiments of Luigi Galvani and his debate with Alessandro Volta, driving the latter to construct his “voltaic pile.” Galvani noticed that a frog’s crural nerve could be stimulated wirelessly to create a muscle contraction. This was first demonstrated indoors using a static electricity generator and then outdoors during lightning storms. Later, Galvani discovered he could trigger contractions merely by completing a circuit containing dissimilar metals, though he stubbornly refused to acknowledge that the source of electricity was external to the frog.
Galvani’s mistake drove Volta to perform further experiments demonstrating dissimilar metals can be used to generate electricity. (However, Volta failed to see that a chemical reaction rather than mere contact was the cause.) Armed with Volta’s invention, natural philosophers were empowered to make a series of further discoveries.
Was Luigi Galvani the first person to encounter wireless communication via electromagnetic waves? Close examination of Galvani’s research papers (as explained in the December 1971 issue of IEEE Spectrum by L.A. Geddes and H.E. Hoff) reveals that the muscle contractions in his first “wireless” experiments were due to electro-static induction. In his book The Ambiguous Frog, Marcello Pera shows that lightning storms also triggered contractions via electro-static induction. Galvani happened upon a wireless effect, but it was not due to electromagnetic waves. (This is not as odd as it might seem: in the late 19th century Joseph Henry, William Preece, and Oliver Lodge all pursued wireless communication via magnetic induction, which also does not involve electromagnetic waves.)
Galvani lost the debate with Volta but is now rightly considered the discoverer of bioelectricity. His work inspired Carlo Matteucci, who invented the “rheoscopic frog”—a severed sciatic nerve and its innervated gastrocnemius muscle that could be used as a sensitive electricity detector. In 1856, Kolliker and Mueller used the rheoscopic frog to observe the electrical activity associated with the beating heart of another frog.
The rheoscopic frog was crucial to the development of electrocardiography. Though early researchers had galvanometers for detecting and measuring electrical current, the response time of those devices was too slow for observing the heart’s electrical activity. The rheoscopic frog was the best electrical test and measurement instrument available for that purpose—until it was replaced decades later by the capillary electrometer and then the string galvanometer.
(Page 1 of 2, totaling 17 entries) » next page
Syndicate This Blog