Blogging: Quantity vs. Quality?
Voltaire famously corresponded with the potentates and literati of all Europe, and still managed to write brilliant plays, write even more brilliant novels, and speak possibly the best French ever.
However, the rest of us need to make choices. The impressive but not-Voltaire-quality William Gibson has recently decided to give up blogging.
Neal Stephenson, whom I hereby predict will become exponentially more famous over the next 20 years, has all but foresworn correspondence in order to concentrate on his longer-form, more-structured writing.
Blogging (particularly the near-real-time, stream-of-consciousness variant most practiced at present) is clearly way out on one end of the bell curve of quantity/rapidity vs. quality. That's fine, in and of itself; but where does the value of information and communication lie, and what is the appropriate context for blogging? Some great works happen because of their timeliness; others take decades of strenous effort. Now that we have the ability, with the Internet, of linking and organizing many (or all?) sorts of information, we need to think about *how* we're going to do that -- for the first time since, oh, the invention of libraries and bookshelves.
That's a big task. Any ideas?
Tech-related musings. Occasional rogue war pieces. Hosted and led astray by Ethan Stock, founder and CEO of Zvents. This blog reflects my own views, not the position of Zvents.
Monday, April 28, 2003
Monday, April 21, 2003
Distributed Production, User Enablement
This story in the New York Times caught my eye: "3,000 Amateurs Offer NASA Photos of Columbia's Demise"
Quote:
"In the video nation, almost no moment goes untaped... The nearly ubiquitous cameras grab images of mothers slapping their children in parking lots and Rodney King being beaten — and, as it turns out, the space shuttle Columbia during its descent. Some 3,000 people contacted NASA in the days after the shuttle disaster to offer their firsthand reports, still photographs and videos of the shuttle's entry into the atmosphere. Ultimately, some 12,000 videos and images streamed in... And it has paid off: the images have provided a trove of data that has helped investigators piece together what happened in the final minutes of the flight... The imagery has helped NASA guide debris searchers, and has helped to show that the shuttle was shedding parts before the signs of serious trouble appeared on sensors that could be read by mission control on the ground."
So let's get this straight. NASA spends hundreds of millions of dollars on each and every shuttle launch. The Space Shuttle has cost billions. The U.S. Air Force also has spend billions of dollars on spy satellites. But NASA's centralized bureaucracy has failed to create any form of sensors which captured the information of the shuttle breakup as well as the amateurs; and its centralized bureaucracy countermanded a request to the Air Force to turn its extremely powerful imaging satellites onto the Shuttle.
If we assume that each of those 3,000 people spent an egregious $10,000 each on their equipment (amateur astronomers can be obsessive!) we are talking a grand whopping total of $30 million, and a bunch of users, to beat the pants off the best efforts of the cubic-dollar-enabled federal government. Realistically, $3 million is probably more like it.
So "pecked to death by ducks" triumphs again.
This is more than a metaphor for the future -- this IS the future. Even the large, centralized bureaucracies are moving towards networks of smaller, cheaper stuff -- from Predator drones to sensors that you can sprinkle from airplanes. Certainly, the almost ubiquitous presence of people with video cameras has changed the world. What happened on Sept. 11th? A cheap parking-lot camera captured the best video of an airliner smashing into the Pentagon.
This is all about a future of user-created content; because the means of production are increasingly in all our hands. Technology compresses and expands, centralizes and distributes, in waves decades in length; in 1909, England had one piano per 10 people; but I would be shocked if one American in 100 plays a piano today, much less owns one. The music of the last century was purchased from highly centralized record companies; now new technologies of production and re-production allow people to make music, or to steal it, with nearly equal ease.
Distributed music production (1909)
Centralized music production (1959)
Distributed music production ( 2009).
Open Source is yet another example of this trend -- from a computer world controlled by the "high priests of mainframes" in the 1950s, we've moved on to such a distributed future that foreign 22-year-olds who speak Urgo-Finnic languages can rock the foundations of the corporate world. How'd that happen? Putting a highly distributed, cheap means of production (a cheap PC plus free coding software) in the hands of lots of clever people, stand back and watch what happens! User-created code, that's what.
User-created content...
Here I blog. Yet another cheap means of production.
So what does this mean? Well, if we look at the clothing and fashion industry (where production has been cheap for between 50 and 150 years, depending on how you measure it) we can predict a few things;
1) Fashion will become more important than "real" factors such as quality or durability;
2) The people who will become rich will be those who can set fashion;
3) Owning the means of distribution will be a more anonymous, but highly effective, way of making money;
4) Producers will get squashed.
We can already see the fashion/distribution trend emerging in blogging. "Names" like Instapundit and Drudge have a powerful position; and thousands, or tens of thousands, of toiling strivers work feverishly to get an all important link/break from the big boyz.
Luckily, I don't give a crap if anyone reads this. Striving is such hard work...
This story in the New York Times caught my eye: "3,000 Amateurs Offer NASA Photos of Columbia's Demise"
Quote:
"In the video nation, almost no moment goes untaped... The nearly ubiquitous cameras grab images of mothers slapping their children in parking lots and Rodney King being beaten — and, as it turns out, the space shuttle Columbia during its descent. Some 3,000 people contacted NASA in the days after the shuttle disaster to offer their firsthand reports, still photographs and videos of the shuttle's entry into the atmosphere. Ultimately, some 12,000 videos and images streamed in... And it has paid off: the images have provided a trove of data that has helped investigators piece together what happened in the final minutes of the flight... The imagery has helped NASA guide debris searchers, and has helped to show that the shuttle was shedding parts before the signs of serious trouble appeared on sensors that could be read by mission control on the ground."
So let's get this straight. NASA spends hundreds of millions of dollars on each and every shuttle launch. The Space Shuttle has cost billions. The U.S. Air Force also has spend billions of dollars on spy satellites. But NASA's centralized bureaucracy has failed to create any form of sensors which captured the information of the shuttle breakup as well as the amateurs; and its centralized bureaucracy countermanded a request to the Air Force to turn its extremely powerful imaging satellites onto the Shuttle.
If we assume that each of those 3,000 people spent an egregious $10,000 each on their equipment (amateur astronomers can be obsessive!) we are talking a grand whopping total of $30 million, and a bunch of users, to beat the pants off the best efforts of the cubic-dollar-enabled federal government. Realistically, $3 million is probably more like it.
So "pecked to death by ducks" triumphs again.
This is more than a metaphor for the future -- this IS the future. Even the large, centralized bureaucracies are moving towards networks of smaller, cheaper stuff -- from Predator drones to sensors that you can sprinkle from airplanes. Certainly, the almost ubiquitous presence of people with video cameras has changed the world. What happened on Sept. 11th? A cheap parking-lot camera captured the best video of an airliner smashing into the Pentagon.
This is all about a future of user-created content; because the means of production are increasingly in all our hands. Technology compresses and expands, centralizes and distributes, in waves decades in length; in 1909, England had one piano per 10 people; but I would be shocked if one American in 100 plays a piano today, much less owns one. The music of the last century was purchased from highly centralized record companies; now new technologies of production and re-production allow people to make music, or to steal it, with nearly equal ease.
Distributed music production (1909)
Centralized music production (1959)
Distributed music production ( 2009).
Open Source is yet another example of this trend -- from a computer world controlled by the "high priests of mainframes" in the 1950s, we've moved on to such a distributed future that foreign 22-year-olds who speak Urgo-Finnic languages can rock the foundations of the corporate world. How'd that happen? Putting a highly distributed, cheap means of production (a cheap PC plus free coding software) in the hands of lots of clever people, stand back and watch what happens! User-created code, that's what.
User-created content...
Here I blog. Yet another cheap means of production.
So what does this mean? Well, if we look at the clothing and fashion industry (where production has been cheap for between 50 and 150 years, depending on how you measure it) we can predict a few things;
1) Fashion will become more important than "real" factors such as quality or durability;
2) The people who will become rich will be those who can set fashion;
3) Owning the means of distribution will be a more anonymous, but highly effective, way of making money;
4) Producers will get squashed.
We can already see the fashion/distribution trend emerging in blogging. "Names" like Instapundit and Drudge have a powerful position; and thousands, or tens of thousands, of toiling strivers work feverishly to get an all important link/break from the big boyz.
Luckily, I don't give a crap if anyone reads this. Striving is such hard work...
Friday, April 18, 2003
Who is that guy?
By the way, I've mentioned Duncan a few times, so in the grand tradition of blogging, there's a link to his blog. Quid pro quo, Duncan?
By the way, I've mentioned Duncan a few times, so in the grand tradition of blogging, there's a link to his blog. Quid pro quo, Duncan?
Craig Venter. Genius. Pain in the Neck.
A few days ago, the Human Genome Project announced that the had completed the full sequence of the human genome. Interestingly, mention of either Craig Venter or Celera Genomics was largely missing from these press releases. It's a far cry from the tensions and "war of words" between Venter/Celera and the bureaucrats of the HGP which Nature reported back in 2000, the first (previous) time that the completion of the genome was announced with great fanfare.
That somewhat preliminary announcement was driven by the HGP's very real fear that Venter and Celera would steal their thunder entirely. The announcements of the past few weeks, which are equally a publicity stunt in that they nicely coincide with the 50th anniversary of the discovery of DNA, conveniently ignore the enormous contribution that Venter and Celera made.
About a year ago, I heard Craig Venter speak in London; at that [pre-blog] time, I wrote up a report. It's interesting, it's technology [sort of -- as much as the war, anyway] and most importantly, it goes against the self-inflating tendency of the world's science bureaucrats; so here it is:
Wednesday, Feb 13th 2002
I attended Craig Venter’s talk at the Royal Institute. The R.I., along with the Royal Society, is one of the two oldest scientific societies in London (and thus the world) and the former home of greats such as Michael Faraday. The R.I. has slowly shifted its mission towards science popularization, and regularly hosts public lectures of varying magnitude, including a UK-famous Christmas lecture which is televised on BBC1 as a holiday special. The speaker this evening, Dr. Craig Venter, is perhaps the most important person in the world at the moment working on DNA. Venter is a combination of famous and infamous as something of an enfant terrible, who in the early- to mid-1990s challenged the entire basis of the long-running, government-funded Human Genome Project. When he was rebuffed by the NIH, which told him his proposed “whole genome shotgunning” methods wouldn't work, he founded Celera Genomics, took in about $100 million of venture capital, and demonstrated in dramatic fashion that his methods did in fact work, by producing an entire DNA map of not only humans, but a long list of other organisms (from plants and animals to bacteria, etc.) in record-breaking fashion. Using Venter’s methods (which the NIH grumpily and quietly adopted, so as to avoid being completely beaten and embarrassed) the current timeline to analyze a new organism’s DNA is about three months, as opposed to the 10-15 years it used to take.
Venter is interesting not only for what he can relate about the current state of DNA research, but to watch as a person -- he's a quintessential abrasive, self-confident, counter-orthodoxy guy who has made huge, important things happen despite a lot of resistance; and in the process, created a lot of critics, if not outright enemies.
A small incident at the beginning of the talk showed this side of him quite clearly. After he walked out into the presentation area of the hall –- the same small, circular, high-balconied theater that Faraday used in the 1800s -- he needed to begin with a formal greeting to the audience. In the UK, when certain persons are present, the form is, "Your Royal Highness(es), Lords, Ladies, and Gentlemen..." As it happened, in attendance this evening was HRH the Duke of Kent (who is the brother of Prince Philip, the Queen’s husband) and plenty of lords and ladies (Lady Archer, the wife of disgraced and jailed politician/Lord/author, Jeffrey Archer, was sitting directly in front of me) and thus Venter had to use this greeting. He choked on the word ‘royal’. He simply could not get it out. "Your hrgh-al highness" was as close as he could get, and off he went with his talk. The guy is a completely self-possessed, polished speaker, so I can't imagine it was nervousness -- just the rebelliousness of his soul that has made him a great scientist, made explicit in a single syllable. He’s a large, broad-shouldered man with a huge head, and his entire demeanor screamed, “I’ll be damned if I’ll say ‘royal’ to anyone.”
Another glimpse came in the question session at the end, when someone asked him, "What motivates you?" He went through a long blah-de-blah, about how when you're discovering this wonderful, important stuff that will save lives and make the world better, getting out of bed every day is no problem, etc., etc.; and then, having finished his official answer, grinned like a kid and said, "Plus, I like to win." Which was immediately obvious as the real answer, and everything before it merely a polite dressing of the simple truth.
He gave a masterful -- only word for it -- one-hour talk on the current state of DNA research, and revealed all sorts of interesting facts that I (who follow this sort of thing moderately, not closely) had been unaware of. I had coincidentally just finished reading Richard Dawkin's famous 1976 book The Selfish Gene, which is a paen to DNA determinism, and raises the competition half of Darwin's thesis upon an altar. DNA determinism, which is prominent if not predominant today, relies upon the crucial assumption of a close and programmatic link between DNA ‘genes’ and the expression of very subtle traits in an organism, such as behavior in certain specific situations (e.g. a mother defending her young).
Venter noted that the biggest news out of the Human Genome project, which caused a bit of a stir last year when it was announced, is that humans only have between 25,000 and 40,000 genes -- which is simply too few pieces of information to be able to express tiny subtleties of construction of one’s body, much less actions, behaviors, and thoughts, in the DNA itself. What this means, is that a) the varied (and possibly chaotically fuzzy) *expression* of DNA is far more important than the determinists would like to believe, and b) that both nurture and culture are far more important as well. It's sort of a victory for free will, as it were.
Venter also noted a few numbers on how close all we mammals are to each other at the DNA level:
* Humans and mice share 94.5% of their DNA
* Humans and chimpanzees share 98.73% of their DNA
* The *average* difference between one human and another is one 'base pair' (or single code letter) out of every 1200; of the 27 million code letters that make up your entire DNA, that means that the average difference between you and someone else is only 22,000 letters, which is just a few hundred genes worth of difference, since much of your DNA is basically wasted space. Just for some context, this piece of writing contains almost 8000 letters and spaces; so it represents more than 1/3 of the likely DNA difference between you and I.
The most interesting part of the talk was Venter's discussion of the ongoing work to build an actual ‘map’ of evolution by creating a hierarchy "tree" of diversification and separation in genes over time. For instance, it's pretty well established (through various other methods) that mammals diverged from other animals 600 million years ago. All mammals share a certain set of genes that can thus be 'dated' to 600 million years, and based on similar techniques (such as carbon- or sediment-dates fossils) all splits (and thus gene differences) between various mammals can be dated with a good degree of accuracy to when in the last 600 million years they occurred. The fact that all humans are basically identical -- 99.92% the same -- means that, in the past 10,000 years or so, there has been effectively zero evolution of people. By performing these sorts of difference checks -- which bits of DNA are shared between people and birds, people and frogs, people and mushrooms, people and bacteria -- you can build a map of which changes (in both modern people and modern mushrooms) happened when, and both the general rates and the exact timeline of evolution.
Venter put up all sorts of really pretty visual maps of this, which sadly I can't reproduce.
One particularly fascinating example of this is known as the CCR5 gene. This gene exists in about 9% of Caucasian people, but only .01% of blacks -- so a significant minority in one population, effectively nonexistent in the other. The gene is notable, firstly, because it provides almost 100% immunity to catching AIDS. It's thus part of the explanation why AIDS is more of a plague in Africa – it acts as a ‘damping’ factor in person-to-person communication among Caucasians, and no analogous damper exists in Africans. What's most interesting, however, is that using the same population divergence techniques described above, the emergence of this gene in the Caucasian population can be dated to about 700 years ago. Stastistically, of course, such a recent emergence is necessary to explain such a significant spread in the incidence of the gene between two different groups of humans. Circa 700 years ago was, of course, when the Black Death hit Europe (1348 - 1350) and killed off about a third of the population; and CCR5, in addition to conferring immunity to AIDS, is highly effective at conferring immunity to the plague. Thus this gene got a big boost in the Caucasian population as many people who didn't have it died; whereas in Africa, without the plague, it remained a minor factor within the genetic mix.
Overall, it was a fascinating talk that I've done only a partial job of conveying. Venter is, and will remain, one of the most important scientists of our time, and hearing him speak at length and up close was great. At 7 pounds for the tickets (!) it's an example of the great cultural benefits of being in London.
A few days ago, the Human Genome Project announced that the had completed the full sequence of the human genome. Interestingly, mention of either Craig Venter or Celera Genomics was largely missing from these press releases. It's a far cry from the tensions and "war of words" between Venter/Celera and the bureaucrats of the HGP which Nature reported back in 2000, the first (previous) time that the completion of the genome was announced with great fanfare.
That somewhat preliminary announcement was driven by the HGP's very real fear that Venter and Celera would steal their thunder entirely. The announcements of the past few weeks, which are equally a publicity stunt in that they nicely coincide with the 50th anniversary of the discovery of DNA, conveniently ignore the enormous contribution that Venter and Celera made.
About a year ago, I heard Craig Venter speak in London; at that [pre-blog] time, I wrote up a report. It's interesting, it's technology [sort of -- as much as the war, anyway] and most importantly, it goes against the self-inflating tendency of the world's science bureaucrats; so here it is:
Wednesday, Feb 13th 2002
I attended Craig Venter’s talk at the Royal Institute. The R.I., along with the Royal Society, is one of the two oldest scientific societies in London (and thus the world) and the former home of greats such as Michael Faraday. The R.I. has slowly shifted its mission towards science popularization, and regularly hosts public lectures of varying magnitude, including a UK-famous Christmas lecture which is televised on BBC1 as a holiday special. The speaker this evening, Dr. Craig Venter, is perhaps the most important person in the world at the moment working on DNA. Venter is a combination of famous and infamous as something of an enfant terrible, who in the early- to mid-1990s challenged the entire basis of the long-running, government-funded Human Genome Project. When he was rebuffed by the NIH, which told him his proposed “whole genome shotgunning” methods wouldn't work, he founded Celera Genomics, took in about $100 million of venture capital, and demonstrated in dramatic fashion that his methods did in fact work, by producing an entire DNA map of not only humans, but a long list of other organisms (from plants and animals to bacteria, etc.) in record-breaking fashion. Using Venter’s methods (which the NIH grumpily and quietly adopted, so as to avoid being completely beaten and embarrassed) the current timeline to analyze a new organism’s DNA is about three months, as opposed to the 10-15 years it used to take.
Venter is interesting not only for what he can relate about the current state of DNA research, but to watch as a person -- he's a quintessential abrasive, self-confident, counter-orthodoxy guy who has made huge, important things happen despite a lot of resistance; and in the process, created a lot of critics, if not outright enemies.
A small incident at the beginning of the talk showed this side of him quite clearly. After he walked out into the presentation area of the hall –- the same small, circular, high-balconied theater that Faraday used in the 1800s -- he needed to begin with a formal greeting to the audience. In the UK, when certain persons are present, the form is, "Your Royal Highness(es), Lords, Ladies, and Gentlemen..." As it happened, in attendance this evening was HRH the Duke of Kent (who is the brother of Prince Philip, the Queen’s husband) and plenty of lords and ladies (Lady Archer, the wife of disgraced and jailed politician/Lord/author, Jeffrey Archer, was sitting directly in front of me) and thus Venter had to use this greeting. He choked on the word ‘royal’. He simply could not get it out. "Your hrgh-al highness" was as close as he could get, and off he went with his talk. The guy is a completely self-possessed, polished speaker, so I can't imagine it was nervousness -- just the rebelliousness of his soul that has made him a great scientist, made explicit in a single syllable. He’s a large, broad-shouldered man with a huge head, and his entire demeanor screamed, “I’ll be damned if I’ll say ‘royal’ to anyone.”
Another glimpse came in the question session at the end, when someone asked him, "What motivates you?" He went through a long blah-de-blah, about how when you're discovering this wonderful, important stuff that will save lives and make the world better, getting out of bed every day is no problem, etc., etc.; and then, having finished his official answer, grinned like a kid and said, "Plus, I like to win." Which was immediately obvious as the real answer, and everything before it merely a polite dressing of the simple truth.
He gave a masterful -- only word for it -- one-hour talk on the current state of DNA research, and revealed all sorts of interesting facts that I (who follow this sort of thing moderately, not closely) had been unaware of. I had coincidentally just finished reading Richard Dawkin's famous 1976 book The Selfish Gene, which is a paen to DNA determinism, and raises the competition half of Darwin's thesis upon an altar. DNA determinism, which is prominent if not predominant today, relies upon the crucial assumption of a close and programmatic link between DNA ‘genes’ and the expression of very subtle traits in an organism, such as behavior in certain specific situations (e.g. a mother defending her young).
Venter noted that the biggest news out of the Human Genome project, which caused a bit of a stir last year when it was announced, is that humans only have between 25,000 and 40,000 genes -- which is simply too few pieces of information to be able to express tiny subtleties of construction of one’s body, much less actions, behaviors, and thoughts, in the DNA itself. What this means, is that a) the varied (and possibly chaotically fuzzy) *expression* of DNA is far more important than the determinists would like to believe, and b) that both nurture and culture are far more important as well. It's sort of a victory for free will, as it were.
Venter also noted a few numbers on how close all we mammals are to each other at the DNA level:
* Humans and mice share 94.5% of their DNA
* Humans and chimpanzees share 98.73% of their DNA
* The *average* difference between one human and another is one 'base pair' (or single code letter) out of every 1200; of the 27 million code letters that make up your entire DNA, that means that the average difference between you and someone else is only 22,000 letters, which is just a few hundred genes worth of difference, since much of your DNA is basically wasted space. Just for some context, this piece of writing contains almost 8000 letters and spaces; so it represents more than 1/3 of the likely DNA difference between you and I.
The most interesting part of the talk was Venter's discussion of the ongoing work to build an actual ‘map’ of evolution by creating a hierarchy "tree" of diversification and separation in genes over time. For instance, it's pretty well established (through various other methods) that mammals diverged from other animals 600 million years ago. All mammals share a certain set of genes that can thus be 'dated' to 600 million years, and based on similar techniques (such as carbon- or sediment-dates fossils) all splits (and thus gene differences) between various mammals can be dated with a good degree of accuracy to when in the last 600 million years they occurred. The fact that all humans are basically identical -- 99.92% the same -- means that, in the past 10,000 years or so, there has been effectively zero evolution of people. By performing these sorts of difference checks -- which bits of DNA are shared between people and birds, people and frogs, people and mushrooms, people and bacteria -- you can build a map of which changes (in both modern people and modern mushrooms) happened when, and both the general rates and the exact timeline of evolution.
Venter put up all sorts of really pretty visual maps of this, which sadly I can't reproduce.
One particularly fascinating example of this is known as the CCR5 gene. This gene exists in about 9% of Caucasian people, but only .01% of blacks -- so a significant minority in one population, effectively nonexistent in the other. The gene is notable, firstly, because it provides almost 100% immunity to catching AIDS. It's thus part of the explanation why AIDS is more of a plague in Africa – it acts as a ‘damping’ factor in person-to-person communication among Caucasians, and no analogous damper exists in Africans. What's most interesting, however, is that using the same population divergence techniques described above, the emergence of this gene in the Caucasian population can be dated to about 700 years ago. Stastistically, of course, such a recent emergence is necessary to explain such a significant spread in the incidence of the gene between two different groups of humans. Circa 700 years ago was, of course, when the Black Death hit Europe (1348 - 1350) and killed off about a third of the population; and CCR5, in addition to conferring immunity to AIDS, is highly effective at conferring immunity to the plague. Thus this gene got a big boost in the Caucasian population as many people who didn't have it died; whereas in Africa, without the plague, it remained a minor factor within the genetic mix.
Overall, it was a fascinating talk that I've done only a partial job of conveying. Venter is, and will remain, one of the most important scientists of our time, and hearing him speak at length and up close was great. At 7 pounds for the tickets (!) it's an example of the great cultural benefits of being in London.
Thursday, April 10, 2003
Outsourcing War
The precepts of globalization are being applied to our current war in the Gulf.
* Rule 1 of globalization is that intellectual property and capital have power; people don't.
I don't present that in a value-laden way; since I'm in the technology business, which is all about intellectual property and capital replacing people, I can hardly stand on firm ground and say "this is a bad thing". I don't exactly rest easy with the concept, but it's part of the world as we know it, and certainly a part of historical trends going back at least 500 years.
* Rule 2 of globalization is that with technology and capital the limiting factors in production, people (labor) are forced to compete with each other in order to participate in the system.
This is what drives both the desperate attempts of third-tier localities in America to attract manufacturing jobs with tax breaks (I am particularly reminded of the competitions which led to BMW's U.S. manufacturing being based in Spartansburg, South Carolina) and also the desperate attempts of third-tier countries to out-price and out-position each other in order to similarly attract manufacturing jobs. Some of my Gap T-shirts are sewn in Indonesia; some in Costa Rica; some in Thailand; but all of them are $9.95, because each of those very different regions has competed to drive down its costs (i.e. wages) of manufacturing while building the best possible infrastructure to attract foreign capital and technology. Although I don't regularly check the "made in" label on my Intel microprocessors, the same litany of third-tier global locations drive high-tech silicon as low-tech cotton.
* Rule 3 of globalization is that creating legal entities (corporations) to hold rights to capital and technology/intellectual property is the key way in which benefits (profits) are channeled to investors, while reducing as much as possible the benefits (wages) which are channeled to workers. After all, it's called capitalism, not laborism.
Since all contracts and legal obligations are associated with a particular legal entity, this means that workers (who can't create shell personalities, declare bankruptcy and reform, and reconstitute themselves as new versions of themselves) are always at a distinct disadvantage. United Airlines (UAL) has entered bankruptcy protection; this is a formal process by which the corporation (which holds rights to the capital and the technology that is UAL) can alter its relationship with its workers, while in the meantime, it keeps flying and taking customers' money.
So how are we applying the precepts of globalization to war?
Well, as a back story, a long time ago, the U.S. military figured out that the best way to prevent more student riots and burning of draft cards and so forth, as happened in Vietnam, is to create a military largely without people. Nothing against the Vietnam protestors, but most of them were apparently a bit more worried about dying than they were about killing Vietnamese. Ergo: no draft, no protestors. The technology of our armed forces is awesome, and largely it replaces people; lots of the "other guys" still die, and few of us do. But that's just the back story. What about the present narrative?
Last December, the NY Times noted that the U.S. would run its "war games" out of a brand new command and control center in Doha, Quatar. This was just the foreshadowing; in fact, the war has indeed been run from this base. Quatar is pretty much next door to Saudi Arabia, where we have a very fancy and expensive airbase called Prince Sultan, which was built for the last time we fought a Gulf war. So why did we move our operations to Quatar?
Well, as we're all aware, the Saudis have a difficult internal problem with that bane of globalization, people. Their citizens don't exactly like the current geopolitical moves of the U.S., and the Saudi regime is in the awkward position of either futher suppressing internal dissent, or not being able to provide America with what it needs to fight Iraq. What America needs, of course, is a location to base its capital infrastructure (tanks and planes and missiles) and intellectual property (all the cleverness in those weapons, various computer systems, plus the contents of Gen. Tommy Franks and his staff's heads)
So we invest in a new legal entity (Quatar) and transfer all our capital and IP into it; leaving the old legal entity (Saudi) and its annoying people behind. Kind of like reforming UAL, isn't it? Or, kind of like GM moving plants to Mexico, and ditching all those high-priced, low-productivity workers in Flint. It's all of a kind.
Spartansburg, I mean Quatar, which incidentally is running out of oil and is thus keen to play the globalization game, built *on a speculative basis* that billion-dollar airbase at Doha, in the hope that America would bring its capital and IP over to play. That's a better deal than we offered Turkey! Quatar only has about 750,000 people, and thus is largely free of the public-protestat issues which dog the more-populated Saudi Arabia. This is a twist on the normal run of globalization, in which countries (legal entities, remember, just like corporations) usually pimp out their citizens as cheap labor in return for foreign IP and capital; in this case, Quatar is presenting as its key competitive differentiator its *lack* of grumpy people protesting the overwhelming local presence of Western capital and IP.
It's a fascinating case study in the way the world works. The question I keep asking myself is:
If the overwhelming trend in the global marketplace today is Western capital and intellectual property married to cheap and malleable third-world people, where exactly does that leave the people (formerly known as citizens, currently known as consumers) of the West? There's not a clear productive role for us in this wonderful efficient system; and while we're useful for the moment as consumers, surely that can't last forever.
As this war unfolded, I sat back and watched while the capital and technological power of America, plus the generous globalized offerings of Quatar, did my fighting for me. Now, perhaps, I'll get exactly what I've earned through my participation.
The precepts of globalization are being applied to our current war in the Gulf.
* Rule 1 of globalization is that intellectual property and capital have power; people don't.
I don't present that in a value-laden way; since I'm in the technology business, which is all about intellectual property and capital replacing people, I can hardly stand on firm ground and say "this is a bad thing". I don't exactly rest easy with the concept, but it's part of the world as we know it, and certainly a part of historical trends going back at least 500 years.
* Rule 2 of globalization is that with technology and capital the limiting factors in production, people (labor) are forced to compete with each other in order to participate in the system.
This is what drives both the desperate attempts of third-tier localities in America to attract manufacturing jobs with tax breaks (I am particularly reminded of the competitions which led to BMW's U.S. manufacturing being based in Spartansburg, South Carolina) and also the desperate attempts of third-tier countries to out-price and out-position each other in order to similarly attract manufacturing jobs. Some of my Gap T-shirts are sewn in Indonesia; some in Costa Rica; some in Thailand; but all of them are $9.95, because each of those very different regions has competed to drive down its costs (i.e. wages) of manufacturing while building the best possible infrastructure to attract foreign capital and technology. Although I don't regularly check the "made in" label on my Intel microprocessors, the same litany of third-tier global locations drive high-tech silicon as low-tech cotton.
* Rule 3 of globalization is that creating legal entities (corporations) to hold rights to capital and technology/intellectual property is the key way in which benefits (profits) are channeled to investors, while reducing as much as possible the benefits (wages) which are channeled to workers. After all, it's called capitalism, not laborism.
Since all contracts and legal obligations are associated with a particular legal entity, this means that workers (who can't create shell personalities, declare bankruptcy and reform, and reconstitute themselves as new versions of themselves) are always at a distinct disadvantage. United Airlines (UAL) has entered bankruptcy protection; this is a formal process by which the corporation (which holds rights to the capital and the technology that is UAL) can alter its relationship with its workers, while in the meantime, it keeps flying and taking customers' money.
So how are we applying the precepts of globalization to war?
Well, as a back story, a long time ago, the U.S. military figured out that the best way to prevent more student riots and burning of draft cards and so forth, as happened in Vietnam, is to create a military largely without people. Nothing against the Vietnam protestors, but most of them were apparently a bit more worried about dying than they were about killing Vietnamese. Ergo: no draft, no protestors. The technology of our armed forces is awesome, and largely it replaces people; lots of the "other guys" still die, and few of us do. But that's just the back story. What about the present narrative?
Last December, the NY Times noted that the U.S. would run its "war games" out of a brand new command and control center in Doha, Quatar. This was just the foreshadowing; in fact, the war has indeed been run from this base. Quatar is pretty much next door to Saudi Arabia, where we have a very fancy and expensive airbase called Prince Sultan, which was built for the last time we fought a Gulf war. So why did we move our operations to Quatar?
Well, as we're all aware, the Saudis have a difficult internal problem with that bane of globalization, people. Their citizens don't exactly like the current geopolitical moves of the U.S., and the Saudi regime is in the awkward position of either futher suppressing internal dissent, or not being able to provide America with what it needs to fight Iraq. What America needs, of course, is a location to base its capital infrastructure (tanks and planes and missiles) and intellectual property (all the cleverness in those weapons, various computer systems, plus the contents of Gen. Tommy Franks and his staff's heads)
So we invest in a new legal entity (Quatar) and transfer all our capital and IP into it; leaving the old legal entity (Saudi) and its annoying people behind. Kind of like reforming UAL, isn't it? Or, kind of like GM moving plants to Mexico, and ditching all those high-priced, low-productivity workers in Flint. It's all of a kind.
Spartansburg, I mean Quatar, which incidentally is running out of oil and is thus keen to play the globalization game, built *on a speculative basis* that billion-dollar airbase at Doha, in the hope that America would bring its capital and IP over to play. That's a better deal than we offered Turkey! Quatar only has about 750,000 people, and thus is largely free of the public-protestat issues which dog the more-populated Saudi Arabia. This is a twist on the normal run of globalization, in which countries (legal entities, remember, just like corporations) usually pimp out their citizens as cheap labor in return for foreign IP and capital; in this case, Quatar is presenting as its key competitive differentiator its *lack* of grumpy people protesting the overwhelming local presence of Western capital and IP.
It's a fascinating case study in the way the world works. The question I keep asking myself is:
If the overwhelming trend in the global marketplace today is Western capital and intellectual property married to cheap and malleable third-world people, where exactly does that leave the people (formerly known as citizens, currently known as consumers) of the West? There's not a clear productive role for us in this wonderful efficient system; and while we're useful for the moment as consumers, surely that can't last forever.
As this war unfolded, I sat back and watched while the capital and technological power of America, plus the generous globalized offerings of Quatar, did my fighting for me. Now, perhaps, I'll get exactly what I've earned through my participation.
Monday, April 07, 2003
Google: The Backlash Begins
Americans love an underdog, and fear and loathe overdogs. Google's emergence from nowhere, with an appealing and seemingly non-commercial public image, touched a chord with a lot of people. Google's increasing dominance of the Web today is starting to scare those very same people. At the same time that Yahoo is changing its strategy [plus bonus link] to be more like fast-rising Google, real questions about the monopolistic power of Google to manage information and even create meaning are starting to emerge. At this rate, Google will quickly become the new Microsoft in ways which are somewhat less appealing than near-infinite positive cashflow and world dominance.
First, there was the GoogleWashing controversy, which I commented on a couple of days ago, below. Now, Andrew Orlowski at The Register -- seemingly a one-man anti-Google-hegemony bandwagon -- has found another crack in the new Darth Vader's armor. In a piece from April 5th, Orlowski points out that Google news is including press releases and other corporate PR in with "straight" news. As he positions it, this calls into question the basic journalistic premise of what is truth in the perception of readers. Orlowski gets to the heart of the issue in this snippet:
"Tools you can trust? Transparency in the instruments we use is vital, to ensure the integrity of the system. So we need to know how these editorial decisions are made."
In a completely unscientific sampling of reader letters [which tend to overweight the offended-enough-to-write-in] The Reg's readers get all huffy -- saying things like, "Why is Google shooting itself in the foot? In providing information, credibility is everything."
They're right. And one reason that traditional centralized, hierarchical media retains its hold on our collective consciousness is that it goes to extraordinary and sometimes arcane lengths to preserve our collective trust -- as demonstrated by the self-flagellation of the LA Times over their recent (pretty banal) altered photo flap.
Contrast this to Sean-Paul Kelly's somewhat cavalier "you caught me" over at the Agonist (see my previous post) and ask yourself who you're going to trust to be straight in the future -- the LA Times, which wants your trust badly enough to grovel for it; The Agonist, which shrugs its shoulders and says, "you decide" -- or Google, which says nothing at all.
I read a lot of good newspapers, and other traditional news sources.
Americans love an underdog, and fear and loathe overdogs. Google's emergence from nowhere, with an appealing and seemingly non-commercial public image, touched a chord with a lot of people. Google's increasing dominance of the Web today is starting to scare those very same people. At the same time that Yahoo is changing its strategy [plus bonus link] to be more like fast-rising Google, real questions about the monopolistic power of Google to manage information and even create meaning are starting to emerge. At this rate, Google will quickly become the new Microsoft in ways which are somewhat less appealing than near-infinite positive cashflow and world dominance.
First, there was the GoogleWashing controversy, which I commented on a couple of days ago, below. Now, Andrew Orlowski at The Register -- seemingly a one-man anti-Google-hegemony bandwagon -- has found another crack in the new Darth Vader's armor. In a piece from April 5th, Orlowski points out that Google news is including press releases and other corporate PR in with "straight" news. As he positions it, this calls into question the basic journalistic premise of what is truth in the perception of readers. Orlowski gets to the heart of the issue in this snippet:
"Tools you can trust? Transparency in the instruments we use is vital, to ensure the integrity of the system. So we need to know how these editorial decisions are made."
In a completely unscientific sampling of reader letters [which tend to overweight the offended-enough-to-write-in] The Reg's readers get all huffy -- saying things like, "Why is Google shooting itself in the foot? In providing information, credibility is everything."
They're right. And one reason that traditional centralized, hierarchical media retains its hold on our collective consciousness is that it goes to extraordinary and sometimes arcane lengths to preserve our collective trust -- as demonstrated by the self-flagellation of the LA Times over their recent (pretty banal) altered photo flap.
Contrast this to Sean-Paul Kelly's somewhat cavalier "you caught me" over at the Agonist (see my previous post) and ask yourself who you're going to trust to be straight in the future -- the LA Times, which wants your trust badly enough to grovel for it; The Agonist, which shrugs its shoulders and says, "you decide" -- or Google, which says nothing at all.
I read a lot of good newspapers, and other traditional news sources.
Trouble in Blogistan
It looks like noted blogger Sean-Paul Kelly over at the Agonist has run into a patch of trouble with plagiarism. According to this Wired article by Daniel Forbes, "Much of his material was plagiarized -- lifted word-for-word from a paid news service put out by Austin, Texas, commercial intelligence company Stratfor." You can read the whole story there, so I won't bother repeating it.
This begs the question: What is blogging really good for? If we look at how news is gathered in the world, a great deal of it comes from the creation of a centralized, hierarchical, funded structure of news-gathers -- like ABC, the New York Times, etc. The "promise" of blogging is three-fold:
1) Since ordinary people are often in the middle of "news" and six degrees of connection link most of us, first-hand reports of "news" can be sourced through a network of blogs
2) Blogging is an outstanding way to add commentary on top of sourced news from other [often traditional] places. Which is what I'm doing at the moment, presumably.
3) The inter- and cross-linking of blogging provides, in a meta-analytical way, a method for determining what news is the most important. This is all based on some fairly sophisticated information theory about "authorities" and "hubs" and how in-bound and out-bound links connect them. Suffice to say, once all of us have blathered and pontificated, Google and Inktomi and others can come in and derive meaning not only from what was said, but by whom, and how it was all linked together.
1) is the most questionable supposition of all. The Agonist has simply demonstrated yet again that commercial, centralized, hierarchical news organizations are often a superior form of information-gathering. If they weren't, he wouldn't have been scamming their stuff. In the pre-blog days, sites like Aint It Cool News promised to bring this sort of real-person-driven, six-degrees insider news to the movie industry -- which after all, in addition to the froth of stars, is made up of a bunch of ordinary craft-people doing ordinary jobs, who have a very good idea what Keanu Reeves' new Matrix costume will look like, or whatever. Well, it will take you about five minutes at AICN to see that didn't work out very well after four or five years. In addition to Harry Knowles' complete lack of discipline, the insider sources have dried up; what's the benefit to an insider of telling Harry anything? One of his most regular items on the site, Elston Gunn's Weekly Recap, is a report of what's in that week's Variety magazine. So much for competing with the news-gathering power of the traditional media.
2) requires that the blogger have some commentary to add. Duncan [a man with some commentary to add to nearly anything] has some interesting thoughts related to this; the U-shaped curve of information value as related to time. Frothy now-ness provides a lot of the immediate information value; this degrades, and is replaced by archival value of "true quality" stuff -- be it a Jane Austen novel, or Citizen Kane, or the writings of Winston Churchill or Dean Acheson. I think that this misses the commentary angle; my preferred form is the short essay or column, and I think that one can bring a lot of sense to the backwash, nearly in real-time, through this sort of forum. I would hope that has value somewhat independent of whether I posted it first, or four-hundred-and-thirty-third. I would also hope that it has some lasting value - though when the Agonist has out-lived his fifteen minutes and the link is dead, who will care what I thought about his travails?
3) is the most interesting to me. We may start a company to exploit 3) Which means I'm not going to talk about it right now, out here on the public Web.
It looks like noted blogger Sean-Paul Kelly over at the Agonist has run into a patch of trouble with plagiarism. According to this Wired article by Daniel Forbes, "Much of his material was plagiarized -- lifted word-for-word from a paid news service put out by Austin, Texas, commercial intelligence company Stratfor." You can read the whole story there, so I won't bother repeating it.
This begs the question: What is blogging really good for? If we look at how news is gathered in the world, a great deal of it comes from the creation of a centralized, hierarchical, funded structure of news-gathers -- like ABC, the New York Times, etc. The "promise" of blogging is three-fold:
1) Since ordinary people are often in the middle of "news" and six degrees of connection link most of us, first-hand reports of "news" can be sourced through a network of blogs
2) Blogging is an outstanding way to add commentary on top of sourced news from other [often traditional] places. Which is what I'm doing at the moment, presumably.
3) The inter- and cross-linking of blogging provides, in a meta-analytical way, a method for determining what news is the most important. This is all based on some fairly sophisticated information theory about "authorities" and "hubs" and how in-bound and out-bound links connect them. Suffice to say, once all of us have blathered and pontificated, Google and Inktomi and others can come in and derive meaning not only from what was said, but by whom, and how it was all linked together.
1) is the most questionable supposition of all. The Agonist has simply demonstrated yet again that commercial, centralized, hierarchical news organizations are often a superior form of information-gathering. If they weren't, he wouldn't have been scamming their stuff. In the pre-blog days, sites like Aint It Cool News promised to bring this sort of real-person-driven, six-degrees insider news to the movie industry -- which after all, in addition to the froth of stars, is made up of a bunch of ordinary craft-people doing ordinary jobs, who have a very good idea what Keanu Reeves' new Matrix costume will look like, or whatever. Well, it will take you about five minutes at AICN to see that didn't work out very well after four or five years. In addition to Harry Knowles' complete lack of discipline, the insider sources have dried up; what's the benefit to an insider of telling Harry anything? One of his most regular items on the site, Elston Gunn's Weekly Recap, is a report of what's in that week's Variety magazine. So much for competing with the news-gathering power of the traditional media.
2) requires that the blogger have some commentary to add. Duncan [a man with some commentary to add to nearly anything] has some interesting thoughts related to this; the U-shaped curve of information value as related to time. Frothy now-ness provides a lot of the immediate information value; this degrades, and is replaced by archival value of "true quality" stuff -- be it a Jane Austen novel, or Citizen Kane, or the writings of Winston Churchill or Dean Acheson. I think that this misses the commentary angle; my preferred form is the short essay or column, and I think that one can bring a lot of sense to the backwash, nearly in real-time, through this sort of forum. I would hope that has value somewhat independent of whether I posted it first, or four-hundred-and-thirty-third. I would also hope that it has some lasting value - though when the Agonist has out-lived his fifteen minutes and the link is dead, who will care what I thought about his travails?
3) is the most interesting to me. We may start a company to exploit 3) Which means I'm not going to talk about it right now, out here on the public Web.
Friday, April 04, 2003
No, this is a TechBlog. Really.
Duncan writes:
"good stuff! on the war, was the pause a head fake, an attempt to draw the
RG out so it could be destoyed, and to draw out the fedayeen so they could
be identified?"
I reply:
No "head fake" -- I don't think that war is anywhere near as arcane and frothy as our hypermedia perception of it. It's just kicking the shit out of the other guy, but you have to draw back your leg between kicks. The pause was just a part of the normal operational and logistical flow of things. You only have so many trucks, which can only carry so much stuff, and at some point, those same trucks have to stop, unload, and go back in the other direction for more stuff.
Multiply this because of the operational planning cycles (what did we use up? What do we need?), the strategic planning cycles (OK that went well -- now what?) the need to protect the supply lines (which means hauling troops and ammo in six directions for a while) etc. and a pause of a week seems perfectly reasonable. Patton, a man known for rapid advance, had plenty of pauses in his "dash" across France in 1944; he didn't have to deal with media backwash while he built up his forces and planned his next move, because in those days, everyone was still comparing him to how fast Napoleon had moved. (plus that fact that in WWI, *no one* had moved).
Much as I dislike Donald Rumsfeld, I have a lot of sympathy for his bewilderment at the media lashing he received after less than two weeks of war, when the worst thing that had happened was a few bullet holes in some helicopters.
At least we sort-of got rid of Richard Perle. While he stepped down as chairman, he's still on the DIA board; a fact which everyone in the media seems to be conveniently ignoring or significantly under-playing.
Duncan writes:
"good stuff! on the war, was the pause a head fake, an attempt to draw the
RG out so it could be destoyed, and to draw out the fedayeen so they could
be identified?"
I reply:
No "head fake" -- I don't think that war is anywhere near as arcane and frothy as our hypermedia perception of it. It's just kicking the shit out of the other guy, but you have to draw back your leg between kicks. The pause was just a part of the normal operational and logistical flow of things. You only have so many trucks, which can only carry so much stuff, and at some point, those same trucks have to stop, unload, and go back in the other direction for more stuff.
Multiply this because of the operational planning cycles (what did we use up? What do we need?), the strategic planning cycles (OK that went well -- now what?) the need to protect the supply lines (which means hauling troops and ammo in six directions for a while) etc. and a pause of a week seems perfectly reasonable. Patton, a man known for rapid advance, had plenty of pauses in his "dash" across France in 1944; he didn't have to deal with media backwash while he built up his forces and planned his next move, because in those days, everyone was still comparing him to how fast Napoleon had moved. (plus that fact that in WWI, *no one* had moved).
Much as I dislike Donald Rumsfeld, I have a lot of sympathy for his bewilderment at the media lashing he received after less than two weeks of war, when the worst thing that had happened was a few bullet holes in some helicopters.
At least we sort-of got rid of Richard Perle. While he stepped down as chairman, he's still on the DIA board; a fact which everyone in the media seems to be conveniently ignoring or significantly under-playing.
Thursday, April 03, 2003
The Rogue War Piece on OnoTech -- The Queen of Battle
While this is my very first blog, I've been running a community-blog-like Yahoo Group called LibertyPolitics since Sept. 12th, 2001. I posted the piece below to that forum about 50 hours ago -- on Tuesday, April 1st. If you want to know just how fast our modern media world moves, observe that what I wrote then has gone from shocking to banal in just 50 hours. When I made similar observations to this piece on the 30th of March (100 hours ago) people looked at me like I was honestly crazy. Enjoy my now-conventional wisdom.
---------------------
I've been doing my best to stay away from the reportage of the war. I can probably count on both hands the number of significant events which have occurred thus far. Despite 13 days of hysterical near-round-the-clock coverage by at least 100 global news organizations, there are simply not that many significant events to report.
The most significant event, currently extremely under-reported, is that the American and British forces are winning handily. Now, I know that the backwash of media blather is suggesting that the day after tomorrow, it will be Vietnam all over again, and the Army will be faced with a Hobson's choice of ignominious retreat or victory via many My Lai's...
Nonsense.
I have gone on at great length in this [LibertyPolitics] forum about the "democratized power to destroy". We have seen good evidence of this in the troubles that the Fedayeen and other irregular troops have caused the American and British forces by harrying their supply lines. I continue to believe it's one of the most fundamental strategic shifts and societal problems of the coming century.
But you don't win wars with democratized destruction. In a wartime context, this sort of action can only delay losing.
Winning wars is simply concentrating force. Artillery, Napoleon's 'queen of battle.' Tactical air power, which is an incredibly powerful extension of artillery. Strategic air power. Tanks. Soldiers. Guns.
Thus far in the conflict, the Iraqis have completely failed to concentrate any force at all. At every attempt -- five tanks moving together, 20 personnel carriers, 600 Republican Guards -- they are quickly destroyed by the American and British forces. Typically in such engagements, casualities are effectively zero on the English-speaking side. Poorly publicized CIA estimates put the number of dead Iraqi soldiers (a number both sides in the war are hesitant to talk about, for contrasting PR reasons) at about 50,000, and rising. That is more than 10% of the entire Iraqi army, an increasingly disabling rate of casualties.
What is happening now is that the Iraqis are losing the war, quickly. I think that there's a good chance (50+%) the war will be over in three weeks. An even better chance it will be over in six weeks. As in, Saddam dead or captured, the Baath leadership overthrown, the Republican Guards largely destroyed or surrendered as well.
Use your metaphor of choice -- a rachet, a vise, the incoming tide. The Americans and the British are headed north, and they destroy any opposing force. They may pause, but they never go back. They may be engaged by the enemy, but they never lose. They have been somewhat stymied by the interiors of the cities along the way, but you don't hear stories about irregular Fedayeen forces interdicting supply lines from the cities any more -- that was a one-time surprise, and they're all dead now. Baghdad itself has plenty of regular and irregular forces, but they're getting hammered day in and day out by highly concentrated, deadly force.
I expect at least one significant attempt to use chemical and/or biological weapons in the next couple weeks, as the English-speaking coalition closes with Baghdad. By the standards of the war thus far, there will be plenty of casualties -- maybe a couple hundred, maybe a thousand. Donald Rumsfeld will have photogenic apoplexy. After appropriate off-camera mugging practice, George W. Bush will look stern and pronounce something along the lines of "we told you so". And then the war will go on, briefly, until it ends. By the deadly standard of our explosives and artillery and bullets, these "WMD" will turn out to be nothing more than Jim's "Weapons of Mass Annoyance".
However, It's not all good.
My continuing fear is that once the war is over, the postwar "peace" will start. And that's where we'll see the triumph of democratized destruction. Force and opinion that's not strong or widespread or positive enough to build anything will still be able to survive, and disrupt, and destroy. That's when I worry about America and Britain getting bogged down. Being hated. Being bled, slowly. Not knowing who the enemy is, what the objective is, which way to turn, much less whom to fight. The peace is the problem here, not the war. Can you say, "West Bank"? Can you say, "Northern Ireland"? How about "Aceh" or "Columbia"?
I am not thrilled that we are killing tens of thousands of Iraqis to accomplish our questionable objectives. It is less dead Iraqis than the sanctions, but it is still a lot of people killed young, in a terrible way. They won't be forgotten by their friends and relatives. I will be very happy to see the world (and Iraq) rid of Saddam Hussein. But most of all, I fear what comes next. I remain convinced this war was a very large mistake, for which the West, and America, will pay dearly for a very long time.
Fears of the future aside, make no mistake about the present. With deadly, unimaginable destructive force -- good, old-fashioned high explosives and troops in the field, killing people and blowing things up -- this war is quickly being won.
If only peace were so easy.
While this is my very first blog, I've been running a community-blog-like Yahoo Group called LibertyPolitics since Sept. 12th, 2001. I posted the piece below to that forum about 50 hours ago -- on Tuesday, April 1st. If you want to know just how fast our modern media world moves, observe that what I wrote then has gone from shocking to banal in just 50 hours. When I made similar observations to this piece on the 30th of March (100 hours ago) people looked at me like I was honestly crazy. Enjoy my now-conventional wisdom.
---------------------
I've been doing my best to stay away from the reportage of the war. I can probably count on both hands the number of significant events which have occurred thus far. Despite 13 days of hysterical near-round-the-clock coverage by at least 100 global news organizations, there are simply not that many significant events to report.
The most significant event, currently extremely under-reported, is that the American and British forces are winning handily. Now, I know that the backwash of media blather is suggesting that the day after tomorrow, it will be Vietnam all over again, and the Army will be faced with a Hobson's choice of ignominious retreat or victory via many My Lai's...
Nonsense.
I have gone on at great length in this [LibertyPolitics] forum about the "democratized power to destroy". We have seen good evidence of this in the troubles that the Fedayeen and other irregular troops have caused the American and British forces by harrying their supply lines. I continue to believe it's one of the most fundamental strategic shifts and societal problems of the coming century.
But you don't win wars with democratized destruction. In a wartime context, this sort of action can only delay losing.
Winning wars is simply concentrating force. Artillery, Napoleon's 'queen of battle.' Tactical air power, which is an incredibly powerful extension of artillery. Strategic air power. Tanks. Soldiers. Guns.
Thus far in the conflict, the Iraqis have completely failed to concentrate any force at all. At every attempt -- five tanks moving together, 20 personnel carriers, 600 Republican Guards -- they are quickly destroyed by the American and British forces. Typically in such engagements, casualities are effectively zero on the English-speaking side. Poorly publicized CIA estimates put the number of dead Iraqi soldiers (a number both sides in the war are hesitant to talk about, for contrasting PR reasons) at about 50,000, and rising. That is more than 10% of the entire Iraqi army, an increasingly disabling rate of casualties.
What is happening now is that the Iraqis are losing the war, quickly. I think that there's a good chance (50+%) the war will be over in three weeks. An even better chance it will be over in six weeks. As in, Saddam dead or captured, the Baath leadership overthrown, the Republican Guards largely destroyed or surrendered as well.
Use your metaphor of choice -- a rachet, a vise, the incoming tide. The Americans and the British are headed north, and they destroy any opposing force. They may pause, but they never go back. They may be engaged by the enemy, but they never lose. They have been somewhat stymied by the interiors of the cities along the way, but you don't hear stories about irregular Fedayeen forces interdicting supply lines from the cities any more -- that was a one-time surprise, and they're all dead now. Baghdad itself has plenty of regular and irregular forces, but they're getting hammered day in and day out by highly concentrated, deadly force.
I expect at least one significant attempt to use chemical and/or biological weapons in the next couple weeks, as the English-speaking coalition closes with Baghdad. By the standards of the war thus far, there will be plenty of casualties -- maybe a couple hundred, maybe a thousand. Donald Rumsfeld will have photogenic apoplexy. After appropriate off-camera mugging practice, George W. Bush will look stern and pronounce something along the lines of "we told you so". And then the war will go on, briefly, until it ends. By the deadly standard of our explosives and artillery and bullets, these "WMD" will turn out to be nothing more than Jim's "Weapons of Mass Annoyance".
However, It's not all good.
My continuing fear is that once the war is over, the postwar "peace" will start. And that's where we'll see the triumph of democratized destruction. Force and opinion that's not strong or widespread or positive enough to build anything will still be able to survive, and disrupt, and destroy. That's when I worry about America and Britain getting bogged down. Being hated. Being bled, slowly. Not knowing who the enemy is, what the objective is, which way to turn, much less whom to fight. The peace is the problem here, not the war. Can you say, "West Bank"? Can you say, "Northern Ireland"? How about "Aceh" or "Columbia"?
I am not thrilled that we are killing tens of thousands of Iraqis to accomplish our questionable objectives. It is less dead Iraqis than the sanctions, but it is still a lot of people killed young, in a terrible way. They won't be forgotten by their friends and relatives. I will be very happy to see the world (and Iraq) rid of Saddam Hussein. But most of all, I fear what comes next. I remain convinced this war was a very large mistake, for which the West, and America, will pay dearly for a very long time.
Fears of the future aside, make no mistake about the present. With deadly, unimaginable destructive force -- good, old-fashioned high explosives and troops in the field, killing people and blowing things up -- this war is quickly being won.
If only peace were so easy.
Entropy and Information - the Deletion Problem
Tor Norretranders' excellent book The User Illusion is about, more or less, information technology, consciousness, and reality. In other words, it's broad.
I've been thinking about this book's information-theory driven claims that the problem with entropy and information is not gathering data, but throwing it away... this relates strongly to blogging in general, and our enterprise blogging idea.
The "publish" model of both ERP and traditional news does a good job of circumscribing what's true/real from the vast majority of maybe, early drafts, variants and versions, unproven, disproven, etc. While we chafe at its limitations, we get a lot of benefit from the assurance that the limited dataset within it is highly trustworthy.
Once something enters the blogsphere, it never leaves... even if disproven or expired due to time marching on... Is the best way to measure the truth/value of something the ongoing, continued active linking by people back to "old items" of real value, vs. the gradual decay of the "link-liveliness" of items which are either wrong, or increasingly irrelevant over time?
Studies of academic footnotes have (on a much longer timescale) done a good job of identifying "canon" texts vs. the vast majority of useless research (most published items are never cited by anyone else) plus the whole continuum in between. It should be possible to build a similar hierarchy of trust/truth over time, both across multiple data sources (mx. blogs) and within a single blog (e.g. I refer/link regularly back to the "good stuff" I have written, because it's relevant to my current writing, but I tend not to refer back to the less good stuff).
Tor Norretranders' excellent book The User Illusion is about, more or less, information technology, consciousness, and reality. In other words, it's broad.
I've been thinking about this book's information-theory driven claims that the problem with entropy and information is not gathering data, but throwing it away... this relates strongly to blogging in general, and our enterprise blogging idea.
The "publish" model of both ERP and traditional news does a good job of circumscribing what's true/real from the vast majority of maybe, early drafts, variants and versions, unproven, disproven, etc. While we chafe at its limitations, we get a lot of benefit from the assurance that the limited dataset within it is highly trustworthy.
Once something enters the blogsphere, it never leaves... even if disproven or expired due to time marching on... Is the best way to measure the truth/value of something the ongoing, continued active linking by people back to "old items" of real value, vs. the gradual decay of the "link-liveliness" of items which are either wrong, or increasingly irrelevant over time?
Studies of academic footnotes have (on a much longer timescale) done a good job of identifying "canon" texts vs. the vast majority of useless research (most published items are never cited by anyone else) plus the whole continuum in between. It should be possible to build a similar hierarchy of trust/truth over time, both across multiple data sources (mx. blogs) and within a single blog (e.g. I refer/link regularly back to the "good stuff" I have written, because it's relevant to my current writing, but I tend not to refer back to the less good stuff).
User-Created Content and the Future of Entertainment
The NY Times has just published a somewhat puffy "state of the company" piece about Nobuyuki Idei's vision for Sony.
While there are some interesting elements in the article, including the point that Idei has been unsuccessful in his attempts to get Sony engineers to focus more on software vs. hardware [sure rings true with me!] what I find most interesting is what's missing.
The "convergence" vision that the article discusses is very 1995 -- it's the old, centralized, push-it-to-the-consumer model of music+movies+hardware, with formats and players and content interconnecting in one seamless vertical stack.
* Why did I buy the SACD player? Because the Sony artist CD I wanted was in SACD.
* Why did I buy that other new SACD release? Because I wanted to play something new in my Sony SACD player. Etc.
I think we're all pretty clear that this is dead. Rich's "bits is bits" mantra and the mostly-networked world has pretty much killed off hardware/format driven verticalization (although the Memory Stick just might have enough legs to make it, as a last gasp for the concept) This is far too fragile a stalk for Sony to hang its hat on, and they really need to come up with something else.
That something is what's missing from the article -- any recognition whatsoever that the biggest trend in the entertainment/media/communications world is user-created content (UCC). The Web started it, with home pages, forum news such as Slashdot, and AOL chat rooms; blogging has extended it enormously, and now many people's major time on line is spent doing email (UCC) surfing eBay ( a UCC mall, effectively) or Match.com (where users *are* the content). The success of "reality TV" shows from Survivor to American Idol is yet another aspect of this trend -- both the user-like people on the TV, and the viewer's ability to vote and thus participate, are critical aspects to the experience.
Already, we are seeing the fragmentation of major single "channel" UCC forums (from CBS's Survivor to eBay) into smaller, self-forming communities. I've found it fascinating that Duncan, Rich, and I are all part of separate, semi-formalized forums for discussing the war and/or world politics. I've been running mine for about two years, and it has about 40 members; I would say that at least 30 minutes a day, and sometime two hours, are taken up by my either reading the forum, researching for an essay I want to write, or writing for the forum. While you can look at "the Fray" on Slate, or Mickey Kaus's blog, and say, "that's user-created content, and it has a certain relationship to the 'centralized convergence' content of Slate [Microsoft + MSN + Slate + Comcast is also a classic convergence play ] what you miss is all the little micro-forums like mine and Rich's and Duncan's... which by my measure (and a survey of friends that I've taken) occupy at least 50% of people's time and attention... and growing.
We made some suggestions around "community-based" video ideas to Sony... community is a somewhat discredited concept because it's been tough to monetize in the ways people have done it in the past. But user-created content is a slightly different (and I think more accurate) take on the concept, and the importance of both self-organizing and centralized "editing" methods of selecting which content gets "promoted" and which doesn't, is crucial to the concept.
Sony has a giant hole in the middle of its strategy -- that hole is where all the action and ferment is. Not only do they need to do the video TV show that both we and Duncan have thought of, they should be thinking about how to mediate and drive communities of content-creating users in any number of different ways. After all, Top 40 radio used to more or less honestly reflect what people around the country wanted to listen to -- before it got co-opted by the media conglomerates. On scales both large and small, Sony should be thinking about how to return to that world, re-create that world, and make money from it.
If we're not careful, this will turn into classic relationship marketing -- get close to your customers, let them tell you what they want, and then give it to them ;-)
The NY Times has just published a somewhat puffy "state of the company" piece about Nobuyuki Idei's vision for Sony.
While there are some interesting elements in the article, including the point that Idei has been unsuccessful in his attempts to get Sony engineers to focus more on software vs. hardware [sure rings true with me!] what I find most interesting is what's missing.
The "convergence" vision that the article discusses is very 1995 -- it's the old, centralized, push-it-to-the-consumer model of music+movies+hardware, with formats and players and content interconnecting in one seamless vertical stack.
* Why did I buy the SACD player? Because the Sony artist CD I wanted was in SACD.
* Why did I buy that other new SACD release? Because I wanted to play something new in my Sony SACD player. Etc.
I think we're all pretty clear that this is dead. Rich's "bits is bits" mantra and the mostly-networked world has pretty much killed off hardware/format driven verticalization (although the Memory Stick just might have enough legs to make it, as a last gasp for the concept) This is far too fragile a stalk for Sony to hang its hat on, and they really need to come up with something else.
That something is what's missing from the article -- any recognition whatsoever that the biggest trend in the entertainment/media/communications world is user-created content (UCC). The Web started it, with home pages, forum news such as Slashdot, and AOL chat rooms; blogging has extended it enormously, and now many people's major time on line is spent doing email (UCC) surfing eBay ( a UCC mall, effectively) or Match.com (where users *are* the content). The success of "reality TV" shows from Survivor to American Idol is yet another aspect of this trend -- both the user-like people on the TV, and the viewer's ability to vote and thus participate, are critical aspects to the experience.
Already, we are seeing the fragmentation of major single "channel" UCC forums (from CBS's Survivor to eBay) into smaller, self-forming communities. I've found it fascinating that Duncan, Rich, and I are all part of separate, semi-formalized forums for discussing the war and/or world politics. I've been running mine for about two years, and it has about 40 members; I would say that at least 30 minutes a day, and sometime two hours, are taken up by my either reading the forum, researching for an essay I want to write, or writing for the forum. While you can look at "the Fray" on Slate, or Mickey Kaus's blog, and say, "that's user-created content, and it has a certain relationship to the 'centralized convergence' content of Slate [Microsoft + MSN + Slate + Comcast is also a classic convergence play ] what you miss is all the little micro-forums like mine and Rich's and Duncan's... which by my measure (and a survey of friends that I've taken) occupy at least 50% of people's time and attention... and growing.
We made some suggestions around "community-based" video ideas to Sony... community is a somewhat discredited concept because it's been tough to monetize in the ways people have done it in the past. But user-created content is a slightly different (and I think more accurate) take on the concept, and the importance of both self-organizing and centralized "editing" methods of selecting which content gets "promoted" and which doesn't, is crucial to the concept.
Sony has a giant hole in the middle of its strategy -- that hole is where all the action and ferment is. Not only do they need to do the video TV show that both we and Duncan have thought of, they should be thinking about how to mediate and drive communities of content-creating users in any number of different ways. After all, Top 40 radio used to more or less honestly reflect what people around the country wanted to listen to -- before it got co-opted by the media conglomerates. On scales both large and small, Sony should be thinking about how to return to that world, re-create that world, and make money from it.
If we're not careful, this will turn into classic relationship marketing -- get close to your customers, let them tell you what they want, and then give it to them ;-)
GoogleWash?
I'm becoming a big fan of Orlowski these past few days. Aside from its serious questioning of one of my basic premises -- that powerful contextual search can effectively and accurately replace hard-coded data relationships -- this piece mentions in passing that the current estimate of blog readers on the web is just 4%. Interesting stuff.
I'm becoming a big fan of Orlowski these past few days. Aside from its serious questioning of one of my basic premises -- that powerful contextual search can effectively and accurately replace hard-coded data relationships -- this piece mentions in passing that the current estimate of blog readers on the web is just 4%. Interesting stuff.
Ellison says OS will kill MS. In other news, dog bites man
Larry's been predicting again. He's been predicting the demise of MS forever. He's saying things we already know to be true -- that Linux and (eventually) OpenOffice will really complicate MS's life. The two most interesting points here aren't stated:
1) Microsoft IIS is free, because Apache is free. Thus the IIS 30% market share is irrelevant, because its value has been deflated by open-source competition to zero. What this means is that the market-share question isn't the most important one -- the most important question is Jack Welch's much-ballyhooed "pricing power" and this example demonstrates that with a viable OS alternative -- never mind its market share -- pricing power for proprietary software erodes dramatically. I think that the odds of people like me shifting to OpenOffice are very small. However, I think that the odds of my paying a hell of a lot less for Office 2005 are extremely good, because of the shifting power dynamic.
2) Open source is at least as threatening to Oracle as it is to Microsoft. Where's Ellison's glee on that point? It will be funny to watch the two putative "richest men in the world" and their tit-for-tat competition race each other's fortunes to the bottom over the next 5-10 years. Warren Buffett will laugh.
Ellison says, "The computer industry is finally moving from a cottage industry to an industrial industry. We're moving at breakneck pace toward the 19th century."
Has he checked what happened to manufacturing margins when the industrial revolution occurred? They went from high double-digits within guilds to high single-digits in factories. Volume massively increased. The magic of software over the past 20 years has been that it's retained high margins but achieved massive volume. This combination has bought Ellison many yachts. I really don't think he wants software margins to collapse to industrial levels... but he's right, it's going to happen anyway.
What are Oracle shares worth with 9% margins and near-zero forward revenue visibility? Hmmm.
Larry's been predicting again. He's been predicting the demise of MS forever. He's saying things we already know to be true -- that Linux and (eventually) OpenOffice will really complicate MS's life. The two most interesting points here aren't stated:
1) Microsoft IIS is free, because Apache is free. Thus the IIS 30% market share is irrelevant, because its value has been deflated by open-source competition to zero. What this means is that the market-share question isn't the most important one -- the most important question is Jack Welch's much-ballyhooed "pricing power" and this example demonstrates that with a viable OS alternative -- never mind its market share -- pricing power for proprietary software erodes dramatically. I think that the odds of people like me shifting to OpenOffice are very small. However, I think that the odds of my paying a hell of a lot less for Office 2005 are extremely good, because of the shifting power dynamic.
2) Open source is at least as threatening to Oracle as it is to Microsoft. Where's Ellison's glee on that point? It will be funny to watch the two putative "richest men in the world" and their tit-for-tat competition race each other's fortunes to the bottom over the next 5-10 years. Warren Buffett will laugh.
Ellison says, "The computer industry is finally moving from a cottage industry to an industrial industry. We're moving at breakneck pace toward the 19th century."
Has he checked what happened to manufacturing margins when the industrial revolution occurred? They went from high double-digits within guilds to high single-digits in factories. Volume massively increased. The magic of software over the past 20 years has been that it's retained high margins but achieved massive volume. This combination has bought Ellison many yachts. I really don't think he wants software margins to collapse to industrial levels... but he's right, it's going to happen anyway.
What are Oracle shares worth with 9% margins and near-zero forward revenue visibility? Hmmm.
Subscribe to:
Posts (Atom)