sharing to minicomputer-based local
area networks to broader Ethernet net-
works and on to the Internet. Each stage
in that progression has involved greater
standardization of the technology and,
at least recently, greater homogeniza-
tion of its functionality. For most busi-
ness applications today, the benefits of
customization would be overwhelmed
by the costs of isolation.
IT is also highly replicable. Indeed, it
is hard to imagine a more perfect com-
modity than a byte of data – endlessly
and perfectly reproducible at virtually
no cost. The near-infinite scalability of
many IT functions, when combined
with technical standardization, dooms
most proprietary applications to eco-
nomic obsolescence. Why write your
own application for word processing
or e-mail or, for that matter, supply-
chain management when you can buy
a ready-made, state-of-the-art applica-
tion for a fraction of the cost? But it’s
not just the software that is replicable.
Because most business activities and
processes have come to be embedded
in software,they become replicable, too.
When companies buy a generic appli-
cation, they buy a generic process as
well. Both the cost savings and the in-
teroperability benefits make the sacri-
fice of distinctiveness unavoidable.
The arrival of the Internet has accel-
erated the commoditization of IT by
providing a perfect delivery channel for
generic applications. More and more,
companies will fulfill their IT require-
ments simply by purchasing fee-based
“Web services” from third parties –
similar to the way they currently buy
electric power or telecommunications
services. Most of the major business-
technology vendors, from Microsoft to
IBM, are trying to position themselves
as IT utilities, companies that will con-
trol the provision of a diverse range of
business applications over what is now
called, tellingly, “the grid.” Again, the
upshot is ever greater homogenization
of IT capabilities, as more companies
replace customized applications with
generic ones. (For more on the chal-
lenges facing IT companies,see the side-
bar “What About the Vendors?”)
Finally, and for all the reasons al-
ready discussed, IT is subject to rapid
price deflation. When Gordon Moore
made his famously prescient assertion
that the density of circuits on a com-
puter chip would double every two
years, he was making a prediction
about the coming explosion in process-
ing power. But he was also making a
prediction about the coming free fall in
the price of computer functionality. The
cost of processing power has dropped
relentlessly, from $480 per million in-
structions per second (MIPS) in 1978
to $50 per MIPS in 1985 to $4 per MIPS
in 1995, a trend that continues un-
abated. Similar declines have occurred
in the cost of data storage and trans-
mission. The rapidly increasing afford-
ability of IT functionality has not only
democratized the computer revolu-
tion, it has destroyed one of the most
important potential barriers to com-
petitors. Even the most cutting-edge
IT capabilities quickly become avail-
able to all.
It’s no surprise, given these charac-
teristics, that IT’s evolution has closely
mirrored that of earlier infrastructural
technologies. Its buildout has been every
bit as breathtaking as that of the rail-
roads (albeit with considerably fewer
fatalities). Consider some statistics. Dur-
ing the last quarter of the twentieth cen-
tury, the computational power of
a microprocessor increased by a factor
of 66,000. In the dozen years from 1989
to 2001, the number of host computers
connected to the Internet grew from
80,000 to more than 125 million. Over
the last ten years, the number of sites
on the World Wide Web has grown
from zero to nearly 40 million. And
since the 1980s, more than 280 million
miles of fiber-optic cable have been in-
stalled – enough, as BusinessWeek re-
8
harvard business review
HBR AT LARGE • IT Doesn’t Matter
Too Much of a Good Thing
As many experts have pointed out, the overinvestment in information
technology in the 1990s echoes the overinvestment in railroads in the
1860s. In both cases, companies and individuals, dazzled by the seem-
ingly unlimited commercial possibilities of the technologies, threw large
quantities of money away on half-baked businesses and products. Even
worse, the flood of capital led to enormous overcapacity, devastating
entire industries.
We can only hope that the analogy ends there. The mid-nineteenth-
century boom in railroads (and the closely related technologies of the
steam engine and the telegraph) helped produce not only widespread
industrial overcapacity but a surge in productivity. The combination set
the stage for two solid decades of deflation. Although worldwide economic
production continued to grow strongly between the mid-1870s and the
mid-1890s, prices collapsed – in England, the dominant economic power
of the time, price levels dropped 40
%. In turn, business profits evaporated.
Companies watched the value of their products erode while they were in
the very process of making them. As the first worldwide depression took
hold, economic malaise covered much of the globe.“Optimism about a
future of indefinite progress gave way to uncertainty and a sense of agony,”
wrote historian D.S. Landes.
It’s a very different world today, of course, and it would be dangerous
to assume that history will repeat itself. But with companies struggling to
boost profits and the entire world economy flirting with deflation, it would
also be dangerous to assume it can’t.