For most people the digital economy refers to the economy conducted on the Internet, but the digital economy is much broader than this. The digital economy represents the pervasive use of IT (hardware, software, applications and telecommunications) in all aspects of the economy, including internal operations of organizations (business, government and non-profit); transactions between organizations; and transactions between individuals, acting both as consumers and citizens, and organizations.
Just as 100 years ago the development of cheap, hardened steel enabled a host of tools to be made that drove economic growth, today information technology enables the creation of a host of tools to create, manipulate, organize, transmit, store and act on information in digital form in new ways and through new organizational forms (Cohen, Delong, Weber, and Zysman 2001).
IT is in everyday consumer products like washing machines, cars, and credit cards, and industrial products like computer numerically-controlled machine tools, lasers, and robots. Indeed, in 2006, 70 percent of microprocessors did not go into computers but rather went into cars, planes, HDTVs, etc., enabling their digital functionality and connectivity. Connecting these IT tools is a robust and growing wireless and wireline telecommunications network. Moreover, the technology is anything but static.
As it continues getting cheaper, faster, better, and easier to use, organizations continuously find new and expanded uses for IT every day, as the recent emergence of YouTube illustrates. As some keen observers of the digital economy point out, "At each point in the last 40 years the critical step in the transformation of technological potential into economic productivity has been the discovery by IT users of how to employ their ever greater and ever cheaper computing power to do the previously impossible." (Cohen, Delong, Weber, and Zysman 2001) Cataloging even onetenth of the new applications being created today in a wide array of application areas and sectors would be a monumental task.
If prices had fallen without performance improvements, the result would be cheap but not very effective technologies. If performance had improved without price declines, IT would have proven too expensive to put into everyday devices and applications. If both happened but the technology remained hard to use, adoption rates would be significantly lower. Luckily, all three happened.
In 1965, Intel co-founder Gordon Moore observed that as transistors got smaller, the number of transistors that fit onto an integrated circuit grew exponentially. He "challenged" the semiconductor industry to continue this exponential growth, a challenge which the industry has risen to time and again. Each doubling requires innovation, capital expenditure, and risk. In practical terms, the result has been that the computing power of a chip doubles every 18 months. This prediction has held true for over 40 years.
In 1978, the price of Intel's 086 processor was $480 per million instructions per second (MIPS). By 1985, the cost of the 386 processor had fallen to $50 per MIPS. Ten years later the Pentium Pro cost just $4 per MIPS. In 2003 the Itanium 2 processor cost half that, at $2 per MIPS.8We can see this trend by examining the growth in the number of transistors on Intel processors.
This exponential progress is continuing across many core IT technologies (memory, processors, storage, sensors, displays, and communication). The real price of servers fell approximately 30 percent per year between 1996 and 2001 (Van Reenen 2005). Hard drive storage capacity has doubled every 19 months while the cost of a stored megabyte of data has fallen 50 percent per year.
As a result, the cost of storing one megabyte of information fell from $5,257 in 1975 to 17