Monday, January 2, 2012

The Pulse of Technology - Keeping Pace With Continuous Change - November, 1998


!±8± The Pulse of Technology - Keeping Pace With Continuous Change - November, 1998

Gordon Moore, the co-founder of Intel Corporation first postulated the now-famous Moore's law in the nineteen seventies. Moore's law states that the processing or computational power of silicon chips will double every twenty-four months, while pricing for these chips will halve in the same time period. This law has remained relatively constant for over twenty years. We are now approaching a time when this seemingly immutable law is becoming outdated. In fact, new silicon chips are doubling in power; with new chips coming online within twelve to eighteen months, while pricing is being halved in even less time. What has happened to the underlying technology that drives these silicon chips, and what are the market forces that have dictated rapidly declining prices?

There are several factors that lead to the inexorable increase in processing power, just as these same factors exert a downward pressure on prices. Let's look at several of these factors in the context of hardware developments, software developments and the rise of the Internet as the ubiquitous network that many people predicted as being necessary to make computers universally acceptable in daily life.
Hardware Development.

When Intel was founded by ex-Fairchild developers, the mid-range computer, as personified by the DEC PDP series, Data General machines, IBM 32/34 series and the first HP boxes was the emerging standard in the computer industry. Machines of this time period were often viewed as departmental machines that were required to perform quick, hands-on computing applications that were free from the centralized (i.e., mainframe computing environment) I.T. staffs of the time.

The idea of a small, nimble machine that could be programmed and developed by local departments was extremely appealing at the time. Because of the diversity of manufacturers and proprietary operating systems, standards were largely undeveloped, causing competing platforms to jockey for position. Migration from one machine to another was largely unheard-of due to the high costs of switching data and applications programs; not to mention the high training costs required for I.T. staff.

The acceptance of UNIX as an open standard marks a watershed in the history of computing. For the first time, applications programs could be developed that were cross-platform - that is, capable of running on alternate hardware platforms. This newfound freedom allowed software programmers to write a single application that could be run on multiple machines. The importance to hardware developers was simple - they could spend more time on the refinement of the underlying silicon, and less time developing proprietary hardware systems. It is this process of refinement that has marked the decrease in cost of silicon that we know today.

The advent of the personal computer in the late nineteen-seventies and early nineteen-eighties marked another watershed in the development of hardware. Where mid-range computers allowed entire departments to break free of the constraints of mainframe computing, the advent of the PC brought computing to the thousands of business users who wanted the ability to perform analysis and data gathering at their convenience, not that of the I.T. department. For the first time, individuals could analyze, store and retrieve large amounts of data without having to master a computer language, and they could perform these tasks at their own pace. This device literally transformed the business world, making computations possible to everyday users that were once performed by large mainframe computers. This break-through spirit was best embodied by Apple computer, and symbolized in its "big brother" campaign in 1984. Aside from its edgy attitude, Apple also pioneered consumer usage of the floppy drive, mouse, and graphical user interface that made computing more accessible to everyday users. The ergonomics of computer use drove hardware device design and manufacture in a way previously unknown. Heretofore, ergonomics were largely ignored in computer design and manufacture; Apple changed all that with the introduction of the Macintosh line of PCs.

For all its innovation and edge, Apple made a mistake similar to that made by competing mid-range computers in the mid-seventies - it's OS (operating system) and architecture was proprietary. Fearing that licensing would erode its technological leadership, Apple kept its systems and hardware proprietary and opened the door for a technically inferior product to gain a foothold that it has not yet relinquished.

In 1981, IBM introduced the first IBM PC. This device was, by most standards, technically inferior to the Apple. It possessed a slower processor, was bulky, and used a text-based approach to computing. Yet, despite these shortcomings, it and its brethren, the so-called IBM compatible machines, have dwarfed the Apple offerings over the past two decades. Why? Unlike Apple, the IBM compatible machines were based on an open architecture. The specifications for these machines were designed so that third-party vendors could develop hardware and software for them. In a sense, the best ideas from the best manufacturers get adopted and become the de-facto standard for that particular piece of hardware.

The final piece of the hardware development puzzle was to emerge in 1985 or 1986 in a somewhat unheralded manner. This final puzzle piece was the adoption of PC networking. Initial reactions to the development of the PC network concept were for the most part, negative. Individual users feared that networked computers would once again lead to I.T. control of what were, up till now, personal computers. Once PCs were networked, control would again be wrested from users back to the large mainframe computing departments of the sixties and seventies.

As it turns out, the PC network actually allowed individual users to communicate effectively, once the infrastructure was in place to allow for wired offices. Instead of wresting control away from users, the PC network allowed sharing and collaboration at previously unheard of levels. A new concept developed as a result of the PC network, known as the "network effect." The concept of the "network effect" is that the more people share information in a group, the more powerful the group becomes. Users gain more utility as more people, data and ideas are shared. If you are left out of the network, your productivity and connectivity suffer. It is now important to become connected, and users face the prospect of being stranded if they are not part of the larger network. The concept of the "network effect" is similar to the development of large public libraries or databases that become more useful, as more information is stored there.

To summarize, several trends can be seen in hardware development that drives the pace of change in silicon. First, the trend away from mainframe systems to mid-range systems supporting open standards. Next, the development of personal computers that encourage users to take control of data manipulation, storage and retrieval. The next trend is the development of an open architecture and OS that allows standards to be set based on the merits of the product, not a proprietary system. Finally, the development of a networked office where the power of the network is enhanced as more users are added.

These trends will continue, and likely accelerate, as users demand more functionality in a smaller and smaller foot print. The acceptance of PDAs (personal digital assistants), cell phones and pagers will fuel consumer demand for devices that are easier to use and always connected. The convergence of data and voice transmission over the same carrier network will lead to increasing features and lower price points for machines that offer multiple uses - telephone, pager, PC, Internet access - at the same time.

Software Development
Early software languages were developed to instruct computers in binary code. These assembler languages were very basic in function and instructed computers to perform what we would now consider routine run-time and maintenance tasks. Tedious to write and compile, these early languages had none of the programmer conveniences that we take for granted today, such as debugging and writing tools to help make the programmers' job easier. These languages have become known as first generation computing languages.

As engineers struggled to make the interaction between computer and user more intuitive, a new series of languages were developed such as Fortran and Cobol, the first of which was designed to be primarily a scientific programming language, while the second was designed to be a business programming language. These languages added editing and debugging features and were written in something resembling English-language commands.

The development of Cobol coincided with the widespread commercial use of mainframe and later, of mid-range computers. Other languages such as PL1 and RPGII were also adopted by mid-range computers and could arguably be called the first examples of third generation computing languages. These newer languages incorporated more English-like commands and syntax in the language structure, and incorporated new debugging and editing features directly into the language. As the basic language structure evolved, so too did the applications programs that were being developed. Early in the development of computer languages, a schism formed between that class of software that performed routine maintenance and run-time chores, which came to be known as the operating system (or OS) and a second class of software that performed specific tasks such as running a payroll or updating inventory, that became known as application software.

The widespread use and adoption of second and third generation programming languages corresponded with the growing use of mid-range computer systems. So too, the proliferation of application programs led to a growing acceptance of these departmental computer systems. In fact, the use of departmental computers was tied to the efficiently designed and executed single-purpose programs - such as inventory control or payroll processing, that were often performed in a self-contained business unit.

As computer hardware development evolved from mainframe to mid-range systems, the need for a computing system that allowed multiple users to access the machines and perform independent tasks, increased greatly. A group of Bell Lab scientists created such a language in the late nineteen-sixties, which allowed multiple users and which performed multiple tasks at the same time. This language, known as UNIX, seemed ideally suited to this new computing environment and caught on quickly. The combination of departmental computers and the UNIX language led to the development of distributed computing.

The advent of the personal computer accelerated the trends that were beginning to emerge in the distributed computing model. This model allowed computing power to be located in the hands of those people who required immediate use and manipulation of stored data, while at the same time providing network connectivity.

The original PC operating system, or DOS (disk operating system), was hardly the blueprint for distributed processing. The introduction of the IBM PC in the early eighties married an under-powered processor, memory configuration and hard disk drive (when available) to an anemic OS. Yet, this original machine would morph in rapid succession to a robust group of machines with network capabilities.

The catalyst for this change came in the form of early network cards that allowed PC users to connect to midrange machines or other PCs. These early adopters were driven primarily by a desire to share files or hardware devices (such as printers or larger hard drives) among work groups. Within a short period of time a specialized version of OS was developed to handle these chores more efficiently, with Novell being the most recognized provider of network operating systems. As the capabilities of these network operating systems expanded, new hardware devices were developed to take advantage of the specialized nature of network computing. In short order file servers, print servers and application servers (PCs developed to host application programs in one location) became commonplace.

At about the same time as the development of the network-computing model, a sea change occurred in the way users interacted with their machines. Until now, most application programs were relatively unchanged from their mainframe and midrange counterparts. These programs were for the most part, text-based, with some graphical elements thrown together in a jumbled, clumsy way. Once again Apple led the change in the form of the Macintosh graphical interface that was intuitive to use. Instead of invoking arcane command-line instructions, users could point and click at an object on the screen and launch a file, program or document with ease. The basis for the Apple graphical user interface, along with the point and click device (mouse) was conceived, but not commercialized, in the Xerox PARC facility in Palo Alto in the sixties. Microsoft developed its own version of the graphical user interface with their Windows "operating environment." The first two version of this environment literally ran on top of its famous DOS system in a somewhat ungainly manner. Microsoft finally got the user interface right in Windows 3.0. In similar fashion, Microsoft incorporated many of the benefits of the network operating system into Windows version 3.11, and later improved both the operating system and network features with Windows 95.

The stage was now set for the next "big thing" in computing. Once again, this next wave had its origins in the nineteen-sixties, only to appear as a full-blown implementation n the nineteen-nineties.

The Rise of the Internet
The Internet was conceived of in the nineteen sixties as a way to link the computing resources of several west-coast universities together. At the time, computational power was expensive, and shared resources were a way to defray the costs of deploying large systems. At the same time, the U.S. government realized that if it could build a similar networked structure, it would be difficult to totally disrupt computer operations and data in the event of a national disaster. Over the next two decades, more universities and governmental agencies were added to this patchwork quilt of networked machines.

In order to link disparate machines running on different operating systems, a common file transfer procedure would be required. The FTP (file transfer protocol) schema was developed for this purpose, allowing different machines to communicate effectively. Similarly, a method of routing these files and messages across different locations was also required. Out of this requirement came the development of TCP/IP protocols that determine how a file is routed through the system. These two developments supplied the backbone that was to become the Internet.

Throughout the eighties, the Internet remained the domain of people in the scientific and academic communities. Visionaries imagined that this network could be used to connect people easily across great distances and multiple computing platforms. This vision awaited the development of some sort of a device that allowed files to be easily viewed across multiple platforms. A group of computer scientists at the University of Illinois came up with the idea of a web browser, a program that allowed people to view files in a graphical manner. The first web browser, known as Mosaic, was launched in 1992. The development of this browser allowed people to easily locate and view files on the Internet, and led to the phenomenon known as the WWW (world-wide-web).

Essentially, the development of the WWW has allowed users to find files and communicate in a worldwide network. The use of the web has transformed an arcane file messaging system into a new media; one that is growing faster than any other media in history. The growth of the web is based on several of the trends noted earlier.

The open nature of the web allows contributions from multiple sources and computing platforms.

Contributions are not limited to programming professionals. People with very little computer training can contribute files, articles, and information on the web.
1. The web is suited to the dynamic nature of business and personal life. It no longer requires weeks, months, or even years to develop applications - these tasks can now be performed easily and in a short period of time.

2. As more people become accustomed to the web, and as adoption rates drop, PC purchases increase, causing further downward pressure on hardware prices. The costs of hardware and web access have been declining by 15% to 20% per year for the past several years.

3. The web is the ultimate "network effect." The more people participate, the more information is available, and the more critical it becomes to be included in the network.

4. The web has developed a new concept of speed. Internet time is a recently coined term for rapid development times that are roughly 7 times faster than "real" time. This notion of speed has spilled over into Internet business life, where all aspects of running an Internet business - sales, procurement, deal making, occur at warp-speed rates.

5. The economics of web space seem to defy business logic and gravity. People have developed a notion, rightly or wrongly, that information and services provided on the web are free. This has led to web companies developing unusual approaches to raising revenue in this new media. At the same time, the stock prices of web-based companies have achieved phenomenal valuations, seemingly unsupported by the need to have revenues or make earnings. This seeming dichotomy between lack of tangible earnings and high stock valuations will continue for awhile. The space for positioning on the web is one of a market share grab in what could become the largest media invented to date. In addition to sheer size, the web promises the Holy Grail of media - the ability to interact directly with a consumer to influence purchasing behavior.

6. The notion of competitive advantage, the idea that a company can gain a foothold over competitors through focus on a series of core values or competencies, such as Wal*Mart has built with logistics and deployment, or GE with developing management talent, is being dismantled by the web. The web is the ultimate leveling force. A site can be developed and released on the web, and in a matter of a few months, can spawn dozens of competitors many with improved features or benefits. In such an environment, the notion of sustainable competitive advantage has no real meaning, unless managed in weeks or months, not years or decades.

A Vision of the Future

Given the developments that occurred in computing during the past thirty years, how will we be affected by technology in the future? What are the trends that will affect us in the next several years, and how can we prepare for what many believe is a tumultuous, if exciting future?
The most important trend we face is the pace of change that will be occurring in hardware, software and bandwidth on the Internet. Eighteen-month development cycles are a thing of the past. Hardware and software manufacturers and developers are now operating on six to nine month development cycles. This cycle is from concept formation through to manufacture and distribution. The rise of the web has accelerated development times, and will continue to do so for the foreseeable future. To adapt to this trend, developers and manufacturers will have to plan for multiple critical paths and have the ability to react quickly to changes in business trends as they are planning, developing and implementing projects. People currently have the ability to plan in this manner on an intellectual level; in many industries this has been the accepted norm for the past few decades. On an emotional level, the cost of large-scale disruption, change and constant redeployment can and will be unsettling.

The second trend that will come to dominate our lives is the constant downward pressure on hardware and software prices coupled with the ever increasing demand for hardware and software to work easily. Again, there is a seeming contradiction between lower price and ease of use. As computer hardware and software become more mainstream, the need for simplicity and power will dominate every other consideration. Despite dire predictions that the end of the PC era is near, nothing could be further from the truth. PCs will remain with us for a long time to come. But their usage patterns will change. They will become file repositories, akin to vast research libraries. Users will gravitate toward more specialty devices to communicate (combination phones, electronic address books, web skimmers, message boards); process information (voice activated pads, storage devices, intelligent dictation systems); and be entertained (3D game players, downloadable video and music players, web-enabled real-time games connected to anywhere in the world, personalized concerts viewed from wearable stereo receivers).

The third trend that will come to dominate our thinking and beliefs in technology is the notion of ownership of intellectual property. When Netscape made the unique and courageous decision to give away its commercial browser technology - it essentially validated the concept of open computing - but it also set the notion of intellectual property rights on its ear. The foundation of intellectual property rights - that an author or inventor owned the writing or invention - has been the cornerstone of trademark and patent protection for the last 400 years. To give away this right - to make intellectual property free to be distributed modified and shared - is a sea change in the way we view human capital. If knowledge is power, the free distribution of knowledge will enable a new level of empowerment and use of human talent. Make no mistake, we will struggle mightily with how to value, reward and allocate resources to the developers and users of knowledge. Throughout history, this tension - struggle is you will - has led to heightened levels of creativity and knowledge.

The fourth trend is more disturbing in its implications. There have always been classes in human social structures. These classes have developed along economic lines with variations on the methods used to acquire greater economic resources (knowledge, brute power, ruthlessness, etc.). Over the next several decades we have the potential to develop a new social class, one that distinguishes between the connected and not connected. As the "network effect" of the Internet expands, those who are not connected stand to lose out on many of the benefits of the connected. Training, education, development, entertainment will all be provided by the Internet. For those not connected, the lost opportunities will be tremendous. We must ensure that this class distinction does not in fact occur, and that everyone shares the "network" effect equally.

The fifth trend that will occur is the death of market economies, as we know them. Market economies were developed to efficiently bring together groups of willing buyers and sellers in sufficient numbers to conduct business transactions easily. Over time, the emphasis on a single "market" shifted to that of specialized markets based on transactional need. As examples, consumer goods markets developed for retail selling; money markets evolved into banking and financial institutions; specialized financial institutions, such as stock and future markets developed, and over time, business-to-¬business markets have evolved. All of these markets, of whatever form, have developed around centralized physical locations. With the rise of the Internet, markets no longer require physical presence. Witness the success of Ecommerce; auction sites, computer and software purchases. Over the web, etc. This trend was actually postulated by Faith Popcorn several years ago when she noticed a trend toward "cocooning." She theorized that people wanted more privacy and less social interaction, or at least social interaction when they chose. The Internet allows people to cocoon, while at the same time interacting when and how they choose.

The final trend that will affect our lives will be the commercial expansion of the Internet. The web has touched our lives in many ways, and is literally growing-up before our eyes. How will we resolve Internet privacy issues? How will companies make money on the web? Are Internet stock valuations realistic, and sustainable? What information should be free, and what information should be paid for? How will we compensate people for their intellectual capital, if that capital is freely given away? What role should government play in determining Internet policy? How should Internet sales be taxed, and how do tax laws that are based on the notion of Nexus (the physical location of a place of agency) apply to an essentially location-less entity? These rhetorical questions are being asked by countless industry, think-tank, and governmental institutions on a daily basis, and over time, they will be resolved.

In Shakespeare's Tempest, Miranda upon viewing Caliban and Ariel for the first time declares to her father and other members of their landing party "0 brave new world that hath such people in't." Shakespeare was profoundly aware of the effect that the discovery of the New World had upon his audience. It was a time of intense excitement - "0 brave new world," but it was an excitement mixed with fear and uncertainty "that hath such people in't." The landing party in the Tempest was driven aground by a violent storm - the storm is a symbol of the change that was sweeping through Europe in the 1500's. The Tempest is Shakespeare's attempt to explain the forces that were at work in creating a New World - the forces of discovery, uncertainty, doubt and ultimately hope in creating a better world.

We are poised on the brink of a New World. For the first time in several hundred years, we have the ability to make major changes in the way we view the world, human capital and the sharing of knowledge. Oh brave New World that hath such people in't.
_________________________________


The Pulse of Technology - Keeping Pace With Continuous Change - November, 1998

!8!# Gas Cooktop Ranges Top Quality Mobile Samsung T Discounted Where To Buy Fuel Jerry Cans




0 comments:

Post a Comment


Twitter Facebook Flickr RSS



Français Deutsch Italiano Português
Español 日本語 한국의 中国简体。







Sponsor Links