• Complain

Lars Nielsen - Computing: A Business History

Here you can read online Lars Nielsen - Computing: A Business History full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. year: 2012, publisher: New Street Communications, LLC, genre: Romance novel. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

Lars Nielsen Computing: A Business History
  • Book:
    Computing: A Business History
  • Author:
  • Publisher:
    New Street Communications, LLC
  • Genre:
  • Year:
    2012
  • Rating:
    3 / 5
  • Favourites:
    Add to favourites
  • Your mark:
    • 60
    • 1
    • 2
    • 3
    • 4
    • 5

Computing: A Business History: summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "Computing: A Business History" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

Lars Nielsen engagingly shows why weve got an unlikely partnership - the American military-industrial complex teamed with a generation of pot-smoking hippie whiz-kids - to thank for todays digital economy.

Lars Nielsen: author's other books


Who wrote Computing: A Business History? Find out the surname, the name of the author of the book and a list of all author's works by series.

Computing: A Business History — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "Computing: A Business History" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make
COMPUTING A Business History Lars Nielsen 2011 New Street - photo 1
COMPUTING

A Business History

Lars Nielsen

2011
New Street Communications, LLC
Wickford, RI
2011
New Street Communications, LLC

All rights reserved under International and Pan-American Copyright Conventions. Except for brief quotations for review purposes, no part of this book may be reproduced in any form without the permission of New Street Communications, LLC.

Cover photograph of Bill Gates by Severin Nowacki. Copyright by The World Economic Forum. Used by permission. Photo of Steve Jobs copyright by Matthew Geer. Used by permission.

Published 2011 by New Street Communications, LLC
Wickford, Rhode Island

newstreetcommunications.com
Contents



















Preface

"Ready or not, computers are coming to the people. That's good news, maybe the best since psychedelics."
- Stewart Brand, "Spacewar: Fanatic Life and Symbolic Death Among the Computer Bums," fifth anniversary issue of Rolling Stone , December 7, 1972

The years since 1946 have seen one of the greatest revolutions in the history of mankind. In less than seven decades, computer technology has advanced from the primitive and cumbersome ENIAC mainframe to the compact and elegant PC, thence to the extensive personal and business use of the Internet, and finally to a point where things digital have become a fundamental part of society's DNA. "Computing is not about computers anymore," writes Nicholas Negroponte of the MIT Media Lab. "It is about living."

During these years, the speed of digital innovation grew in step with its ambition and - no small thing - excellence. It also moved at the speed of Moore's Law (first articulated by Intel co-founder Gordon E. Moore in 1965). According to Moore's Law - which proved to be a quite accurate prediction - the number of transistors that could be placed inexpensively on an integrated circuit would double approximately every two years going forward. This exponential improvement in processing capacity, which only grew more rapid with the adoption of the silicon chip in the 1970s, has provided the vital foundation on which all digital innovation (requiring ever-increasing processing speed and memory capacity) has been based.

Steve Jobs, Bill Gates and most other great innovators of the new digital era had not yet even been born as of 1949, the year Popular Mechanics breathlessly and ambitiously predicted: "Computers in the future may weigh no more than 1.5 tons." Coming into the world during the mid-1950s, these future technologists and entrepreneurs grew up during the period of America's greatest historical prosperity - a time of profound excellence in education, and a time of robust corporate growth (in part enabled by, and in part itself enabling the development and broad-based adoption of early business data systems). Indeed the greatest looming corporate success story of the 1950s and 1960s was IBM. Despite Dustin Hoffman's character in The Graduate being advised to go into "plastics," most young people coming of age in educated households during the 1960s were actually advised to look toward something quite different: computers.

The revolution was on , and no number of nay-sayers could stop it. "I have traveled the length and breadth of this country and talked with the best people, and I can assure you that data processings [sic] is a fad that won't last out the year." So said Prentice-Hall's editor-in-chief for business books in 1957. "But what ... is it good for?" asked an engineer at IBM's Advanced Computing Systems Division when confronted with the idea of the microchip in 1968. As late as the 1980s, technologist Ken Olsen, cofounder of Digital Equipment Corporation (DEC), was heard to insist: "There is no reason anyone would want a computer in their home."

In the end, the greatest of all digital innovations - the personal computer - evolved from without the corporate culture, rather than within - a product of the Woodstock generation. "So we went to Atari and said, 'Hey, we've got this amazing thing, even built with some of your parts, and what do you think about funding us? Or we'll give it to you. We just want to do it. Pay our salary, we'll come work for you,'" recalls Steve Jobs. "And they said, 'No.' So then we went to Hewlett-Packard, and they said, 'Hey, we don't need you. You haven't got through college yet.'" And so it went.

*

Ultimately, the history of business computing is characterized by great successive creative leaps of [informed] imagination: of thinking outside the veritable box; of intellectual bootstrapping from one paradigm to another, the latter always (or nearly always) an improvement. (It is prudent to remember Woody Allen's famous comment with reference to comedy: "If you're not failing every now and again, it's a sign you're not doing anything very innovative.")

"Innovation distinguishes between a leader and a follower," notes Jobs. To this, Bill Gates adds: "Never before, in history, has innovation offered the promise of so much to so many in so short a time." And that innovation is nowhere near over. "The Web as I envisaged it," writes its inventor Tim Berners-Lee, "we have not seen it yet. The future is still so much bigger than the past."

In chronicling the first six and a half decades of the digital revolution, this volume comprises what will be, in the final analysis, but the beginning in a far longer tale than we can today imagine. Shakespeare told us that what is past is but prologue. Here, then, is one man's version of that prologue.

Lars Nielsen, 1 July 2011

Amsterdam, Holland

1
From Military to Marketplace

"If you don't want to be replaced by a computer, don't act like one."
- Arno Penzias, computer scientist

As shall be seen, research and development by the military-industrial complex of the United States has played a major role in various aspects of computer development throughout the decades. This was never so much the case, however, as in the creation of the very first major computer, the ENIAC (Electronic Numerical Integrator and Computer).

Funded by the U.S. Army in 1943 and developed at a cost of $500,000 by John Mauchly and J. Presper Eckert at the University of Pennsylvania's Moore School of Electrical Engineering, the ENIAC was a cumbersome, massive machine, but also a technological wonder at the time of its public debut in 1946.

Consider the heft: 17,468 vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints. Taking up 1,800 square feet, the ENIAC weighed more than 27 tons. (Writing in 2004 for the IEEE Annals of the History of Computing , David Allen Grier, a technology historian at George Mason University, said the machine was best described "as a collection of electronic adding machines and other arithmetic units, which were originally controlled by a web of large electrical cables.") Today, UPenn's School of Engineering and Applied Science has on display four of the original 40 panels of the ENIAC.

Input was via IBM card reader, and output via IBM card punch. Once the painstaking job of input was finished, actual computing took place at what seemed at the time like warp-speed. A multiplication of a 10-digit number by a single-digit number took 1400 microseconds (714 per second), a 10- by 10-digit multiplication took 2800 microseconds (357 per second), and so forth. A division or square root problem took approximately 28,600 microseconds (35 per second). These speeds were no less than one thousand times faster than those offered by the previous generation of electro-mechanical calculator machines. This quantum leap in computing power has never since been matched with reference to the introduction of any single new machine.

Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «Computing: A Business History»

Look at similar books to Computing: A Business History. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «Computing: A Business History»

Discussion, reviews of the book Computing: A Business History and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.