• Complain

Ceruzzi - Computing: a concise history

Here you can read online Ceruzzi - Computing: a concise history full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. City: Cambridge;Mass, year: 2012, publisher: MIT Press, genre: Computer. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

Ceruzzi Computing: a concise history
  • Book:
    Computing: a concise history
  • Author:
  • Publisher:
    MIT Press
  • Genre:
  • Year:
    2012
  • City:
    Cambridge;Mass
  • Rating:
    3 / 5
  • Favourites:
    Add to favourites
  • Your mark:
    • 60
    • 1
    • 2
    • 3
    • 4
    • 5

Computing: a concise history: summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "Computing: a concise history" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

The history of computing could be told as the story of hardware and software, or the story of the Internet, or the story of smart hand-held devices, with subplots involving IBM, Microsoft, Apple, Facebook and Twitter. In this account of the invention and development of digital technology, the author, a computer historian, offers a broader and more useful perspective. He identifies four major threads running throughout all of computings technological development: digitization, the coding of information, computation, and control in binary form, ones and zeros; the convergence of multiple streams of techniques, devices and machines, yielding more than the sum of their parts; the steady advance of electronic technology, as characterized famously by Moores Law; and the human-machine interface. He guides us through computing history, telling how a Bell Labs mathematician coined the word digital in 1942 (to describe a high-speed method of calculating used in anti-aircraft devices), and recounting the development of the punch card (for use in the 1890 U.S. Census). He describes the ENIAC, built for scientific and military applications; the UNIVAC, the first general purpose computer; and ARPANET, the Internets precursor. His account traces the world-changing evolution of the computer from a room-size ensemble of machinery to a minicomputer to a desktop computer to a pocket-sized smart phone. He describes the development of the silicon chip, which could store ever-increasing amounts of data and enabled ever-decreasing device size. He visits that hotbed of innovation, Silicon Valley, and brings the story up to the twenty-first century with the Internet, the World Wide Web, and social networking.;The digital age -- The first computers, 1935-1945 -- The stored program principle -- The chip and Silicon Valley -- The microprocessor -- The Internet and the World Wide Web -- Conclusion.

Ceruzzi: author's other books


Who wrote Computing: a concise history? Find out the surname, the name of the author of the book and a list of all author's works by series.

Computing: a concise history — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "Computing: a concise history" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make

Computing

MIT Press Essential Knowledge Series

Computing: A Concise History,

Paul E. Ceruzzi

Information and the Modern Corporation,

James Cortada

Intellectual Property Strategy,

John Palfrey

Open Access,

Peter Suber

Computing

A Concise History

Paul E. Ceruzzi

The MIT Press
Cambridge, Massachusetts
London, England

2012 Smithsonian Institution

All rights reserved. No part of this book may be reproduced in any form by any electronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing from the publisher.

For information about special quantity discounts, please email .

Library of Congress Cataloging-in-Publication Data

Cerruzi, Paul E. Computing : a concise history / Paul E. Ceruzzi.

p. cm. (MIT Press essential knowledge)

Includes bibliographical references and index.

ISBN 978-0-262-51767-6 (pbk : alk. paper)

ISBN 978-0-262-31039-0 (retail e-book)

1. Computer scienceHistory. I. Title.

QA76.17.C467 2012

004dc23

2011053181

10 9 8 7 6 5 4 3 2 1

Series Foreword

The MIT Press Essential Knowledge series presents short, accessible books on need-to-know subjects in a variety of fields. Written by leading thinkers, Essential Knowledge volumes deliver concise, expert overviews of topics ranging from the cultural and historical to the scientific and technical. In our information age, opinion, rationalization, and superficial descriptions are readily available. Much harder to come by are the principled understanding and foundational knowledge needed to inform our opinions and decisions. This series of beautifully produced, pocket-sized, soft-cover books provides in-depth, authoritative material on topics of current interest in a form accessible to nonexperts. Instead of condensed versions of specialist texts, these books synthesize anew important subjects for a knowledgeable audience. For those who seek to enter a subject via its fundamentals, Essential Knowledge volumes deliver the understanding and insight needed to navigate a complex world.

Bruce Tidor

Professor of Biological Engineering and Computer Science

Massachusetts Institute of Technology

Introduction

A familiar version of Zenos paradox states that it is impossible for a runner to finish a race. First, he must traverse one-half of distance to the finish, which takes a finite time; then he must traverse one-half of the remaining distance, which takes a shorter but also finite time; and so on. To reach the finish line would thus require an infinite number of finite times, and so the race can never be won. The history of computing likewise can never be written. New developments transform the field while one is writing, thus rendering obsolete any attempt to construct a coherent narrative. A decade ago, historical narratives focused on computer hardware and software, with an emphasis on the IBM Corporation and its rivals, including Microsoft. That no longer seems so significant, although these topics remain important. Five years ago, narratives focused on the Internet, especially in combination with the World Wide Web and online databases. Stand-alone computers were important, but the network and its effects were the primary topics of interest. That has changed once again, to an emphasis on a distributed network of handheld devices, linked to a cloud of large databases, video, audio, satellite-based positioning systems, and more. In the United States the portable devices are called smart phonesthe name coming from the devices from which they descendedbut making phone calls seems to be the least interesting thing they do. The emphasis on IBM, Microsoft, and Netscape has given way to narratives that place Google and Apple at the center. Narratives are compelled to mention Facebook and Twitter at least once in every paragraph. Meanwhile the older technologies, including mainframe computers, continue to hum along in the background. And the hardware on which all of this takes place continues to rely on a device, the microprocessor, invented in the early 1970s.

Mathematicians have refuted Zenos paradox. This narrative will also attempt to refute Zenos paradox as it tells the story of the invention and subsequent development of digital technologies. It is impossible to guess what the next phase of computing will be, but it is likely that whatever it is, it will manifest four major threads that run through the story.

The Digital Paradigm

The first of these is a digital paradigm: the notion of coding information, computation, and control in binary form, that is, a number system that uses only two symbols, 1 and 0, instead of the more familiar decimal system that human beings, with their ten fingers, have used for millennia. It is not just the use of binary arithmetic, but also the use of binary logic to control machinery and encode instructions for devices, and of binary codes to transmit information. This insight may be traced at least as far back as George Boole, who described laws of logic in 1854, or before that to Gottfried Wilhelm Leibinz (16461716). The history that follows discusses the often-cited observation that digital methods of calculation prevailed over the analog method. In fact, both terms came into use only in the 1930s, and they never were that distinct in that formative period. The distinction is valid and is worth a detailed look, not only at its origins but also how that distinction has evolved.

Convergence

A second thread is the notion that computing represents a convergence of many different streams of techniques, devices, and machines, each coming from its own separate historical avenue of development. The most recent example of this convergence is found in the smart phone, a merging of many technologies: telephone, radio, television, phonograph, camera, teletype, computer, and a few more. The computer, in turn, represents a convergence of other technologies: devices that calculate, store information, and embody a degree of automatic control. The result, held together by the common glue of the digital paradigm, yields far more than the sum of the individual parts. This explains why such devices prevail so rapidly once they pass a certain technical threshold, for example, why digital cameras, almost overnight around 2005, drove chemical-based film cameras into a small niche.

Solid-State Electronics

The third has been hinted at in relation to the second: this history has been driven by a steady advance of underlying electronics technology. That advance has been going on since the beginning of the twentieth century; it accelerated dramatically with the advent of solid-state electronics after 1960. The shorthand description of the phenomenon is Moores law: an empirical observation made in 1965 by Gordon Moore, a chemist working in what later became known as Silicon Valley in California. Moore observed that the storage capacity of computer memory chips was increasing at a steady rate, doubling every eighteen months. It has held steady over the following decades. Moore was describing only one type of electrical circuit, but variants of the law are found throughout this field: in the increase in processing speeds of a computer, the capacity of communications lines, memory capacities of disks, and so forth. He was making an empirical observation; the law may not continue, but as long as it does, it raises interesting questions for historians. Is it an example of technological determinism: that technological advances drive history? The cornucopia of digital devices that prevails in the world today suggests it is. The concept that technology drives history is anathema to historians, who argue that innovation also works the other way around: social and political forces drive inventions, which in turn shape society. The historical record suggests that both are correct, a paradox as puzzling as Zenos but even harder to disentangle.

Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «Computing: a concise history»

Look at similar books to Computing: a concise history. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «Computing: a concise history»

Discussion, reviews of the book Computing: a concise history and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.