• Complain

Stefan Hollos - Information Theory A Concise Introduction

Here you can read online Stefan Hollos - Information Theory A Concise Introduction full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. year: 2015, publisher: Abrazol Publishing, genre: Computer / Science. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

Stefan Hollos Information Theory A Concise Introduction
  • Book:
    Information Theory A Concise Introduction
  • Author:
  • Publisher:
    Abrazol Publishing
  • Genre:
  • Year:
    2015
  • Rating:
    3 / 5
  • Favourites:
    Add to favourites
  • Your mark:
    • 60
    • 1
    • 2
    • 3
    • 4
    • 5

Information Theory A Concise Introduction: summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "Information Theory A Concise Introduction" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

Stefan Hollos: author's other books


Who wrote Information Theory A Concise Introduction? Find out the surname, the name of the author of the book and a list of all author's works by series.

Information Theory A Concise Introduction — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "Information Theory A Concise Introduction" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make
Information Theory: A Concise Introduction
Contents
Preface

Books on information theory tend to fall into one of two extreme categories. There are large academic textbooks that cover the subject with great depth and rigor. Probably the best known of these is the book by Cover and Thomas. At the other extreme are the popular books such as the ones by Pierce and Gleick. They provide a very superficial introduction to the subject, enough to engage in cocktail party conversation but little else. This book attempts to bridge these two extremes.

This book is written for someone who is at least semi-mathematically literate and wants a concise introduction to some of the major concepts in information theory. The level of mathematics needed is very elementary. A rudimentary grasp of logarithms, probability, and basic algebra is all that is required. Two chapters at the end of the book provide a review of everything the reader needs to know about logarithms and discrete probability to get the most out of the book. Very little attention is given to mathematical proof. Instead we try to present the results in a way that makes them almost obvious or at least plausible.

We start in the introduction with a discussion of how information theory has its roots in the field of communication systems design. This leads to the question of how to quantify information and how a logarithmic measure is the most sensible. The concept of entropy is introduced at this point but only for the case where all messages are equally probable. The introduction ends with two examples of how information concepts come up in areas seemingly unrelated to communication. The first is a number guessing game and the second is the problem of finding a counterfeit coin.

The next chapter looks at the problem of encoding messages as efficiently as possible. This is the source coding or data compression problem. The idea of prefix free codes and the Kraft-McMillan inequality are introduced. It is shown how the entropy is a lower limit for the average length of a code word.

The following two chapters discuss specific coding techniques. The first is Huffman coding. Three detailed examples of constructing a Huffman code are worked out. Software for constructing Huffman codes can be found on the books website:

The next coding chapter discusses a powerful technique called arithmetic coding. This technique encodes a string of symbols as a single code word. For long strings of symbols it gets very close to the per symbol entropy.

There is a long chapter devoted just to the concept of entropy. How to calculate joint and conditional entropy is covered along with a detailed example of how to use them. There is a discussion of mutual information, what it means and how it is calculated. This is followed by a simple example of how to calculate the entropy of a Markov chain. The chapter ends with an elementary example using the principle of maximum entropy to infer a probability distribution.

Calculating the entropy of English is covered in the next chapter. It shows how to use the statistics of n-grams to get a series of increasingly accurate estimates for the entropy. It also shows how to use the statistics of words to estimate the entropy.

The next chapter covers channel capacity and the noisy channel coding theorem. It shows in general how to calculate channel capacity but only the capacity of a binary symmetric channel is worked out in detail. There is a brief discussion of the noisy channel coding theorem with no proofs. The chapter ends with an unusual example of the use of channel capacity in gambling and investing. This is called the Kelly gambling system.

The final chapter is a brief introduction to the topic of error correction or channel coding. Repetition codes, parity check codes, and Hamming codes are covered.

We hope this book is useful for someone looking for a fast introduction to most of the major topics in information theory. An introduction that is concise but not superficial.

Introduction

The field of information theory was created almost entirely by one man, an American mathematician and engineer named Claude Elwood Shannon. This is a man whose list of intellectual accomplishments is so impressive, you have to wonder if hes not a figment of someones imagination.

Picture 1

Claude Elwood Shannon - The Father of Information Theory.

Shannon was born in 1916 in Michigan, and grew up in the small town of Gaylord, Michigan. In 1936 he graduated from the University of Michigan with a bachelor degree in mathematics and another in electrical engineering. He then went on to graduate school at the Massachusetts Institute of Technology. His 1937 masters thesis was on the use of Boolean algebra to design relay and switching circuits. This work provided the foundation for what would later become known as digital circuit design, which is the field of electrical engineering concerned with the design of digital computers and other kinds of digital circuits. His 1940 PhD thesis was on theoretical genetics.

After spending some time at the Institute for Advanced Study in Princeton, Shannon went to work at the Bell Telephone Labs. There he was exposed to problems in communication theory and cryptography which he worked on during World War II. In 1948 he published a paper in two parts in the Bell System Technical Journal titled A Mathematical Theory of Communication. This paper is widely considered to be the founding document of information theory.

Shannon created information theory to solve communication problems. The goal of communication is to transmit information from one point in space/time to another point in space/time. Some common examples are radio broadcasts, the interaction of a web browser with a web server, and the storage of a file on disk for later access.

Communication problems fall roughly into two categories. First we have the source coding problem where the goal is to represent information as efficiently as possible. This is also known as data compression and it is widely used in computer and communication systems. Most image, audio, and video file formats use some form of data compression. The second category is called channel coding where the goal is to encode information so that it can be transmitted over a channel with as little probability of error as possible. Televisions, cell phones, the internet, CDs and DVDs all use some form of channel coding to reliably transmit information.

One of the first things information theory must do is provide a way to measure and encode information. In this chapter we will look at how to measure information and how the measurement is related to the way the information is encoded. The next chapter looks at the more general case where some messages in a communication system are more probable than others. Both this and the next three chapters address the source coding problem.

Measuring information mathematically, or objectively in any way, seems at first to be almost impossible. Humans attach differing degrees of importance to information. Your hair is on fire is generally regarded as more important than you have ketchup on your nose. So how do you measure importance? Luckily we dont have to answer this question.

Shannon realized that any coherent and objective measurement of information must be independent of its importance or meaning. The only thing that matters is that information consists of a message or series of messages that are drawn from some set of possible messages. Some of these messages may be more probable than others. It is more probable that you have ketchup on your nose than your hair is on fire. The probability of a message is what must be taken into account when measuring information. Information theory does not consider the meaning or importance of a message.

Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «Information Theory A Concise Introduction»

Look at similar books to Information Theory A Concise Introduction. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «Information Theory A Concise Introduction»

Discussion, reviews of the book Information Theory A Concise Introduction and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.