• Complain

David Ellerman - New Foundations for Information Theory: Logical Entropy and Shannon Entropy

Here you can read online David Ellerman - New Foundations for Information Theory: Logical Entropy and Shannon Entropy full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. year: 2021, publisher: Springer, genre: Romance novel. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

David Ellerman New Foundations for Information Theory: Logical Entropy and Shannon Entropy
  • Book:
    New Foundations for Information Theory: Logical Entropy and Shannon Entropy
  • Author:
  • Publisher:
    Springer
  • Genre:
  • Year:
    2021
  • Rating:
    3 / 5
  • Favourites:
    Add to favourites
  • Your mark:
    • 60
    • 1
    • 2
    • 3
    • 4
    • 5

New Foundations for Information Theory: Logical Entropy and Shannon Entropy: summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "New Foundations for Information Theory: Logical Entropy and Shannon Entropy" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications.
Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy is a probability measure on the information sets, the probability that on two independent trials, a distinction or dit of the partition will be obtained.
The formula for logical entropy is a new derivation of an old formula that goes back to the early twentieth century and has been re-derived many times in different contexts. As a probability measure, all the compound notions of joint, conditional, and mutual logical entropy are immediate. The Shannon entropy (which is not defined as a measure in the sense of measure theory) and its compound notions are then derived from a non-linear dit-to-bit transform that re-quantifies the distinctions of a random variable in terms of bitsso the Shannon entropy is the average number of binary distinctions or bits necessary to make all the distinctions of the random variable. And, using a linearization method, all the set concepts in this logical information theory naturally extend to vector spaces in generaland to Hilbert spaces in particularfor quantum logical information theory which provides the natural measure of the distinctions made in quantum measurement.
Relatively short but dense in content, this work can be a reference to researchers and graduate students doing investigations in information theory, maximum entropy methods in physics, engineering, and statistics, and to all those with a special interest in a new approach to quantum information theory.

David Ellerman: author's other books


Who wrote New Foundations for Information Theory: Logical Entropy and Shannon Entropy? Find out the surname, the name of the author of the book and a list of all author's works by series.

New Foundations for Information Theory: Logical Entropy and Shannon Entropy — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "New Foundations for Information Theory: Logical Entropy and Shannon Entropy" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make
Contents
Landmarks
Book cover of New Foundations for Information Theory SpringerBriefs in - photo 1
Book cover of New Foundations for Information Theory
SpringerBriefs in Philosophy

SpringerBriefs present concise summaries of cutting-edge research and practical applications across a wide spectrum of fields. Featuring compact volumes of 50 to 125 pages, the series covers a range of content from professional to academic. Typical topics might include:

  • A timely report of state-of-the art analytical techniques

  • A bridge between new research results, as published in journal articles, and a contextual literature review

  • A snapshot of a hot or emerging topic

  • An in-depth case study or clinical example

  • A presentation of core concepts that students must understand in order to make independent contributions

SpringerBriefs in Philosophy cover a broad range of philosophical fields including: Philosophy of Science, Logic, Non-Western Thinking and Western Philosophy. We also consider biographies, full or partial, of key thinkers and pioneers.

SpringerBriefs are characterized by fast, global electronic dissemination, standard publishing contracts, standardized manuscript preparation and formatting guidelines, and expedited production schedules. Both solicited and unsolicited manuscripts are considered for publication in the SpringerBriefs in Philosophy series. Potential authors are warmly invited to complete and submit the Briefs Author Proposal form. All projects will be submitted to editorial review by external advisors.

SpringerBriefs are characterized by expedited production schedules with the aim for publication 8 to 12 weeks after acceptance and fast, global electronic dissemination through our online platform SpringerLink. The standard concise author contracts guarantee that

  • an individual ISBN is assigned to each manuscript

  • each manuscript is copyrighted in the name of the author

  • the author retains the right to post the pre-publication version on his/her website or that of his/her institution.

More information about this series at http://www.springer.com/series/10082

David Ellerman
New Foundations for Information Theory
Logical Entropy and Shannon Entropy
1st ed. 2021
Logo of the publisher David Ellerman Faculty of Social Sciences University - photo 2
Logo of the publisher
David Ellerman
Faculty of Social Sciences, University of Ljubljana, Ljubljana, Slovenia
ISSN 2211-4548 e-ISSN 2211-4556
SpringerBriefs in Philosophy
ISBN 978-3-030-86551-1 e-ISBN 978-3-030-86552-8
https://doi.org/10.1007/978-3-030-86552-8
Mathematics Subject Classication (2010): 03A05 94A17
SpringerBriefs in Philosophy
The Author(s), under exclusive license to Springer Nature Switzerland AG 2021
This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG

The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland

To the memory of Gian-Carlo Rotamathematician, philosopher, mentor, and friend.

About this book

This book presents a new foundation for information theory where the notion of information is defined in terms of distinctions, differences, distinguishability, and diversity. The direct measure is logical entropy which is the quantitative measure of the distinctions made by a partition. Shannon entropy is a transform or re-quantification of logical entropy that is especially suited for Claude Shannons mathematical theory of communications. The interpretation of the logical entropy of a partition is the two-draw probability of getting a distinction of the partition (a pair of elements distinguished by the partition) so it realizes a dictum of Gian-Carlo Rota: Andrei Kolmogorov suggested that information should be defined independently - photo 3 . Andrei Kolmogorov suggested that information should be defined independently of probability, so logical entropy is first defined in terms of the set of distinctions of a partition and then a probability measure on the set defines the quantitative version of logical entropy. We give a history of the logical entropy formula that goes back to Corrado Ginis 1912 index of mutability and has been rediscovered many times. In addition to being defined as a (probability) measure in the sense of measure theory (unlike Shannon entropy), logical entropy is always non-negative (unlike three-way Shannon mutual information) and finitely-valued for countable distributions. One perhaps surprising result is that in spite of decades of MaxEntropy efforts based maximizing Shannon entropy, the maximization of logical entropy gives a solution that is closer to the uniform distribution in terms of the usual (Euclidean) notion of distance. When generalized to metrical differences, the metrical logical entropy is just twice the variance, so it connects to the standard measure of variation in statistics. Finally, there is a semi-algorithmic procedure to generalize set-concepts to vector-space concepts. In this manner, logical entropy defined for set partitions is naturally extended to the quantum logical entropy of direct-sum decompositions in quantum mechanics. This provides a new approach to quantum information theory that directly measures the results of quantum measurement.

Acknowledgements

After Gian-Carlo Rotas untimely death in 1999, many of us who had worked or coauthored with him set about to develop parts of his unfinished work. My own work focused on developing his initial efforts on the logic of equivalence relations or partitions. Since the notion of a partition on a set is the category-theoretic dual to notion of a subset of a set, the first product was the logic of partitions (see the Appendix) that is dual to the Boolean logic of subsets (usually presented as propositional logic). Rota had additionally emphasized that the quantitative version of subset logic was logical probability theory and that probability is to subsets as information is to partitions. Hence the next thing to do in this Rota-suggested program was to develop the quantitative version of partition logic as a new approach to information theory. The topic of this book is this logical theory of information based on the notion of the logical entropy of a partitionthat then transforms in a uniform way to the well-known entropy formulas of Shannon. The book is dedicated to Rotas memory to acknowledge his influence on this whole project.

Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «New Foundations for Information Theory: Logical Entropy and Shannon Entropy»

Look at similar books to New Foundations for Information Theory: Logical Entropy and Shannon Entropy. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «New Foundations for Information Theory: Logical Entropy and Shannon Entropy»

Discussion, reviews of the book New Foundations for Information Theory: Logical Entropy and Shannon Entropy and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.