Volume 39
Emergence, Complexity and Computation
Series Editors
Ivan Zelinka
Technical University of Ostrava, Ostrava, Czech Republic
Andrew Adamatzky
University of the West of England, Bristol, UK
Guanrong Chen
City University of Hong Kong, Hong Kong, China
Editorial Board
Ajith Abraham
MirLabs, USA
Ana Lucia
Universidade Federal do Rio Grande do Sul, Porto Alegre, Rio Grande do Sul, Brazil
Juan C. Burguillo
University of Vigo, Spain
Sergej elikovsk
Academy of Sciences of the Czech Republic, Czech Republic
Mohammed Chadli
University of Jules Verne, France
Emilio Corchado
University of Salamanca, Spain
Donald Davendra
Technical University of Ostrava, Czech Republic
Andrew Ilachinski
Center for Naval Analyses, USA
Jouni Lampinen
University of Vaasa, Finland
Martin Middendorf
University of Leipzig, Germany
Edward Ott
University of Maryland, USA
Linqiang Pan
Huazhong University of Science and Technology, Wuhan, China
Gheorghe Pun
Romanian Academy, Bucharest, Romania
Hendrik Richter
HTWK Leipzig University of Applied Sciences, Germany
Juan A. Rodriguez-Aguilar
IIIA-CSIC, Spain
Otto Rssler
Institute of Physical and Theoretical Chemistry, Tbingen, Germany
Vaclav Snasel
Technical University of Ostrava, Czech Republic
Ivo Vondrk
Technical University of Ostrava, Czech Republic
Hector Zenil
Karolinska Institute, Sweden
The Emergence, Complexity and Computation (ECC) series publishes new developments, advancements and selected topics in the fields of complexity, computation and emergence. The series focuses on all aspects of reality-based computation approaches from an interdisciplinary point of view especially from applied sciences, biology, physics, or chemistry. It presents new ideas and interdisciplinary insight on the mutual intersection of subareas of computation, complexity and emergence and its impact and limits to any computing based on physical limits (thermodynamic and quantum limits, Bremermanns limit, Seth Lloyd limits) as well as algorithmic limits (Gdels proof and its impact on calculation, algorithmic complexity, the Chaitins Omega number and Kolmogorov complexity, non-traditional calculations like Turing machine process and its consequences,) and limitations arising in artificial intelligence. The topics are (but not limited to) membrane computing, DNA computing, immune computing, quantum computing, swarm computing, analogic computing, chaos computing and computing on the edge of chaos, computational aspects of dynamics of complex systems (systems with self-organization, multiagent systems, cellular automata, artificial life,), emergence of complex systems and its computational aspects, and agent based computation. The main aim of this series is to discuss the above mentioned topics from an interdisciplinary point of view and present new ideas coming from mutual intersection of classical as well as modern methods of computation. Within the scope of the series are monographs, lecture notes, selected contributions from specialized conferences and workshops, special contribution from international experts.
Indexed by zbMATH.
More information about this series at http://www.springer.com/series/10624
Editors
Ivan Zelinka , Massimo Brescia and Dalya Baron
Intelligent Astrophysics
1st ed. 2021
Logo of the publisher
Editors
Ivan Zelinka
Faculty of Electrical Engineering and Computer Science, VB-TU Ostrava, Ostrava, Czech Republic
Massimo Brescia
INAF, Astronomical Observatory of Capodimonte, Napoli, Italy
Dalya Baron
School of Physics and Astronomy, Tel Aviv University, Tel Aviv, Israel
ISSN 2194-7287 e-ISSN 2194-7295
Emergence, Complexity and Computation
ISBN 978-3-030-65866-3 e-ISBN 978-3-030-65867-0
https://doi.org/10.1007/978-3-030-65867-0
The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2021
This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface
During the development of humankind and the knowledge of our world, a large amount of information was produced. This information of a scientific as well as a technical nature then again contributed to the further development of our civilization. The amount of data that man has produced during scientific development increases exponentially with time. In the last five years, humankind has produced more information and data than in all its previous existential development. With increasing data, there is a need to have methods that can effectively process them and present the data to humans. Processing here means cleaning from noise, searching for useful information, visualization and the like. In the last hundred years, with the development of computers and information science, efficient algorithms have begun to emerge, which today belong to the so-called artificial intelligence and which are able to perform just such tasks. The first has been used classical algorithms for filtering, compression, dimension reduction, and many other tasks. Later, another algorithms enrich this class, which today belong to machine learning, which is basically part of artificial intelligence, and with the help of these methods and algorithms, we can process very large data in real time, which is important for science as such. The same goes for technology and society. Areas, where these methods are most needed, include physics and astrophysics. In physics, let us mention the CERN accelerator in Geneva, Switzerland, where experiments produce an enormous amount of data in fractions of a second. All this data must be processed precisely. Another area is in astrophysics, which, thanks to high robotics and automation, has become an area that is literally flooding us with by data. Today, it is a common fact and the fact that robotic telescopes spew up petabytes of data in one night. If we realize that at the time of Johannes Keplers discovery, about 400 Kb of data was enough to discover the famous Kepler laws, it is clear that many discoveries can be hidden in these petabytes, which can literally flow between our fingers. Therefore, especially in astrophysics, all-important methods in the field of machine learning are very effectively applied, most notably neural networks, various filtering algorithms and recently it has been shown that evolutionary algorithms are also a very capable tool for processing such data and their possible modeling.