• Complain

Minrui Zheng - Spatially Explicit Hyperparameter Optimization for Neural Networks

Here you can read online Minrui Zheng - Spatially Explicit Hyperparameter Optimization for Neural Networks full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. year: 2021, publisher: Springer, genre: Romance novel. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

Minrui Zheng Spatially Explicit Hyperparameter Optimization for Neural Networks
  • Book:
    Spatially Explicit Hyperparameter Optimization for Neural Networks
  • Author:
  • Publisher:
    Springer
  • Genre:
  • Year:
    2021
  • Rating:
    4 / 5
  • Favourites:
    Add to favourites
  • Your mark:
    • 80
    • 1
    • 2
    • 3
    • 4
    • 5

Spatially Explicit Hyperparameter Optimization for Neural Networks: summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "Spatially Explicit Hyperparameter Optimization for Neural Networks" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

Neural networks as the commonly used machine learning algorithms, such as artificial neural networks (ANNs) and convolutional neural networks (CNNs), have been extensively used in the GIScience domain to explore the nonlinear and complex geographic phenomena. However, there are a few studies that investigate the parameter settings of neural networks in GIScience. Moreover, the model performance of neural networks often depends on the parameter setting for a given dataset. Meanwhile, adjusting the parameter configuration of neural networks will increase the overall running time. Therefore, an automated approach is necessary for addressing these limitations in current studies. This book proposes an automated spatially explicit hyperparameter optimization approach to identify optimal or near-optimal parameter settings for neural networks in the GIScience field. Also, the approach improves the computing performance at both model and computing levels. This book is written for researchers of the GIScience field as well as social science subjects.

Minrui Zheng: author's other books


Who wrote Spatially Explicit Hyperparameter Optimization for Neural Networks? Find out the surname, the name of the author of the book and a list of all author's works by series.

Spatially Explicit Hyperparameter Optimization for Neural Networks — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "Spatially Explicit Hyperparameter Optimization for Neural Networks" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make
Contents
Landmarks
Book cover of Spatially Explicit Hyperparameter Optimization for Neural - photo 1
Book cover of Spatially Explicit Hyperparameter Optimization for Neural Networks
Minrui Zheng
Spatially Explicit Hyperparameter Optimization for Neural Networks
1st ed. 2021
Logo of the publisher Minrui Zheng School of Public Administration and - photo 2
Logo of the publisher
Minrui Zheng
School of Public Administration and Policy, Renmin University of China, Beijing, China
ISBN 978-981-16-5398-8 e-ISBN 978-981-16-5399-5
https://doi.org/10.1007/978-981-16-5399-5
The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021
This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Singapore Pte Ltd.

The registered company address is: 152 Beach Road, #21-01/04 Gateway East, Singapore 189721, Singapore

To my parents and my advisor (Dr. Wenwu Tang). Their endless support has been encouraging to continue my research.

Preface

Neural networks as a commonly used machine learning algorithms (e.g., artificial neural networks and convolutional neural networks) have been extensively used in GIScience domain to explore the nonlinear or/and complex geographic phenomena. However, how to automatically adjust the parameters of neural networks is still an open question in GIScience. Moreover, the model performance of neural networks often depends on the parameter setting for a given dataset. Meanwhile, adjusting the parameter configuration of neural networks will increase the overall running time. In this book, the author proposed an automated spatially explicit hyperparameter optimization approach to identify optimal or near-optimal parameter settings for neural networks and accelerate the search process through both model and computing levels. The author used two spatial prediction models in this book to examine the utilities of spatially explicit hyperparameter optimization. The results demonstrate that the approach proposed in this book improves the computing performance at model and computing levels and addresses the challenge of finding optimal parameter settings for neural networks in the GIScience field.

In the remainder of this book, Chap. concludes this book.

Minrui Zheng
Beijing, China
Acknowledgements

My deepest gratitude goes to my committee members, Drs. Wenwu Tang, Elizebeth Delmelle, Minwoo Lee, and Akin Ogundiran for their support and guidance on this work.

I owe many thanks to former and current members of Center for Applied GIScience (Dr. Michael Desjardins, Dr. Alexander Hohl, Yu Lan, Dr. Jianxin Yang, Tianyang Chen, Zachery Slocum) and faculty (Drs. Heather Smith, Eric Delmelle, Craig Allan, Yu Wang, and Lisa Russell-Pinson) at the University of North Carolina at Charlotte who have helped and encouraged me. Also, I would like to express my appreciation to my friends (Greg Verret, Amanda Verret, Mark Verret, Nathan Verret, Yi Zhang, Li Liu, Qiang Li, and Jiayang Li).

Contents
List of Figures
List of Tables
The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021
M. Zheng Spatially Explicit Hyperparameter Optimization for Neural Networks https://doi.org/10.1007/978-981-16-5399-5_1
1. Introduction
Minrui Zheng
(1)
School of Public Administration and Policy, Renmin University of China, Beijing, China
Minrui Zheng
Email:
1.1 Background

In the past decades, with the increasing volume of spatial data and development of cutting-edge techniques, several spatial models have been created to investigate complex spatial phenomena and explore spatial process (Goodchild ).

Spatial modeling embraces a series of models and techniques that explore relationships, patterns, and phenomena across space and time. The steps of spatial modeling often proceed in a sequence from problem specification, model theory, data preparation, model verification, calibration, and evaluation to prediction (see Fig. ).
Fig 11 Spatial modeling process Adapted from Shannon Although spatial - photo 3
Fig. 1.1

Spatial modeling process

(Adapted from Shannon )

Although spatial modeling exists in a number of research areas, one of the vital parts of spatial modeling is algorithms. Figure illustrates the process of algorithms from input X to output Y. Algorithms are a sequence of computational steps that transform the input into the output. In each spatial modeling exercise, it includes one or multiple model units which have a single algorithm and related parameters. There are two types of parameters based on their contributions to spatial modeling: standard parameters and hyperparameters. Standard parameters are an internal component of spatial modeling, and their values usually are derived from models, such as coefficients of regression models and coefficients of objective functions of optimization models.
Fig 12 Illustration of algorithms YfX Yy 1 y 2 y n Xx 1 x 2 - photo 4
Fig. 1.2

Illustration of algorithms (Y=f(X); Y=[y1, y2,, yn]; X=[x1, x2,, xm]; f is the algorithm between X and Y), revised from Gahegan ()

Hyperparameters are an external component of spatial modeling, and the values are user-defined or pre-defined by other algorithms. Hyperparameters usually influence the algorithms themselves and the derivation of standard parameters. Some examples of hyperparameters are the learning rate of artificial neural networks (ANNs), the initial number of clusters of k-means clustering, and the number of trees in random forest algorithm. However, hyperparameters also exist in rule-based spatial modeling, such as the cellular automata (CA) model. Stochastic disturbance term from transition rules is a hyperparameter for CA, which allows the CA model generated patterns to be closed to reality (Yeh and Li ).

Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «Spatially Explicit Hyperparameter Optimization for Neural Networks»

Look at similar books to Spatially Explicit Hyperparameter Optimization for Neural Networks. We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «Spatially Explicit Hyperparameter Optimization for Neural Networks»

Discussion, reviews of the book Spatially Explicit Hyperparameter Optimization for Neural Networks and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.