• Complain

David Foster - Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release)

Here you can read online David Foster - Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release) full text of the book (entire story) in english for free. Download pdf and epub, get meaning, cover and reviews about this ebook. year: 2022, publisher: OReilly Media, Inc., genre: Computer. Description of the work, (preface) as well as reviews are available. Best literature library LitArk.com created for fans of good reading and offers a wide selection of genres:

Romance novel Science fiction Adventure Detective Science History Home and family Prose Art Politics Computer Non-fiction Religion Business Children Humor

Choose a favorite category and find really read worthwhile books. Enjoy immersion in the world of imagination, feel the emotions of the characters or learn something new for yourself, make an fascinating discovery.

David Foster Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release)
  • Book:
    Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release)
  • Author:
  • Publisher:
    OReilly Media, Inc.
  • Genre:
  • Year:
    2022
  • Rating:
    3 / 5
  • Favourites:
    Add to favourites
  • Your mark:
    • 60
    • 1
    • 2
    • 3
    • 4
    • 5

Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release): summary, description and annotation

We offer to read an annotation, description, summary or preface (depends on what the author of the book "Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release)" wrote himself). If you haven't found the necessary information about the book — write in the comments, we will try to find it.

Generative modeling is one of the hottest topics in AI. Its now possible to teach a machine to excel at human endeavors such as painting, writing, and composing music. With this practical book, machine learning engineers and data scientists will discover how to re-create some of the most impressive examples of generative deep learning models such as variational autoencoders (VAEs), generative adversarial networks (GANs), Transformers, normalizing flows, energy based models, and diffusion models.Author David Foster demonstrates the inner workings of each technique, starting with the basics of deep learning before advancing to some of the most cutting-edge algorithms in the field. Through tips and tricks, youll understand how to make your models learn more efficiently and become more creative.A generative model can be broadly defined as follows:Generative modeling is a branch of Machine Learning (ML) that deals with the creation of models that can generate new data points that are similar to the training data.What does this mean in practice? Suppose we have a dataset containing images of horses. We may wish to build a model that can generate a new image of a horse that has never existed but still looks real because the model has learned the general rules that govern the appearance of a horse. This is the kind of problem that can be solved using generative modeling.In order to truly understand what generative modeling aims to achieve and why this is important, it is useful to compare it to its counterpart, discriminative modeling. If you have studied Machine Learning, most problems you will have faced will have most likely been discriminative in nature.The Chapter 3 will first lay out the theoretical underpinning of generative adversarial networks (GANs). You will then learn how to build your own GANs using Keras. We use the Keras function image_dataset_from_directory to create a Tensorflow Dataset pointed at the directly where the images are stored. This allows us to read batches of images into memory only when required (e.g. during training), so that we can work with large datasets and not worry about having to fit the entire dataset into memory.Discover how VAEs can change facial expressions in photosBuild practical GAN examples from scratch to generate images based on your own datasetCreate autoregressive generative models, such as LSTMs for text generation and PixelCNN models for image generationBuild music generation models, using Transformers and MuseGANExplore the inner workings of state-of-the-art architectures such as StyleGANGPT-3, and DDIMDive into the the detail of multimodal models such as DALL.E 2 and Imagen for text-to-image generationUnderstand how generative world models can help agents accomplish tasks within a reinforcement learning settingUnderstand how the future of generative modeling might evolve, including how businesses will need to adapt to take advantage of the new technologies

David Foster: author's other books


Who wrote Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release)? Find out the surname, the name of the author of the book and a list of all author's works by series.

Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release) — read online for free the complete book (whole text) full work

Below is the text of the book, divided by pages. System saving the place of the last page read, allows you to conveniently read the book "Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release)" online for free, without having to search again every time where you left off. Put a bookmark, and you can go to the page where you finished reading at any time.

Light

Font size:

Reset

Interval:

Bookmark:

Make
Generative Deep Learning by David Foster Copyright 2023 Applied Data Science - photo 1
Generative Deep Learning

by David Foster

Copyright 2023 Applied Data Science Partners Ltd. All rights reserved.

Printed in the United States of America.

Published by OReilly Media, Inc. , 1005 Gravenstein Highway North, Sebastopol, CA 95472.

OReilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles (http://oreilly.com). For more information, contact our corporate/institutional sales department: 800-998-9938 or corporate@oreilly.com.

  • Acquisitions Editor: Nicole Butterfield
  • Development Editor: Michele Cronin
  • Production Editor: Christopher Faucher
  • Interior Designer: David Futato
  • Cover Designer: Karen Montgomery
  • July 2019: First Edition
  • June 2023: Second Edition
Revision History for the Early Release
  • 2022-06-28: First Release
  • 2022-08-08: Second Release
  • 2022-08-29: Third Release
  • 2022-10-11: Fourth Release
  • 2022-11-16: Fifth Release
  • 2023-01-24: Sixth Release
  • 2023-03-07: Seventh Release

See http://oreilly.com/catalog/errata.csp?isbn=9781098134181 for release details.

The OReilly logo is a registered trademark of OReilly Media, Inc. Generative Deep Learning, the cover image, and related trade dress are trademarks of OReilly Media, Inc.

The views expressed in this work are those of the author and do not represent the publishers views. While the publisher and the author have used good faith efforts to ensure that the information and instructions contained in this work are accurate, the publisher and the author disclaim all responsibility for errors or omissions, including without limitation responsibility for damages resulting from the use of or reliance on this work. Use of the information and instructions contained in this work is at your own risk. If any code samples or other technology this work contains or describes is subject to open source licenses or the intellectual property rights of others, it is your responsibility to ensure that your use thereof complies with such licenses and/or rights.

978-1-098-13412-9

Dedication

For Alina, the loveliest noise vector of them all.

Preface

What I cannot create, I do not understand.

Richard Feynman

Generative AI is one of the most revolutionary technologies of our time, transforming the way we interact with machines. Its potential to revolutionize the way we live, work and play has been the subject of countless conversations, debates and predictions. But what if there was an even greater potential to this powerful technology? What if the possibilities of generative AI extend beyond our current imagination? The future of generative AI may be more exciting than we ever thought possible

Since our earliest days, we have sought opportunities to generate original and beautiful creations. For early humans, this took the form of cave paintings depicting wild animals and abstract patterns, created with pigments placed carefully and methodically onto rock. The Romantic Era gave us the mastery of Tchaikovsky symphonies, with their ability to inspire feelings of triumph and tragedy through sound waves, woven together to form beautiful melodies and harmonies. And in recent times, we have found ourselves rushing to bookshops at midnight to buy stories about a fictional wizard, because the combination of letters creates a narrative that wills us to turn the page and find out what happens to our hero.

It is therefore not surprising that humanity has started to ask the ultimate question of creativity: can we create something that is in itself creative?

This is the question that generative AI aims to answer. With recent advances in methodology and technology, we are now able to build machines that can paint original artwork in a given style, write coherent blocks of text with long-term structure, compose music that is pleasant to listen to, and develop winning strategies for complex games by generating imaginary future scenarios. This is just the start of a generative revolution that will leave us with no choice but to find answers to some of the biggest questions about the mechanics of creativity, and ultimately, what it means to be human.

In short, there has never been a better time to learn about generative AI so lets get started!

Objective and Approach

This book assumes no prior knowledge of generative AI. We will build up all of the key concepts from scratch in a way that is intuitive and easy to follow, so do not worry if you have no experience with generative AI. You have come to the right place!

Rather than only cover the techniques that are currently in vogue, this book serves as a complete guide to generative modeling that covers a broad range of model families. There is no one technique that is objectively better or worse than any other - in fact, many state-of-the-art models now mix together ideas from across the broad spectrum of approaches to generative modeling. For this reason, it is important to keep abreast of developments across all areas of generative AI, rather than only focus on one particular kind of technique. One thing is certain - the field of generative AI is moving fast and you never know where the next ground-breaking idea will come from!

With this in mind, the approach I will take is to show you how to train your own generative models on your own data, rather than relying on pre-trained off-the-shelf models. Whilst there are are now many impressive open-source generative models that can be downloaded and run in a few lines of code, the aim of this book is to dig deeper into their architecture and design from first principles, so that you gain a complete understanding of how they work and can code up examples of each technique from scratch using Python and Keras.

In summary, this book can be thought of as a map of the current generative AI landscape that covers both theory and practical applications, including full working examples of key models from the literature. We will walk through the code for each step by step, with clear signposts that show how the code implements the theory underpinning each technique. This book can be read cover-to-cover or used as a reference book that you can dip into. Above all, I hope you find it a useful and enjoyable read!

Note

Throughout the book, you will find short, allegorical stories that help explain the mechanics of some of the models we will be building. I believe that one of the best ways to teach a new abstract theory is to first convert it into something that isnt quite so abstract, such as a story, before diving into the technical explanation. The story and the model explanation are just the same mechanics explained in two different domains - you might therefore find it useful to refer back to the relevant story while learning about the technical details of each model!

Prerequisites

This book assumes that you have experience coding in Python. If you are not familiar with Python, the best place to start is through LearningPython.org. There are many free resources online that will allow you to develop enough Python knowledge to work with the examples in this book.

Also, since some of the models are described using mathematical notation, it will be useful to have a solid understanding of linear algebra (for example, matrix multiplication, etc.) and general probability theory. A useful resource is the Mathematics for Machine Learning book (Deisenroth, Faisal, Roth), freely available at the following link (

Next page
Light

Font size:

Reset

Interval:

Bookmark:

Make

Similar books «Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release)»

Look at similar books to Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release). We have selected literature similar in name and meaning in the hope of providing readers with more options to find new, interesting, not yet read works.


Reviews about «Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release)»

Discussion, reviews of the book Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2nd Edition (Seventh Early Release) and just readers' own opinions. Leave your comments, write what you think about the work, its meaning or the main characters. Specify what exactly you liked and what you didn't like, and why you think so.