Docker: Up and Running
by Sean P. Kane with Karl Matthias
Copyright 2023 Sean P. Kane and Karl Matthias. All rights reserved.
Printed in the United States of America.
Published by OReilly Media, Inc. , 1005 Gravenstein Highway North, Sebastopol, CA 95472.
OReilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles (https://oreilly.com). For more information, contact our corporate/institutional sales department: 800-998-9938 or corporate@oreilly.com .
- Acquisitions Editor: John Devins
- Development Editor: Michele Cronin
- Production Editor: Elizabeth Faerm
- Copyeditor: Sonia Saruba
- Proofreader: TO COME
- Indexer: TO COME
- Interior Designer: David Futato
- Cover Designer: Randy Comer
- Illustrator: Kate Dullea
- June 2015: First Edition
- September 2018: Second Edition
- April 2023: Third Edition
Revision History for the Early Release
- 2022-10-25: First Release
- 2023-01-23: Second Release
- 2023-02-16: Final Release
See https://oreilly.com/catalog/errata.csp?isbn=9781098131821 for release details.
The OReilly logo is a registered trademark of OReilly Media, Inc. Docker: Up and Running, the cover image, and related trade dress are trademarks of OReilly Media, Inc.
The views expressed in this work are those of the author, and do not represent the publishers views. While the publisher and the author have used good faith efforts to ensure that the information and instructions contained in this work are accurate, the publisher and the author disclaim all responsibility for errors or omissions, including without limitation responsibility for damages resulting from the use of or reliance on this work. Use of the information and instructions contained in this work is at your own risk. If any code samples or other technology this work contains or describes is subject to open source licenses or the intellectual property rights of others, it is your responsibility to ensure that your use thereof complies with such licenses and/or rights.
978-1-098-13176-0
[LSI]
Chapter 1. Introduction
Docker was first introduced to the worldwith no pre-announcement and little fanfareby Solomon Hykes, founder and CEO of a company then called dotCloud, in a five-minute lightning talk at the Python Developers Conference in Santa Clara, California on March 15, 2013. At the time of this announcement, only about 40 people outside of dotCloud had been given the opportunity to play with Docker.
Within a few weeks of this announcement, there was a surprising amount of press. The project was quickly open-sourced and made publicly available on GitHub, where anyone could download and contribute to the project. Over the next few months, more and more people in the industry started hearing about Docker and how it was going to revolutionize the way software was built, delivered, and run. And within a year, almost no one in the industry was unaware of Docker, but many were still unsure what it was exactly, and why people were so excited about it.
Docker is a tool that promises to easily encapsulate the process of creating a distributable artifact for any application, deploying it at scale into any environment, and streamlining the workflow and responsiveness of agile software organizations.
The Promise of Docker
Initially, many people who were unfamiliar with Docker viewed it as some sort of virtualization platform, but in reality, it was the first widely accessible tool to build on top of a much newer technology called containerization. Docker and Linux containers have had a significant impact on a wide range of industry segments that include tools and technologies like Vagrant, KVM, OpenStack, Mesos, Capistrano, Ansible, Chef, Puppet, and so on. There is something very telling about the list of products that have had their market share directly impacted by Docker, and maybe youve spotted it already. Looking over this list most engineers would recognize that these tools span a lot of different use cases, yet all of these workflows have been forever changed by Docker. This is largely because Docker has significantly altered everyones expectations of how a CI/CD workflow should function. Instead of each step involving a time-consuming process managed by specialists, most people expect a DevOps pipeline to be fully automated and flow from one step to the next without any human intervention. The technologies in that list are also generally acclaimed for their ability to improve productivity, and thats exactly what has given Docker so much buzz. Docker sits right in the middle of some of the most enabling technologies of the last decade and can bring significant improvements to almost every step of the pipeline.
If you were to do a feature-by-feature comparison of Docker and the reigning champion in any of these individual areas (e.g. configuration management), Docker would very likely look like a middling competitor. Its stronger in some areas than others, but what Docker brings to the table is a feature set that crosses a broad range of workflow challenges. By combining the ease of application testing and deployment tools like Vagrant and Capistrano with the ease of administrating virtualization systems, and then providing interfaces that make workflow automation and orchestration easy to implement, Docker provides a very enabling feature set.
Lots of new technologies come and go, and a dose of skepticism about the newest rage is always healthy. When Docker was a new technology it would have been easy to dismiss Docker as just another technology that solves a few very specific problems for developers or operations teams. If you look at Docker as a pseudo-virtualization or deployment technology alone, it might not seem very compelling. But Docker is much more than it seems on the surface.
It is hard and often expensive to get communication and processes right between teams of people, even in smaller organizations. Yet we live in a world where the communication of detailed information between teams is increasingly required to be successful. Discovering and implementing a tool that reduces the complexity of that communication while aiding in the production of more robust software is a big win. And thats exactly why Docker merits a deeper look. Its no panacea, and the way that you implement Docker within your organization requires some critical thought, but Docker and Linux containers provide a good approach to solving some real-world organizational problems and helping to enable companies to ship better software faster. Delivering a well-designed Linux container workflow can lead to happier technical teams and real savings for the organizations bottom line.
So where are companies feeling the most pain? Shipping software at the speed expected in todays world is hard to do well, and as companies grow from one or two developers to many teams of developers, the burden of communication around shipping new releases becomes much heavier and harder to manage. Developers have to understand a lot of complexity about the environment they will be shipping software into, and production operations teams need to increasingly understand the internals of the software they ship. These are all generally good skills to work on because they lead to a better understanding of the environment as a whole and therefore encourage the designing of robust software, but these same skills are very difficult to scale effectively as an organizations growth accelerates.
The details of each companys environment often require a lot of communication that doesnt directly build value for the teams involved. For example, requiring developers to ask an operations team for