Guest author Ben Hall is the lead technical developer of C# .NET at gov.uk (the UK public sector information site) and a member of the .NET Foundation. He worked for nine years as a school teacher, covering programming and computer science. Ben enjoys making complex topics accessible and practical for busy developers.
Choosing between Docker Desktop and a DIY solution
The Docker Engine is at the heart of the Docker experience. Docker Desktop’s ready-to-use solution for building containerized applications includes the Docker Engine and all the other tools and settings you need to start development right away.
Developers can create a Docker “DIY” implementation around the Docker Engine manually. Some organizations may prefer the flexibility and control to do this themselves. But choosing a DIY Docker Engine solution requires more engineering, development, and setup. Docker and its Windows companion, WSL, are relatively complex, so the DIY approach isn’t right for everyone.
In this article, we’ll help you decide which approach is right for you and your organization. For clarity, we will be making comparisons between what Docker Desktop offers and DIY Docker setup on Windows.
Setting up Docker on Windows
This article on failingfast.io describes the main steps of a manual installation on Windows, which includes creating a WSL 2 distro, setting up a Docker repository, and installing the Docker Engine on the additional WSL 2 distro setup. This process is a bit shaky, so prepare for troubleshooting before you launch. It is just a guide to get I started. Most use cases will need more setup, including:
- Configure Docker for startup
- Sign Up
- Accepting Docker connections from remote hosts
- remote access configuration
- Fix IP forwarding problems
Setting up Docker Desktop is an entirely different experience. Simply download and run the latest Docker Desktop installer – it completes all the work automatically. We’re up and running in a few minutes, ready to deploy to containers.
cutting edge and stable
Docker Desktop and the DIY application we linked to share a common core on Windows Subsystem for Linux (WSL) 2 that enables developers to run a Linux environment directly on Windows.
WSL 2 has significantly improved memory usage, code execution, and compatibility. He achieved this by making an architectural transition to a full Linux kernel, which supports running Linux containers locally, without emulation.
Working closely with Microsoft and Windows Insider, Docker was quick to adopt this useful emerging technology as a backend for Docker Desktop. Docker then released a Technical Preview long before WSL 2 even hit public availability in Windows. Every effort was also made to maintain feature parity with the previous version that used Hyper-V.
We can add Docker Desktop to our developer tools, confident that it will continue to support the latest technologies while avoiding urgent changes to the experience we are accustomed to.
Docker Desktop manages everything, from setup to future kernel patches. And because it is complete package, automatic software updates will keep all tools installed on it up-to-date and secure, including the Docker Engine itself. This is a less automated image to manage internally!
With DIY Docker setup, it’s up to you to keep up with all the security patches and other updates. DIY solution will also save you a lot of persistent issues that need solving. So, be sure to double those developer hours across a large organization when calculating the ROI for Docker Desktop.
Docker Desktop will automatically publish HTTP/HTTPS proxy server settings to Docker for use when containers are checked out.
It will also work properly when connected to a VPN. It achieves this by intercepting traffic from containers and injecting it into Windows as if it originated from the Docker application itself.
Pause and resume
This feature was requested by a user in the public roadmap of Docker Desktop. It’s not the biggest feature ever, but it’s another great reminder that Docker Desktop is in active development. It is constantly being improved in response to user feedback, and implemented with monthly releases.
Users can now pause a Docker Desktop session to reduce CPU usage and preserve battery life. When paused, the current state of all your containers is saved in memory and all processes are frozen.
Volumes are the standard way to persist any data that Docker containers work with, including files shared between containers. Unlike mounts that work directly with host machine files, volumes are managed by Docker, which offers several advantages.
You will face two major challenges when working with Docker volumes manually in the Docker CLI:
- It can be difficult to determine which container each folder belongs to, so scanning old volumes can be a slow process.
- Moving content in and out of folders is more complicated than really needed.
Docker Desktop provides a solution for this by providing a view in the Dashboard to explore sizes. In this demo, you can:
- Easily select the sizes to be used
- See which containers are used as storage
- Create and delete volumes
- Explore files and folders in the folder, including file sizes
- Download files from folders
- Search and sort by name, date and size
Although there are many features to explore in one article, we should take a look at the Kubernetes integration in Docker Desktop.
Kubernetes has become the standard for container organization, with 83 percent of respondents to the 2020 CNCF survey reporting that they use it in production.
We certainly don’t need Kubernetes to get the advantages of Docker in on-premises development, such as isolation from the host system. Additionally, we can use Docker Compose 2.0 to run multiple containers with some neat networking features. But if you are working on a project that will be deployed to Kubernetes in production, using a similar environment locally is a wise choice.
In the past, a local Kubernetes instance was another thing to set up, and the costs in developer time didn’t provide enough benefit for some. This will likely be the case for a DIY Docker solution.
In contrast, Docker Desktop comes with a standalone Kubernetes server and client for local testing. It is an uncomplicated, no-configuration, single-node array. You can switch to it through the user interface, as the image below shows, or the usual way using the context of using kubectl config.
Apple Support Original Silicon
In 2021, the release of Docker Desktop for Mac can fully Taking advantage of the latest M1 chip has reached general availability. There are already over 145,000 ARM-based images on Docker Hub. This Apple Silicon version supports multi-platform images, which means you can create and play images for x86 And ARM architectures without complex cross-compilation environments.
This is very well received because the simulation provided by Rosetta 2, which provides acceptable functionality for many common applications, is not sufficient to run containers.
Costs and scalability
The DIY alternative requires a significant amount of engineering time to build and configure, with an ongoing maintenance commitment to updating, restoring, and troubleshooting the container environment. Each developer in the organization will do most of this work individually each time they work in a new environment.
This approach does not scale well! This means that developers will not spend time on activities that directly benefit the business, such as new features. None of us enjoy a quick review as we have to explain that we didn’t offer a feature due to issues or working on setting up development environments.
Containers should help facilitate product delivery. What Docker Desktop plans to achieve is nothing new. We have always invested in IDEs for programming and other tools that combine functionality into one handy package to improve productivity.
To help you decide if Docker Desktop is right for your organization from a cost perspective, Jeremy Castile has some guidelines to help you assess your return on investment.
Work with multiple environments
Developers widely accept that building elements must be immutable – the application itself, designed, must pass through quality assurance to production. The next level, if you like it, is app bundles And its dependencies. This helps maintain consistency between the development, testing, and production environments.
We risk not achieving this benefit if the process is too complex. Organizations have provided many great tools and processes for teams, only for these tools to gather dust because the entry bar for the skills required is very high.
This situation is more evident in QA teams. Many testers Technical, but typically, they have a certain test-oriented skill set. Since QA is one group that benefits most from consistent testing environments, consider what you are likely to use.
Introduction to development environments
To further enhance the experience of these scenarios, Docker Desktop has added a new collaborative development feature, currently in preview, called Development Environments.
Switching git branches or environments usually requires a lot of manual changes to configuration, dependencies, and other environment settings before the code can be run.
The new feature makes it easy to keep details of the environment itself in source control using code. With the click of a button, the developer can share their work in progress And Its dependencies are via Docker Hub. This means that developers can easily switch to fully functional instances of each other’s work, for example, completing a pull request without having to change from their local branch and making all those changes to the environment.
Get started in development environments with preview documentation.
Brett Fisher, author who writes about Docker, sum up The Need for Docker Desktop: “It’s really a testament to Docker Desktop that there is no comparable tool for local Linux containers on macOS/Windows that solves 80% of what people usually need in terms of container running time locally.”
We’ve found out what Docker Desktop has to offer and along the way. We’ve also touched on the topics of cost, return on investment, setup, maintenance, scalability, and setup. Although some prefer the flexibility and control of DIY Docker, Docker Desktop requires less effort to setup and maintain, providing a nice learning curve for everyone from development to quality assurance.
Perhaps the biggest challenge to a DIY solution is from a business value perspective. Developers love figuring out how to do these things. Therefore, the developer wouldn’t necessarily keep track of the number of hours they spent more than a week maintaining a DIY solution – the company wouldn’t have visibility into any productivity loss.
If you’re still using a DIY solution for on-premises development with Docker on Windows or macOS, learn more about Docker Desktop and download it to get started.