Insights

Using Magento 2 and Docker for Local Development

Have your cake, and eat it too, in a container?
If you’re part of - or own - an agency, wouldn’t it be nice to have all your team be able to spin up projects, including all their dependencies, in a set up that replicates your production sites as closely as possible? If you’re only working on 1 or maybe 2 sites, you're probably saying, “Well yeah, we do that already, but running everything locally. What's the big deal?”

But what happens when you have a team which doesn’t all work on the same hardware or OS? To complicate things more, what if you have multiple projects, all with varying technical stacks, like different versions of Varnish, PHP, Nginx, etc. What then?

Setting up local environments for each of your team so they may easily switch between using differing stacks can be cumbersome, and prone to fragmentation, as updates to one developer’s setup might be forgotten and not passed on to other members of the team. Worse still, this ideal consistent developer setup is often abandoned or ignored due to the constant maintenance required, with most developers just reverting to using a stack that “just works” on their machine.

Follow on to see how C3 tackled this problem with some encouraging results.

The Problem

Like with many agencies, client projects can vary drastically in their technology requirements, ranging from nuanced to large differences between the default tech stacks. This can be simply explained by differing upgrade cycles between each project and which technologies are available at the time of launch. As the number of projects grows and evolves, the ability for a development team to keep their development environments in sync with what is currently in production for each project becomes a challenge. Fragmentation sets in, and ultimately can lead to the dreaded “well it works on my machine, but not on the production server” bug.

The Goal

The ideal setup would be a system by which all developers can easily change their current working stack to match what currently is in production for each project. As project requirements change, these need to be easily and quickly applied to all development environments to ensure all current project work and testing is done against the currently used technology.

Options

Scripts

Maybe just create scripts for php, nginx, varnish, elasticsearch, which allow switching versions on the fly

Pros:
+ Gives developers easy ways to switch up their tech stack

Cons:
– Difficult to maintain and ensure all members of the team are on the most up to date versions.
– Easy for developers to forget to switch stack techs between projects

Be Draconian on Update Cycles

Basically force the same Stack and upgrade cycle on each client at the same time

Pros:
+ All projects always run the same stack

Cons:
– Very inflexible to clients
– Could work, but likely isn’t realistic, as few clients will stick around if there is no flexibility on the technologies and update schedules. Especially if they don’t make sense for their business.

Go Virtual

Create virtual machines for each project environment.

Pros:
+ Ensures the same stack

Cons:
– Difficult to roll updates out to the team, adds an extra step to each developer’s workflow
– Virtual machines add more overhead to the developers’ hardware
– A new VM would need to be created for every flavour of tech stack a client may use.

Docker

Use Docker, more specifically Docker Compose, to boot up the tech stack requirement for each project on the fly.

Pros:
+ Ability to customize the stack per project
+ Include a single file per project to boot the required tech stack on the fly
+ Controlled via Version Control System (VCS)
+ Easy to update and maintain along with project code.

Cons:
Potential to add more overhead to the developers’ hardware. (read on)

There will be Cake

Three cheers for Docker!

With the ability for our team to simply update the required stacks for each project via VCS, along with the project code itself, we’re home free. Developers are (or should be) used to regularly pulling down changes into their current working branches, which will update the tech stack on the fly each time they boot their local copy of the project. #winning

The Cake is a Lie

Docker + Magento = slow on MacOS…

As with many PHP based web development agencies, we’ve opted to use MacOS as our primary development operating system. Unfortunately, we discovered that Docker for Mac is notoriously slow to work with, as the file synchronisation layer used between Docker and MacOS (Hyperkit) is highly resource intensive. As a result, the performance of Docker ended up being too slow for practical development with large Magento 2 projects.

While there have been many helpful improvements to speed blogged about by various resources, ultimately, these either didn’t improve the speed enough to be viable or simply introduced more complexity to maintain, negating the point of this endeavour. Docker themselves admit to the performance issues with Docker on MacOS and assure us they’re working hard to resolve them. For some time, we had thought that all was lost to use Docker as our solution to our developer tech stack fragmentation problem.

Note:
To our understanding there used to be similar performance issues on the Microsoft Windows version of Docker as well, however, with Microsoft’s recent addition of the Windows Subsystem for Linux into Windows 10, containers are now able to run natively as if they were running on a normal Linux based OS.

Or is it?

Native Network File Share to the Rescue

Recently our Technical Director Jamie Saunders discovered a script that boots up a local network share on MacOS. With this tool in hand, we were able to basically circumvent the HyperKit synchronization overhead between Docker and MacOS, and thus drastically improve the speed of Docker on MacOS!

The dream lives on, and at the time of writing, we’re building out docker implementations of all our current projects with satisfying results.

About the author

Michael Smith