As you can guess, this has been a fun work day.
Docker image layering and nightlies for the heavier installs has worked pretty well for me. Dependencies from things like npm, composer etc are all build time still but more of the base stuff is on a weekly build cycle. We just do notifications if the nightlies fail to manually resolve it which is very very seldom
Yeah, I’ve got it configured well enough, but this is a project I took over from an ex-coworker. It’s overly “clever” and complicated, so untangling that mess is gonna take a while and I didn’t really wanna start it before the end of the year. But I had to do some work on that project and the cache that slows deployment down was just the latest of the gifts he left us.
Basically, to avoid pulling and pushing docker images, he exports them to some archive that gets stored in the cache and loaded into docker on restore. Sounds smart, right? To achieve the lowest traffic, the compression algorithm is fucking aggressive and takes more than the rest of the job. As a bonus, we store the cache in S3, so the pull and push of the docker image pretty much still happens, except much slower.