Jérôme Petazzoni 1/19/2015

Putting data in a volume in a Dockerfile

Read Original

This technical article details a Docker performance issue: when large files (e.g., 10+ GB) are added to a directory later declared as a VOLUME in a Dockerfile, it causes slow build times and container startup. The author explains that Docker copies the data from the image layer to a new anonymous volume on each container creation and during certain build steps. The solution is to avoid placing large datasets in volume directories; instead, keep read-only data in the image's copy-on-write filesystem or decouple application and data into separate images/volumes for native I/O needs.

Putting data in a volume in a Dockerfile

Comments

No comments yet

Be the first to share your thoughts!

Browser Extension

Get instant access to AllDevBlogs from your browser