Docker and Virtualenv? A clean way to locally install python dependencies with pip in Docker

If you’ve ever developed a python application, you’ve probably installed your python dependencies in a virtualenv. A simple way to do so is:

# build a virtualenv
virtualenv venv

# activate the virtualenv
source venv/bin/activate

# install some dependencies
pip install flask

Thanks to virtualenv your project dependencies are now isolated from your other projects and from the operating system packages. Simple, isn’t it?

Another way to locally isolate you project is to install your dependencies in a docker container (actually the best practice would be to use virtualenv in a docker container as described here:

In this use-case, you’ll want to store the python packages required by your application in a mounted folder to avoid re-installing them everytime you reset your container in development. In other words, you’ll want to store the python dependencies in a specific folder.

The first obvious solution to that is using the -t, --target <dir> Install packages into <dir>. pip option.

However this option ends up being a trap. When using the --target option the installer changes its behaviour in a non desirable way for us, and becomes incompatible with the --upgrade option as described here:

A better solution, in line with PEP 370, is to use the PYTHONUSERBASE environment variable. Cf.

You just need to then use pip install --user and your packages will be installed in a specific folder without any of the strange side-effects of the --target option.

Here is the detailed step-by-step solution.

Your docker-compose file should look like this:

# docker-compose.yml
  image: python:3
  working_dir: /mnt
    - .:/mnt
    PYTHONUSERBASE: /mnt/vendor
  command: pip install -r requirements.txt --user --upgrade

  image: python:3
  working_dir: /mnt
    - .:/mnt
    - '5000:5000'
    PYTHONUSERBASE: /mnt/vendor
  command: python src/

Install your vendors (do it twice just to check!):
docker-compose run --rm vendors

Run your app:
docker-compose up -d server


The PYTHONUSERBASE is used to compute the path of the user site-packages directory. You should use it in pair with the pip --user option to install python packages in a custom directory.

You liked this article? You'd probably be a good match for our ever-growing tech team at Theodo.

Join Us

  • florian klein

    Hey there :)

    I struggled a lot with this too, and after a few research I found out the best approach was to use the “onbuild” python image, that runs a hook dusing image creation.
    This way, pip is run at build time, and dependencies are available in the image itself, an there is no need to mess with the env variables anymore!

    Hope it helps,

  • – “command: pip install -r requirements.txt –user –upgrade“ runs pip as root.
    – pip then runs each as root
    – you can specify package hashes in requirements.txt:
    – In a Dockerfile, you can specify USER
    – In docker-compose.yml, you can su (or sudo – which should create a log – if sudo is already installed)