pull


How to update all pulled Docker images that are tagged as latest

Recently, we moved a client to Docker and we needed to give them a way to automagically update all “latest” Docker images.
Since Docker does not have a single command to update all pulled images we used this one-liner to update all images at once:

docker images --format "{{.Repository}}:{{.Tag}}" | grep ':latest' | xargs -L1 docker pull;

The above command will:

  1. Print all images in the format RepositoryName:Tag
  2. Then it will filter all lines that end with the suffix :latest (which is the tag we are interested in)
  3. Finally, for each result (which is one per line) it will be fed  via the command xargs -L1 to the command docker pull

Please note that you cannot really update an existing container using docker commands, what you need to do is actually:

  1. Stop the container whose image was updated
  2. Delete it
  3. Recreate it using the parameters of the previous container

As you understand, it is a good practice to save all of your data in volumes outside the container to make the update process easy.

For example, below you will find the commands using which we updated the jwilder/nginx-proxy and the jrcs/letsencrypt-nginx-proxy-companion images along with the two containers that were using them:

docker container stop nginx-proxy nginx-letsencrypt;
docker container rm nginx-proxy nginx-letsencrypt;
docker run -d -p 443:443 \
     --name nginx-proxy \
     --net reverse-proxy \
     -v $HOME/certs:/etc/nginx/certs:ro \
     -v /etc/nginx/vhost.d \
     -v /usr/share/nginx/html \
     -v /var/run/docker.sock:/tmp/docker.sock:ro \
     -v $HOME/my_proxy.conf:/etc/nginx/conf.d/my_proxy.conf:ro \
     --label com.github.jrcs.letsencrypt_nginx_proxy_companion.nginx_proxy=true \
     jwilder/nginx-proxy:latest;

docker run -d \
     --name nginx-letsencrypt \
     --net reverse-proxy \
     --volumes-from nginx-proxy \
     -v $HOME/certs:/etc/nginx/certs:rw \
     -v /var/run/docker.sock:/var/run/docker.sock:ro \
     jrcs/letsencrypt-nginx-proxy-companion:latest;

Advertisements

How does ‘git pull’ and ‘git fetch’ differ?

In simple terms, the difference between the two Git commands is that git pull is composed by a git fetch followed by a git merge.

When you use git pull, Git will automatically merge any pulled commits into the branch you are currently working in, without letting you review them first. You will get a prompt only if there is conflict found during automatic merging.

When you use git fetch, Git retrieves any commits from the target branch that do not exist in your current branch and stores them in your local repository. However, it will not merge them with your current branch. To integrate the new commits into your current branch, you need to use git merge manually.
This command is particularly useful if you need to keep your repository up to date, but are working on something that might break if you merge your files.
For example, if you will go offline and you need to have those commits available to you but cannot merge at the time, using git fetch you will download the new commits to your machine without affecting your current code. Later you will be able to merge the new commits to your branch as/when you please.


Pull all Git repositories you have access to

ssh [email protected] info | cut -f 2 | tail -n +3 | xargs -I {} -n 1 -I_repository -- sh -c 'cd _repository; git pull; cd ..;'

The above command will connect to the git server (git.bytefreaks.net) using  gitolite and get a list of all the repositories you have access to using ssh [email protected] info

The command should return a list similar to this:

hello bytefreaks, this is [email protected] running gitolite3 v3.5.3.1-1-gf8776f5 on git 1.7.1

 R W	Repo1
 R W	Repo2
 R W	Repo3
 R  	Repo4

From the results, we remove the first 3 lines as they contain no useful information to cloning all the repositories. From the rest of the lines, where each line contains the information for a repository we have access to, we keep the third column only as it is the one that holds the repository name as it is stored on the server.

Afterwards it will remove all columns except the second to filter the column with the repository names and will remove the first 3 lines to keep only the data we are interested in.

On the last stage of the pipe we have a list of the names of the repositories, using xargs, we assign each repository name to the _repository variable and using one result at a time, we navigate into the folder of the repository using cd and call the pull command.

Note: We assume that all repositories are in the current folder as children and each one is in a sub-folder of its own which is named as the repository is.