Bash


Ubuntu – Overwrite dockerd default settings

Trying to create a new bridge on docker, we got the following error

$ docker-compose up -d;
Creating network "docker-compose_new_bridge" with driver "bridge"
ERROR: could not find an available, non-overlapping IPv4 address pool among the defaults to assign to the network

After investigating, we realized that it was due to some default limitations of docker that did not allow more virtual networks to be created. To overcome the problem, we read that we had to give access to more address space using the /etc/docker/daemon.json.

On Ubuntu that file did not exist so we created it and copied the following content to it:

{
  "default-address-pools": [
    {
      "base": "172.80.0.0/16",
      "size": 24
    },
    {
      "base": "172.90.0.0/16",
      "size": 24
    }
  ]
}

Source: https://docs.docker.com/engine/reference/commandline/dockerd/

This configuration allowed Docker to reserve the network address space 172.80.[0-255].0/24 and 172.90.[0-255].0/24, that provided the daemon a total of 512 networks each owning 256 addresses.

To apply the changes to the daemon, we restarted it:

sudo systemctl restart docker.service;

and then we applied our changes to our docker ecosystem:

docker-compose up -d;

How to retrieve the SSL cert expiration date from a PEM encoded certificate?

We use the following command to get the ending date of PEM encoded certificates that are generated using certbot and Let's Encrypt:

openssl x509 -enddate -noout -in fullchain.pem;

To get a list of all certificates and their expiration dates, we issue the following find command that executes the above snippet on each result while printing the name of the file first.

find ~/certificates/ -name "fullchain.pem" -print -exec openssl x509 -enddate -noout -in '{}' \;

In this example, the certificates are in our home folder under the name ‘certificates’. The results will look like the following sample:

/home/tux/certificates/example.com/fullchain.pem
notAfter=Aug 22 10:12:55 2021 GMT
/home/tux/certificates/site2.example.com/fullchain.pem
notAfter=Nov 22 03:22:44 2021 GMT

Using scp to copy a folder on a custom port

while true;
do
date;
scp -rp -P 2222 $SOURCE_DIRECTORY [email protected]$REMOTE_SERVER:$DESTINATION_DIRECTORY;
sleep 60;
done;

The above code was used to copy the contents of a local folder to a remote one every one minute. We did not want to lose the metadata of the files (including the modification date of the files) so we used the -p parameter to preserve that information.

The -P 2222 parameter instructs scp to use a different port rather the default.

The -r is used to instruct the copy to get all contents of the folder and its sub-folders.

The above code as a one-liner is:

while true; do date; scp -rp -P 2222 $SOURCE_DIRECTORY [email protected]$REMOTE_SERVER:$DESTINATION_DIRECTORY; sleep 60; done;


Download Large Jupyter Workspace files

Recently, we were working on a Jupyter Workspace at anyscale-training.com/jupyter/lab. As there was no option to download all files of the workspace nor there was a way to create an archive from the GUI, we followed the procedure below (that we also use on Coursera.org and works like a charm):

First, we clicked on the blue button with the + sign in it.
That opened the Launcher tab that is visible on the image above.
From there, we clicked on the Terminal button under the Other category.

In the terminal, we executed the following command to create a compressed archive of all the files we needed to download:

tar -czf Ray-RLLib-Tutorials.tar.gz ray_tutorial/ Ray-Tutorial/ rllib_tutorials/;

After the command completed its execution, we could see our archive on the left list of files. By right-clicking it we we are able to initiate its download. Unfortunately, after the first 20MB the download would always crash! To fix this issue, we split the archive to multiple archives of 10MB each, then downloaded them individually and finally merged them back together on our PC. The command to split the compressed archive to multiple smaller archives of fixed size was the following:

tar -czf - ray_tutorial/ Ray-Tutorial/ rllib_tutorials/ | split --bytes=10MB - Ray-RLLib-Tutorials.tar.gz.;

After downloading those files one by one by right-clicking on them and then selecting the Download option we recreated the original structure on our PC using the following command:

cat Ray-RLLib-Tutorials.tar.gz.* | tar xzvf -;

To clean up both the remote Server and our Local PC, we issued the following command:

rm Ray-RLLib-Tutorials.tar.gz.*;

This is a guide on how to download a very big Jupyter workspace by splitting it to multiple smaller files using the console.