Skyfield.api loader behaves differently in docker container

174 views Asked by At

I wish to specify to Skyfield a download directory as documented here :

http://rhodesmill.org/skyfield/files.html

Here is my script:

from skyfield.api import Loader
load = Loader('~/data/skyfield')
# Next line downloads deltat.data, deltat.preds, Leap_Second.dat in ~/data/skyfield
ts = load.timescale()
t = ts.utc(2017,9,13,0,0,0)
stations_url = 'http://celestrak.com/NORAD/elements/stations.txt'
# Next line downloads stations.txt in ~/data/skyfield AND deltat.data, deltat.preds, Leap_Second.dat in $PWD !!!
satellites = load.tle(stations_url)
satellite = satellites['ISS (ZARYA)']

Expected behaviour (works fine outside docker)

The 3 deltat files (deltat.data, deltat.preds and Leap_Second.dat) are downloaded in ~/data/skyfield with load.timescale() and stations.txt is downloaded at the same place with load.tle(stations_url)

Behaviour when run in a container

The 3 deltat files get downloaded twice :

  • one time in the specified folder at the call load.timescale()
  • another time in the current directory at the call load.tle(stations_url)

This is frustrating because they already exist at this point and they pollute current directory. Note that stations.txt end up in the right place (~/data/skyfield)

If the container is ran interactively, then calling exec(open("script.py").read()) in a python shell gives a normal behaviour again. Can anyone reproduce this issue? It is hard to tell wether it comes from python, docker or skyfield.

The dockerfile is just these 2 lines:

FROM continuumio/anaconda3:latest
RUN conda install -c astropy astroquery && conda install -c anaconda ephem=3.7.6.0 && pip install skyfield

Then (assuming the built image is tagged astro) I run it with :

docker run --rm -w /tmp/working -v $PWD:/tmp/working astro:latest python script.py

And here is the output (provided the folders are empty before the run):

[#################################] 100% deltat.data
[#################################] 100% deltat.preds
[#################################] 100% Leap_Second.dat
[#################################] 100% stations.txt
[#################################] 100% deltat.data
[#################################] 100% deltat.preds
[#################################] 100% Leap_Second.dat

EDIT

Adding -t to docker run did not solve the issue but helped to even illustrate it better. I think it may come from Skyfield because some recent issues on github seem quite similar although not exactly the same.

1

There are 1 answers

3
Andy Shinn On

The simple solution here is to add -t to your docker run command to allocate a pseudo TTY:

docker run --rm -t -w /tmp/working -v $PWD:/tmp/working astro:latest python script.py

What you are seeing is caused by the way the lines are printed and buffering of non-TTY based stdout. The percentage up to 100% is likely printed on a line without newlines. Then after 100% it is printed again with a newline. With buffering, this causes it to be printed twice.

When you run the same command with a TTY, there is no buffering and the lines are printed realtime so the newlines actually work as desired.

The code path isn't actually running twice :)

See Docker run with pseudoTTY (-t) gives instant stdout, buffering happens without it for another explanation (possibly better than mine).