inotifywait -m does not process more than 1 file after long running process

307 views Asked by At

I have a script that detects files on close_write and runs an ~5 minute process on them. These files are written to the directory in batches of up to 100. The issue is that inotifywait only detects the first file in the batch and does not process the subsequent files unless they are removed from the directory by hand and put back. Here is my script:

#!/bin/bash

inotifywait -r -e close_write -e moved_to --format "%f" $TARGET -m | while read file
    do
        if [[ "$file" =~ .*mp4$ ]]; then
            echo "Detected $file"
            /usr/bin/python3 LongRunningProgram.py -i $TARGET/$file -o $PROCESSED -u $UPLOADPATH -c $C
        fi
    done

it is maintained by a systemctl service written like so:

[Unit]
Description=Description
After=network.target

[Service]
Type=idle
user=pi
WorkingDirectory=/home/pi
ExecStart=/bin/bash /home/pi/notify.sh OutPath C
Restart=on-failure

[Install]
WantedBy=multi-user.target

I am confused as to why it only seems to recognize the first file but not subsequent files when run like this, however if I replace the long running program with sleep 300 it seems to work fine.

1

There are 1 answers

0
Yllier123 On

My fault for neglecting to explain what the "long running process" was doing. The python script that is called as a result of inotifywait spins up an ffmpeg process to do work on the MP4 files that were written to the directory. There is a bug with ffmpeg when using it in a while loop. I followed This answers solution to use the -nostdin flag and that appears to have solved the issue. Hopefully this answer helps someone else with this problem :)