I am using a Raspberry Pi as a (very simple) home weather station.
I currently have a small python script that, when run, takes a temperature reading and posts the result as a new line to a Google Spreadsheet via the Google Docs API. This python script needs to be run as root, and I currently get good results by scheduling this script to run every hour in the superuser's crontab.
I have also just gotten this Pi to run Nginix and uWsgi to serve up Django-powered websites and I'd like to start logging the temperatures to a local database in addition to the Google Spreadsheet. I would like to do this via Django's ORM since it's what I am most familiar with.
I have two specific goals:
- Set up a regularly scheduled task that will both post a newly-taken temperature reading to the local DB (using the Django ORM) as well as the Google Spreadsheet.
- Serve a web page (presumably requiring a login) that will allow remote users to "take a temperature reading now" and both report the result back (via a webpage) and insert the result into the server-side DB.
My question is: What is the best way to do this, knowing that any code that will access the temperature probe must be run as root?
For #1, I guess the question boils down to: what's the best way to load up a python environment that "feels like the Django shell" in that it can easily do all the same imports (specifically things like the ORM)? If I could do that, I could write a new script that would do both the Spreadsheet upload and the DB write. Can I/should I do this via a Django "Command" that would then be scheduled to run in the superuser's crontab? Or should I do this via a regular old python script that goes ahead and loads the right Django-specific modules?
For #2, I was looking for suggestions about how to handle the "must run as root" issue for the temperature sensor readings. I clearly don't want to run the whole Django process as root. So what are my best options here?
Thanks.
django-celery is the definitely worth looking into for running scheduled tasks.