I want to use multiprocessing to separate two tasks in pygame: Pulling an image from a webcam and showing the image on a screen. The reason I want to do this, is because
- I want to have fancy image processing
- I want to separate the web cam polling from the screen and user input
and want to separate the time delay of both, ideally optmizing the load. Using threading I have no issues sending images from one thread to the other. However, I get a laggy image. I would like to test if using multiprocessing can reduce delay of the image.
Here's the snag: I get an error after sending the image from the camera process to the screen process. After pulling the image from the queue using
imgmsg = img_q.get()
I check the image for size with
imgmsg.img.get_width()
As said, with threads this returns the correct image width. However with multiprocessing I get the following error
Process Process-1: Traceback (most recent call last): File "/usr/lib/python3.2/multiprocessing/process.py", line 267, in
_bootstrap
self.run() File "/usr/lib/python3.2/multiprocessing/process.py", line 116, in run
self._target(*self._args, **self._kwargs) File "test_photoBoothMultiProc.py", line 21, in consumer
photoBoothScreen.screenThread(in_q, img_q) File "/home/pi/pyBooth/thread_photoBoothScreen.py", line 68, in screenThread
print(imgmsg.img.get_width()) pygame.error: display Surface quit
So it seems the image is lost in the queue? I have tried to read into this, and there seem to problems when trans mitting larger objects through a queue. Is this correct? How would I circumvent or fix this?
Serialize the images first (with
pygame.image.tostring
/pygame.image.fromstring
) before sending to another process.That should work. This way, you send only the data of the image itself, not a
Surface
instance (so it's totally independent from pygame).Note that you could also compress the string further with simply calling
.encode("zlib")
/.decode("zlib")
on that string.