I want to detect people without relying on face detection. In cases where lighting conditions are poor, or Pepper is facing away, people are not detected. The memory events 'PeoplePerception/JustArrived' and 'EngagementZones/PersonApproached' seem to rely on faces being detectable by the camera. Is there a memory event which is triggered by changes in laser/infrared/sonar distance?
I wonder if there is a better solution than:
while True:
floatDist = self.memory.getData('Device/SubDeviceList/Platform/Front/Sonar/Sensor/Value')
if floatDist < 1.0:
doSomething()
sleep(0.5)
You can use the front sonar and the event "FaceDetected" for the human detection.
But you can use a PeriodicTask and not a while loop. You while check the event each 0.5secs and you will be allowed to stop it.
I will do it like that :
So it is an example of a python Class that need the module ALMemory. With the module ALMemory you will check the sonar and if a face is detected.