Open a-ichikura opened 5 months ago
I also have Pepper 2.5 and I found that the same code is fine with Pepper 2.5.
Why does the problem occur only with Pepper 2.9?
Disclaimer: I'm not affiliated with this repository, just a random dude and the following is only from my memory that you'd need to verify yourself.
With the NAOqi update to version 2.9, some things have changed in the architecture. Accordingly, version 2.9 no longer includes some services that were still available in version 2.5. There are now also new services, some of which partially replicate old functionality.
In your case, you are running into the error because the ALTracker service simply no longer exists in version 2.9. I believe the service that has replaced ALTracker is Human or HumanAwareness.
You can view the services and their methods on your robot (even those that are no longer really functional) via qicli and (in some cases) even output a few docstrings using the -show-doc option.
For Pepper robots with NAOqi version 2.9, it is actually intended/recommended to use the QiSDK as part of an Android app. You can use the old services (at least those that still work) with libqi-python with the identical code as for NAOqi 2.5 (e.g. ALTextToSpeech or ALLeds). The new services, however, cannot be used this way, as they now also expect a context object. There is (if I remember correctly) a RobotContextFactory (?) service that may be able to provide this. However, there is no official support for this and you should be aware that you are travelling absolutely blind here.
Be that as it may, I hope that these explanations will help you a little to understand your problem. If I have misunderstood/interpreted something here, I hope that others can clear up any mistakes.
@DrR955 you are completely correct with your explanation and answer, thank you for the help 😊 I'm not exactly sure myself about Context objects. The design of these API are outside of my expertise. I believe robot contexts are built from focus handles. There is indeed the RobotContextFactory service, that is related to the Focus service.
The goal of this design is to ensure exclusive access to some API by the Android activity that uses them, to avoid concurrency issues. The Focus service makes sure that only one client is allowed to have a handle at a time.
@DrR955 @nyibbang
Thank you so much! I can understand the difference between Pepper 2.5 and 2.9.
As @DrR955 said, I found Human Awareness API. This is what I looked for, so I managed to continue on my project. Thanks again.
The following data is the output of what we can use for Pepper 2.9 with libqi-python API, and the functions related to a kind of tracking.
>>> map(lambda x: x['name'], app.session.services())
['ServiceDirectory', 'LogManager', 'PackageManager', 'ALServiceManager', 'ALMemory', 'Mapping', 'ContextFactory', 'AccessControl', '_ALNotificationAdder', 'ALNotificationManager', '_ALExpressiveness', 'MainConversation', '_ALNotificationReader', 'ALSystem', 'AutonomousAbilities', 'ALConnectionManager', 'ALPreferences', 'ALKnowledge', 'ALUserInfo', 'HumanAwareness', 'ALAudioDevice', 'ALAudioPlayer', 'Knowledge', 'LoLA', 'ALRobotModel', 'ALPreferenceManager', '_ConditionChecker_pepper_3234_0', '_NaoqiInformationForSemantic', 'ALExpressionWatcher', 'ALTactileGesture', '_ALCloud', '_SemanticEngine', 'Semantics', 'ALStore', 'ALDiagnosis', 'ALBodyTemperature', 'ActuationPrivate', 'ALRobotPosture', 'ALMotion', 'Touch', 'Actuation', 'ALTextToSpeech', 'ALTouch', 'ALVoiceEmotionAnalysis', 'ALFrameManager', 'ALBattery', 'ALLeds', 'ALResourceManager', 'ALAutonomousBlinking', 'ALLauncher', '_AsrCompiler2', 'ALBehaviorManager', 'ALPythonBridge', '_AsrCompiler1', 'ALVideoDevice', 'ALRobotHealthMonitor', 'ALSpeechRecognition', 'Conversation', 'LanguageManager', 'ALModularity', '_ALBrightnessStatistics', 'Camera', 'ALNavigation', 'HumanPerception', 'ALMood', 'BasicAwareness', 'LifeStarter', '_ConditionChecker_pepper_3233_0', 'Focus', 'ALAutonomousLife', 'ALPodDetection', 'ALAnimatedSpeech', '_ALMovementScheduler', 'ALSpeakingMovement', 'ALDialog', 'ALRecharge', 'ALListeningMovement', 'ALBackgroundMovement', 'ALAnimationPlayer', 'ALSignsAndFeedback', 'ALRobotMood', 'ALTabletService']
Human Awareness:
['__bool__', '__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', 'async', 'call', 'engagedHuman', 'humansAround', 'isValid', 'makeEngageHuman', 'metaObject', 'recommendedHumanToEngage']
Human Perception:
['__bool__', '__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', 'async', 'blobs', 'bodies', 'call', 'dumpBlackBox', 'faces', 'humansAroundPrivate', 'isValid', 'metaObject', 'setDebugMode']
Basic Awareness:
['__bool__', '__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '_getEmptyFrame', 'async', 'call', 'isEnabled', 'isRunning', 'isValid', 'metaObject', 'setEnabled', 'setStrategies', 'state', 'trackedHuman', 'zoneOfInterest']
Unfortunately, I could track people with HumanAwareness
or HumanPerception
in Python.
If my understanding is correct, there is getHumanAround function in Java, but I could not find out such function in Python.
I know we have a way to track people with BasicAwareness
by using setEnabled(True)
, but my purpose is moving only the head during another arm motion.
I believe there is a humansAround
property on HumanAwareness
that you can get or connect to to track to humans detected by the robot.
Have you tried using the humansAround property? Looking at the EngageHuman docs, recommendedHuman
might also be of use. I'm not sure how that would work with callbacks, though (if at all). Remember, this is all a (somewhat educated) guess on my part, but if you need to get updates / a callback - if the human has changed, for example - then you should probably check (e.g. via qicli) if the HumanAwareness
service has signals you can subscribe to.
Regarding your problem with BasicAwareness
you could maybe make use of HolderBuilder.withDegreesOfFreedom, which would probably be part of the AutonomousAbilities
service.
On a side note, if you have already managed to successfully use functions of 2.9 services (that require the context object), would you please share the code? Maybe it will help some poor soul in the future (myself included).
You don´t have to go through ALMemory
subscriber in this case. You can just connect to the property directly:
humanAwareness = app.session.service("HumanAwareness")
humansAround = humanAwareness.humansAround
# Get the current list of detected humans
humans = humansAround.value()
print(f"current humans: {humans}")
# Or connect to changes on human detection. The lambda is called every time the list of humans around changes.
humansAround.connect(lambda humans: print(f"new humans detected: {humans}"))
Here is how you can build a Context
object for the API you want to use:
ctxFactory = app.session.service("ContextFactory")
focus = app.session.service("Focus")
ctx = ctxFactory.makeContext()
focusOwner = focus.take()
ctx.focus.setValue(focusOwner)
ctx.identity.setValue("my-app") # Unsure about this, let me know if it works for you
# You can now use the ctx object for the API that you want
engageHuman = humanAwareness.makeEngageHuman(ctx, human)
engageHuman.run()
@nyibbang @DrR955
Thank you all. I found humanAround
before, but I could not under stand how to use it in Python.
Now, I succeeded in getting human
object with this code,
humanAwareness = app.session.service("HumanAwareness")
humansAround = humanAwareness.humansAround
# Get the current list of detected humans
humans = humansAround.value()
print(f"current humans: {humans}")
# Or connect to changes on human detection. The lambda is called every time the list of humans around changes.
humansAround.connect(lambda humans: print(f"new humans detected: {humans}"))
I'll try with BesicAwareness
later with adjusting DegreesOfFreedom
and share the link of my code here, if I can make it.
I want to make my pepper track people, but I got an error.
I tried to subscribe "ALTracker" with your example as follows,
However, I got a following error.
I confirmed that the same error occurs with "ALSonar," "ALExtractor" and so on.
My qi version is 3.1.5 and I use Python 3.8. How can I solve this error?