aldebaran / libqi-python

qiSDK python bindings
BSD 3-Clause "New" or "Revised" License
15 stars 11 forks source link

How to use `ALTracker` of Pepper 2.9 with libqi-python? #26

Open a-ichikura opened 5 months ago

a-ichikura commented 5 months ago

I want to make my pepper track people, but I got an error.

I tried to subscribe "ALTracker" with your example as follows,

import argparse
import qi
import sys

class Authenticator:

    def __init__(self, username, password):
        self.username = username
        self.password = password

    # This method is expected by libqi and must return a dictionary containing
    # login information with the keys 'user' and 'token'.
    def initialAuthData(self):
        return {'user': self.username, 'token': self.password}

class AuthenticatorFactory:

    def __init__(self, username, password):
        self.username = username
        self.password = password

    # This method is expected by libqi and must return an object with at least
    # the `initialAuthData` method.
    def newAuthenticator(self):
        return Authenticator(self.username, self.password)

# Reads a file containing the username on the first line and the password on
# the second line. This is the format used by qilaunch.
def read_auth_file(path):
    with open(path) as f:
        username = f.readline().strip()
        password = f.readline().strip()
        return (username, password)

def make_application(argv=sys.argv):
    """
    Create and return the qi.Application, with authentication set up
    according to the command line options.
    """
    # create the app and edit `argv` in place to remove the consumed
    # arguments.
    # As a side effect, if "-h" is in the list, it is replaced with "--help".
    app = qi.Application(argv)

    # Setup a non-intrusive parser, behaving like `qi.Application`'s own
    # parser:
    # * don't complain about unknown arguments
    # * consume known arguments
    # * if the "--help" option is present:
    #   * print its own options help
    #   * do not print the main app usage
    #   * do not call `sys.exit()`
    parser = argparse.ArgumentParser(add_help=False, usage=argparse.SUPPRESS)
    parser.add_argument(
        "-a", "--authfile",
        help="Path to the authentication config file. This file must "
        "contain the username on the first line and the password on the "
        "second line.")
    if "--help" in argv:
        parser.print_help()
        return app
    args, unparsed_args = parser.parse_known_args(argv[1:])
    logins = read_auth_file(args.authfile) if args.authfile else ("nao", "nao")
    factory = AuthenticatorFactory(*logins)
    app.session.setClientAuthenticatorFactory(factory)
    # edit argv in place.
    # Note: this might modify sys.argv, like qi.Application does.
    argv[1:] = unparsed_args
    return app

if __name__ == "__main__":
    parser = argparse.ArgumentParser()
    parser.add_argument("--msg", default="Hello python")
    app = make_application()
    args = parser.parse_args()
    logger = qi.Logger("authentication_with_application")
    logger.info("connecting session")
    app.start()
    logger.info("fetching ALTextToSpeech service")
    tts = app.session.service("ALTextToSpeech")
    test = app.session.service("ALTracker")
    logger.info("Saying something")
    tts.call("say", args.msg)

However, I got a following error.

[I] 1720225369.133268 10238 authentication_with_application: connecting session
[I] 1720225369.362969 10238 authentication_with_application: fetching ALTextToSpeech service
Traceback (most recent call last):
  File "connection.py", line 86, in <module>
    test = app.session.service("ALTracker")
RuntimeError: Cannot find service 'ALTracker' in index
[W] 1720225369.433052 10253 qitype.signal: disconnect: No subscription found for SignalLink 18446744073709551615.
[W] 1720225369.433066 10247 qitype.signal: disconnect: No subscription found for SignalLink 13.
[W] 1720225369.433136 10253 qitype.signal: disconnect: No subscription found for SignalLink 18446744073709551615.

I confirmed that the same error occurs with "ALSonar," "ALExtractor" and so on.

My qi version is 3.1.5 and I use Python 3.8. How can I solve this error?

a-ichikura commented 5 months ago

I also have Pepper 2.5 and I found that the same code is fine with Pepper 2.5.

Why does the problem occur only with Pepper 2.9?

DrR955 commented 5 months ago

Disclaimer: I'm not affiliated with this repository, just a random dude and the following is only from my memory that you'd need to verify yourself.

With the NAOqi update to version 2.9, some things have changed in the architecture. Accordingly, version 2.9 no longer includes some services that were still available in version 2.5. There are now also new services, some of which partially replicate old functionality.

In your case, you are running into the error because the ALTracker service simply no longer exists in version 2.9. I believe the service that has replaced ALTracker is Human or HumanAwareness.

You can view the services and their methods on your robot (even those that are no longer really functional) via qicli and (in some cases) even output a few docstrings using the -show-doc option.

For Pepper robots with NAOqi version 2.9, it is actually intended/recommended to use the QiSDK as part of an Android app. You can use the old services (at least those that still work) with libqi-python with the identical code as for NAOqi 2.5 (e.g. ALTextToSpeech or ALLeds). The new services, however, cannot be used this way, as they now also expect a context object. There is (if I remember correctly) a RobotContextFactory (?) service that may be able to provide this. However, there is no official support for this and you should be aware that you are travelling absolutely blind here.

Be that as it may, I hope that these explanations will help you a little to understand your problem. If I have misunderstood/interpreted something here, I hope that others can clear up any mistakes.

nyibbang commented 5 months ago

@DrR955 you are completely correct with your explanation and answer, thank you for the help 😊 I'm not exactly sure myself about Context objects. The design of these API are outside of my expertise. I believe robot contexts are built from focus handles. There is indeed the RobotContextFactory service, that is related to the Focus service.

The goal of this design is to ensure exclusive access to some API by the Android activity that uses them, to avoid concurrency issues. The Focus service makes sure that only one client is allowed to have a handle at a time.

a-ichikura commented 4 months ago

@DrR955 @nyibbang

Thank you so much! I can understand the difference between Pepper 2.5 and 2.9.

As @DrR955 said, I found Human Awareness API. This is what I looked for, so I managed to continue on my project. Thanks again.

The following data is the output of what we can use for Pepper 2.9 with libqi-python API, and the functions related to a kind of tracking.

>>> map(lambda x: x['name'], app.session.services())
['ServiceDirectory', 'LogManager', 'PackageManager', 'ALServiceManager', 'ALMemory', 'Mapping', 'ContextFactory', 'AccessControl', '_ALNotificationAdder', 'ALNotificationManager', '_ALExpressiveness', 'MainConversation', '_ALNotificationReader', 'ALSystem', 'AutonomousAbilities', 'ALConnectionManager', 'ALPreferences', 'ALKnowledge', 'ALUserInfo', 'HumanAwareness', 'ALAudioDevice', 'ALAudioPlayer', 'Knowledge', 'LoLA', 'ALRobotModel', 'ALPreferenceManager', '_ConditionChecker_pepper_3234_0', '_NaoqiInformationForSemantic', 'ALExpressionWatcher', 'ALTactileGesture', '_ALCloud', '_SemanticEngine', 'Semantics', 'ALStore', 'ALDiagnosis', 'ALBodyTemperature', 'ActuationPrivate', 'ALRobotPosture', 'ALMotion', 'Touch', 'Actuation', 'ALTextToSpeech', 'ALTouch', 'ALVoiceEmotionAnalysis', 'ALFrameManager', 'ALBattery', 'ALLeds', 'ALResourceManager', 'ALAutonomousBlinking', 'ALLauncher', '_AsrCompiler2', 'ALBehaviorManager', 'ALPythonBridge', '_AsrCompiler1', 'ALVideoDevice', 'ALRobotHealthMonitor', 'ALSpeechRecognition', 'Conversation', 'LanguageManager', 'ALModularity', '_ALBrightnessStatistics', 'Camera', 'ALNavigation', 'HumanPerception', 'ALMood', 'BasicAwareness', 'LifeStarter', '_ConditionChecker_pepper_3233_0', 'Focus', 'ALAutonomousLife', 'ALPodDetection', 'ALAnimatedSpeech', '_ALMovementScheduler', 'ALSpeakingMovement', 'ALDialog', 'ALRecharge', 'ALListeningMovement', 'ALBackgroundMovement', 'ALAnimationPlayer', 'ALSignsAndFeedback', 'ALRobotMood', 'ALTabletService']
Human Awareness:
['__bool__', '__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', 'async', 'call', 'engagedHuman', 'humansAround', 'isValid', 'makeEngageHuman', 'metaObject', 'recommendedHumanToEngage']
Human Perception:
['__bool__', '__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', 'async', 'blobs', 'bodies', 'call', 'dumpBlackBox', 'faces', 'humansAroundPrivate', 'isValid', 'metaObject', 'setDebugMode']
Basic Awareness:
['__bool__', '__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '_getEmptyFrame', 'async', 'call', 'isEnabled', 'isRunning', 'isValid', 'metaObject', 'setEnabled', 'setStrategies', 'state', 'trackedHuman', 'zoneOfInterest']
a-ichikura commented 4 months ago

Unfortunately, I could track people with HumanAwareness or HumanPerception in Python.

If my understanding is correct, there is getHumanAround function in Java, but I could not find out such function in Python.

I know we have a way to track people with BasicAwareness by using setEnabled(True), but my purpose is moving only the head during another arm motion.

nyibbang commented 4 months ago

I believe there is a humansAround property on HumanAwareness that you can get or connect to to track to humans detected by the robot.

DrR955 commented 4 months ago

Have you tried using the humansAround property? Looking at the EngageHuman docs, recommendedHuman might also be of use. I'm not sure how that would work with callbacks, though (if at all). Remember, this is all a (somewhat educated) guess on my part, but if you need to get updates / a callback - if the human has changed, for example - then you should probably check (e.g. via qicli) if the HumanAwareness service has signals you can subscribe to.

Regarding your problem with BasicAwareness you could maybe make use of HolderBuilder.withDegreesOfFreedom, which would probably be part of the AutonomousAbilities service.

On a side note, if you have already managed to successfully use functions of 2.9 services (that require the context object), would you please share the code? Maybe it will help some poor soul in the future (myself included).

nyibbang commented 4 months ago

HumanAwareness and properties

You don´t have to go through ALMemory subscriber in this case. You can just connect to the property directly:

humanAwareness = app.session.service("HumanAwareness")
humansAround = humanAwareness.humansAround
# Get the current list of detected humans
humans = humansAround.value()
print(f"current humans: {humans}")
# Or connect to changes on human detection. The lambda is called every time the list of humans around changes.
humansAround.connect(lambda humans: print(f"new humans detected: {humans}"))

Context and focus objects

Here is how you can build a Context object for the API you want to use:

ctxFactory = app.session.service("ContextFactory")
focus = app.session.service("Focus")
ctx = ctxFactory.makeContext()
focusOwner = focus.take()
ctx.focus.setValue(focusOwner)
ctx.identity.setValue("my-app") # Unsure about this, let me know if it works for you
# You can now use the ctx object for the API that you want
engageHuman = humanAwareness.makeEngageHuman(ctx, human)
engageHuman.run()
a-ichikura commented 4 months ago

@nyibbang @DrR955 Thank you all. I found humanAround before, but I could not under stand how to use it in Python.

Now, I succeeded in getting human object with this code,

humanAwareness = app.session.service("HumanAwareness")
humansAround = humanAwareness.humansAround
# Get the current list of detected humans
humans = humansAround.value()
print(f"current humans: {humans}")
# Or connect to changes on human detection. The lambda is called every time the list of humans around changes.
humansAround.connect(lambda humans: print(f"new humans detected: {humans}"))

I'll try with BesicAwareness later with adjusting DegreesOfFreedom and share the link of my code here, if I can make it.