Closed timtre closed 3 years ago
Did you find a solution? I have the same problem running from Turkey.
@selimbayhan, thanks for your info. Before we look into the specific Roobo issue, could we step back to do a simple verification on the speech key and region from the following Java sample code? https://github.com/Azure-Samples/cognitive-services-speech-sdk/tree/master/quickstart/java/jre/from-microphone
On Mac or laptop, follow the readme and fill in your speech key and region info. Then check if they work (could recognize speech to text) in this Java sample app.
@HuapingLiu , thanks for your response. I used the sample code to verify that my keys and region info is true. It is working on my Mac. It recognised what I spoke.
@selimbayhan , thanks for the trying. Looks this is a specific issue on RooboV1 device. Could you please help confirm some info?
1. My Android studio version is 4.1.2
2. Yes. I installed Google Play services but nothing changed. Here is the error from logs. 03-17 23:11:55.755 4702-4744/com.microsoft.cognitiveservices.speech.samples.sdsdkstarterapp W/GooglePlayServicesUtil: Google Play Store is missing. 03-17 23:11:55.755 4702-4744/com.microsoft.cognitiveservices.speech.samples.sdsdkstarterapp E/GooglePlayServicesUtil: GooglePlayServices not available due to error 9
3.Recognize Continuously is working. There is no problem with recognize continuously. I have a problem with conversation transcription. I added one participant with a signature (v4) generated from the rest service but There is no transcribed data. The log is something below. Also I am adding the signature that I used. I store signatures in a firestore db
I/CTS: TransResult Recognized result received: $ref$ : ; Tick: 7900000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 0 I/CTS: TransResult Recognized result received: Guest : ; Tick: 7900000 I/CTS: TransResult Recognized result received: $ref$ : ; Tick: 100100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 100100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 100300000 I/CTS: TransResult Recognized result received: $ref$ : ; Tick: 200100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 200100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 200100000 I/CTS: TransResult Recognized result received: $ref$ : ; Tick: 305300000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 300100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 304900000 W/InputEventReceiver: Attempted to finish an input event but the input event receiver has already been disposed. I/CTS: TransResult Recognized result received: $ref$ : ; Tick: 400100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 400100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 400100000 I/CTS: TransResult Recognized result received: $ref$ : ; Tick: 500100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 500100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 511900000 I/CTS: TransResult Recognized result received: $ref$ : ; Tick: 600100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 600100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 600100000 I/CTS: TransResult Recognized result received: $ref$ : ; Tick: 700100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 700100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 706900000 I/CTS: TransResult Recognized result received: $ref$ : ; Tick: 893300000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 800100000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 806700000
Signature: [Huaping removed this personal info]
4. I will try the basic console app and tell you the result. Thanks for your helps.
@HuapingLiu
I answered your first three questions in the previous comment. For the 4 th question.I just tested your basic console app. It is working for speech recognition from microphone. It has an output like this. Although it has a warning with linker. ıt is working with speech recognition. I also tested in with converstion Transcription for first the choice 5 then 3. It is working but very late. I also put the logs. So my problem is related with my Andoid app I think. I used v4 signature, but I see you use v0 signature in your console app. Is it a problem to use v4 signature?
Choice (0 for MAIN MENU): 2 WARNING: linker: libMicrosoft.CognitiveServices.Speech.extension.pma.so: unused DT entry: type 0x6ffffffe arg 0x5c9c WARNING: linker: libMicrosoft.CognitiveServices.Speech.extension.pma.so: unused DT entry: type 0x6fffffff arg 0x3 WARNING: linker: libpma.so: unused DT entry: type 0x1d arg 0x121a WARNING: linker: libpma.so: unused DT entry: type 0x6ffffffe arg 0x2764 WARNING: linker: libpma.so: unused DT entry: type 0x6fffffff arg 0x2 WARNING: linker: libunimic_runtime.so: unused DT entry: type 0x6ffffffe arg 0xc484 WARNING: linker: libunimic_runtime.so: unused DT entry: type 0x6fffffff arg 0x3 WARNING: linker: libMicrosoft.CognitiveServices.Speech.extension.kws.so: unused DT entry: type 0x6ffffffe arg 0x2bd0 WARNING: linker: libMicrosoft.CognitiveServices.Speech.extension.kws.so: unused DT entry: type 0x6fffffff arg 0x3 Say something... Recognizing:this Recognizing:the speech Recognizing:the speech recognition Recognizing:the speech recognition is Recognizing:the speech recognition is working RECOGNIZED:The speech recognition is working.
Recognizing:but Recognizing:but the Recognizing:but the conversation Recognizing:but the conversation transaction Recognizing:but the conversation transaction has Recognizing:but the conversation transaction has a Recognizing:but the conversation transaction has a problem RECOGNIZED:But the conversation transaction has a problem.
Conversation Transcription Result WARNING: linker: libMicrosoft.CognitiveServices.Speech.extension.pma.so: unused DT entry: type 0x6ffffffe arg 0x5c9c WARNING: linker: libMicrosoft.CognitiveServices.Speech.extension.pma.so: unused DT entry: type 0x6fffffff arg 0x3 WARNING: linker: libpma.so: unused DT entry: type 0x1d arg 0x121a WARNING: linker: libpma.so: unused DT entry: type 0x6ffffffe arg 0x2764 WARNING: linker: libpma.so: unused DT entry: type 0x6fffffff arg 0x2 WARNING: linker: libunimic_runtime.so: unused DT entry: type 0x6ffffffe arg 0xc484 WARNING: linker: libunimic_runtime.so: unused DT entry: type 0x6fffffff arg 0x3 WARNING: linker: libMicrosoft.CognitiveServices.Speech.extension.kws.so: unused DT entry: type 0x6ffffffe arg 0x2bd0 WARNING: linker: libMicrosoft.CognitiveServices.Speech.extension.kws.so: unused DT entry: type 0x6fffffff arg 0x3 Added Tester TRANSCRIBING: Text=Unidentified : first one TRANSCRIBING: Text=Unidentified : first one first TRANSCRIBING: Text=Unidentified : first one first five TRANSCRIBING: Text=Unidentified : first one first five ten TRANSCRIBING: Text=Unidentified : first one first five ten three TRANSCRIBING: Text=Unidentified : first one first five ten three first TRANSCRIBING: Text=Unidentified : first one first five ten three first five TRANSCRIBING: Text=Unidentified : first one first five ten three first five ten TRANSCRIBING: Text=Unidentified : first one first five ten three first five ten three Transcribed: Text= Offset=33500000 Duration=0 UserId=Unidentified Transcribed: Text=First one first 5, then three first 5, then three. Offset=33600000 Duration=55600000 UserId=Unidentified Transcribed: Text= Offset=0 Duration=100100000 UserId=Unidentified
@selimbayhan, thanks for your info.
@HuapingLiu thanks. I just installaed Android Studio v3.6.3 and write the participantlist with the format you provided. I got the following error.
I/CTS: Session started event. Start recognition I/CTS: CANCELED: Reason=Error, ErrorCode=ConnectionFailure, ErrorDetails=Connection was closed by the remote host. Error code: 1011. Error details: Protocols.Core.BadClientRequestException: Failed to parse user signature. at Microsoft.Speech.CTS SessionId: e5db49a3937746e498f58f396e1b8ae1 I/CTS: Session stopped event. Stop recognition
My Signature format is:
PARTICIPANTSLIST = <user1@\ {\ "Type": "Regular",\ "Version": "V4",\ "Data": {"Duration":612400000,"Language":"en-us","SignatureData":"[Huaping removed the personal info]"}\ }>
@selimbayhan, thanks for your patience. I checked with CTS and attached a new participants.properties file. When you get time, could you please give a try on it? participants.properties.txt
@HuapingLiu thanks
First I got a permison related error and I added the following line to the manifest file.
<permission android:name="android.permission.INTERACT_ACROSS_USERS_FULL" android:protectionLevel="signature"/>
After that I get rid of the error about permissions.
**Now it can parse the signature. But the problem is continuing. Here is the logs. The app crashed and get a dump. I am also attaching the dump file. (/data/anr/traces.txt). traces.txt
As you can see from the logs there is nothing in the TransResult.**
I/CTS: Participants enrollment I/CTS: add participant: user1 W/AudioRecord: AUDIO_INPUT_FLAG_FAST denied by client I/CTS: Session started event. Start recognition I/CTS: Recognition started. 1616319390007 I/art: Thread[5,tid=13681,WaitingInMainSignalCatcherLoop,Thread*=0xb7820b88,peer=0x12c000a0,"Signal Catcher"]: reacting to signal 3 I/art: Wrote stack traces to '/data/anr/traces.txt' I/CTS: TransResult Recognized result received: Guest : ; Tick: 0 I/CTS: TransResult Recognized result received: Guest : ; Tick: 30200000 I/CTS: TransResult Recognized result received: Guest : ; Tick: 100100000 I/CTS: CANCELED: Reason=Error, ErrorCode=ServiceTimeout, ErrorDetails=Due to service inactivity, the client buffer exceeded maximum size. Resetting the buffer. SessionId: 3fdd3b8b1443425aa89452dc58150c29 I/CTS: Session stopped event. Stop recognition
@selimbayhan. thank you for your useful log. when I looked into the log, I did not find "I/PMA: Opened microphone PCM". The following is my log with same participant format: I/CTS: Participants enrollment I/CTS: add participant: user1 I/CTS: Recognition started. 1616303443025 I/PMA: Opening PCM I/PMA: Opened microphone PCM I/CTS: Session started event. Start recognition I/CTS: TransResult Intermediate result received: ...: think that; Tick: 11900000 I/CTS: TransResult Intermediate result received: ...: think that's; Tick: 11900000 I/CTS: TransResult Intermediate result received: ...: i think that's; Tick: 11900000 I/CTS: TransResult Intermediate result received: ...: thing testing; Tick: 11900000 I/CTS: TransResult Intermediate result received: ...: testing testing; Tick: 11900000
So could you please check your Android app folder structure, it should look like:
these 3 lib*.so files under jniLibs folder helps collect Circular 6+1 microphone array audio. If you do not have them could you please try to get them from this link (for now the latest version is 1.15)? https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/speech-devices-sdk
[BTW, from Windows system, there is no additional permission needs to run this Android sample app. Hope it is not a 'block' issue. I will look into it later.]
@HuapingLiu thanks.
Actually I have the sama folder structure. I have that 3 lib*.so files. Here is the screenshot.
@selimbayhan, cool. Please update the armeabi folder name to armeabi-v7a, then give a try to see if you can see log like: I/PMA: Opening PCM I/PMA: Opened microphone PCM which are from the PMA libs. [After you updating the folder name, I recommend you reboot RooboV1 and then deploy the sample app to device]
@HuapingLiu thank you very much for your patience. With the folder renaming it is working now. And it is working with Android Studio 4.1.2 . I think there is no Android Studio version issue .But as I said before I put the following line to manifest to get rid of the error below:
Error: java.lang.SecurityException: Permission Denial: Component com.google.android.gms/.chimera.container.SharedModuleProvider requests FLAG_SINGLE_USER, but app does not hold android.permission.INTERACT_ACROSS_USER
Added line to manifest to get rid of the error:
<permission android:name="android.permission.INTERACT_ACROSS_USERS_FULL" android:protectionLevel="signature"/>
And I think There is no issue with Google Play Services. Formerly I got the signature from a firebase firestore db and it's android library depends on Google Play Services. When I deleted firebase implementation from the build files I get rid of Google Play services errors.
Thank you very much.
@selimbayhan , great. Appreciate your collaboration and details info here, which is very useful. I consider this issue is resolved and close this case.
This issue is for a: (mark with an
x
)Minimal steps to reproduce
Any log messages given by the failure
Might the problem be that it's just not possible to run the Speech Devices SDK when being based in Europe?
Expected/desired behavior
OS and Version?
Versions
Mention any other details that might be useful
I tried changing the time on the device to WestUS, CentralUS, as well as Central European Time. At the same time I entered the corresponding codes for the speech regions in the java code (MainActivity.java). I even tried to use a VPN on the Roobo device to simulate a US location.
I'd be happy if you could help me out here. Is it even possible to run the Speech Devices SDK outside the US or China?