Closed acgacgacgacg closed 1 year ago
@acgacgacgacg Just tested and not reproduced on my device, did you use client-sdk-flutter/example
or implement it in your own project? Have you set the mic permission tag in iOS/Runner/Info.plist
?
@cloudwebrtc
I did on my own project.
I have set the mic permission in Info.plist
Here is my code of toggling mic
Future<void> toggleMic() async {
final hasPermission = await Permission.microphone.isGranted;
if (!hasPermission) {
final result = await Permission.microphone.request();
if (result != PermissionStatus.granted) {
await AppSettings.openSoundSettings();
return;
}
}
final isMutedLocal = room?.localParticipant?.isMicrophoneEnabled() ?? false;
try {
await room?.localParticipant?.setMicrophoneEnabled(!isMutedLocal);
} on Exception catch (e, s) {
await CrashReportHelper.report(e, s);
await CrashReportHelper.log(
'publish audio failed, connection state is ${room?.connectionState}',
);
await showAlertDialog(
_context,
'toggleMicFailed'.tr(),
'leaveRoomThenJoinAgain'.tr(),
);
}
notifyListeners();
}
ah, this is weird, do you have a minified project that can reproduce the issue?
I suspect it has something to do with a wrong setting, but because can't reproduce it in the example app, so I can't locate the bug code.
@cloudwebrtc I will check the difference between example and my code. Here is the code I used to connect to a room
Future<void> _connectRoom() async {
final options = RoomOptions(
adaptiveStream: true,
dynacast: true,
defaultAudioCaptureOptions:
const AudioCaptureOptions(highPassFilter: true),
defaultScreenShareCaptureOptions:
ScreenShareCaptureOptions(useiOSBroadcastExtension: Platform.isIOS),
);
try {
room = Room(roomOptions: options);
await room!.connect(
LIVEKIT_SERVER,
_token,
);
} on Exception catch (e, s) {
unawaited(CrashReportHelper.report(e, s));
unawaited(CrashReportHelper.log('room connection failed'));
}
debugPrint('room connected');
}
The code looks fine.
You can try to create a listener before room.connect
, because the listener has asynchronous event interaction internally,
//create new room
room = Room(roomOptions: options);
// Create a Listener before connecting
_listener = room.createListener();
await room!.connect(
LIVEKIT_SERVER,
_token,
);
@cloudwebrtc I modified my code and removed the following 2 rows after connection (in old version of livekit, this prevents a bug when coming from background)
await room?.localParticipant?.setMicrophoneEnabled(true);
await room?.localParticipant?.setMicrophoneEnabled(false);
I tested again and found the first time I setMicrophoneEnabled(true)
, it works just fine.
However, after set to false and to true again, it becomes extremely slow, and some time the AUIOClient_StartIO failed
issue occurs
Also, the audio needs almost 1 second to be heard form another participant.
This only occurs from the second time after setMicrophoneEnabled(true)
When I disconnect the room, some time the following error throws
flutter: Concurrent modification during iteration: _LinkedHashMap len:1.
flutter:
#0 _CompactIterator.moveNext (dart:collection-patch/compact_hash.dart:706:7)
#1 RoomPrivateMethods._cleanUp (package:livekit_client/src/core/room.dart:597:45)
<asynchronous suspension>
#2 Room.disconnect (package:livekit_client/src/core/room.dart:365:5)
<asynchronous suspension>
After this error throws, when I connect to room next time, the AUIOClient_StartIO failed
will occurs
Also, the audio needs almost 1 second to be heard form another participant. This only occurs from the second time after
setMicrophoneEnabled(true)
This may be a known issue, if your audio is outputting, the second time call setMicrphoneEnabled will restart AudioUnit inside WebRTC Audio Device Module, which will consume some time.
okay, This may be related to room cleanup, where the audio device was not successfully released by the previous session. can you try changing these lines of code https://github.com/livekit/client-sdk-flutter/blob/main/lib/src/core/room.dart#L596-L601 to
// clean up RemoteParticipants
var participants = _participants.values.toList();
for (final participant in participants) {
// RemoteParticipant is responsible for disposing resources
await participant.dispose();
}
_participants.clear();
? This may be a concurrent operation issue
Also, the audio needs almost 1 second to be heard form another participant. This only occurs from the second time after
setMicrophoneEnabled(true)
This may be a known issue, if your audio is outputting, the second time call setMicrphoneEnabled will restart AudioUnit inside WebRTC Audio Device Module, which will consume some time.
I see. The previous version doesn't have this issue. Since it will cause a bad UX, I think it would be better to roll back to an earlier version. Which version should I roll back to?
can you try changing these lines https://github.com/livekit/client-sdk-flutter/blob/main/lib/src/core/room.dart#L596-L601 of code to
// clean up RemoteParticipants var participants = _participants.values.toList(); for (final participant in participants) { // RemoteParticipant is responsible for disposing resources await participant.dispose(); } _participants.clear();
? This may be a concurrent operation issue
Let me try.
okay, This may be related to room cleanup, where the audio device was not successfully released by the previous session. can you try changing these lines of code https://github.com/livekit/client-sdk-flutter/blob/main/lib/src/core/room.dart#L596-L601 to
// clean up RemoteParticipants var participants = _participants.values.toList(); for (final participant in participants) { // RemoteParticipant is responsible for disposing resources await participant.dispose(); } _participants.clear();
? This may be a concurrent operation issue
This concurrent error doesn't occur but the AURemoteIO.cpp:1668 AUIOClient_StartIO failed (-66637)
still appears after several time of setMicrophoneEnabled(true)
setMicrophoneEnabled(false)
And once AUIOClient_StartIO failed
, even disconnect and then connect won't fix.
I think it would be better to roll back to an earlier version. Which version should I roll back to?
Actually, the delay is due to a bug we fixed where the indicator would still light up after the mic was muted. The earlier version of mute just stopped transmitting the audio frame on the AudioTrack (although the track is muted the indicator is still on), while the fixed version would start from The audio device module turning off audio unit capture.
This concurrent error doesn't occur but the AURemoteIO.cpp:1668 AUIOClient_StartIO failed (-66637) still appears after several time of setMicrophoneEnabled(true) setMicrophoneEnabled(false) And once AUIOClient_StartIO failed, even disconnect and then connect won't fix.
Let me stress test this issue in e2e. This may be a timing bug in native SDK.
@cloudwebrtc Any progress on this issue?
please try livekit_client v1.2.0. This version is updated to flutter-webrtc 0.9.22 with a setting update about iOS sound management.
Close this issue as this bug no longer occurs on my device. If this problem occurs on our customers' devices, I will report and ask to reopen. Thanks!
Describe the bug Error
AUIOClient_StartIO failed
occurs on iOS after a client joined an existing room andsetMicrophoneEnabled(true)
This causes the audio track not able to be published. AlsosetMicrophoneEnabled(true)
is extremely slow compare to version 1.1.0To Reproduce Create a room by connecting from another client, connect the room and
setMicrophoneEnabled(true)
Expected behavior No error and audio track can be published.
Platform information
[✓] Android toolchain - develop for Android devices (Android SDK version 32.1.0-rc1) • Android SDK at /Users/gokaten/Library/Android/sdk • Platform android-33, build-tools 32.1.0-rc1 • Java binary at: /Applications/Android Studio.app/Contents/jre/Contents/Home/bin/java • Java version OpenJDK Runtime Environment (build 11.0.12+0-b1504.28-7817840) • All Android licenses accepted.
[✓] Xcode - develop for iOS and macOS (Xcode 14.0.1) • Xcode at /Applications/Xcode.app/Contents/Developer • Build 14A400 • CocoaPods version 1.11.3
[✓] Chrome - develop for the web • Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome
[✓] Android Studio (version 2021.2) • Android Studio at /Applications/Android Studio.app/Contents • Flutter plugin can be installed from: 🔨 https://plugins.jetbrains.com/plugin/9212-flutter • Dart plugin can be installed from: 🔨 https://plugins.jetbrains.com/plugin/6351-dart • Java version OpenJDK Runtime Environment (build 11.0.12+0-b1504.28-7817840)
[✓] VS Code (version 1.73.1) • VS Code at /Applications/Visual Studio Code.app/Contents • Flutter extension version 3.54.0
[✓] Connected device (2 available) • macOS (desktop) • macos • darwin-arm64 • macOS 12.5.1 21G83 darwin-arm • Chrome (web) • chrome • web-javascript • Google Chrome 108.0.5359.94
[✓] HTTP Host Availability • All required HTTP hosts are available
flutter: room connected AudioSession override via Earpiece/Headset is successful AudioSession override via Earpiece/Headset is successful [aurioc] AURemoteIO.cpp:1668 AUIOClient_StartIO failed (-66637)