aws / amazon-chime-sdk-ios

An iOS client library for integrating multi-party communications powered by the Amazon Chime service.
https://aws.amazon.com/chime/chime-sdk/
Apache License 2.0
139 stars 64 forks source link

Followed documentation but cannot hear or speak in voice call #639

Open Banhdit1 opened 5 months ago

Banhdit1 commented 5 months ago

Describe the bug For some reason the dial out works and someone can pick up the call but the iOS caller using app has no sound or voice input (orange light comes on though on the iPhone). I think i might be missing something but I am not sure, any help would be greatly appreciated. All the ios code is below.

To Reproduce Steps to reproduce the behavior: Installed SDK using cocoa AmazonChimeSDK-Bitcode and AmazonChimeSDKMedia-Bitcode. Infoplist includes Privacy - Microphone Usage Description and Privacy - Camera Usage Description Calling a land line using Amazon Chime SDK iOS, SIP media applications, and Voice connectors. iOS class function is below (combined it with UI buttons to call the function):

SWIFT CODE: import Foundation import Amplify import AmazonChimeSDK import AmazonChimeSDKMedia import CallKit import AVFoundation

class Calltest: ObservableObject { @Published var joinMeetingResponse: JoinMeetingResponse?

func audio_security_request () {
    switch AVAudioSession.sharedInstance().recordPermission {
    case AVAudioSession.RecordPermission.granted:
        print("voice granted")
    case AVAudioSession.RecordPermission.denied:
        print("denied")
    case AVAudioSession.RecordPermission.undetermined:
        // You must request permission.
        AVAudioSession.sharedInstance().requestRecordPermission({ (granted) in
          if granted {
            print("granted")
          } else {
            print("rejected")
          }
        })
    @unknown default:
        print("Fatal error")
    }
}

func requestCameraAccess() {
    AVCaptureDevice.requestAccess(for: .video) { granted in
        DispatchQueue.main.async {
            if granted {
                print("camera granted")
            } else {
                print("denied")
                // The user did not grant access to the camera
            }
        }
    }
}

func initiateCall(phone_number:String) async {
    let message = #"{"toNumber": "\#(phone_number)"}"#
    let request = RESTRequest(path: "/chime", body: message.data(using: .utf8))
    print("request", request)
    var data: RESTTask.Success
    do {
        data = try await Amplify.API.post(request: request)
    }
    catch {
        print("Failed due to API error: ", error)
        return
    }
    do {
        print("data", data)
        let jsonDecoder = JSONDecoder()
        joinMeetingResponse = try jsonDecoder.decode(JoinMeetingResponse.self, from: data)
        print(joinMeetingResponse)
    }
    catch {
        print("Failed due to decoding: ", error)
        return
    }

    let meetingResp = CreateMeetingResponse(meeting: Meeting(
            externalMeetingId: joinMeetingResponse!.responseInfo.meeting.externalMeetingId,
            mediaPlacement: MediaPlacement(
                audioFallbackUrl: joinMeetingResponse!.responseInfo.meeting.mediaPlacement.audioFallbackUrl,
                audioHostUrl: joinMeetingResponse!.responseInfo.meeting.mediaPlacement.audioHostUrl,
                signalingUrl: joinMeetingResponse!.responseInfo.meeting.mediaPlacement.signalingUrl,
                turnControlUrl: joinMeetingResponse!.responseInfo.meeting.mediaPlacement.turnControlUrl
            ),
            mediaRegion: joinMeetingResponse!.responseInfo.meeting.mediaRegion,
            meetingId: joinMeetingResponse!.responseInfo.meeting.meetingId
        )
    )
    let attendeeResp = CreateAttendeeResponse(attendee:
        Attendee(attendeeId: joinMeetingResponse!.responseInfo.attendee.attendeeId,
            externalUserId: joinMeetingResponse!.responseInfo.attendee.externalUserId,
                 joinToken: joinMeetingResponse!.responseInfo.attendee.joinToken
        )
    )
    let meetingSessionConfig = MeetingSessionConfiguration(
        createMeetingResponse: meetingResp,
        createAttendeeResponse: attendeeResp
    )
    let logger = ConsoleLogger(name: "MeetingViewController")

    let currentMeetingSession = DefaultMeetingSession(
        configuration: meetingSessionConfig,
        logger: logger
    )
    let audioDevices = currentMeetingSession.audioVideo.listAudioDevices()

    currentMeetingSession.audioVideo.chooseAudioDevice(mediaDevice: audioDevices[0])

    do {
        try currentMeetingSession.audioVideo.start(callKitEnabled:false)
        print("success")
    } catch PermissionError.audioPermissionError {
        print("no permisions")
    } catch {
        print("ekrjrojreor", error)
    }
}

}

Expected behavior Expected to be able to dial a number and exchange sound through the microphone and the speaker

Logs [ERROR] MeetingViewController - audio cues: others_joined.wav err: 7 [ERROR] MeetingViewController - audio cues: others_left.wav err: 7 [ERROR] MeetingViewController - audio cues: reconnecting.wav err: 7 [ERROR] MeetingViewController - audio cues: reconnected.wav err: 7 [ERROR] MeetingViewController - audio cues: reconnect_failed.wav err: 7 [ERROR] MeetingViewController - audio cues: rumble_strips.wav err: 7 [ERROR] MeetingViewController - audio cues: first_caller.wav err: 7 [ERROR] MeetingViewController - audio cues: remote_muted.wav err: 7 [ERROR] MeetingViewController - audio cues: call_waiting.wav err: 7 [ERROR] MeetingViewController - audio cues: ring_back.wav err: 7 [ERROR] MeetingViewController - audio cues: first_caller_beep.wav err: 7 [ERROR] MeetingViewController - audio cues: welcome.wav err: 7 [INFO] MeetingViewController - AudioClient State: connecting Status: ok [INFO] MeetingViewController - Initializing VideoClient [INFO] MeetingViewController - Starting VideoClient [INFO] MeetingViewController - API/DefaultAudioVideoFacade/start(audioVideoConfiguration: audioMode: stereo48K, callKitEnabled: true, enableAudioRedundancy: true)

[INFO] MeetingViewController - videoClientIsConnecting [INFO] MeetingViewController - AudioClient State: finishConnecting Status: ok [INFO] MeetingViewController - videoClientDidConnect, 0 [INFO] MeetingViewController - videoClientCameraSendIsAvailable true

Test environment Info (please complete the following information):

Banhdit1 commented 5 months ago

import Foundation import Amplify import AmazonChimeSDK import AmazonChimeSDKMedia import CallKit import AVFoundation

class CallManager: NSObject, CXProviderDelegate { func providerDidReset(_ provider: CXProvider) { self.hangup() }

let provider: CXProvider
let callController = CXCallController()
var currentMeetingSession: MeetingSession?
@Published var joinMeetingResponse: JoinMeetingResponse?

override init() {
    // Provide an initial value for provider
    let providerConfiguration = CXProviderConfiguration(localizedName: "MyApp")
    providerConfiguration.supportsVideo = false
    providerConfiguration.maximumCallsPerCallGroup = 1
    providerConfiguration.supportedHandleTypes = [.generic]

    self.provider = CXProvider(configuration: providerConfiguration)

    // Important: Call super.init before accessing methods or properties on self
    super.init()

    // After super.init, you can now set the delegate for the provider
    provider.setDelegate(self, queue: nil)
}

func audio_security_request () {
    switch AVAudioSession.sharedInstance().recordPermission {
    case AVAudioSession.RecordPermission.granted:
        print("voice granted")
    case AVAudioSession.RecordPermission.denied:
        print("denied")
    case AVAudioSession.RecordPermission.undetermined:
        // You must request permission.
        AVAudioSession.sharedInstance().requestRecordPermission({ (granted) in
            if granted {
                print("granted")
            } else {
                print("rejected")
            }
        })
    @unknown default:
        print("Fatal error")
    }
}

func requestCameraAccess() {
    AVCaptureDevice.requestAccess(for: .video) { granted in
        DispatchQueue.main.async {
            if granted {
                print("camera granted")
            } else {
                print("denied")
                // The user did not grant access to the camera
            }
        }
    }
}

private func joinMeeting(phone_number: String) async throws -> JoinMeetingResponse {
       // Construct the API request
       let message = #"{"toNumber": "\#(phone_number)"}"#
       let request = RESTRequest(path: "/chime", body: message.data(using: .utf8))

       // Send the API request and decode the response
       let data = try await Amplify.API.post(request: request)

       let jsonDecoder = JSONDecoder()
       let response = try jsonDecoder.decode(JoinMeetingResponse.self, from: data)

       return response
   }

func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession) {
    // Handle the audio session activation
    let audioDevices = currentMeetingSession?.audioVideo.listAudioDevices()

    currentMeetingSession?.audioVideo.chooseAudioDevice(mediaDevice: (audioDevices?[0])!)
    do {
        try? currentMeetingSession?.audioVideo.start(callKitEnabled: true)
    }
    catch {
        print("100", error)
    }

}

func provider(_ provider: CXProvider, perform action: CXStartCallAction) {
        // Configure the audio session, but do not start audio here
        configureAudioSession()

        // Start your meeting/joining process using an async task
        Task {
            do {
                let joinMeetingResponse = try await joinMeeting(phone_number: action.handle.value)

                // Call succeeded to initiate, inform CallKit that it can proceed
                let meetingResp = CreateMeetingResponse(meeting:
                    Meeting(
                        externalMeetingId: joinMeetingResponse.responseInfo.meeting.externalMeetingId,
                        mediaPlacement: MediaPlacement(
                            audioFallbackUrl: joinMeetingResponse.responseInfo.meeting.mediaPlacement.audioFallbackUrl,
                            audioHostUrl: joinMeetingResponse.responseInfo.meeting.mediaPlacement.audioHostUrl,
                            signalingUrl: joinMeetingResponse.responseInfo.meeting.mediaPlacement.signalingUrl,
                            turnControlUrl: joinMeetingResponse.responseInfo.meeting.mediaPlacement.turnControlUrl
                        ),
                        mediaRegion: joinMeetingResponse.responseInfo.meeting.mediaRegion,
                        meetingId: joinMeetingResponse.responseInfo.meeting.meetingId
                    )
                )
                let attendeeResp = CreateAttendeeResponse(attendee:
                    Attendee(attendeeId: joinMeetingResponse.responseInfo.attendee.attendeeId,
                        externalUserId: joinMeetingResponse.responseInfo.attendee.externalUserId,
                        joinToken: joinMeetingResponse.responseInfo.attendee.joinToken
                    )
                )
                let meetingSessionConfig = MeetingSessionConfiguration(
                    createMeetingResponse: meetingResp,
                    createAttendeeResponse: attendeeResp
                )
                self.currentMeetingSession = DefaultMeetingSession(
                    configuration: meetingSessionConfig,
                    logger: ConsoleLogger(name: "MeetingConsoleLogger")
                )
                action.fulfill()

                // Provide CallKit with a call update, if needed
                let callUpdate = createCallUpdate(for: action)
                provider.reportCall(with: action.callUUID, updated: callUpdate)

            } catch {
                print("Join meeting error: \(error)")

                // Inform CallKit that starting the call has failed
                action.fail()

                // Handle the error appropriately
                handleJoinMeetingError(error)
            }
        }
    }

private func handleJoinMeetingError(_ error: Error) {
        if let apiError = error as? APIError {
            // Handle Amplify APIError, possibly with custom messaging or retries
            print("API Error occurred: \(apiError)")
        } else if let decodingError = error as? DecodingError {
            // Handle decoding errors with more specific messages or error handling
            print("Decoding Error occurred: \(decodingError)")
        } else {
            // Generic error handling for other types of errors
            print("An unexpected error occurred: \(error)")
        }

        // Perform any necessary cleanup or user notifications here
        // ...
    }

private func createCallUpdate(for action: CXStartCallAction) -> CXCallUpdate {
        let callUpdate = CXCallUpdate()
        callUpdate.remoteHandle = action.handle
        callUpdate.hasVideo = action.isVideo
        // Set other callUpdate fields as necessary
        return callUpdate
    }

private func configureAudioSession() {
    let audioSession = AVAudioSession.sharedInstance()
       do {
           try audioSession.setCategory(.playAndRecord, mode: .voiceChat, options: [])
           try audioSession.setActive(true)
       } catch {
           print("Failed to configure audio session: \(error)")
       }
  }

public func startCall(with handle: String, isVideo: Bool = false) {
    let handle = CXHandle(type: .phoneNumber, value: handle)
    let startCallAction = CXStartCallAction(call: UUID(), handle: handle)
    startCallAction.isVideo = isVideo

    let transaction = CXTransaction(action: startCallAction)

    let controller = CXCallController()
    controller.request(transaction) { error in
        if let error = error {
            print("Error requesting transaction: \(error)")
        } else {
            print("Requested transaction successfully")
        }
    }
}

func hangup () {
    self.currentMeetingSession?.audioVideo.stop()
}

}

Requested transaction successfully 2024-02-03T16:19:28-0800 info HttpContent : [Logging] seeking to offset 0 in data 2024-02-03T16:19:28-0800 info HttpContent : [Logging] read 27 bytes from data 2024-02-03T16:19:28-0800 info HttpContent : [Logging] read 0 bytes from data 2024-02-03T16:19:28-0800 info HttpContent : [Logging] seeking to offset 0 in data

[ERROR] MeetingConsoleLogger - audio cues: others_joined.wav err: 7 [ERROR] MeetingConsoleLogger - audio cues: others_left.wav err: 7 [ERROR] MeetingConsoleLogger - audio cues: reconnecting.wav err: 7 [ERROR] MeetingConsoleLogger - audio cues: reconnected.wav err: 7 [ERROR] MeetingConsoleLogger - audio cues: reconnect_failed.wav err: 7 [ERROR] MeetingConsoleLogger - audio cues: rumble_strips.wav err: 7 [ERROR] MeetingConsoleLogger - audio cues: first_caller.wav err: 7 [ERROR] MeetingConsoleLogger - audio cues: remote_muted.wav err: 7 [ERROR] MeetingConsoleLogger - audio cues: call_waiting.wav err: 7 [ERROR] MeetingConsoleLogger - audio cues: ring_back.wav err: 7 [ERROR] MeetingConsoleLogger - audio cues: first_caller_beep.wav err: 7 [ERROR] MeetingConsoleLogger - audio cues: welcome.wav err: 7 device model is iPhone13,3 hardware model is D53pAP [INFO] MeetingConsoleLogger - Initializing VideoClient [INFO] MeetingConsoleLogger - Starting VideoClient [INFO] MeetingConsoleLogger - API/DefaultAudioVideoFacade/start(audioVideoConfiguration: audioMode: stereo48K, callKitEnabled: true, enableAudioRedundancy: true) [INFO] MeetingConsoleLogger - AudioClient State: connecting Status: ok [INFO] MeetingConsoleLogger - videoClientIsConnecting [INFO] MeetingConsoleLogger - AudioClient State: finishConnecting Status: ok [INFO] MeetingConsoleLogger - attendeesJoined: [<AmazonChimeSDK.AttendeeInfo: 0x2820b22b0>] [INFO] MeetingConsoleLogger - videoClientDidConnect, 0 [INFO] MeetingConsoleLogger - videoClientCameraSendIsAvailable true