VapiAI / flutter

MIT License
5 stars 2 forks source link

Flutter deploy, error on calling assistant: πŸ†˜Vapi - Failed to create client after 5 attempts. #3

Open RonanJehnna opened 2 months ago

RonanJehnna commented 2 months ago

Hi @nikhilro, getting the same error on flutter but I am using a public key only.

I/flutter (31150): πŸ”„ 2024-04-10 17:55:13.088052: Vapi - Requesting Mic Permission...
I/flutter (31150): πŸ†— 2024-04-10 17:55:13.253083: Vapi - Mic Permission Granted
I/flutter (31150): πŸ”„ 2024-04-10 17:55:13.255974: Vapi - Preparing Call & Client...
I/flutter (31150): πŸ”„ 2024-04-10 17:55:13.331110: Vapi - Creating client (Attempt 1)...
I/org.webrtc.Logging(31150): WebRtcAudioManager: ctor@[name=Thread-5, id=27073]
I/org.webrtc.Logging(31150): WebRtcAudioManager: Sample rate is set to 48000 Hz
D/CompatibilityChangeReporter(31150): Compat change id reported: 263076149; UID 10283; state: DISABLED
I/org.webrtc.Logging(31150): WebRtcAudioEffects: canUseAcousticEchoCanceler: true
I/org.webrtc.Logging(31150): WebRtcAudioEffects: canUseNoiseSuppressor: true
W/org.webrtc.Logging(31150): WebRtcAudioManager: AAudio support is currently disabled on all devices!
I/org.webrtc.Logging(31150): WebRtcAudioManager: Android SDK: 34, Release: 14, Brand: Nothing, Device: Spacewar, Id: UP1A.231005.007, Hardware: qcom, Manufacturer: Nothing, Model: A063, Product: Spacewar
I/AudioManager(31150): In isSpeakerphoneOn(), calling application: com.example.vapi_ai
I/AudioManager(31150): In isBluetoothScoOn(), calling application: com.example.vapi_ai
I/org.webrtc.Logging(31150): WebRtcAudioManager: Audio State: audio mode: MODE_NORMAL, has mic: true, mic muted: false, music active: false, speakerphone: false, BT SCO: false
I/org.webrtc.Logging(31150): WebRtcAudioManager: Audio State: 
I/org.webrtc.Logging(31150): WebRtcAudioManager:   fixed volume=false
I/org.webrtc.Logging(31150): WebRtcAudioManager:   STREAM_VOICE_CALL: volume=1, max=7, muted=false
I/org.webrtc.Logging(31150): WebRtcAudioManager:   STREAM_MUSIC: volume=5, max=16, muted=false
I/org.webrtc.Logging(31150): WebRtcAudioManager:   STREAM_RING: volume=16, max=16, muted=false
I/org.webrtc.Logging(31150): WebRtcAudioManager:   STREAM_ALARM: volume=14, max=16, muted=false
I/org.webrtc.Logging(31150): WebRtcAudioManager:   STREAM_NOTIFICATION: volume=10, max=16, muted=false
I/org.webrtc.Logging(31150): WebRtcAudioManager:   STREAM_SYSTEM: volume=1, max=16, muted=false
I/org.webrtc.Logging(31150): WebRtcAudioManager: Audio Devices: 
I/org.webrtc.Logging(31150): WebRtcAudioManager:   TYPE_BUILTIN_EARPIECE(out): channels=[1], encodings=[2], sample rates=[48000], id=2
I/org.webrtc.Logging(31150): WebRtcAudioManager:   TYPE_BUILTIN_SPEAKER(out): channels=[2], encodings=[2], sample rates=[48000], id=3
I/org.webrtc.Logging(31150): WebRtcAudioManager:   TYPE_BLUETOOTH_SCO(out): channels=[1], encodings=[2], sample rates=[8000, 16000], id=8150
I/org.webrtc.Logging(31150): WebRtcAudioManager:   TYPE_BLUETOOTH_A2DP(out): channels=[2], encodings=[2], sample rates=[48000], id=8158
I/org.webrtc.Logging(31150): WebRtcAudioManager:   TYPE_TELEPHONY(out): channels=[1, 2], encodings=[2], sample rates=[8000, 16000], id=13
I/org.webrtc.Logging(31150): WebRtcAudioManager:   TYPE_BUILTIN_MIC(in): channels=[1, 2], encodings=[2], sample rates=[8000, 12000, 16000, 24000, 32000, 48000, 11025, 22050, 44100], id=20
I/org.webrtc.Logging(31150): WebRtcAudioManager:   TYPE_BLUETOOTH_SCO(in): channels=[1], encodings=[2], sample rates=[8000, 16000], id=8153
I/org.webrtc.Logging(31150): WebRtcAudioManager:   TYPE_HDMI(in): channels=[2], encodings=[2], sample rates=[48000, 32000, 44100], id=31
I/org.webrtc.Logging(31150): WebRtcAudioManager:   TYPE_TELEPHONY(in): channels=[1, 2], encodings=[2], sample rates=[8000, 16000, 48000, 12000, 24000, 32000, 11025, 22050, 44100], id=21
I/org.webrtc.Logging(31150): WebRtcAudioManager:   TYPE_BUILTIN_MIC(in): channels=[1, 2], encodings=[2], sample rates=[8000, 12000, 16000, 24000, 32000, 48000, 11025, 22050, 44100], id=22
I/org.webrtc.Logging(31150): WebRtcAudioManager:   TYPE_UNKNOWN(in): channels=[2], encodings=[2], sample rates=[48000], id=33
I/org.webrtc.Logging(31150): WebRtcAudioManager:   TYPE_FM_TUNER(in): channels=[1, 2], encodings=[2], sample rates=[48000, 8000, 12000, 16000, 24000, 32000, 11025, 22050, 44100], id=23
D/OpenSLESPlayer(31150): ctor[tid=32291]
D/OpenSLESRecorder(31150): ctor[tid=32291]
D/OpenSLESPlayer(31150): AttachAudioBuffer
D/OpenSLESPlayer(31150): SetPlayoutSampleRate(48000)
D/OpenSLESPlayer(31150): SetPlayoutChannels(1)
D/OpenSLESPlayer(31150): AllocateDataBuffers
D/OpenSLESPlayer(31150): native buffer size: 192
D/OpenSLESPlayer(31150): native buffer size in ms: 4.00
D/OpenSLESRecorder(31150): AttachAudioBuffer
D/OpenSLESRecorder(31150): SetRecordingSampleRate(48000)
D/OpenSLESRecorder(31150): SetRecordingChannels(1)
D/OpenSLESRecorder(31150): AllocateDataBuffers
D/OpenSLESRecorder(31150): frames per native buffer: 192
D/OpenSLESRecorder(31150): frames per 10ms buffer: 480
D/OpenSLESRecorder(31150): bytes per native buffer: 384
D/OpenSLESRecorder(31150): native sample rate: 48000
I/org.webrtc.Logging(31150): WebRtcAudioManager: init@[name=Thread-5, id=27073]
I/org.webrtc.Logging(31150): WebRtcAudioManager: audio mode is: MODE_NORMAL
D/OpenSLESPlayer(31150): Init[tid=32291]
D/OpenSLESRecorder(31150): Init[tid=32291]
D/OpenSLESRecorder(31150): EnableBuiltInAEC(1)
E/OpenSLESRecorder(31150): Not implemented
D/OpenSLESRecorder(31150): EnableBuiltInNS(1)
E/OpenSLESRecorder(31150): Not implemented
I/example.vapi_ai(31150): Background concurrent copying GC freed 396719(20MB) AllocSpace objects, 3(60KB) LOS objects, 89% free, 2778KB/26MB, paused 3.116ms,19us total 141.194ms
W/example.vapi_ai(31150): ApkAssets: Deleting an ApkAssets object '<empty> and /data/app/~~UYR4jSMcuU7FZzq4bWlLJw==/com.google.android.tts-trjJgo9-en7YNu8NMXOngA==/base.apk' with 1 weak references
I/DailyCore(31150): daily_core::soup::signalling: s0_name: calls0_target=daily_core::native::ffi::call_client::lifecycles0_file=daily-core/src/native/ffi/call_client/lifecycle.rss0_line=37s0_call_id="0"Soup send queue starting
I/DailyCore(31150): daily_core::call_manager: s0_name: calls0_target=daily_core::native::ffi::call_client::lifecycles0_file=daily-core/src/native/ffi/call_client/lifecycle.rss0_line=37s0_call_id="0"CallManager event loop starting
I/flutter (31150): ⏳ 2024-04-10 17:55:14.355232: Vapi - Client creation timed out.

Flutter and dart version used image

nikhilro commented 2 months ago

Could you try with the vanilla example?

RonanJehnna commented 2 months ago

Yeah I tried it with the code provided in example

image image

import 'package:flutter/material.dart';
import 'package:vapi/Vapi.dart';

const VAPI_PUBLIC_KEY = 'VAPI_PUBLIC_KEY';
const VAPI_ASSISTANT_ID = 'VAPI_ASSISTANT_ID';

void main() {
  runApp(MyApp());
}

class MyApp extends StatefulWidget {
  @override
  _MyAppState createState() => _MyAppState();
}

class _MyAppState extends State<MyApp> {
  String buttonText = 'Start Call';
  bool isLoading = false;
  bool isCallStarted = false;
  Vapi vapi = Vapi(VAPI_PUBLIC_KEY);

  _MyAppState() {
    vapi.onEvent.listen((event) {
      if (event.label == "call-start") {
        setState(() {
          buttonText = 'End Call';
          isLoading = false;
          isCallStarted = true;
        });
        print('call started');
      }
      if (event.label == "call-end") {
        setState(() {
          buttonText = 'Start Call';
          isLoading = false;
          isCallStarted = false;
        });
        print('call ended');
      }
      if (event.label == "message") {
        print(event.value);
      }
    });
  }

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: Scaffold(
        appBar: AppBar(
          title: Text('Test App'),
        ),
        body: Center(
          child: ElevatedButton(
            onPressed: isLoading
                ? null
                : () async {
                    setState(() {
                      buttonText = 'Loading...';
                      isLoading = true;
                    });

                    if (!isCallStarted) {
                      await vapi.start(assistant: {
                        "firstMessage": "Hello, I am an assistant.",
                        "model": {
                          "provider": "openai",
                          "model": "gpt-3.5-turbo",
                          "messages": [
                            {
                              "role": "system",
                              "content": "You are an assistant."
                            }
                          ]
                        },
                        "voice": "jennifer-playht"
                      });
                    } else {
                      await vapi.stop();
                    }
                  },
            child: Text(buttonText),
          ),
        ),
      ),
    );
  }
}
RonanJehnna commented 2 months ago

I am using dart package https://pub.dev/packages/vapi/example for this and then replaced main.dart code with vanilla example

RonanJehnna commented 2 months ago

I also tried cloning this repo and running it, still getting the same error.

RoyalCoder88 commented 2 months ago

I also tried cloning this repo and running it, still getting the same error.

Hi @RonanJehnna if you're using the VSCode IDE run the app from terminal: flutter run as well from Android Studio