dmrschmidt / DSWaveformImage

Generate waveform images from audio files on iOS, macOS & visionOS in Swift. Native SwiftUI & UIKit views.
MIT License
978 stars 109 forks source link

how to refresh the wave when going back to that level #59

Closed ducduy20 closed 1 year ago

ducduy20 commented 1 year ago

i have used wave to show recording, when i press use and i go to another screen, but when i go back press record the wave continues from last time. I use navigationLink and SwiftUI

dmrschmidt commented 1 year ago

Hey ducduy20,

this sounds like you are describing an issue with your app logic, rather than with the library itself. I can try to see if I can help you of course, but without any code samples this may be tricky. For example, what means "when I press use"? What is that button doing? Where does it sit?

You are also not explicitly saying what you expect to happen.

So I can only guess what you are trying to achieve: I am assuming that you are using the WaveformLiveCanvas in SwiftUI to show live recording progress.

That would mean that you must have some sort of @Published var samples: [Float] = [] or @State var samples: [Float] = [] somewhere in your code, which you are passing in like WaveformLiveCanvas(samples: samples).

If my deduction is correct until here, then your issue is, that you are expecting samples to get emptied whenever you are pressing the record button an that is not happening anywhere. So somewhere you'd have to do something like samples = []. Where or when that would be, I cannot say as that depends entirely on your app's logic, flow etc.

ducduy20 commented 1 year ago

here when i press record button everything is ok but when i press use and it automatically switch to another screen with NavigationLink but if i go back and press record button again it will play back the old recording. what i want is that after the kernel stops recording the wave will be refreshed.

ducduy20 commented 1 year ago

here is the code that demonstrates the above

    NavigationStack{
            VStack{
                Text(changeText ? "Recording": "Record")
                    .font(.system(size: 19,weight: .bold))
                    .padding(.top, 50)

                Spacer()
                if audioRecord.isRecording == false{
                    Rectangle()
                        .fill(Color.gray.opacity(0.1))
                        .frame(width: UIScreen.width - 20, height: 250)
                }
                else{
                    if audioRecord.isRecording == true{
                        WaveformLiveCanvas(samples: audioRecord.samples, configuration: liveConfiguration)
                            .frame(width: UIScreen.width - 100, height: 350)
                    }
                }

                Spacer()

                ZStack{
                    Image("ic_button_record")
                        .resizable()
                        .scaledToFit()
                        .frame(width: UIScreen.width)
                        .edgesIgnoringSafeArea(.bottom)

                    VStack{

                        HStack{

                            if audioRecord.minutes < 10{
                                Text("0" + "\(audioRecord.minutes )" + " :")
                            }else{
                                Text("\(audioRecord.minutes )" + " :" )
                            }
                            if audioRecord.seconds < 10{
                                Text("0" + "\(audioRecord.seconds )")

                            }else{

                                Text("\(audioRecord.seconds )")

                            }
//                            Text(audioRecord.timeText)

                        }.font(.system(size: 50, weight: .heavy))
                            .foregroundColor(.white)
                            .opacity(audioRecord.isRecording ? 1 : 0.6 )
                            .frame(width: UIScreen.width, alignment: .center)

                        HStack(){
                            if audioRecord.isRecording == false{
                                Button {
                                    //todo
                                    withAnimation {
                                        self.importFile = true
                                    }
                                } label: {
                                    VStack{
                                        Image("ic_record1")
                                        Text("Open File")
                                            .foregroundColor(.white)
                                    }
                                }
                                .fileImporter(isPresented: $importFile, allowedContentTypes: [.audio], allowsMultipleSelection: false) { result in
                                    if  case.success(let value) = result {
                                        changeImport = true
                                        do{
                                            guard let selectFile: URL = try result.get().first else { return }
                                            if selectFile.startAccessingSecurityScopedResource(){
                                                let model = Recording(fileURL: selectFile, createdAT: selectFile.lastPathComponent)
                                                print(value)
                                                audioEffect.audioFile = model

                                                data.getData1 = selectFile
                                                data.fileName = selectFile.lastPathComponent
                                                print(data.getData1)
//                                                selectFile.stopAccessingSecurityScopedResource()
                                            }
                                        }catch{
                                            print("Failed Open File")
                                        }
                                    }
                                }
                            }

                            Spacer()

                            Button {
                                changeText.toggle()
                                switch AVAudioSession.sharedInstance().recordPermission{
                                case.denied:
                                    permissionCheck = true
                                case.granted:

                                    startRecording()

                                case .undetermined:
                                    AVAudioSession.sharedInstance().requestRecordPermission { granted in
                                        if granted {
                                            DispatchQueue.main.asyncAfter(deadline: .now() + 1/4 ){
                                                startRecording()
                                            }
                                        }else{
                                            //todo
                                            permissionCheck = true
                                        }
                                    }
                                default:
                                    break
                                }
                            } label: {
                                ZStack{
                                    Circle()
                                        .stroke(lineWidth: 1)
                                        .frame(width: 114, height: 114)
                                        .foregroundColor(.white)
                                        .opacity(0.1)
                                        .scaleEffect(animation1 ? 1 : 1.15)
                                        .onAppear {
                                            withAnimation(Animation.easeInOut(duration: 0.6).repeatForever(autoreverses: true)) {
                                                self.animation1.toggle()
                                            }
                                        }
                                    Circle()
                                        .stroke(lineWidth: 1)
                                        .frame(width: 102, height: 102)
                                        .foregroundColor(.white)
                                        .opacity(0.2)
                                        .scaleEffect(animation2 ? 1 : 1.1)
                                        .onAppear {
                                            withAnimation (Animation.easeInOut(duration: 0.6).repeatForever(autoreverses: true)){
                                                self.animation2.toggle()
                                            }
                                        }
                                    Circle()
                                        .stroke(lineWidth: 2)
                                        .frame(width: 92, height: 92)
                                        .foregroundColor(.white)
                                        .opacity(0.3)
                                        .scaleEffect(animation3 ? 1 : 1.05)
                                        .onAppear {
                                            withAnimation(Animation.easeInOut(duration: 0.6).repeatForever(autoreverses: true)) {
                                                self.animation3.toggle()
                                            }
                                        }
                                    Circle()
                                        .fill(.white)
                                        .frame(width: 82, height: 82)
                                    Image( audioRecord.isRecording ? "bt_recording" : "bt_record")
                                }
                            }
                            .alert(isPresented: $permissionCheck) {
                                 Alert (title: Text("Microphone access required to take record"),
                                        message: Text("Go to Settings?"),
                                        primaryButton: .default(Text("Settings"), action: {

                                     DispatchQueue.main.async {
                                                UIApplication.shared.open(URL(string: UIApplication.openSettingsURLString)!)
                                     }
                                        }),
                                        secondaryButton: .default(Text("Cancel")))
                                    }

                            Spacer()
                            if audioRecord.isRecording == false{
                                Button {
                                    dismiss.wrappedValue.dismiss()
                                } label: {
                                    VStack{
                                        Image("ic_record2")

                                        Text("Close")
                                            .foregroundColor(.white)
                                    }.padding()
                                }
                            }
                        }
                        .frame(width: UIScreen.width - 40 ,alignment: .center)
                    }

                }
            }
            .frame(width: UIScreen.width, height: UIScreen.height)
        }
        //MARK: navigationLink
        .navigate(to: AudioChangeView(recording: recording, recordAudio: self.audioRecord, playEffect: audioEffect, viewModel: data , voiceChange: EffectFactory.ChangeVoice.allCases, enumCase: .open), when: $changeImport,isHidenNavigationBar: true, navigationBarTitle: "Audio Change")

        .navigate(to: AudioChangeView(recording: recording, recordAudio: self.audioRecord, playEffect: audioEffect, audioRcording: AudioChange(), viewModel: data, voiceChange: EffectFactory.ChangeVoice.allCases, enumCase: .record), when: $changeView, isHidenNavigationBar: true,navigationBarTitle: "Audio Change")
    }
    func startRecording() {

        if audioRecord.isRecording {
            audioRecord.stopRecording()
            changeView = true
        }else {
            audioRecord.startRecording()
            let fileRecording = audioRecord.audioFile
            data.getData2 = fileRecording!.fileURL
            data.fileName = fileRecording!.fileURL.lastPathComponent
        }
    }
dmrschmidt commented 1 year ago

Hey @ducduy20,

so yeah, this problem really sits outside of the support I can provide here, as its SwiftUI & application logic related and not about the library.

Your issue is, that you want audioRecord.samples to reset to [] when the user navigates back from that navigation link.

So that's what you will have to find out "how to perform an action when a user navigates back from a NavigationLink". I haven't used NavigationLink myself yet, so I don't know the answer personally, but I'm sure the documentation will be helpful here or otherwise StackOverflow.

Once you know how to hook into the "navigate back action", all you'd need to do is have a method like audioRecord.reset(), which restores your state. imho how you describe what you want is over-complicating the situation though. A possibly much simpler way that doesn't have any edge cases (cause calling reset() directly would clear the view as soon as the user comes back), would be to have startRecording() do the clearing whenever it is called. That's also better UX imho. Offer the user different buttons to either "continue current recording" or "reset" or "record new" or sth along those lines.