An iOS and macOS audio visualization framework built upon Core Audio useful for anyone doing real-time, low-latency audio processing and visualizations.
I want to extract the data used to plot my audio file, so I can plot them later with Excel in my computer. In other words, EZAudioPlot has some x and y-componet to plot a buffering wave, I want to extract those in an array and plot them with amplitude on y-axis and time on x-axis.
Here is my code, I don't know how to use getwaveformdata():
func setupPlot() {
let plot = AKNodeOutputPlot(microphone, frame: audioInputPlot.bounds)
plot.plotType = .buffer
plot.shouldFill = false
plot.shouldMirror = false
plot.shouldCenterYAxis = true
plot.color = Gold
}
let documentsUrl = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
let audioFilePath = documentsUrl.appendingPathComponent("AudioFile-2017-03-04-13-38-45.m4a")
let audioFile = EZAudioFile(url: URL(fileURLWithPath: audioFilePath.path))
let waveformData = audioFile!.getWaveformData()
plot.updateBuffer(waveformData?.buffers[0], withBufferSize: (waveformData?.bufferSize)!)
print(waveformData ?? "no WaveFormData")
I want to extract the data used to plot my audio file, so I can plot them later with Excel in my computer. In other words, EZAudioPlot has some x and y-componet to plot a buffering wave, I want to extract those in an array and plot them with amplitude on y-axis and time on x-axis.
Here is my code, I don't know how to use getwaveformdata():