Closed timaschew closed 4 years ago
It's expecting raw PCM audio in chunked transfer encoding. I designed it while trying to understand how Home Assistant/Ada worked with streaming audio.
Do you have any sense of how people might generally want to use it? It may also be fairly redundant given the GStreamer input.
It's expecting raw PCM audio in chunked transfer encoding.
As I've said, I've get it working with vlc and of course it is a chunked transfer (stream) in PCM format. Since I've enabled logging on my server I was able to realize that rhasspy was not calling the HTTP endpoint at all.
So what I would expect:
As a reference: I'm using the node-record-lpcm16 module and just pipe the stream form the microphone to response via express:
const recorder = require('node-record-lpcm16')
const express = require('express')
const recording = recorder.record()
const microphoneStream = recording.stream()
const app = express()
app.get('*', (req, res) => {
console.log('client connected')
res.status(200)
res.set('Content-Type', 'audio/wav')
microphoneStream.pipe(res)
})
app.listen(7000)
HTTP microphone is removed in 2.5. Suggest using gstreamer instead.
I've tried to stream the audio via HTTP with a self written Node.js application. I was able to receive the stream with VLC but with rhasspy it didn't work.
Then I found a solution using VLC (as a server) but I only get it working with RTP in the WAVE codec:
and HTTP with MPEG codec:
Anyway I don't know what rhasspy is exactly expecting, when using HTTP which path is used for instance? Would be nice to have some example, reference.