livekit / server-sdk-go

Client and server SDK for Golang
Apache License 2.0
206 stars 93 forks source link

Can this client push local video streams in the future? #11

Closed zhuwei closed 3 years ago

davidzhao commented 3 years ago

Yeah, it's already capable of doing that. One caveat: the current SDK doesn't encode video like other client SDKs do. You can publish packets assuming they are already encoded.

Are you thinking about publishing an existing video file? or is this a live source like a camera?

zhuwei commented 3 years ago

Yes, publishing an existing video file or camera live rtp. Does it already have this function? Do you have relevant documents?Or examples Thx!

davidzhao commented 3 years ago

We don't have great examples here. But here's our load tester publishing a test track.

You can replace the sample provider with your own implementation that provides samples

bregydoc commented 3 years ago

Hi @davidzhao. Thanks for the awesome work with livekit.

Actually, I'm trying to send a stream of images (native image.Image) to my livekit room.

My first approach was to try to encode the images into H264 frames, using [this package],(github.com/gen2brain/x264-go) but when start the publishing track I got an unable to start track, codec is not supported by remote error and I don't know if it depends on the other clients, or what. I'm testing this approach with your React client, but I don't know how to make to accept h264 codec (If I'm understanding correctly the error).

Well, after some hours trying with this approach, I decide to change to other codec: VP8, and the only valid (and updated) package I find is this wrapper of libvpx, but it is more low level I think, then I'm currently reading and learning how to encode my image.Image images with this codec, but I'm very doubted about my approach.

If anyone knows or have any ideas to another way to solve this problem, I'll appreciate so much.

bregydoc commented 3 years ago

I have my decontextualized first approach implementation code here:

package main

import (
    "bytes"
    "fmt"
    "time"

    "github.com/gen2brain/x264-go"
    lksdk "github.com/livekit/server-sdk-go"
    "github.com/minskylab/calab"
    "github.com/minskylab/calab/experiments"
    "github.com/minskylab/calab/experiments/petridish"
    "github.com/minskylab/calab/spaces/board"
    "github.com/minskylab/calab/systems/lifelike"
    "github.com/pion/webrtc/v3"
    "github.com/pion/webrtc/v3/pkg/media"
)

func basicLifeLike(w, h int, lifeRule *lifelike.Rule) *petridish.PetriDish {
    dynamic := lifelike.MustNew(lifeRule, lifelike.ToroidBounded, lifelike.MooreNeighborhood(1, false))
    space := board.MustNew(w, h).Fill(board.UniformNoise, dynamic)

    return petridish.NewDefault(calab.BulkDynamicalSystem(space, dynamic))
}

func main() {
    gameOfLife := basicLifeLike(256, 256, lifelike.GameOfLifeRule)

    experiment := experiments.New()

    experiment.AddPetriDish(gameOfLife)

    fmt.Printf("gameOfLife id: %s\n", gameOfLife.ID)

    frames, err := experiment.Observe(gameOfLife.ID)
    if err != nil {
        panic(err)
    }

    go gameOfLife.Run(30 * time.Minute)

    host := "ws://ip:7880"
    apiKey := "--"
    apiSecret := "---"
    roomName := "myroom"
    identity := "botuser"

    room, err := lksdk.ConnectToRoom(host, lksdk.ConnectInfo{
        APIKey:              apiKey,
        APISecret:           apiSecret,
        RoomName:            roomName,
        ParticipantIdentity: identity,
    })
    if err != nil {
        panic(err)
    }

    time.Sleep(5 * time.Second)

    buf := bytes.NewBuffer([]byte{})

    opts := &x264.Options{
        Width:     256,
        Height:    256,
        FrameRate: 30,
        Tune:      "zerolatency",
        Preset:    "veryfast",
        Profile:   "baseline",
        // LogLevel:  x264.LogDebug,
    }

    enc, err := x264.NewEncoder(buf, opts)
    if err != nil {
        panic(err)
    }

    track, err := webrtc.NewTrackLocalStaticSample(webrtc.RTPCodecCapability{MimeType: webrtc.MimeTypeH264}, "video", "pion")
    if err != nil {
        panic(err)
    }

    local, err := room.LocalParticipant.PublishTrack(track, "track test")
    if err != nil {
        panic(err)
    }

    local.SetMuted(true)

    for frame := range frames {

        if err := enc.Encode(frame); err != nil {
            panic(err)
        }

        track.WriteSample(media.Sample{
            Data:     buf.Bytes(),
            Duration: time.Second,
        })

        buf.Reset()
        fmt.Printf("mean tps: %.2f\n", gameOfLife.GetMeanTPS())
    }

    enc.Flush()
}
davidzhao commented 3 years ago

hey @bregydoc, when you set the track to Muted, the server will stop forwarding all data to subscribers of this track. Is there a reason why you are doing that?

By default, LiveKit is configured to use VP8/opus to optimize compatibility with a wide range of clients, you can change this in config.

Check out this example in pion for writing a VP8 track from disk, H264 is very similar to that, especially you don't need to parse. Another thing to do for debugging is to write the encoded data to disk as a .h264 file, then ensure with VLC that you are able to playback.

bregydoc commented 3 years ago

Hi @davidzhao, thanks for your ideas,

I activate the h264 codec with the config file, and now I don't have the unable to start track, codec is not supported by remote error.

Also, I created a .h264 file based on this example of x264-go library. I play the video file generated with VLC and all is ok. Then, based on this example and adding my last implementation of a LocalStaticSampler I was testing sending samples to livekit, but client only receive a blank image, I mean, the image color change from gray to black, but anything else. Do you have any idea?.

PD: I can paste my test code, but it has ~100 lines and I don't want to change the way of this issue. Do you prefer that I create an issue for this topic, or we can keep in this thread?

davidzhao commented 3 years ago

looks like there are some issues with decoding. how often do you have keyframes in your video? it's possible that the client missed the initial keyframe from the server.

if bandwidth isn't a concern, you could make each frame a keyframe.

zhuwei commented 3 years ago

@bregydoc Doesn't buf have to transfer to RTP packet?

davidzhao commented 3 years ago

Take a look at this Pion example.

davidzhao commented 3 years ago

Hey everyone, I've updated the examples in here to show how to publish existing video files using this SDK. Check out the docs here. Closing this issue.