Another question for my friends: Do you know what *exactly* makes an audio stream sync up with a video stream in libwebrtc?

Is it the a=msid:<stream-id> <track-id> grouping on the sender side?

Is it the a=msid:<stream-id> <track-id> grouping on the receiver side?

Is it both? Or is it something else entirely, e.g. do we need to create a MediaStream from both tracks on the sender or the receiver? Or do we need to attach both audio and video tracks to the same <video> element?

@steely_glintTim Panton @lminieroLorenzo Miniero @danjenkins @sSaúl Ibarra Corretgé

0

If you have a fediverse account, you can quote this note from your own instance. Search https://digitalcourage.social/users/f09fa681/statuses/114196535533015026 on your instance and quote it. (Note that quoting is not supported in Mastodon.)