Another question for my #WebRTC friends: Do you know what *exactly* makes an audio stream sync up with a video stream in libwebrtc?
Is it the a=msid:<stream-id> <track-id> grouping on the sender side?
Is it the a=msid:<stream-id> <track-id> grouping on the receiver side?
Is it both? Or is it something else entirely, e.g. do we need to create a MediaStream from both tracks on the sender or the receiver? Or do we need to attach both audio and video tracks to the same <video> element?
@steely_glintTim Panton
@lminieroLorenzo Miniero
@danjenkins
@sSaúl Ibarra Corretgé