
Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open. If you know of people you think should be on the list, please ask them to subscribe! The two ways to subscribe: - Send "subscribe" to rtc-web-request@alvestrand.no, and do what the response says - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions Let the discussions begin! Harald

Cool, thanks Harald as I said during the day, I'd like to separate (as much as possible) "why is real-time communications on the internet hard?" (which is true, but a subject the IETF, the ITU, and others are also grappling with) from "what is interesting/challenging about real-time communications *in the web*?" -- which I take to mean in pages shown by a browser. On Oct 8, 2010, at 4:24 , Harald Alvestrand wrote:
Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open.
If you know of people you think should be on the list, please ask them to subscribe!
The two ways to subscribe:
- Send "subscribe" to rtc-web-request@alvestrand.no, and do what the response says - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions
Let the discussions begin!
Harald
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
David Singer Multimedia and Software Standards, Apple Inc.

Thanks for the general invite! I wonder: has the HTML5 device element been looked at ( http://dev.w3.org/html5/html-device/) and what are the problems with that solution? Cheers, Silvia. On Sat, Oct 9, 2010 at 5:28 PM, David Singer <singer@apple.com> wrote:
Cool, thanks Harald
as I said during the day, I'd like to separate (as much as possible) "why is real-time communications on the internet hard?" (which is true, but a subject the IETF, the ITU, and others are also grappling with) from "what is interesting/challenging about real-time communications *in the web*?" -- which I take to mean in pages shown by a browser.
On Oct 8, 2010, at 4:24 , Harald Alvestrand wrote:
Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open.
If you know of people you think should be on the list, please ask them to subscribe!
The two ways to subscribe:
- Send "subscribe" to rtc-web-request@alvestrand.no, and do what the response says - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions
Let the discussions begin!
Harald
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
David Singer Multimedia and Software Standards, Apple Inc.
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web

On 10/09/10 03:17, Silvia Pfeiffer wrote:
Thanks for the general invite!
I wonder: has the HTML5 device element been looked at (http://dev.w3.org/html5/html-device/) and what are the problems with that solution? We're the ones who have to look - and some on the list have been closely involved with writing the <device> spec. It would be surprising to me if they are not part of the solution - but just part.
As far as I know, it's still not clear how to tie a <device> to a media stream - given that media streams aren't defined yet, this is not very surprising :-)
Cheers, Silvia.
On Sat, Oct 9, 2010 at 5:28 PM, David Singer <singer@apple.com <mailto:singer@apple.com>> wrote:
Cool, thanks Harald
as I said during the day, I'd like to separate (as much as possible) "why is real-time communications on the internet hard?" (which is true, but a subject the IETF, the ITU, and others are also grappling with) from "what is interesting/challenging about real-time communications *in the web*?" -- which I take to mean in pages shown by a browser.
On Oct 8, 2010, at 4:24 , Harald Alvestrand wrote:
> Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open. > > If you know of people you think should be on the list, please ask them to subscribe! > > The two ways to subscribe: > > - Send "subscribe" to rtc-web-request@alvestrand.no <mailto:rtc-web-request@alvestrand.no>, and do what the response says > - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions > > Let the discussions begin! > > Harald > > _______________________________________________ > RTC-Web mailing list > RTC-Web@alvestrand.no <mailto:RTC-Web@alvestrand.no> > http://www.alvestrand.no/mailman/listinfo/rtc-web
David Singer Multimedia and Software Standards, Apple Inc.
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no <mailto:RTC-Web@alvestrand.no> http://www.alvestrand.no/mailman/listinfo/rtc-web

Silvia, you might be interested in some experimenting we've done with media streams and <device>: https://labs.ericsson.com/developer-community/blog/beyond-html5-conversation... (you can track back to earlier posts). BR, Stefan ________________________________ From: rtc-web-bounces@alvestrand.no [mailto:rtc-web-bounces@alvestrand.no] On Behalf Of Harald Alvestrand Sent: den 9 oktober 2010 10:12 To: Silvia Pfeiffer Cc: rtc-web@alvestrand.no; David Singer Subject: Re: [RTW] List is now open On 10/09/10 03:17, Silvia Pfeiffer wrote: Thanks for the general invite! I wonder: has the HTML5 device element been looked at (http://dev.w3.org/html5/html-device/) and what are the problems with that solution? We're the ones who have to look - and some on the list have been closely involved with writing the <device> spec. It would be surprising to me if they are not part of the solution - but just part. As far as I know, it's still not clear how to tie a <device> to a media stream - given that media streams aren't defined yet, this is not very surprising :-) Cheers, Silvia. On Sat, Oct 9, 2010 at 5:28 PM, David Singer <singer@apple.com<mailto:singer@apple.com>> wrote: Cool, thanks Harald as I said during the day, I'd like to separate (as much as possible) "why is real-time communications on the internet hard?" (which is true, but a subject the IETF, the ITU, and others are also grappling with) from "what is interesting/challenging about real-time communications *in the web*?" -- which I take to mean in pages shown by a browser. On Oct 8, 2010, at 4:24 , Harald Alvestrand wrote:
Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open.
If you know of people you think should be on the list, please ask them to subscribe!
The two ways to subscribe:
- Send "subscribe" to rtc-web-request@alvestrand.no<mailto:rtc-web-request@alvestrand.no>, and do what the response says - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions
Let the discussions begin!
Harald
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no<mailto:RTC-Web@alvestrand.no> http://www.alvestrand.no/mailman/listinfo/rtc-web
David Singer Multimedia and Software Standards, Apple Inc. _______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no<mailto:RTC-Web@alvestrand.no> http://www.alvestrand.no/mailman/listinfo/rtc-web

Hi Stefan, I have seen those, thanks. That's actually the reason why I asked: because I have already seen it work with the <device> element and I wondered what the remaining challenges were. It seems there is lots of discussion about protocols and codecs. Cheers, Silvia. On Mon, Oct 11, 2010 at 9:49 AM, Stefan Håkansson LK < stefan.lk.hakansson@ericsson.com> wrote:
Silvia,
you might be interested in some experimenting we've done with media streams and <device>: https://labs.ericsson.com/developer-community/blog/beyond-html5-conversation... (you can track back to earlier posts).
BR, Stefan
------------------------------ *From:* rtc-web-bounces@alvestrand.no [mailto: rtc-web-bounces@alvestrand.no] *On Behalf Of *Harald Alvestrand *Sent:* den 9 oktober 2010 10:12 *To:* Silvia Pfeiffer *Cc:* rtc-web@alvestrand.no; David Singer *Subject:* Re: [RTW] List is now open
On 10/09/10 03:17, Silvia Pfeiffer wrote:
Thanks for the general invite!
I wonder: has the HTML5 device element been looked at ( http://dev.w3.org/html5/html-device/) and what are the problems with that solution?
We're the ones who have to look - and some on the list have been closely involved with writing the <device> spec. It would be surprising to me if they are not part of the solution - but just part.
As far as I know, it's still not clear how to tie a <device> to a media stream - given that media streams aren't defined yet, this is not very surprising :-)
Cheers, Silvia.
On Sat, Oct 9, 2010 at 5:28 PM, David Singer <singer@apple.com> wrote:
Cool, thanks Harald
as I said during the day, I'd like to separate (as much as possible) "why is real-time communications on the internet hard?" (which is true, but a subject the IETF, the ITU, and others are also grappling with) from "what is interesting/challenging about real-time communications *in the web*?" -- which I take to mean in pages shown by a browser.
On Oct 8, 2010, at 4:24 , Harald Alvestrand wrote:
Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open.
If you know of people you think should be on the list, please ask them to subscribe!
The two ways to subscribe:
- Send "subscribe" to rtc-web-request@alvestrand.no, and do what the response says - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions
Let the discussions begin!
Harald
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
David Singer Multimedia and Software Standards, Apple Inc.
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web

That's right, a lot of things remain regarding protocols and other stuff. But IMHO <device>, StreamAPIs and <audio> and <video> should be part of the puzzle! --Stefan ________________________________ From: Silvia Pfeiffer [mailto:silviapfeiffer1@gmail.com] Sent: den 11 oktober 2010 01:23 To: Stefan Håkansson LK Cc: Harald Alvestrand; rtc-web@alvestrand.no; David Singer Subject: Re: [RTW] List is now open Hi Stefan, I have seen those, thanks. That's actually the reason why I asked: because I have already seen it work with the <device> element and I wondered what the remaining challenges were. It seems there is lots of discussion about protocols and codecs. Cheers, Silvia. On Mon, Oct 11, 2010 at 9:49 AM, Stefan Håkansson LK <stefan.lk.hakansson@ericsson.com<mailto:stefan.lk.hakansson@ericsson.com>> wrote: Silvia, you might be interested in some experimenting we've done with media streams and <device>: https://labs.ericsson.com/developer-community/blog/beyond-html5-conversation... (you can track back to earlier posts). BR, Stefan ________________________________ From: rtc-web-bounces@alvestrand.no<mailto:rtc-web-bounces@alvestrand.no> [mailto:rtc-web-bounces@alvestrand.no<mailto:rtc-web-bounces@alvestrand.no>] On Behalf Of Harald Alvestrand Sent: den 9 oktober 2010 10:12 To: Silvia Pfeiffer Cc: rtc-web@alvestrand.no<mailto:rtc-web@alvestrand.no>; David Singer Subject: Re: [RTW] List is now open On 10/09/10 03:17, Silvia Pfeiffer wrote: Thanks for the general invite! I wonder: has the HTML5 device element been looked at (http://dev.w3.org/html5/html-device/) and what are the problems with that solution? We're the ones who have to look - and some on the list have been closely involved with writing the <device> spec. It would be surprising to me if they are not part of the solution - but just part. As far as I know, it's still not clear how to tie a <device> to a media stream - given that media streams aren't defined yet, this is not very surprising :-) Cheers, Silvia. On Sat, Oct 9, 2010 at 5:28 PM, David Singer <singer@apple.com<mailto:singer@apple.com>> wrote: Cool, thanks Harald as I said during the day, I'd like to separate (as much as possible) "why is real-time communications on the internet hard?" (which is true, but a subject the IETF, the ITU, and others are also grappling with) from "what is interesting/challenging about real-time communications *in the web*?" -- which I take to mean in pages shown by a browser. On Oct 8, 2010, at 4:24 , Harald Alvestrand wrote:
Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open.
If you know of people you think should be on the list, please ask them to subscribe!
The two ways to subscribe:
- Send "subscribe" to rtc-web-request@alvestrand.no<mailto:rtc-web-request@alvestrand.no>, and do what the response says - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions
Let the discussions begin!
Harald
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no<mailto:RTC-Web@alvestrand.no> http://www.alvestrand.no/mailman/listinfo/rtc-web
David Singer Multimedia and Software Standards, Apple Inc. _______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no<mailto:RTC-Web@alvestrand.no> http://www.alvestrand.no/mailman/listinfo/rtc-web

My idea is that we will tie these various pieces together like a filter chain, i.e. <device> -> [encoder] -> [transport] -> [decoder] -> <video> where the connections between pieces are made by passing URLs around. (i.e. opening a device yields a URL for a stream, which is supplied to the encoder; at the other end, streams coming from the decoder are identified by a URL, which can then be passed directly to a <video> tag or WebGL texture.) Other combinations are of course possible, such as direct access to [transport], in the case of a web real-time game, or combining <device> -> encoder -> websocket, for doing live (non-realtime) broadcasts. We're still figuring out the right interfaces for encoder/decoder; for transport, hopefully the draft I proposed<https://docs.google.com/viewer?url=http://rtc-web.alvestrand.com/papers/juberti-p2ptransport-api.pdf%3Fattredirects%3D0>can serve as a reasonable starting point. --justin On Sun, Oct 10, 2010 at 4:26 PM, Stefan Håkansson LK < stefan.lk.hakansson@ericsson.com> wrote:
That's right, a lot of things remain regarding protocols and other stuff. But IMHO <device>, StreamAPIs and <audio> and <video> should be part of the puzzle!
--Stefan
------------------------------ *From:* Silvia Pfeiffer [mailto:silviapfeiffer1@gmail.com] *Sent:* den 11 oktober 2010 01:23 *To:* Stefan Håkansson LK *Cc:* Harald Alvestrand; rtc-web@alvestrand.no; David Singer
*Subject:* Re: [RTW] List is now open
Hi Stefan,
I have seen those, thanks. That's actually the reason why I asked: because I have already seen it work with the <device> element and I wondered what the remaining challenges were. It seems there is lots of discussion about protocols and codecs.
Cheers, Silvia.
On Mon, Oct 11, 2010 at 9:49 AM, Stefan Håkansson LK < stefan.lk.hakansson@ericsson.com> wrote:
Silvia,
you might be interested in some experimenting we've done with media streams and <device>: https://labs.ericsson.com/developer-community/blog/beyond-html5-conversation... (you can track back to earlier posts).
BR, Stefan
------------------------------ *From:* rtc-web-bounces@alvestrand.no [mailto: rtc-web-bounces@alvestrand.no] *On Behalf Of *Harald Alvestrand *Sent:* den 9 oktober 2010 10:12 *To:* Silvia Pfeiffer *Cc:* rtc-web@alvestrand.no; David Singer *Subject:* Re: [RTW] List is now open
On 10/09/10 03:17, Silvia Pfeiffer wrote:
Thanks for the general invite!
I wonder: has the HTML5 device element been looked at ( http://dev.w3.org/html5/html-device/) and what are the problems with that solution?
We're the ones who have to look - and some on the list have been closely involved with writing the <device> spec. It would be surprising to me if they are not part of the solution - but just part.
As far as I know, it's still not clear how to tie a <device> to a media stream - given that media streams aren't defined yet, this is not very surprising :-)
Cheers, Silvia.
On Sat, Oct 9, 2010 at 5:28 PM, David Singer <singer@apple.com> wrote:
Cool, thanks Harald
as I said during the day, I'd like to separate (as much as possible) "why is real-time communications on the internet hard?" (which is true, but a subject the IETF, the ITU, and others are also grappling with) from "what is interesting/challenging about real-time communications *in the web*?" -- which I take to mean in pages shown by a browser.
On Oct 8, 2010, at 4:24 , Harald Alvestrand wrote:
Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open.
If you know of people you think should be on the list, please ask them to subscribe!
The two ways to subscribe:
- Send "subscribe" to rtc-web-request@alvestrand.no, and do what the response says - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions
Let the discussions begin!
Harald
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
David Singer Multimedia and Software Standards, Apple Inc.
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web

It seems to me that works nicely for 1-to-1 RTC. It wouldn't even require introduction of RTP/RTSP but can simply be pushing encoded a/v packets over the network as fast as possible on the given URL, which would then be a UDP URL? I guess for multi-peer RTC a different approach on the network would be required or would it be a full mesh? Silvia. On Mon, Oct 11, 2010 at 10:43 AM, Justin Uberti <juberti@google.com> wrote:
My idea is that we will tie these various pieces together like a filter chain, i.e.
<device> -> [encoder] -> [transport] -> [decoder] -> <video>
where the connections between pieces are made by passing URLs around. (i.e. opening a device yields a URL for a stream, which is supplied to the encoder; at the other end, streams coming from the decoder are identified by a URL, which can then be passed directly to a <video> tag or WebGL texture.)
Other combinations are of course possible, such as direct access to [transport], in the case of a web real-time game, or combining <device> -> encoder -> websocket, for doing live (non-realtime) broadcasts.
We're still figuring out the right interfaces for encoder/decoder; for transport, hopefully the draft I proposed<https://docs.google.com/viewer?url=http://rtc-web.alvestrand.com/papers/juberti-p2ptransport-api.pdf%3Fattredirects%3D0>can serve as a reasonable starting point.
--justin
On Sun, Oct 10, 2010 at 4:26 PM, Stefan Håkansson LK < stefan.lk.hakansson@ericsson.com> wrote:
That's right, a lot of things remain regarding protocols and other stuff. But IMHO <device>, StreamAPIs and <audio> and <video> should be part of the puzzle!
--Stefan
------------------------------ *From:* Silvia Pfeiffer [mailto:silviapfeiffer1@gmail.com] *Sent:* den 11 oktober 2010 01:23 *To:* Stefan Håkansson LK *Cc:* Harald Alvestrand; rtc-web@alvestrand.no; David Singer
*Subject:* Re: [RTW] List is now open
Hi Stefan,
I have seen those, thanks. That's actually the reason why I asked: because I have already seen it work with the <device> element and I wondered what the remaining challenges were. It seems there is lots of discussion about protocols and codecs.
Cheers, Silvia.
On Mon, Oct 11, 2010 at 9:49 AM, Stefan Håkansson LK < stefan.lk.hakansson@ericsson.com> wrote:
Silvia,
you might be interested in some experimenting we've done with media streams and <device>: https://labs.ericsson.com/developer-community/blog/beyond-html5-conversation... (you can track back to earlier posts).
BR, Stefan
------------------------------ *From:* rtc-web-bounces@alvestrand.no [mailto: rtc-web-bounces@alvestrand.no] *On Behalf Of *Harald Alvestrand *Sent:* den 9 oktober 2010 10:12 *To:* Silvia Pfeiffer *Cc:* rtc-web@alvestrand.no; David Singer *Subject:* Re: [RTW] List is now open
On 10/09/10 03:17, Silvia Pfeiffer wrote:
Thanks for the general invite!
I wonder: has the HTML5 device element been looked at ( http://dev.w3.org/html5/html-device/) and what are the problems with that solution?
We're the ones who have to look - and some on the list have been closely involved with writing the <device> spec. It would be surprising to me if they are not part of the solution - but just part.
As far as I know, it's still not clear how to tie a <device> to a media stream - given that media streams aren't defined yet, this is not very surprising :-)
Cheers, Silvia.
On Sat, Oct 9, 2010 at 5:28 PM, David Singer <singer@apple.com> wrote:
Cool, thanks Harald
as I said during the day, I'd like to separate (as much as possible) "why is real-time communications on the internet hard?" (which is true, but a subject the IETF, the ITU, and others are also grappling with) from "what is interesting/challenging about real-time communications *in the web*?" -- which I take to mean in pages shown by a browser.
On Oct 8, 2010, at 4:24 , Harald Alvestrand wrote:
Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open.
If you know of people you think should be on the list, please ask them to subscribe!
The two ways to subscribe:
- Send "subscribe" to rtc-web-request@alvestrand.no, and do what the response says - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions
Let the discussions begin!
Harald
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
David Singer Multimedia and Software Standards, Apple Inc.
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web

On Sun, Oct 10, 2010 at 5:44 PM, Silvia Pfeiffer <silviapfeiffer1@gmail.com>wrote:
It seems to me that works nicely for 1-to-1 RTC. It wouldn't even require introduction of RTP/RTSP but can simply be pushing encoded a/v packets over the network as fast as possible on the given URL, which would then be a UDP URL?
The URLs I refer to are really just identifiers to identify resources internally inside the browser - for the network connectivity the plan is to use ICE to interactively determine how packets should be routed to the desired endpoint in the presence of NATs and firewalls. Regarding RTP, we'll want to use RTP for the on-the-wire protocol, so that we get all the goodness of RTP - multiplexing multiple streams and/or formats, detection of losses, lipsync, etc.
I guess for multi-peer RTC a different approach on the network would be required or would it be a full mesh?
Conferencing is a key use case, and I think this design can support either centralized or distributed conferencing models. The nodes will connect to one another using the same wire protocol, but the overall topology of the nodes will be up to the application.
Silvia.
On Mon, Oct 11, 2010 at 10:43 AM, Justin Uberti <juberti@google.com>wrote:
My idea is that we will tie these various pieces together like a filter chain, i.e.
<device> -> [encoder] -> [transport] -> [decoder] -> <video>
where the connections between pieces are made by passing URLs around. (i.e. opening a device yields a URL for a stream, which is supplied to the encoder; at the other end, streams coming from the decoder are identified by a URL, which can then be passed directly to a <video> tag or WebGL texture.)
Other combinations are of course possible, such as direct access to [transport], in the case of a web real-time game, or combining <device> -> encoder -> websocket, for doing live (non-realtime) broadcasts.
We're still figuring out the right interfaces for encoder/decoder; for transport, hopefully the draft I proposed<https://docs.google.com/viewer?url=http://rtc-web.alvestrand.com/papers/juberti-p2ptransport-api.pdf%3Fattredirects%3D0>can serve as a reasonable starting point.
--justin
On Sun, Oct 10, 2010 at 4:26 PM, Stefan Håkansson LK < stefan.lk.hakansson@ericsson.com> wrote:
That's right, a lot of things remain regarding protocols and other stuff. But IMHO <device>, StreamAPIs and <audio> and <video> should be part of the puzzle!
--Stefan
------------------------------ *From:* Silvia Pfeiffer [mailto:silviapfeiffer1@gmail.com] *Sent:* den 11 oktober 2010 01:23 *To:* Stefan Håkansson LK *Cc:* Harald Alvestrand; rtc-web@alvestrand.no; David Singer
*Subject:* Re: [RTW] List is now open
Hi Stefan,
I have seen those, thanks. That's actually the reason why I asked: because I have already seen it work with the <device> element and I wondered what the remaining challenges were. It seems there is lots of discussion about protocols and codecs.
Cheers, Silvia.
On Mon, Oct 11, 2010 at 9:49 AM, Stefan Håkansson LK < stefan.lk.hakansson@ericsson.com> wrote:
Silvia,
you might be interested in some experimenting we've done with media streams and <device>: https://labs.ericsson.com/developer-community/blog/beyond-html5-conversation... (you can track back to earlier posts).
BR, Stefan
------------------------------ *From:* rtc-web-bounces@alvestrand.no [mailto: rtc-web-bounces@alvestrand.no] *On Behalf Of *Harald Alvestrand *Sent:* den 9 oktober 2010 10:12 *To:* Silvia Pfeiffer *Cc:* rtc-web@alvestrand.no; David Singer *Subject:* Re: [RTW] List is now open
On 10/09/10 03:17, Silvia Pfeiffer wrote:
Thanks for the general invite!
I wonder: has the HTML5 device element been looked at ( http://dev.w3.org/html5/html-device/) and what are the problems with that solution?
We're the ones who have to look - and some on the list have been closely involved with writing the <device> spec. It would be surprising to me if they are not part of the solution - but just part.
As far as I know, it's still not clear how to tie a <device> to a media stream - given that media streams aren't defined yet, this is not very surprising :-)
Cheers, Silvia.
On Sat, Oct 9, 2010 at 5:28 PM, David Singer <singer@apple.com> wrote:
Cool, thanks Harald
as I said during the day, I'd like to separate (as much as possible) "why is real-time communications on the internet hard?" (which is true, but a subject the IETF, the ITU, and others are also grappling with) from "what is interesting/challenging about real-time communications *in the web*?" -- which I take to mean in pages shown by a browser.
On Oct 8, 2010, at 4:24 , Harald Alvestrand wrote:
Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open.
If you know of people you think should be on the list, please ask them to subscribe!
The two ways to subscribe:
- Send "subscribe" to rtc-web-request@alvestrand.no, and do what the response says - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions
Let the discussions begin!
Harald
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
David Singer Multimedia and Software Standards, Apple Inc.
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web

The issue of peer-to-peer multicast is more complex than one of simply topology of nodes. For various reasons, one really wants to be able to send different parts of the same stream via different paths in order to accomplish this (made even more complicated if you believe that FEC is appropriate for this... which I don't, but that's a different argument altogether). Then it is no longer as simple as wiring together RTP In to RTP Out. (And, as I mentioned in a previous email, there might be good reasons for not even allowing such a connection, to prevent silent relaying from using up bandwidth at unsuspecting browsers) Matthew Kaufman From: "Justin Uberti" <juberti@google.com> I guess for multi-peer RTC a different approach on the network would be required or would it be a full mesh? Conferencing is a key use case, and I think this design can support either centralized or distributed conferencing models. The nodes will connect to one another using the same wire protocol, but the overall topology of the nodes will be up to the application.

There needs to be: 1) A way of ensuring that media packets being sent to a "UDP URL" are being sent to a cooperating device, which means there must be some sort of handshake which relies on an out-of-band exchange as well. STUN Connectivity Check probes and responses meet this requirement, if it is enforced that this handshake must complete prior to sending media traffic. 2) A way of describing how these media packets will be encapsulated, especially if audio and video will be on the same UDP port. RTP meets this requirement (if you're ok with using SSRC for multiplexing, for the latter case). You do raise the good question as to whether or not peer-to-peer application-level multicast overlays are in scope or not. Matthew Kaufman From: "Silvia Pfeiffer" <silviapfeiffer1@gmail.com> To: "Justin Uberti" <juberti@google.com> Cc: "Stefan Håkansson LK" <stefan.lk.hakansson@ericsson.com>, rtc-web@alvestrand.no, "David Singer" <singer@apple.com>, "Harald Alvestrand" <harald@alvestrand.no> Sent: Sunday, October 10, 2010 5:44:02 PM Subject: Re: [RTW] List is now open It seems to me that works nicely for 1-to-1 RTC. It wouldn't even require introduction of RTP/RTSP but can simply be pushing encoded a/v packets over the network as fast as possible on the given URL, which would then be a UDP URL? I guess for multi-peer RTC a different approach on the network would be required or would it be a full mesh? Silvia. On Mon, Oct 11, 2010 at 10:43 AM, Justin Uberti < juberti@google.com > wrote: My idea is that we will tie these various pieces together like a filter chain, i.e. <device> -> [encoder] -> [transport] -> [decoder] -> <video> where the connections between pieces are made by passing URLs around. (i.e. opening a device yields a URL for a stream, which is supplied to the encoder; at the other end, streams coming from the decoder are identified by a URL, which can then be passed directly to a <video> tag or WebGL texture.) Other combinations are of course possible, such as direct access to [transport], in the case of a web real-time game, or combining <device> -> encoder -> websocket, for doing live (non-realtime) broadcasts. We're still figuring out the right interfaces for encoder/decoder; for transport, hopefully the draft I proposed can serve as a reasonable starting point. --justin On Sun, Oct 10, 2010 at 4:26 PM, Stefan Håkansson LK < stefan.lk.hakansson@ericsson.com > wrote: That's right, a lot of things remain regarding protocols and other stuff. But IMHO <device>, StreamAPIs and <audio> and <video> should be part of the puzzle! --Stefan From: Silvia Pfeiffer [mailto: silviapfeiffer1@gmail.com ] Sent: den 11 oktober 2010 01:23 To: Stefan Håkansson LK Cc: Harald Alvestrand; rtc-web@alvestrand.no ; David Singer Subject: Re: [RTW] List is now open Hi Stefan, I have seen those, thanks. That's actually the reason why I asked: because I have already seen it work with the <device> element and I wondered what the remaining challenges were. It seems there is lots of discussion about protocols and codecs. Cheers, Silvia. On Mon, Oct 11, 2010 at 9:49 AM, Stefan Håkansson LK < stefan.lk.hakansson@ericsson.com > wrote: Silvia, you might be interested in some experimenting we've done with media streams and <device>: https://labs.ericsson.com/developer-community/blog/beyond-html5-conversation... (you can track back to earlier posts). BR, Stefan From: rtc-web-bounces@alvestrand.no [mailto: rtc-web-bounces@alvestrand.no ] On Behalf Of Harald Alvestrand Sent: den 9 oktober 2010 10:12 To: Silvia Pfeiffer Cc: rtc-web@alvestrand.no ; David Singer Subject: Re: [RTW] List is now open On 10/09/10 03:17, Silvia Pfeiffer wrote: Thanks for the general invite! I wonder: has the HTML5 device element been looked at ( http://dev.w3.org/html5/html-device/ ) and what are the problems with that solution? We're the ones who have to look - and some on the list have been closely involved with writing the <device> spec. It would be surprising to me if they are not part of the solution - but just part. As far as I know, it's still not clear how to tie a <device> to a media stream - given that media streams aren't defined yet, this is not very surprising :-) Cheers, Silvia. On Sat, Oct 9, 2010 at 5:28 PM, David Singer < singer@apple.com > wrote: Cool, thanks Harald as I said during the day, I'd like to separate (as much as possible) "why is real-time communications on the internet hard?" (which is true, but a subject the IETF, the ITU, and others are also grappling with) from "what is interesting/challenging about real-time communications *in the web*?" -- which I take to mean in pages shown by a browser. On Oct 8, 2010, at 4:24 , Harald Alvestrand wrote:
Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open.
If you know of people you think should be on the list, please ask them to subscribe!
The two ways to subscribe:
- Send "subscribe" to rtc-web-request@alvestrand.no , and do what the response says - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions
Let the discussions begin!
Harald
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
David Singer Multimedia and Software Standards, Apple Inc. _______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web _______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web _______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web

You do raise the good question as to whether or not peer-to-peer application-level multicast overlays are in scope or not.
I would consider it a victory if we could get unicast working. Multicast seems like a second priority, at best. --Richard
Matthew Kaufman
From: "Silvia Pfeiffer" <silviapfeiffer1@gmail.com> To: "Justin Uberti" <juberti@google.com> Cc: "Stefan Håkansson LK" <stefan.lk.hakansson@ericsson.com>, rtc-web@alvestrand.no , "David Singer" <singer@apple.com>, "Harald Alvestrand" <harald@alvestrand.no
Sent: Sunday, October 10, 2010 5:44:02 PM Subject: Re: [RTW] List is now open
It seems to me that works nicely for 1-to-1 RTC. It wouldn't even require introduction of RTP/RTSP but can simply be pushing encoded a/ v packets over the network as fast as possible on the given URL, which would then be a UDP URL?
I guess for multi-peer RTC a different approach on the network would be required or would it be a full mesh?
Silvia.
On Mon, Oct 11, 2010 at 10:43 AM, Justin Uberti <juberti@google.com> wrote: My idea is that we will tie these various pieces together like a filter chain, i.e.
<device> -> [encoder] -> [transport] -> [decoder] -> <video>
where the connections between pieces are made by passing URLs around. (i.e. opening a device yields a URL for a stream, which is supplied to the encoder; at the other end, streams coming from the decoder are identified by a URL, which can then be passed directly to a <video> tag or WebGL texture.)
Other combinations are of course possible, such as direct access to [transport], in the case of a web real-time game, or combining <device> -> encoder -> websocket, for doing live (non-realtime) broadcasts.
We're still figuring out the right interfaces for encoder/decoder; for transport, hopefully the draft I proposed can serve as a reasonable starting point.
--justin
On Sun, Oct 10, 2010 at 4:26 PM, Stefan Håkansson LK <stefan.lk.hakansson@ericsson.com
wrote: That's right, a lot of things remain regarding protocols and other stuff. But IMHO <device>, StreamAPIs and <audio> and <video> should be part of the puzzle!
--Stefan
From: Silvia Pfeiffer [mailto:silviapfeiffer1@gmail.com] Sent: den 11 oktober 2010 01:23 To: Stefan Håkansson LK Cc: Harald Alvestrand; rtc-web@alvestrand.no; David Singer
Subject: Re: [RTW] List is now open
Hi Stefan,
I have seen those, thanks. That's actually the reason why I asked: because I have already seen it work with the <device> element and I wondered what the remaining challenges were. It seems there is lots of discussion about protocols and codecs.
Cheers, Silvia.
On Mon, Oct 11, 2010 at 9:49 AM, Stefan Håkansson LK <stefan.lk.hakansson@ericsson.com
wrote: Silvia,
you might be interested in some experimenting we've done with media streams and <device>: https://labs.ericsson.com/developer-community/blog/beyond-html5-conversation... (you can track back to earlier posts).
BR, Stefan
From: rtc-web-bounces@alvestrand.no [mailto:rtc-web-bounces@alvestrand.no ] On Behalf Of Harald Alvestrand Sent: den 9 oktober 2010 10:12 To: Silvia Pfeiffer Cc: rtc-web@alvestrand.no; David Singer Subject: Re: [RTW] List is now open
On 10/09/10 03:17, Silvia Pfeiffer wrote: Thanks for the general invite!
I wonder: has the HTML5 device element been looked at (http://dev.w3.org/html5/html-device/ ) and what are the problems with that solution? We're the ones who have to look - and some on the list have been closely involved with writing the <device> spec. It would be surprising to me if they are not part of the solution - but just part.
As far as I know, it's still not clear how to tie a <device> to a media stream - given that media streams aren't defined yet, this is not very surprising :-)
Cheers, Silvia.
On Sat, Oct 9, 2010 at 5:28 PM, David Singer <singer@apple.com> wrote: Cool, thanks Harald
as I said during the day, I'd like to separate (as much as possible) "why is real-time communications on the internet hard?" (which is true, but a subject the IETF, the ITU, and others are also grappling with) from "what is interesting/challenging about real-time communications *in the web*?" -- which I take to mean in pages shown by a browser.
On Oct 8, 2010, at 4:24 , Harald Alvestrand wrote:
Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open.
If you know of people you think should be on the list, please ask them to subscribe!
The two ways to subscribe:
- Send "subscribe" to rtc-web-request@alvestrand.no, and do what the response says - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions
Let the discussions begin!
Harald
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
David Singer Multimedia and Software Standards, Apple Inc.
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web _______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web

I agree that it makes sense to be a filter chain where parts can be plugged together somewhat arbitrarily (though we probably want to disallow receiving a stream from somewhere and then sending it on to somewhere else without ever playing it back, to prevent banner ads from becoming third-party relays). But I disagree with passing URLs around as the mechanism. These are all objects that can be created through Javascript. Objects can be pointed at other objects (setInput, setOutput as an example of what you might name the APIs), even if the behind-the-scenes data flow is (by necessity of it being real-time constrained) not passing through Javascript. The only case where URLs might make sense, in my mind, is the actual device access ("camera:front"), but even there I'm not sure. Matthew Kaufman From: "Justin Uberti" <juberti@google.com> To: "Stefan Håkansson LK" <stefan.lk.hakansson@ericsson.com> Cc: "Silvia Pfeiffer" <silviapfeiffer1@gmail.com>, "Harald Alvestrand" <harald@alvestrand.no>, rtc-web@alvestrand.no, "David Singer" <singer@apple.com> Sent: Sunday, October 10, 2010 4:43:47 PM Subject: Re: [RTW] List is now open My idea is that we will tie these various pieces together like a filter chain, i.e. <device> -> [encoder] -> [transport] -> [decoder] -> <video> where the connections between pieces are made by passing URLs around. (i.e. opening a device yields a URL for a stream, which is supplied to the encoder; at the other end, streams coming from the decoder are identified by a URL, which can then be passed directly to a <video> tag or WebGL texture.) Other combinations are of course possible, such as direct access to [transport], in the case of a web real-time game, or combining <device> -> encoder -> websocket, for doing live (non-realtime) broadcasts. We're still figuring out the right interfaces for encoder/decoder; for transport, hopefully the draft I proposed can serve as a reasonable starting point. --justin On Sun, Oct 10, 2010 at 4:26 PM, Stefan Håkansson LK < stefan.lk.hakansson@ericsson.com > wrote: That's right, a lot of things remain regarding protocols and other stuff. But IMHO <device>, StreamAPIs and <audio> and <video> should be part of the puzzle! --Stefan From: Silvia Pfeiffer [mailto: silviapfeiffer1@gmail.com ] Sent: den 11 oktober 2010 01:23 To: Stefan Håkansson LK Cc: Harald Alvestrand; rtc-web@alvestrand.no ; David Singer Subject: Re: [RTW] List is now open Hi Stefan, I have seen those, thanks. That's actually the reason why I asked: because I have already seen it work with the <device> element and I wondered what the remaining challenges were. It seems there is lots of discussion about protocols and codecs. Cheers, Silvia. On Mon, Oct 11, 2010 at 9:49 AM, Stefan Håkansson LK < stefan.lk.hakansson@ericsson.com > wrote: Silvia, you might be interested in some experimenting we've done with media streams and <device>: https://labs.ericsson.com/developer-community/blog/beyond-html5-conversation... (you can track back to earlier posts). BR, Stefan From: rtc-web-bounces@alvestrand.no [mailto: rtc-web-bounces@alvestrand.no ] On Behalf Of Harald Alvestrand Sent: den 9 oktober 2010 10:12 To: Silvia Pfeiffer Cc: rtc-web@alvestrand.no ; David Singer Subject: Re: [RTW] List is now open On 10/09/10 03:17, Silvia Pfeiffer wrote: Thanks for the general invite! I wonder: has the HTML5 device element been looked at ( http://dev.w3.org/html5/html-device/ ) and what are the problems with that solution? We're the ones who have to look - and some on the list have been closely involved with writing the <device> spec. It would be surprising to me if they are not part of the solution - but just part. As far as I know, it's still not clear how to tie a <device> to a media stream - given that media streams aren't defined yet, this is not very surprising :-) Cheers, Silvia. On Sat, Oct 9, 2010 at 5:28 PM, David Singer < singer@apple.com > wrote: Cool, thanks Harald as I said during the day, I'd like to separate (as much as possible) "why is real-time communications on the internet hard?" (which is true, but a subject the IETF, the ITU, and others are also grappling with) from "what is interesting/challenging about real-time communications *in the web*?" -- which I take to mean in pages shown by a browser. On Oct 8, 2010, at 4:24 , Harald Alvestrand wrote:
Cullen (I think) has changed the permissions on the list, so that now everyone can subscribe, and the archives are open.
If you know of people you think should be on the list, please ask them to subscribe!
The two ways to subscribe:
- Send "subscribe" to rtc-web-request@alvestrand.no , and do what the response says - Go to http://www.alvestrand.no/mailman/listinfo/rtc-web and follow instructions
Let the discussions begin!
Harald
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web
David Singer Multimedia and Software Standards, Apple Inc. _______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web _______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web _______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web

On Sun, 10 Oct 2010, Justin Uberti wrote:
My idea is that we will tie these various pieces together like a filter chain, i.e.
<device> -> [encoder] -> [transport] -> [decoder] -> <video>
This is basically what's in the HTML spec today: On the sending side: <device> is <device> [encoder] is a Stream object [transport] is a PeerConnection object On the receiving side: [transport] is a PeerConnection object [decoder] is a Stream object's .url attribute <video> is a <video> The details are abstracted out a lot at this point; the idea is that future versions of the API can add specific knobs and dials to allow authors to control what they want to change; we can decide what to expose based on what people most want. This dramatically limits the complexity of what we have to implement, and makes it more likely we'll get interop sooner. However, I need to do some work on this spec to merge in the ideas from some of Justin's work, so this may change a bit. What will certainly change is the way the low-bandwidth channel maintained by the script is interfaced with on the PeerConnection object; I also need to move some of the Session stuff into the spec, e.g. sending of DTMF tones. Work still has to happen to define how information is conveyed to the browser, in particular, what format the browser should expect server configuration to be in. We also need a spec defining how browsers use ICE/STUN/etc, e.g. how they identify which PeerConnection object an incoming connection is related to -- imagine the situation of a script trying to open two simultaneous connections between the same two browsers. One thing that might make sense is for me to write a skeleton of what I imagine the "next layer" would look like (the spec layer that defines the format of the data that the browsers use to talk over the low-latency channel, defines how ICE/STUN/relays/etc are used, defines what the relays are, defines how to do video codec negotiation, defines how to identify which packets go with which PeerConnection, defines how to respond to specific API calls in terms of data on the wire, etc), and to then hand that off to someone who actually knows how that is all to be defined. Would that make sense? -- Ian Hickson U+1047E )\._.,--....,'``. fL http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,. Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'

On Mon, Oct 11, 2010 at 12:11 PM, Ian Hickson <ian@hixie.ch> wrote:
On Sun, 10 Oct 2010, Justin Uberti wrote:
My idea is that we will tie these various pieces together like a filter chain, i.e.
<device> -> [encoder] -> [transport] -> [decoder] -> <video>
This is basically what's in the HTML spec today:
On the sending side:
<device> is <device> [encoder] is a Stream object [transport] is a PeerConnection object
On the receiving side:
[transport] is a PeerConnection object [decoder] is a Stream object's .url attribute <video> is a <video>
The details are abstracted out a lot at this point; the idea is that future versions of the API can add specific knobs and dials to allow authors to control what they want to change; we can decide what to expose based on what people most want. This dramatically limits the complexity of what we have to implement, and makes it more likely we'll get interop sooner. However, I need to do some work on this spec to merge in the ideas from some of Justin's work, so this may change a bit. What will certainly change is the way the low-bandwidth channel maintained by the script is interfaced with on the PeerConnection object; I also need to move some of the Session stuff into the spec, e.g. sending of DTMF tones.
Work still has to happen to define how information is conveyed to the browser, in particular, what format the browser should expect server configuration to be in. We also need a spec defining how browsers use ICE/STUN/etc, e.g. how they identify which PeerConnection object an incoming connection is related to -- imagine the situation of a script trying to open two simultaneous connections between the same two browsers.
One thing that might make sense is for me to write a skeleton of what I imagine the "next layer" would look like (the spec layer that defines the format of the data that the browsers use to talk over the low-latency channel, defines how ICE/STUN/relays/etc are used, defines what the relays are, defines how to do video codec negotiation, defines how to identify which packets go with which PeerConnection, defines how to respond to specific API calls in terms of data on the wire, etc), and to then hand that off to someone who actually knows how that is all to be defined. Would that make sense?
Sure. As you mention, our previous discussions on the Session API are probably a good start for this. I'd be happy to work with you on the details.
-- Ian Hickson U+1047E )\._.,--....,'``. fL http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,. Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'

On Wed, 13 Oct 2010, Justin Uberti wrote:
One thing that might make sense is for me to write a skeleton of what I imagine the "next layer" would look like (the spec layer that defines the format of the data that the browsers use to talk over the low-latency channel, defines how ICE/STUN/relays/etc are used, defines what the relays are, defines how to do video codec negotiation, defines how to identify which packets go with which PeerConnection, defines how to respond to specific API calls in terms of data on the wire, etc), and to then hand that off to someone who actually knows how that is all to be defined. Would that make sense?
Sure. As you mention, our previous discussions on the Session API are probably a good start for this. I'd be happy to work with you on the details.
Ok, here's a very early first draft at such a skeleton: http://hixie.ch/specs/rtc-skeleton/ The bits marked in asterisks *like this* are the bits that I would then use in the HTML spec to glue the two specs together. (I haven't yet updated the HTML spec to use these terms, but it won't take long.) -- Ian Hickson U+1047E )\._.,--....,'``. fL http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,. Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'

On Mon, 11 Oct 2010, Ian Hickson wrote:
On Sun, 10 Oct 2010, Justin Uberti wrote:
My idea is that we will tie these various pieces together like a filter chain, i.e.
<device> -> [encoder] -> [transport] -> [decoder] -> <video>
This is basically what's in the HTML spec today:
On the sending side:
<device> is <device> [encoder] is a Stream object [transport] is a PeerConnection object
On the receiving side:
[transport] is a PeerConnection object [decoder] is a Stream object's .url attribute <video> is a <video>
We see Stream simply as the glue between the device element and the transport endpoint, and between the remote transport endpoint and the video element. Thus encoding and decoding is handled by the ConnectionPeer object. <device> ==> [encoder] -> [transport] -> [decoder] ==> <video> On the sending side: <device> is <device> ==> is a Stream object [encoder] -> [transport] is a ConnectionPeer object On the receiving side: [transport] -> [decoder] is a ConnectionPeer object ==> is a Stream object <video> is a <video> In the local self-view case there's no encoding or decoding. <device> ==> <video> In the recording case, the encoding is handled by the StreamRecorder. <device> ==> [recorder] -> [encoder] -> [file] <device> is <device> ==> is a Stream object [recorder] -> [encoder] is a StreamRecorder [file] is a File
The details are abstracted out a lot at this point; the idea is that future versions of the API can add specific knobs and dials to allow authors to control what they want to change; we can decide what to expose based on what people most want. This dramatically limits the complexity of what we have to implement, and makes it more likely we'll get interop sooner. However, I need to do some work on this spec to merge in the ideas from some of Justin's work, so this may change a bit. What will certainly change is the way the low-bandwidth channel maintained by the script is interfaced with on the PeerConnection object; I also need to move some of the Session stuff into the spec, e.g. sending of DTMF tones.
Rather than specific functionality for sending DTMF tones, it would be more useful to have a generic way to include pre-recorded media in a Stream. BR Adam

Another advantage of denoting Stream as something that is uncoded is that it makes it simpler to add audio processing. You could have something like <device> ==> [encoder] -> [transport] -> [decoder] ==> [processing] ==> <audio> where processing could be something based on for example the rendered scene in a game (e.g. panning). --Stefan -----Original Message----- From: Adam Bergkvist Sent: den 25 oktober 2010 14:29 To: Ian Hickson; Justin Uberti Cc: Stefan Håkansson LK; Silvia Pfeiffer; Harald Alvestrand; rtc-web@alvestrand.no; David Singer Subject: RE: [RTW] List is now open On Mon, 11 Oct 2010, Ian Hickson wrote:
On Sun, 10 Oct 2010, Justin Uberti wrote:
My idea is that we will tie these various pieces together like a filter chain, i.e.
<device> -> [encoder] -> [transport] -> [decoder] -> <video>
This is basically what's in the HTML spec today:
On the sending side:
<device> is <device> [encoder] is a Stream object [transport] is a PeerConnection object
On the receiving side:
[transport] is a PeerConnection object [decoder] is a Stream object's .url attribute <video> is a <video>
We see Stream simply as the glue between the device element and the transport endpoint, and between the remote transport endpoint and the video element. Thus encoding and decoding is handled by the ConnectionPeer object. <device> ==> [encoder] -> [transport] -> [decoder] ==> <video> On the sending side: <device> is <device> ==> is a Stream object [encoder] -> [transport] is a ConnectionPeer object On the receiving side: [transport] -> [decoder] is a ConnectionPeer object ==> is a Stream object <video> is a <video> In the local self-view case there's no encoding or decoding. <device> ==> <video> In the recording case, the encoding is handled by the StreamRecorder. <device> ==> [recorder] -> [encoder] -> [file] <device> is <device> ==> is a Stream object [recorder] -> [encoder] is a StreamRecorder [file] is a File
The details are abstracted out a lot at this point; the idea is that future versions of the API can add specific knobs and dials to allow authors to control what they want to change; we can decide what to expose based on what people most want. This dramatically limits the complexity of what we have to implement, and makes it more likely we'll get interop sooner. However, I need to do some work on this spec to merge in the ideas from some of Justin's work, so this may change a bit. What will certainly change is the way the low-bandwidth channel maintained by the script is interfaced with on the PeerConnection object; I also need to move some of the Session stuff into the spec, e.g. sending of DTMF tones.
Rather than specific functionality for sending DTMF tones, it would be more useful to have a generic way to include pre-recorded media in a Stream. BR Adam

Changing subject lines.... one thing to remember in the video case is that there's no "uncoded" format - in limited-hardware cases like phones, it's extremely important to only perform the transformations one absolutely has to, so it's "figure out which format that is supported by the camera and can be easily displayed on recipient's screen with the most lightweight conversion"; the figuring-out is probably out of scope for standardization, but the enumeration and choosing of possible formats for each element is probably critical. On 10/25/10 15:18, Stefan Håkansson LK wrote:
Another advantage of denoting Stream as something that is uncoded is that it makes it simpler to add audio processing.
You could have something like
<device> ==> [encoder] -> [transport] -> [decoder] ==> [processing] ==> <audio>
where processing could be something based on for example the rendered scene in a game (e.g. panning).
--Stefan
-----Original Message----- From: Adam Bergkvist Sent: den 25 oktober 2010 14:29 To: Ian Hickson; Justin Uberti Cc: Stefan Håkansson LK; Silvia Pfeiffer; Harald Alvestrand; rtc-web@alvestrand.no; David Singer Subject: RE: [RTW] List is now open
On Mon, 11 Oct 2010, Ian Hickson wrote:
On Sun, 10 Oct 2010, Justin Uberti wrote:
My idea is that we will tie these various pieces together like a filter chain, i.e.
<device> -> [encoder] -> [transport] -> [decoder] -> <video>
This is basically what's in the HTML spec today:
On the sending side:
<device> is<device> [encoder] is a Stream object [transport] is a PeerConnection object
On the receiving side:
[transport] is a PeerConnection object [decoder] is a Stream object's .url attribute <video> is a<video>
We see Stream simply as the glue between the device element and the transport endpoint, and between the remote transport endpoint and the video element. Thus encoding and decoding is handled by the ConnectionPeer object.
<device> ==> [encoder] -> [transport] -> [decoder] ==> <video>
On the sending side:
<device> is<device> ==> is a Stream object [encoder] -> [transport] is a ConnectionPeer object
On the receiving side:
[transport] -> [decoder] is a ConnectionPeer object ==> is a Stream object <video> is a<video>
In the local self-view case there's no encoding or decoding.
<device> ==> <video>
In the recording case, the encoding is handled by the StreamRecorder.
<device> ==> [recorder] -> [encoder] -> [file]
<device> is<device> ==> is a Stream object [recorder] -> [encoder] is a StreamRecorder [file] is a File
The details are abstracted out a lot at this point; the idea is that future versions of the API can add specific knobs and dials to allow authors to control what they want to change; we can decide what to expose based on what people most want. This dramatically limits the complexity of what we have to implement, and makes it more likely we'll get interop sooner. However, I need to do some work on this spec to merge in the ideas from some of Justin's work, so this may change a bit. What will certainly change is the way the low-bandwidth channel maintained by the script is interfaced with on the PeerConnection object; I also need to move some of the Session stuff into the spec, e.g. sending of DTMF tones.
Rather than specific functionality for sending DTMF tones, it would be more useful to have a generic way to include pre-recorded media in a Stream.
BR Adam

Good move to change subject lines! I agree fully to that no unnecessary conversions should be carried out (one reason for my input last on that we agreed on that supported media formats should be exchanged). There is however one more factor for "phones": often the network capacity is limited as well which calls for efficient video compression (which these devices usually have HW acceleration support for). --Stefan -----Original Message----- From: rtc-web-bounces@alvestrand.no [mailto:rtc-web-bounces@alvestrand.no] On Behalf Of Harald Alvestrand Sent: den 25 oktober 2010 15:24 To: Stefan Håkansson LK Cc: Silvia Pfeiffer; Justin Uberti; rtc-web@alvestrand.no; Adam Bergkvist; Ian Hickson; David Singer Subject: [RTW] Architecture of module linkages (Re: List is now open) Changing subject lines.... one thing to remember in the video case is that there's no "uncoded" format - in limited-hardware cases like phones, it's extremely important to only perform the transformations one absolutely has to, so it's "figure out which format that is supported by the camera and can be easily displayed on recipient's screen with the most lightweight conversion"; the figuring-out is probably out of scope for standardization, but the enumeration and choosing of possible formats for each element is probably critical. On 10/25/10 15:18, Stefan Håkansson LK wrote:
Another advantage of denoting Stream as something that is uncoded is that it makes it simpler to add audio processing.
You could have something like
<device> ==> [encoder] -> [transport] -> [decoder] ==> [processing] ==> <audio>
where processing could be something based on for example the rendered scene in a game (e.g. panning).
--Stefan
-----Original Message----- From: Adam Bergkvist Sent: den 25 oktober 2010 14:29 To: Ian Hickson; Justin Uberti Cc: Stefan Håkansson LK; Silvia Pfeiffer; Harald Alvestrand; rtc-web@alvestrand.no; David Singer Subject: RE: [RTW] List is now open
On Mon, 11 Oct 2010, Ian Hickson wrote:
On Sun, 10 Oct 2010, Justin Uberti wrote:
My idea is that we will tie these various pieces together like a filter chain, i.e.
<device> -> [encoder] -> [transport] -> [decoder] -> <video>
This is basically what's in the HTML spec today:
On the sending side:
<device> is<device> [encoder] is a Stream object [transport] is a PeerConnection object
On the receiving side:
[transport] is a PeerConnection object [decoder] is a Stream object's .url attribute <video> is a<video>
We see Stream simply as the glue between the device element and the transport endpoint, and between the remote transport endpoint and the video element. Thus encoding and decoding is handled by the ConnectionPeer object.
<device> ==> [encoder] -> [transport] -> [decoder] ==> <video>
On the sending side:
<device> is<device> ==> is a Stream object [encoder] -> [transport] is a ConnectionPeer object
On the receiving side:
[transport] -> [decoder] is a ConnectionPeer object ==> is a Stream object <video> is a<video>
In the local self-view case there's no encoding or decoding.
<device> ==> <video>
In the recording case, the encoding is handled by the StreamRecorder.
<device> ==> [recorder] -> [encoder] -> [file]
<device> is<device> ==> is a Stream object [recorder] -> [encoder] is a StreamRecorder [file] is a File
The details are abstracted out a lot at this point; the idea is that future versions of the API can add specific knobs and dials to allow authors to control what they want to change; we can decide what to expose based on what people most want. This dramatically limits the complexity of what we have to implement, and makes it more likely we'll get interop sooner. However, I need to do some work on this spec to merge in the ideas from some of Justin's work, so this may change a bit. What will certainly change is the way the low-bandwidth channel maintained by the script is interfaced with on the PeerConnection object; I also need to move some of the Session stuff into the spec, e.g. sending of DTMF tones.
Rather than specific functionality for sending DTMF tones, it would be more useful to have a generic way to include pre-recorded media in a Stream.
BR Adam
_______________________________________________ RTC-Web mailing list RTC-Web@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtc-web

On Mon, 25 Oct 2010, Adam Bergkvist wrote:
Rather than specific functionality for sending DTMF tones, it would be more useful to have a generic way to include pre-recorded media in a Stream.
That's not a bad idea. Basically being able to source a Stream from a specific media resource rather than from a <device>. -- Ian Hickson U+1047E )\._.,--....,'``. fL http://ln.hixie.ch/ U+263A /, _.. \ _\ ;`._ ,. Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'

On 10/10/2010 4:26 PM, Stefan Håkansson LK wrote:
That's right, a lot of things remain regarding protocols and other stuff. But IMHO <device>, StreamAPIs and <audio> and <video> should be part of the puzzle!
At Mozilla we've experimented with taking the input from a webcam for video recording and reflecting it to a canvas for display. This makes it easy to manipulate from script for building application, taking screenshots, etc. --Chris

On 10/9/2010 12:17 AM, Silvia Pfeiffer wrote:
I wonder: has the HTML5 device element been looked at (http://dev.w3.org/html5/html-device/) and what are the problems with that solution?
The device element is only a small part of the picture (at least so far.) Most of the people in the room were protocol & codec folks so we spent most of our time talking about the underlying elements. --Chris

On Oct 10, 2010, at 5:04 , Christopher Blizzard wrote:
The device element is only a small part of the picture (at least so far.) Most of the people in the room were protocol & codec folks so we spent most of our time talking about the underlying elements.
Speaking completely off the top of my head, I imagine something a little higher level than device. At the workshop we discussed ever so briefly using the 'video' element for the display of the remote end. You'd need to give it a suitable URL that identifies a protocol and address from which to get the a/v, of course. I wondered (and still do) if we can split 'discovery' off. This might be multi-level: Address-book-like: "I need to talk to Chris Blizzard" --> Chris has the connectivity phoneto:14155551212, wondrous:snowblizzard@example.com, awesome:magic@excellentphone.org Discovery: my UA knows about the wondrous phone system, and it says "find me snowblizzard@example.com" --> he has the address sip:192.168.34.45 so now I know how to set up a SIP/RTP call using IP addresses. This is something I pass to the video element. Similarly, I wonder about a "capture" element which can capture audio and video and reflect them on to the browser display. I guess they have a lot of attributes/DOM interfaces to set things like the bitrate, screen size to send, screen size to show locally, and so on. I pass the same sip:192.168.34.45 to the "capture" element, and set the right attributes, and it provides the other direction. I'd like to think we can make the system even more modular. Certainly we might like to see all the non-real-time stuff mediated through scripts and so on. We need to remember that we have the local UA at each end, the sites that served the 'integration' web page for each end, and the servers that provide the back-ends for the discovery protocols and possibly the real-time communications, though for the last we all seem to prefer that it be *possible* to talk end-to-end directly, as intermediate servers add delay (probably). But the details of how protocols work is, of course, a matter for those protocols; at the UA/browser level, we're identifying protocols by URL type. Now, we might want to recommend/mandate certain protocol(s), and, within them, certain codecs, encryption, and so on, to provide a baseline of interoperability. We don't have the luxury here that the video and audio elements have, where a clear common use case is using HTTP to load a file to play. David Singer Multimedia and Software Standards, Apple Inc.

On Sun, Oct 10, 2010 at 11:45 AM, David Singer <singer@apple.com> wrote:
On Oct 10, 2010, at 5:04 , Christopher Blizzard wrote:
The device element is only a small part of the picture (at least so far.)
Most of the people in the room were protocol & codec folks so we spent most of our time talking about the underlying elements.
Speaking completely off the top of my head, I imagine something a little higher level than device.
At the workshop we discussed ever so briefly using the 'video' element for the display of the remote end. You'd need to give it a suitable URL that identifies a protocol and address from which to get the a/v, of course. I wondered (and still do) if we can split 'discovery' off. This might be multi-level:
Address-book-like: "I need to talk to Chris Blizzard" --> Chris has the connectivity phoneto:14155551212, wondrous:snowblizzard@example.com <wondrous%3Asnowblizzard@example.com>, awesome:magic@excellentphone.org <awesome%3Amagic@excellentphone.org>
Discovery: my UA knows about the wondrous phone system, and it says "find me snowblizzard@example.com" --> he has the address sip:192.168.34.45
so now I know how to set up a SIP/RTP call using IP addresses. This is something I pass to the video element.
Similarly, I wonder about a "capture" element which can capture audio and video and reflect them on to the browser display. I guess they have a lot of attributes/DOM interfaces to set things like the bitrate, screen size to send, screen size to show locally, and so on. I pass the same sip:192.168.34.45 to the "capture" element, and set the right attributes, and it provides the other direction.
I'd like to think we can make the system even more modular. Certainly we might like to see all the non-real-time stuff mediated through scripts and so on. We need to remember that we have the local UA at each end, the sites that served the 'integration' web page for each end, and the servers that provide the back-ends for the discovery protocols and possibly the real-time communications, though for the last we all seem to prefer that it be *possible* to talk end-to-end directly, as intermediate servers add delay (probably). But the details of how protocols work is, of course, a matter for those protocols; at the UA/browser level, we're identifying protocols by URL type.
Now, we might want to recommend/mandate certain protocol(s), and, within them, certain codecs, encryption, and so on, to provide a baseline of interoperability. We don't have the luxury here that the video and audio elements have, where a clear common use case is using HTTP to load a file to play.
Dave, I wonder: has Apple's HTTP live streaming been used for RTC at all? even been considered? Cheers, Silvia.

On Oct 10, 2010, at 12:19 , Silvia Pfeiffer wrote:
Dave, I wonder: has Apple's HTTP live streaming been used for RTC at all? even been considered?
wow, I wonder what I said that inspired that question! The answer is no, as the streaming solution is optimized in almost completely the wrong direction - it cheerfully introduces latency in order to meet other needs. David Singer Multimedia and Software Standards, Apple Inc.

On Sun, Oct 10, 2010 at 7:27 PM, David Singer <singer@apple.com> wrote:
On Oct 10, 2010, at 12:19 , Silvia Pfeiffer wrote:
Dave, I wonder: has Apple's HTTP live streaming been used for RTC at all?
even been considered?
wow, I wonder what I said that inspired that question! The answer is no, as the streaming solution is optimized in almost completely the wrong direction - it cheerfully introduces latency in order to meet other needs.
Nothing at all. I was just wondering. I didn't think it was suitable, but wanted to check. :-) It is, however, suitable for streaming live events, just not for any live interaction, I would think. Cheers, Silvia.

On 10/9/2010 5:45 PM, David Singer wrote:
--> Chris has the connectivity phoneto:14155551212,wondrous:snowblizzard@example.com,awesome:magic@excellentphone.org
I am a fan of these schemes. Just saying. --Chris
participants (10)
-
Adam Bergkvist
-
Christopher Blizzard
-
David Singer
-
Harald Alvestrand
-
Ian Hickson
-
Justin Uberti
-
Matthew Kaufman
-
Richard L. Barnes
-
Silvia Pfeiffer
-
Stefan Håkansson LK