
Hi Randell, Just to make sure I understand your "bonus" reduction, it is because the "fishy" pattern confirms a congestion loss rather than a possibly stochastic loss, right? Or is there another reason to apply the bonus that I missed? Like some sort of mild congestion vs. severe congestion inference (that can't be obtained by the loss signal itself alone)? Thanks, Mo -----Original Message----- From: rtp-congestion-bounces@alvestrand.no [mailto:rtp-congestion-bounces@alvestrand.no] On Behalf Of Randell Jesup Sent: Wednesday, August 08, 2012 1:46 AM To: rtp-congestion@alvestrand.no Subject: Re: [R-C] RRTCC issues: loss, decrease rate On 8/8/2012 1:04 AM, Mo Zanaty (mzanaty) wrote:
In the case of loss due to congestion (a full queue or AQM action), the loss itself seems like the right signal to process. Why wait to infer congestion from the subsequent delay pattern which can be speculative/unreliable rather than the loss itself?
I agree totally, one should always assume loss is some type of congestion (though very low levels of loss might be ignored). This is an area where the current proposed algorithm can be improved.
If the goal is to distinguish congestion from stochastic loss, that is a general problem that probably needs more thought than the RRTCC 3-sigma outlier filter, or Kalman filter (which is designed to filter stochastic jitter but not losses), or Randell's "fishy" filter. There should be ample research available on this topic from many years of TCP over wireless links.
Agreed. The way I used it was to give a "bonus" reduction in bandwidth if the losses appeared 'fishy'. Per the earlier emails, this would mostly happen on otherwise-mostly-idle access links or maybe during bursts of cross-traffic. -- Randell Jesup randell-ietf@jesup.org _______________________________________________ Rtp-congestion mailing list Rtp-congestion@alvestrand.no http://www.alvestrand.no/mailman/listinfo/rtp-congestion