The authors discuss the idealistic case of knowing which sources deserve
more bandwidth than others in the allocation per source scheme of fairness.
This is idealistic only because no one has figured out how to do it.
Sometimes the most direct answer is the best. (Now not that I am advocating
this because for as many built in pieces of information, there are just as
many ways of deviating from that, so that malicious users could exploit
I don't believe that allocation per source, per destination, or per process
yields fruit. In all of these cases, the allocation, and thus flow control
responsibility is on the END. The analogy is the one used in class where
the car drives down the wrong lane to merge into the exit at the last
This is nothing more than another example of human nature, and the desire
for resolving problems. The only "fair" way of control is to have it
handled by someplace other than the ENDs.
The discussion of bit-by-bit round robin (BR) is also idealistic, and
impractical. The authors describe the behavior of the packetized round
robin suggest that they asymptotically approach BR eventually. The curious
thing here, is the "eventually" part. This is described as having a
pre-emptive transmission capability, that favors small packets (smaller
Falpha). Smaller packets in general usually imply shorter total
conversations, so does the "sufficiently long conversation" assumption about
the asymptotic behavior apply?
The trouble with doing something right the first time is that no one
appreciates how difficult it was.