well, bandwidth doesn't get allocated so much as used, and DCF is used to determine whether a host can get dedicated access to a shared medium (in this case the channel) long enough to send its packet. Often we use the term 'bandwidth' incorrectly. Bandwidth is really a constant based on the network media. What you are really interested in is Transfer rate, which can loosely be defined as the amount of time the channel is yours to use exclusively times the actual width of the band the channel occupies.
Remember, at any given Nanosecond, exactly 0-1 host can be sending. There can never be two hosts transmitting at exactly the same nano, or a collision will occur, and hosts need to back off.
Just like with 802.3 (CSMA/CD) networks, when the medium is in use, its full bandwidth is being used (it is not possible to use less than 100%), and the different packets are being Time-sliced so that they can get access to the medium after waiting in line a second, so that other terminals can get their turn too.
From that perspective, the transfer rate you are referring to is the aggregation of the actual bandwidth X the amount of time its used. using a 10Mib pipe/channel for 1 second yields a "
bandwidth" of 10Mib/s. The real question is, does it take 5 seconds in wall time to get 1 second of access to the wire? if that is the case, the channel has used 100% of its capacity for 5 seconds (10Mbps), but your terminal got only one second of the time [1/5 * (10Mbps)].
In Summation, these algorithms are NOT about allocating bandwidth. they are exclusively concerned about whether the channel is free so your terminal can use it this nano, without a collision.
Hope that helps clarify
Thank you Frank.. your answer was great but according to what I know, when a user gets access to the medium he/she is allowed to send the whole packet independent of how much it will take of time. Is that right ? in addition could you please clarify why would 1 second access take 5 seconds real time ? Thanks a lot. – Alberto – 2015-02-27T10:34:27.133
CSMA/CD is all about how multiple potential senders determine how to share the media. When one is transmitting, the others must back off until the media is idle. The delay I'm speaking of is from collision detection to determine if it is safe to transmit, and to fix it if you do, and then a collision occurs. – Frank Thomas – 2015-02-27T13:54:27.687