Ethernet minimum frame size

2

I heard that the minimum frame size of an Ethernet packet is determined by how long it takes for a collision to occur. It has to be large enough so that if a collision occurs it will be detected before the transmission is completed.

However, consider this case: What if the transmission of a packet is almost completed. The sender is about to send the last byte when a collision all the way at the other end occurs. Will the collision be detected or not? Making the Ethernet packet long doesn't seem to make any difference.

tony_sid

Posted 2011-05-19T14:47:35.360

Reputation: 11 651

here's a nice writeup about how the minimum frame size is calculated - http://bharathi.posterous.com/minimum-ethernet-framepacket-size

– freethinker – 2011-05-19T15:27:53.527

Answers

6

First of all, this question may be only of historical interest, since the newer Ethernet standards are full-duplex and use switches instead of hubs, so collisions can't occur.

The worst case in a CSMA, CD network (Carrier-Sense Multiple Access, Collision Detection) is when two nodes, at the maximum allowed distance in the network, begin sending a small frame at the same time. They both hear a quiet medium and start sending their frame. The frame size needs to be long enough so that it takes longer to send the frame than the propagation time to the other end. That way, both nodes will hear the other transmission, and detect a collision.

Jamie Cox

Posted 2011-05-19T14:47:35.360

Reputation: 639

I agree with Spiff's good clarification, below. – Jamie Cox – 2011-05-19T19:31:31.117

4

The minimum frame length isn't just about the time it takes for a collision to occur on a maximum-width network, it's all that plus the time it takes for the other transmitting hosts to notice the collision, plus the time it takes for the collision notification (the "jam" signal) to make it back across a maximum-width network before the first host finishes its minimum-length transmission and leaves the medium. It guarantees the first host was still using the medium when it gets the jam signal, so it knows it's been collided with and can do the right thing.

In your example, if the transmission was almost completed, the second machine would have heard the beginning of the frame by then so it would realize the medium wasn't free and wouldn't start its transmission.

Spiff

Posted 2011-05-19T14:47:35.360

Reputation: 84 656

So even though the sender doesn't hear the collision, the receiver does, and that takes care of the problem? – tony_sid – 2011-05-19T20:41:32.367

@OSX Jedi All of the transmitting hosts that collided are supposed to notice the collision and send the jam signal to ensure that all the other transmitting hosts involved in the collision detected it as well.

Then they all follow the procedures for what to do after detecting a collision.

You said "receiver" but there isn't necessarily a "receiver" in this situation. It doesn't matter who was or wasn't the intended recipient of any of the transmissions that collided. It's up to the transmitting hosts to detect that they collided with other transmitting hosts and do the right thing. – Spiff – 2011-05-19T21:47:41.327

0

It is the job of CSMA/CD to detect that the shared medium is not idle. During the first 64 bytes - the minimum frame size - it is possible that a collision may occur.

sblair

Posted 2011-05-19T14:47:35.360

Reputation: 12 231