We have a LAN with Cisco switches, redundant cabling and spanning tree. If I understand it correctly, when I pull out a redundant cable (that is currently "used" by the spanning tree) it takes several seconds until the spanning tree converges in reaction. How can I prevent this packet loss (assuming of course I know beforehand that the cable will be pulled)? That is, how can I make the spanning tree adapt "proactively"?
I would have guessed that an interface shutdown
plus waiting a couple of seconds should suffice, but did not dare to try that out yet. Actually, I am afraid an interface shutdown would cause the same interruption times during convergence because I suffered from such an interruption yesterday when makeing a supposedly harmless configuration change at some interfaces. (Edit: I just confimed this experimentally; as expected there was some 20 seconds of interruption after interface shutdown - note that I am looking for a "lossless" soluiton, not just "less loss")