INTRODUCTION TO QUEUING
“Queuing theory is the theoretical study of waiting lines, expressed in mathematical terms, including components such as number of waiting lines, number of servers, average wait time, number of queues or lines, and probabilities of queue times. Either increasing or decreasing.”[i]
In terms of routing queuing is the process of sequencing a backlog of packets waiting to be forwarded over a router interface.
Normally packets leave the router in the order they arrived this is known as First-in First-Out (FIFO) process. It does not give any preference to voice or mission-critical data.
Queuing is the process of sorting packets in order of priority, it is only necessary when traffic flow is causing congestion to the router, or if there are packet drops occurring..
Due to the fact that queuing introduces some form of latency, it should not be introduced as a normal part of a router configuration.
Queuing is best applied on WAN links. Bursty traffic and low data rates can mix and generate a congestive situation. Depending on the maximum transmission units (MTUs) the media in place, queuing is more successful when applied to links with T1 (1.544 Mbps) or E1 (2.048 Mbps) bandwidth speeds and below.
When traffic congestion is only temporary queuing can be an effective means to reduce the congestion. However if congestion is a regular concern it might be better to consider a bandwidth upgrade, and perhaps a new router.
A network administrator should also contemplate the idea of setting up a queuing policy which will support him when managing different traffic types. This will enable him to preserve the stability of the network.
Queuing itself does not actually increase the bandwidth of the network it is only used to sort the order of the packets that arrive at a router interface by following the parameters as entered by the network administrator.