Bethesda, MD 20814
OPNET is a registered
trademark of OPNET Technologies
© 2003 OPNET Technologies
University: University of
Name of Sponsoring Professor: Professor Kavitha Chandra
Department: Center for Advanced Computation &
Department of Electrical & Computer Engineering
TCP Performance in Wireless
TCP is a reliable transport layer protocol that is well tuned to respond to congestion related packet losses in wired networks. Implementing TCP in wireless channels is challenging since the dominant cause of packet loss is due to bit errors that occur when the channel fades. The channel fading process is generally correlated and this feature may be applied in error control schemes. In this work, we present models for packet and bit error distributions in a Rayleigh fading channel. The number of bit errors per packet is represented by an Erlangian probability distribution function. The parameters of the model are obtained by matching the first and second order moments of the packet bit errors. The relationship of the model parameters to the Doppler frequency and signal-to-noise ratio is presented.
The effects of temporal correlation at the bit level in fading are considered using a first-order discrete time Markov chain as a model. Both, the packet error distribution and the Markov chain model are applied as channel error models in an
OPNET simulation of TCP flows between a server and client. The TCP performance in terms of average throughput and retransmission statistics is characterized relative to the parameters of the channel models.
A comparative evaluation of TCP performance using the two models indicates that the IID representation of packet errors for slow fading tends to predict higher throughput than will be observed. For fast fading channels, the same model underestimates the throughput since it does not capture the fast transitions out of the fade state at the bit level. In contrast, the DTMC capturing memory at the bit level predicts TCP performance degradation in slow fading channels by modeling the average fade duration accurately. This model also shows the increased throughput that can occur in fast fading channels, since it captures the bit level transitions out of the fade state accurately. Finally, the effect of providing error correction at different levels on TCP throughput shows that for fast and slow fading a breaking point occurs beyond which coding is ineffective.
See Details of Work at: