Abstract
We model the goodput of a single TCP source on a wireless link experiencing sudden increases in Round Trip Time (RTT), that is delay spikes. Such spikes trigger spurious timeouts that reduce the TCP goodput. Renewal reward theory is used to derive a straightforward expression for TCP goodput that takes into account limited sending rates (limited window size), lost packets due to congestion and the delay spike properties such as the average spike duration and distribution of the spike intervals. The basic model is for independent and identically distributed (i.i.d.) spike intervals, and correlated spike intervals are modelled by using a modulating background Markov chain. Validation by ns2 simulations shows excellent agreement for lossless scenarios and good accuracy for moderate loss scenarios (for packet loss probabilities less than 5%). Numerical studies have also been performed to assess the impact of different spike interval distributions on TCP performance.
| Original language | English |
|---|---|
| Pages (from-to) | 653-667 |
| Number of pages | 15 |
| Journal | European Transactions on Telecommunications |
| Volume | 19 |
| Issue number | 6 |
| DOIs | |
| Publication status | Published - Oct 2008 |
| MoE publication type | A1 Journal article-refereed |
Fingerprint
Dive into the research topics of 'Performance of TCP on low-bandwidth wireless links with delay spikes'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver