How I did a streaming Test

22:53 yalazi 0 Comments


I nearly promised to write about "the syn that's not replied" and "The port can not be opened" but I have something better to write about now. Well I've already written most of it already ;)

This is about a streaming test I've had performed for the GSM operator I'm consulting in Thailand. Have you ever encountered a streaming server to be load tested? Well I've in the last few weeks. It was a success let alone the bandwidth bottlenecks of my vps provider [1] Vpshispeed of Thailand.  I'm possibly going to retry these test with more Thailand bandwidth to reach my tests target in a later time. Any help on huge bandwidth vps or cloud providers in Thailand?

I'm bound to thank Diederik dee Gee for all the help he has provided to me. Thank you Diederik. It's been a really pleasure to work and talk with you! Also I must tell that that bottleneck was not any fault of Diederik's side. It's just I had consumed all the bandwidth available to his service and even I could use more than he has promised. His service exceeds his promises ;) I strongly recommend his services for anyone in need of vps in Thailand.

If I return to the foundation of my load test, as you already know it's a hard process. Also seemingly there are virtually no tools to load test a streaming server. So I had to improvise about it.

My concerns for a streaming load test are as follows:
  • No tools -
    Build in house scripts or tools.
  • Bandwidth -
    Streaming is bandwidth intensive so you need huge amounts of it.
  • What to monitor and how to analyse data? -
    For a web server load test we have many tools and patterns to use. Like ab, jmeter and many many more and easy structure to make the test. "Fire requests and receive responses." But how to load test a streaming server? It's not http.
So I prepared a foundation to design my tests. My Foundation is as follows:
  1. Receive Assumption: Users has enough bandwidth to saturate itself for maximum performance and keep busy the server at most.
  2. Receive Assumption: Users can consume all the data it receives or has enough buffer for constant data receiving.
  3. Receive Assumption: Users has enough time and patience to consume streams from beginning to end without any exceptions.
  4. Consume Assumption: users has a limited buffer but overflowing data is received with the next burst added to the received data.
  5. Consume Assumption: users consume data in a constant bitrate relative to the bitrate of the stream
  6. Server Assumption: Servers are more bandwidth constraint then memory or cpu.
  7. Server Assumption: Different protocols for streaming only has minimal effect on cpu and memory usage on the server side

After creating my foundation I started searching for tools to use. Because of assumptions 6 and 7 I decided on focusing on a single protocol and with the openrtsp tool [2] and (live555 Streaming Media Library [3]) availability rtsp seemed to be a suitable choice.

But as is, OpenRtsp would not give me any data to analyse. So I quickly hacked one of its samples into giving me data about stream bursts and put the received data into void. As this would only give me raw data about the network usage I built a simple python tool to analyse the raw data in a simplistic but informative way. This tool gives a [4] gnuplot data file about the data burst which then be used to create graphs of the bursts in a visual way.

I've put the foundation into github and can be pulled from [5] . There are more info in the README file in the repository.

All pull requests are welcome ;) Please don't be harsh and please try to be constructive with your comments. These are hacks as they are right now.


0 yorum: