Thanks for the posts steve, I was totally forgiving FPS/MaxP relation when I advised to use vsync whenever you can. I did some further research about it, which is cool because I finally found the explanation for the 125fps bug in Q3, 8 years after I knew about the value o/ (
http://ucguides.savagehelp.com/Quake3/FAQFPSJumps.html - oops post from 2001, I was using 56K at that time...)
If I may add about vsync : most of first person shooter (I didn't use "FPS" there...) use 2 graphic buffers do display frames on screen : one where image is actually being rendered (back), the other being displayed on the screen (front).
When vsync is on, the graphic card "waits" for the next frame to be completely rendered into the back buffer and wait for the right moment to display the next frame (according to the monitor's refresh rate), then swap the two buffers : the screen displays the buffer containing the new frame while the next frame is being rendered by the engine into the other.
When vsync is off, the GC doesn't wait before swapping, it just swaps the buffer when the next frame is complete, so the current frame is partially filled with a piece of the next frame, that's the tearing effect.
http://en.wikipedia.org/wiki/Page_tearing or better that video
http://www.youtube.com/watch?v=aQXppnkj2qY, it's an advertising but it's exactly what I see when I have 100 FPS on my 60Hz monitor =).
Main drawback with vsync : if FPS < refresh rate, the GC has to wait 1 cycle more for the next frame to render so it skips displaying a frame, giving an even more choppy game. That's why people often having framedrop below refresh rate shouldn't use vsync. Hum I often have framedrop below RR and I use vsync, seems it was more obvious when I played Q3, maybe because I was using a CRT then...
About the network, I found an excel file to calculate actual number of packets sent when taking in account current FPS and cl_maxpackets value (
http://quaddenied.free.fr/insideq3a.htm - /! warning, some text in french on the webpage). I guess the formula is not 100% accurate (seems the 0.1 value is arbitrarily chosen), but I used it in excel to see the relation between actual framerate and cl_maxpackets value (attached gif : cl_maxpackets in X, FPS in Y, outgoing packets actually sent as results). Have a look at the result : imagine you have cl_maxpackets = 42, com_maxfps = 125, if a framedrop occurs and lower your framerate to 90 FPS which is still quite good, you get a decrease of 28% of packets sent compared to the number of packets which *should* be sent oO !
About the cfg, I use vsync so 63 FPS max and cl_maxpackets 42. So I shouldn't send more than 32 packets/s... it totally matches the result I got when I sniffed the outgoing traffic from my UrT :/ Fortunately, when I got framedrops to 45 FPS, I have full outgoing packets to compensate the bad framerate =)
To Ana if you read this : you told me you quite unhit these days, actually it seems to correspond to the time I sent you my cfg, that *may* be the explanation... sooooorrryyyyyyyy
_________________
ut4_he_tennis_v0.1-------------------------
"We are talking about computers here, compared to those I can read women like a book ;P"
Unclefragger