Nov 13 2011, 10:11 AM
THIS WAS ORIGINALLY A RESPONSE TO ANOTHER POST, BUT I HAVE DECIDED TO MAKE IT A NEW THREAD. Please read this. If you're still unsure about anything to do with net_graph, then read this thread here, where I explain to Knowpain about what each part of the net_graph actually represents. I've now attached the relevant information to this post, so you can read that thread for context, or you can just read here for the raw information
ABOUT NET_GRAPH
The way it works in CS:S is that you're constantly sending updates to the server. These updates (actually called "packets") contain information such as :
- Where you are on the map
- What you can see (or what you should be able to see)
- Whether you're firing or not (or any action, for that matter)
- Whether you're moving or not
Just basically all the information about YOU that the server needs to know to place your character in the game, and for your character to be able to interact inside the map.
Now at the same time, you're receiving a whole bunch of these "packets" from the server. These include the positions, actions and whereabouts of everyone else in the server. Obviously you're not accessing all this information all the time, the server sends it to you when its necessary (for example, if you see someone else in the game, or if someone is firing at you and hits you).
So you have "traffic" coming in and out of your computer. If you have too much "traffic" for your connection to handle, then you experience "choke", which is when your connection is flooded with more data than it can handle. On the other side of that, if you aren't sending data quick enough to the server, or the "packets" are being lost along the way, then you will experience "loss".
The client side (thats your side) cvars (commands that you put in console) will, for the majority, handle how much data you are sending and receiving from/to the server. The server does have limits as to how much it can send/receive at one time, but as long as the client side cvars are less than or equal to the server side values, then you will be fine.
But sometimes people don't set their rates properly. Usually I see the main problem coming from peoples internet connections not being able to send/receive the data quick enough.
What you should have, at least, is your cl_updaterate and cl_cmdrate set to 66. You can do this by opening console and typing
This will mean that you're matching the servers rate at which it sends and receives data. Or at least, you should be, presuming you have a decent internet connection.
Now I'll actually go into what is behind what net_graph is actually showing you.
net_graph is actually split horizontally. It's something people don't realise, because they read everything in grids. The main 2 lines you need to concentrate on are the 2nd and 3rd. You see they start with "IN" and "OUT"? Well, they go horizontally across.
That bit that says "IN" is what is being received by you from the server. The bit that says "OUT" is what you're sending to the server. From left to right :
- The size of the "packets" being sent by the server, and being received by you. This is in kilobytes. This will change throughout the game, sometimes quite rapidly. That's fine. The one next to "OUT" is the size of the "packets" that you are sending to the server. This will also change.
- The average amount of kilobytes per second being sent from the server, and being sent from you.
- The average amount of "packets" being sent from the server, and being sent from you.
Just an example, in this screenshot :
- This person is sending 14.8 packets per second.
- Each packet is 101 kilobytes
- They're sending it at a speed of 1.64 kilobytes per second.
Now granted, the last value (14.8/s) can often be completely wrong (this is only in very rare occasions, and usually its a higher rate than possible) and sometimes even impossible. But the fact of the matter is, they're sending packets too slowly to the server.
What you will notice is, that his lerp is white (it's stable as well), and it's sitting at a nice 30.3ms. So we know for a fact that the poor hit registry/lag problems on this guy have nothing to do with the way his packets interpolate, it's the fact that he's uploading packets too slowly! It's tools like net_graph that make it easy for people to narrow down network problems like this. So, what is lerp?
LERP
You want your lerp to be as low as possible while remaining white. This doesn't necessarily mean that it has to be 15, as I will show you now. Quick reference : "lerp" stands for "linear interpolation", which is a variable which is set by several client side and server side (cl_ and sv_) command variables (cvars), for example "cl_interp" and "cl_interp_ratio".
The thing about lerp is that it is the net_graphs indicator of packet interpolation. The way I understand this term is that it's how well all the data being sent between you and the server gets put together, eventually culminating in what you experience in the game (events such as moving, firing etc).
If you have a HIGHER lerp (for example, 100ms) then there will be a slight delay between you pushing buttons and the action being completed on the game (clicking +mouse1 and the bullet leaving the gun, for example). You may not notice the difference, but that's because it's on a split second timeframe. Trust me, it affects the data being sent in the packets to the server, and then subsequently affects the packets being received from the server.
This is how people get shot behind walls. It's a delay between them sending the packet and you receiving the packet. The best way I have seen it explained is here. Just read the section on Entity Interpolation, and look at the graph that has been drawn. It will make sense if you read the section.
The lower your lerp whilst remaining WHITE, the better the response time between you and the server (and vice versa).
The colours of the lerp are something that has confused a lot of people. If you have a constant white lerp, then it generally means that you're having no issues with packet interpolation. If lerp becomes YELLOW, then it's an indicator that the SERVERS FRAMERATE is dropping below your interp value. For example, if you have "cl_interp 0.33", then the server would need to be running at 30fps. If the server drops below this, then your lerp will become yellow.
To stop this happening, it's best not to set a value for your "cl_interp". If you set it to "cl_interp 0", then you should technically never experience yellow interp. 0 is the default value, meaning that whatever the servers FPS is, your interp will never be higher than it. Some servers set their FPS to low limits (for example "fps_max 60"). This is purely to conserve CPU usage. I'm not sure what War Lords is set to, but I doubt it would be lower than 75.
If you're experiencing ORANGE lerp, then it means that you're experiencing packet loss, which will ultimately result in choke/loss (in net_graph). This means that there is not enough packets for the server to interpolate. This generally happens when you have "cl_interp_ratio" set to "1", as you'll only have one packet being sent. If you set "cl_interp_ratio" to "2", then the server will have an extra packet to interpolate with, meaning that if you lose a packet being sent, there is another packet for the server to be interpolating with.
NOTE : This will only work if you have your "cl_interp" set to "0".
The ratio works like this. If your "cl_updaterate" is set to "66" (which it should be), and you have "cl_interp_ratio" set to "1", then your lerp will be 15ms, because your "cl_interp" is being set to 0.0151. This is fine, as long as the lerp remains WHITE.
If not, then your connection can't handle being at such a low value, and you need to be sending more than 1 packet. If you then set your "cl_interp_ratio" to "2", then your lerp will be 30ms, because your "cl_interp" is being set to "0.03". You will also be sending an extra packet to interpolate with.
Here is a quick example. A while back, I was experiencing some issues with my lerp becoming yellow/orange. I then realised that I had not configured my rates to compensate for the new connection I'm using (connecting through ethernet plugs in my wall). So for a while, I was experiencing this :
You can see here that whilst my lerp is exceptionally low, that I am experiencing packet loss, indicated by my lerp being ORANGE. Then I set my rates to :
So that I had an extra packet to interpolate with Now I have this :
As you can see, my lerp is WHITE (it's constant). The size of the packets and the speed in which the packets are being sent/received has also sped up, but the amount of packets per second remains (roughly) the same. It's just I am no longer getting packet loss, because I have that extra packet being sent.
I know this was a very long post, but it does actually explain a lot. Please read it.
ABOUT NET_GRAPH
The way it works in CS:S is that you're constantly sending updates to the server. These updates (actually called "packets") contain information such as :
- Where you are on the map
- What you can see (or what you should be able to see)
- Whether you're firing or not (or any action, for that matter)
- Whether you're moving or not
Just basically all the information about YOU that the server needs to know to place your character in the game, and for your character to be able to interact inside the map.
Now at the same time, you're receiving a whole bunch of these "packets" from the server. These include the positions, actions and whereabouts of everyone else in the server. Obviously you're not accessing all this information all the time, the server sends it to you when its necessary (for example, if you see someone else in the game, or if someone is firing at you and hits you).
So you have "traffic" coming in and out of your computer. If you have too much "traffic" for your connection to handle, then you experience "choke", which is when your connection is flooded with more data than it can handle. On the other side of that, if you aren't sending data quick enough to the server, or the "packets" are being lost along the way, then you will experience "loss".
The client side (thats your side) cvars (commands that you put in console) will, for the majority, handle how much data you are sending and receiving from/to the server. The server does have limits as to how much it can send/receive at one time, but as long as the client side cvars are less than or equal to the server side values, then you will be fine.
But sometimes people don't set their rates properly. Usually I see the main problem coming from peoples internet connections not being able to send/receive the data quick enough.
What you should have, at least, is your cl_updaterate and cl_cmdrate set to 66. You can do this by opening console and typing
Code:
cl_updaterate 66
cl_cmdrate 66
This will mean that you're matching the servers rate at which it sends and receives data. Or at least, you should be, presuming you have a decent internet connection.
Now I'll actually go into what is behind what net_graph is actually showing you.
net_graph is actually split horizontally. It's something people don't realise, because they read everything in grids. The main 2 lines you need to concentrate on are the 2nd and 3rd. You see they start with "IN" and "OUT"? Well, they go horizontally across.
That bit that says "IN" is what is being received by you from the server. The bit that says "OUT" is what you're sending to the server. From left to right :
- The size of the "packets" being sent by the server, and being received by you. This is in kilobytes. This will change throughout the game, sometimes quite rapidly. That's fine. The one next to "OUT" is the size of the "packets" that you are sending to the server. This will also change.
- The average amount of kilobytes per second being sent from the server, and being sent from you.
- The average amount of "packets" being sent from the server, and being sent from you.
Just an example, in this screenshot :
- This person is sending 14.8 packets per second.
- Each packet is 101 kilobytes
- They're sending it at a speed of 1.64 kilobytes per second.
Now granted, the last value (14.8/s) can often be completely wrong (this is only in very rare occasions, and usually its a higher rate than possible) and sometimes even impossible. But the fact of the matter is, they're sending packets too slowly to the server.
What you will notice is, that his lerp is white (it's stable as well), and it's sitting at a nice 30.3ms. So we know for a fact that the poor hit registry/lag problems on this guy have nothing to do with the way his packets interpolate, it's the fact that he's uploading packets too slowly! It's tools like net_graph that make it easy for people to narrow down network problems like this. So, what is lerp?
LERP
You want your lerp to be as low as possible while remaining white. This doesn't necessarily mean that it has to be 15, as I will show you now. Quick reference : "lerp" stands for "linear interpolation", which is a variable which is set by several client side and server side (cl_ and sv_) command variables (cvars), for example "cl_interp" and "cl_interp_ratio".
The thing about lerp is that it is the net_graphs indicator of packet interpolation. The way I understand this term is that it's how well all the data being sent between you and the server gets put together, eventually culminating in what you experience in the game (events such as moving, firing etc).
If you have a HIGHER lerp (for example, 100ms) then there will be a slight delay between you pushing buttons and the action being completed on the game (clicking +mouse1 and the bullet leaving the gun, for example). You may not notice the difference, but that's because it's on a split second timeframe. Trust me, it affects the data being sent in the packets to the server, and then subsequently affects the packets being received from the server.
This is how people get shot behind walls. It's a delay between them sending the packet and you receiving the packet. The best way I have seen it explained is here. Just read the section on Entity Interpolation, and look at the graph that has been drawn. It will make sense if you read the section.
The lower your lerp whilst remaining WHITE, the better the response time between you and the server (and vice versa).
The colours of the lerp are something that has confused a lot of people. If you have a constant white lerp, then it generally means that you're having no issues with packet interpolation. If lerp becomes YELLOW, then it's an indicator that the SERVERS FRAMERATE is dropping below your interp value. For example, if you have "cl_interp 0.33", then the server would need to be running at 30fps. If the server drops below this, then your lerp will become yellow.
To stop this happening, it's best not to set a value for your "cl_interp". If you set it to "cl_interp 0", then you should technically never experience yellow interp. 0 is the default value, meaning that whatever the servers FPS is, your interp will never be higher than it. Some servers set their FPS to low limits (for example "fps_max 60"). This is purely to conserve CPU usage. I'm not sure what War Lords is set to, but I doubt it would be lower than 75.
If you're experiencing ORANGE lerp, then it means that you're experiencing packet loss, which will ultimately result in choke/loss (in net_graph). This means that there is not enough packets for the server to interpolate. This generally happens when you have "cl_interp_ratio" set to "1", as you'll only have one packet being sent. If you set "cl_interp_ratio" to "2", then the server will have an extra packet to interpolate with, meaning that if you lose a packet being sent, there is another packet for the server to be interpolating with.
NOTE : This will only work if you have your "cl_interp" set to "0".
The ratio works like this. If your "cl_updaterate" is set to "66" (which it should be), and you have "cl_interp_ratio" set to "1", then your lerp will be 15ms, because your "cl_interp" is being set to 0.0151. This is fine, as long as the lerp remains WHITE.
If not, then your connection can't handle being at such a low value, and you need to be sending more than 1 packet. If you then set your "cl_interp_ratio" to "2", then your lerp will be 30ms, because your "cl_interp" is being set to "0.03". You will also be sending an extra packet to interpolate with.
Here is a quick example. A while back, I was experiencing some issues with my lerp becoming yellow/orange. I then realised that I had not configured my rates to compensate for the new connection I'm using (connecting through ethernet plugs in my wall). So for a while, I was experiencing this :
You can see here that whilst my lerp is exceptionally low, that I am experiencing packet loss, indicated by my lerp being ORANGE. Then I set my rates to :
Code:
cl_interp "0"
cl_interp_ratio "2"
So that I had an extra packet to interpolate with Now I have this :
As you can see, my lerp is WHITE (it's constant). The size of the packets and the speed in which the packets are being sent/received has also sped up, but the amount of packets per second remains (roughly) the same. It's just I am no longer getting packet loss, because I have that extra packet being sent.
I know this was a very long post, but it does actually explain a lot. Please read it.