So I was just talking to DNA about server lag, and it occurred to me that that's only really valid if your rates are set correctly. If you look at your net_graph, on the "in" line, k/s tells you how much data the server is sending you. It looks to me like this number can only go to about 4 k/s per 10000 rate ("rate" in console) without losing some updates. In a full 24-player server, with cl_updaterate at 66, this number will frequently be touching 20 k/s (or higher, depending on the map). This means that if you want to receive all the updates from the server, you need to set your rate to at least 50000. This is contrary to the typical numbers of 25000 or 30000.
If you look at the last number on the "in" line, that is the number of updates per second you're receiving from the server. This can drop for a few reasons, commonly: (1) the server is not sending you all the updates, because it is respecting your rate setting, (2) packets are getting lost somewhere between you and the server, (3) the server is not able to maintain its tickrate.
If your rate is set to a typical 30000, a drop in the number of updates you're receiving is only likely to indicate a server problem if your k/s in is staying at or below 12 k/s.