In our company we've been having problems with our bandwidth. Seems like everyday we're using up all of the bandwidth and the internet connection just seems slow. We have 4 Mbps of DL and 2 Mbps of UL.
Dell told us to look at our RMON Statistics Group and look at the Dropped packets, Received Bytes (Octets), Received Packets, Broadcast Packets Received, and Multicast Packets Received section. In this section shows numbers and "supposedly" shows why our internet connection is slow. I'm guessing the higher the numbers then the more bandwidth we're using?
In the picture below is an example of what I am talking about. And here are 2 examples from our RMON Statistics:
Example 1:
Port g4
Dropped packets ------------------> 0
Received Bytes (Octets) ---------> 3069860830
Received Packets -----------------> 6119057
Broadcast Packets Received ---> 33639
Multicast Packets Received -----> 11420
Example 2:
Port g6
Dropped packets ------------------> 0
Received Bytes (Octets) ---------> 1410323800
Received Packets -----------------> 41510743
Broadcast Packets Received ---> 11999
Multicast Packets Received -----> 298
We have more ports, but I'm just showing you guys two. When comparing these two ports, why is it that one thing is more than the other (ex: Recevied Bytes (Octets)). If the one port with more Received Bytes or something, is it a good idea to check it out or is this normal? And why is it so much more than the other ports? I know it could be a lot of stuff, but what are some things that I could check?