![]() So, for every hour that you have 100 players on your server, you will chew through 9 gigabytes of bandwidth, both up and down (apparently). ![]() Over 100 players, that works out to 150 megabytes per minute. So, if each player is using roughly 25 kiloBYTES of bandwidth every second, that's 1.5 megabytes per minute. But math is hard, so we're going to stick to the assumption of 100 players. Thus the amount of bandwidth transfer per player per second is heavily dependent on how many players are currently online. This is probably because more players equals more structures equals more entities for the server to send to each client. Or 25.6kb/sec per player per second.īut that doesn't fit with the previous account of 80kb/s with 14 players (and that 80kb/sec sounds like it's in kilobytes, not kilobits, because that would equal roughly 700bytes per player per second, which is absurd). That is to say that the amount of bandwidth used per second for 100 players is (again: anecdotal) 256 kilobytes per second. Note that I am assuming that /u/Troutpiecakes is talking about megaBITS per second, and not megaBYTES (which you would use to describe actual bandwidth used)Ģ megabits (divide by 8 to get actual bandwidth transfer) equates to 256kb/sec (ignoring line contention, rate-limiting, etc - perfect conditions). Let's assume you have 100 players, and according to the anecdotal dataset given us by /u/Troutpiecakes, this utilizes 2mb/sec up/down. which can be calculated based on the average up/down speed per second per player.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |