I asked this same question on Reddit and I got zero engagement, so perhaps Lemmy has people that care more about their hardware.
I recently decided to use some of the tools provided by Mr Salter (netburn) and I have to ask the community if you want to see multi-client stress tests (4K streaming, VoIP, web browsing) used on a wireless router or if the single-client iperf tests are good enough. Bear in mind that pretty much all publications that still test their devices (most don’t) rely on the single-client test method.
Can your ISP? If so, yes. Because ~25 Mbps * 4 is not a lot of data. And the NAT for four clients mapped to the same firewall/router is pretty trivial. And no, adding “browsing” is not going to be an issue.
Again, NAT is easy. And it happens on every single packet (big ol’ asterisk on this, but not the venue to get into the specifics), regardless of whether it is one client or two. So what matters is the amount of packets per second that can be processed which these speed tests already cover (albeit, somewhat obfuscated because most people don’t understand the network layers).
And in the enterprise case? That is mostly about whether you can run a mesh network, what signal coverage you have, and the total number of clients (and packets) that need to be processed per second. Which… you are either a complete sicko who wouldn’t be watching reviews online or you are just going to buy a ubiquiti or omada setup.
On paper, it is not a lot of data, but then adding more clients requesting 25Mbps continuously and then adding some spontaneous, but intensive web browsing can lead to latency spikes. And the user no longer gets a good streaming/browsing experience. I’ve even seen it on an expensive (by consumer-based networking standard) router such as the GT-AX6000.
I am just trying to better understand this stuff, so I have to ask if seeing how long it takes for a client device to accomplish a certain task wouldn’t be better than just glancing over the average Mbps in a graph? That’s what most publications are showing.
Those “lag spikes” are almost always a result of content servers, load balancing, and possibly even client system resources. Or even just a crappy modem.
A quick overview of an average LAN is:
A “Router” generally handles everything up to the Modem. For a consumer/household, this is fine. Because no matter how many streams of Frasier you have running on your desktop, you aren’t actually generating that much traffic. Even a server isn’t going to generate that much traffic (unless you are speccing it out specifically with multiple NICs and so forth). Hell, your OS is more likely to fall over before you make any decent Router break a sweat. Your crappy Netgear can likely handle a LOT more than the god awful piece of crap modem Comcast rents out.
The difference between consumer and enterprise is how many computers are involved. Because yes, if you add enough clients you can stress things. But in that case, we are talking closer to hundreds of clients than three or four kids who are on their phone and their tablet at the same time. Which, again, you are either a network sicko building a bespoke solution or you just pay the Ubiquiti tax.
(And, as an aside, it usually isn’t even the network hardware that falls over in hotels. It is their captive portal. And it very much violates a lot of the terms of staying at a hotel and can be considered “hacking” for legal reasons but… if you know the trick to forcing a reboot you can usually fix the network for the entire hotel for the next couple days).
And if we then consider the WAN (internet), the usual path these days is to then connect to a Content Delivery Network (CDN) that is effectively a bunch of small relatively local servers that mirror other parts of the internet. Cloudflare is probably the most famous. And a lot of those “Prove you are a human” checks are kind of masking the fetching of data (which doubles as a way to protect against DDOS attacks). This is almost definitely where those “lag spikes” came from, not your hardware.
That is an incredibly user specific review. That doesn’t necessarily make it bad, but “how fast can I download an episode of Frasier” provides a subset of the amount of information you get from “what speeds does this router support?”
But it sounds like what you want is a “review” of a full network (hardware) stack. And… that is again, not something you can get online. That is what you literally pay someone to come over and check out your building for. Because your wifi? That is going to be heavily impacted by where you place the access point, what you have in your walls, etc. Same with your modem (almost always a piece of crap) and even how many times the comcast tech spliced off your coax before it even gets to the box.
Because, for a router? What matters is a controlled-ish environment and then how many packets it can process per second. And just measuring average speed over a large file transfer is probably the best way to get that as it normalizes all the CDN and stack shenanigans.
Adding “more clients” mostly just lets you find out how your traffic is being routed to said CDN while not providing much more data than just a sustained high speed transfer would.
In case it is not obvious, I am definitely a home networking sicko who decided an enterprise level solution was the cost effective way to set up a mesh network for wifi coverage (and… it will come out cheaper when I upgrade my access points in a few years). I’ve never found a need to test “number of clients” because I know that even with my complete mess oh a DHCP table, it is nothing. What I do do is local transfers of large files between clients on different setups. So I will connect my laptop to the wifi and download some files from my NAS. Then I’ll do the same with a wired connection to a few different Switches. And then I’ll just have Steam download a game or two to make sure my modem isn’t a piece of crap.
You’re talking about real-world scenarios, but I am just trying to get something simulated that resembles general real-life conditions. So, no CDN, the modem and even the ISP don’t matter in this particular scenario. You have written a phenomenal response, so I am sorry that I ask to take some more of your time and check out this article: smallnetbuilder.com/wireless/wireless-reviews/2x2-ac-access-point-roundup-part-2/ This is pretty much what I am trying to accomplish and it does seem that the APs can be stressed by fewer client devices than expected. or maybe again, there’s something that I am missing.
What you linked to is literally someone doing the kind of survey you pay a professional for (or do yourself). It is multiple clients running literal stress tests. Because yes, those are designed to represent website requests… except they are done near constantly for five minutes. That will never happen in the real world between caching of resources and people generally wanting to at least look at a web page before loading the next one. And mostly boils down to “packets per second” but in a way that provides much less data in terms of what was actually being tested. it is simulating an enterprise network load in a manner that is very prone to quirks of the hardware (they even mention their wifi dongles weren’t properly supported in linux) while drawing conclusions that actually are pretty suspect (the idea of needing to refresh the page because of errors CAN happen but is generally unlikely due to cached resources and the resiliency of codecs for media streaming. Most of the time, those “the page didn’t load right” are the CDN).
Same with the roaming tests and the like. Yes, it is nice looking data but mostly it boils down to being INCREDIBLY situational and, honestly, not useful unless you live in that dude’s office.
I don’t know that site very well. But, to me, this looks like a lot of data spam that can be summarized as “If you are dealing with enterprise level traffic, get an enterprise solution” while also having a LOT of affiliate links to buy the hardware.
In a lot of ways, this reminds me of the computer hardware review channels. The better ones just play a suite of games and give you data from that because that conveys most of the useful information while being a realistic scenario. Gamers Nexus deserves an extra shout out (as they almost always do) for actually explaining why each game was used and reminding people of things like “Hitman 3 is a good bloatware test” because of the quirks of those games. But there are the ones who want to flood the consumer with nonsense data because it overloads their brains while coming to the same conclusion but being more “authoritative”. It is one of the reasons I actually love that when Jays Two Cents does a stress test they repeatedly emphasize “This will never happen to your computer in reality. We are doing this to stress test our cooling solution”.
In fact, I would go so far as to say that reviews like this becoming ubiquitous would actually make the product space worse. We have already seen it happen. When people started discovering that mesh networks exist, there was a lot of interest. And many tech channels (I don’t want to JUST call out LTT but… I am gonna call out LTT because they always do this bullshit) reviewed enterprise equipment, particularly Ubiquiti. And that more or less led to the idea that you either buy a shitty netgear router for your dorm or you buy an enterprise solution. Which means there is no product that is good for 95% of consumers anymore. You either have trash or really expensive overkill (although, I AM a fan of TP-Link’s Omada approach as that is very much built out of consumer grade hardware at the low end). Because nobody needs feature X if they aren’t running a hotel but… are you really going to buy something that scores lower because it doesn’t have it?
I understand perfectly where you’re coming from and I would love to find some way to objectively test wireless networking hardware which can be easily replicated in pretty much any other site. The Octoscope tools (now assimilated by Spirent) may be the closest since we get to put the wireless AP/router in a box and then simulate the conditions we want. But, for those that don’t have fat wallets, I guess these open-source tools are good enough, even if the results are heavily subjective. I know that people don’t like good enough, but at this point in time, even the fine edge between realistic and unrealistic is better than nothing.
And funny thing, WiFi 6 is not really that much better than WiFi 5, unless some very specific conditions are met - I’ve seen it in testing. So yes, I know what hype and advertising can do…
Gamer Nexus are my favorite when it comes to PC hardware as well and I would love to see them give it a try at testing wireless networking hardware. Who knows, maybe they’ll create a standard for testing these non-enterprise wireless routers.
Well, I have a MiniPC running a VyOS router and the only time I’ve seen it even break a sweat is when I have to run openvpn or wireguaurd with lots of throughput which is probably because of the encryption involved.
I’m curious as to how the wireless access points part of the network work. I have no problem saturating my bandwidth on wired connections but on wireless, I do get choking when say 3-4 devices try to stream 4k.