PDF Report
Objective
The Candela WiFi Capacity test is designed to measure performance of an
Access Point when handling different amounts of WiFi Stations. The test
allows the user to increase the number of stations in user defined steps
for each test iteration and measure the per station and the overall
throughput for each trial. Along with throughput other measurements made
are client connection times, Fairness, % packet loss, DHCP times and
more. The expected behavior is for the AP to be able to handle several
stations (within the limitations of the AP specs) and make sure all
stations get a fair amount of airtime both in the upstream and
downstream. An AP that scales well will not show a significant over-all
throughput decrease as more stations are added.
Realtime Graph shows summary download and upload RX Goodput rate of connections
created by this test. Goodput does not include Ethernet, IP, UDP/TCP header overhead.
CSV Data for Realtime Throughput
Total Megabits-per-second transferred. This only counts the protocol payload, so it will not count the Ethernet, IP, UDP, TCP or other header overhead. A well behaving system will show about the same rate as stations increase. If the rate decreases significantly as stations increase, then it is not scaling we
CSV Data for Total Mbps Received vs Number of Stations Active
Text Data for Mbps Upload/Download
Protocol-Data-Units received. For TCP, this does not mean much, but for UDP connections, this correlates to packet size. If the PDU size is larger than what fits into a single frame, then the network stack will segment it accordingly. A well behaving system will show about the same rate as stations increase. If the rate decreases significantly as stations increase, then it is not scaling well.
CSV Data for Total PDU/s Received vs Number of Stations Active
Text Data for Pps Upload/Download
Station disconnect stats. These will be only for the last iteration.
If the 'Clear Reset Counters' option is selected,
the stats are cleared after the initial association. Any re-connects reported
indicate a potential stability issue.
Can be used for long-term stability testing in cases where you bring up all
stations in one iteration and then run the test for a longer duration.
CSV Data for Port Reset Totals
Station connect time is calculated from the initial Authenticate message through the completion of Open or RSN association/authentication.
CSV Data for Station Connect Times
Wifi-Capacity Test requested values
Station Increment:
|
1,2
|
Loop Iterations:
|
Single (1)
|
Duration:
|
1 min (1 m)
|
Layer 4-7 Endpoint:
|
NONE
|
MSS
|
AUTO
|
Total Download Rate:
|
OC48 (2.488 Gbps)
|
Total Upload Rate:
|
Zero (0 bps)
|
Protocol:
|
UDP-IPv4
|
Payload Size:
|
AUTO
|
Socket buffer size:
|
OS Default
|
IP ToS:
|
Best Effort (0)
|
Multi-Conn:
|
AUTO
|
Set Bursty Minimum Speed:
|
Burst Mode Disabled (-1)
|
Percentage TCP Rate:
|
10% (10%)
|
Randomize Rates
|
true
|
Leave Ports Up
|
false
|
Settle Time:
|
5 sec (5 s)
|
Rpt Timer:
|
fast (1 s)
|
Show-Per-Iteration-Charts
|
true
|
Show-Per-Loop-Totals
|
true
|
Hunt-Lower-Rates
|
false
|
Show Events
|
true
|
Clear Reset Counters
|
false
|
CSV Reporting Dir
|
- not selected -
|
Build Date
|
Mon Oct 31 08:38:27 PDT 2022
|
Build Version
|
5.4.6
|
Git Version
|
cc4e25f4013da941141702ada4944bd17adb5c5c
|
Ports
|
1.1.eth1 1.1.sta1400 1.1.sta14009 1.1.sta1401 1.1.sta14010
1.1.sta14011 1.1.sta14012 1.1.sta14013 1.1.sta14014 1.1.sta14015
1.1.sta14016 1.1.sta14017 1.1.sta1402 1.1.sta1403 1.1.sta1404
1.1.sta1405 1.1.sta1406 1.1.sta1407 1.1.sta1408 1.1.wlan14
|
Firmware
|
0. 6-1
|
Machines
|
ct523c-0b29
|
Requested Parameters:
Download Rate:
|
Per station:
|
2488000000 (2.488 Gbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
1
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
936.884 Mbps
|
Cx Ave:
|
936.884 Mbps
|
Cx Max:
|
936.884 Mbps
|
All Cx:
|
936.884 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
936.884 Mbps
|
Aggregated Rate:
|
Min:
|
936.884 Mbps
|
Avg:
|
936.884 Mbps
|
Max:
|
936.884 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
2488000000 (2.488 Gbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
1
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
6.588 GB
|
Cx Ave:
|
6.588 GB
|
Cx Max:
|
6.588 GB
|
All Cx:
|
6.588 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
6.588 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
1244000000 (1.244 Gbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
2
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
436.189 Mbps
|
Cx Ave:
|
447.714 Mbps
|
Cx Max:
|
459.24 Mbps
|
All Cx:
|
895.429 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
895.429 Mbps
|
Aggregated Rate:
|
Min:
|
436.189 Mbps
|
Avg:
|
447.714 Mbps
|
Max:
|
459.24 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
1244000000 (1.244 Gbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
2
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
3.074 GB
|
Cx Ave:
|
3.155 GB
|
Cx Max:
|
3.236 GB
|
All Cx:
|
6.31 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
6.31 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
829333333 (829.333 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
3
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
287.517 Mbps
|
Cx Ave:
|
293.337 Mbps
|
Cx Max:
|
303.244 Mbps
|
All Cx:
|
880.012 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
880.012 Mbps
|
Aggregated Rate:
|
Min:
|
287.517 Mbps
|
Avg:
|
293.337 Mbps
|
Max:
|
303.244 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
829333333 (829.333 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
3
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
2.029 GB
|
Cx Ave:
|
2.068 GB
|
Cx Max:
|
2.136 GB
|
All Cx:
|
6.205 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
6.205 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
622000000 ( 622 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
4
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
157.035 Mbps
|
Cx Ave:
|
202.464 Mbps
|
Cx Max:
|
334.193 Mbps
|
All Cx:
|
809.855 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
809.855 Mbps
|
Aggregated Rate:
|
Min:
|
157.035 Mbps
|
Avg:
|
202.464 Mbps
|
Max:
|
334.193 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
622000000 ( 622 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
4
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
1.105 GB
|
Cx Ave:
|
1.428 GB
|
Cx Max:
|
2.36 GB
|
All Cx:
|
5.71 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
5.71 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
497600000 (497.6 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
5
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
156.46 Mbps
|
Cx Ave:
|
172.437 Mbps
|
Cx Max:
|
198.364 Mbps
|
All Cx:
|
862.183 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
862.183 Mbps
|
Aggregated Rate:
|
Min:
|
156.46 Mbps
|
Avg:
|
172.437 Mbps
|
Max:
|
198.364 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
497600000 (497.6 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
5
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
1.102 GB
|
Cx Ave:
|
1.214 GB
|
Cx Max:
|
1.394 GB
|
All Cx:
|
6.07 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
6.07 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
414666666 (414.667 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
6
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
114.841 Mbps
|
Cx Ave:
|
128.258 Mbps
|
Cx Max:
|
153.018 Mbps
|
All Cx:
|
769.548 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
769.548 Mbps
|
Aggregated Rate:
|
Min:
|
114.841 Mbps
|
Avg:
|
128.258 Mbps
|
Max:
|
153.018 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
414666666 (414.667 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
6
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
828.044 MB
|
Cx Ave:
|
924.57 MB
|
Cx Max:
|
1.081 GB
|
All Cx:
|
5.417 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
5.417 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
355428571 (355.429 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
7
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
108.975 Mbps
|
Cx Ave:
|
113.371 Mbps
|
Cx Max:
|
120.005 Mbps
|
All Cx:
|
793.596 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
793.596 Mbps
|
Aggregated Rate:
|
Min:
|
108.975 Mbps
|
Avg:
|
113.371 Mbps
|
Max:
|
120.005 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
355428571 (355.429 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
7
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
785.919 MB
|
Cx Ave:
|
818.079 MB
|
Cx Max:
|
867.53 MB
|
All Cx:
|
5.592 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
5.592 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
311000000 ( 311 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
8
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
107.448 Mbps
|
Cx Ave:
|
112.379 Mbps
|
Cx Max:
|
116.559 Mbps
|
All Cx:
|
899.028 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
899.028 Mbps
|
Aggregated Rate:
|
Min:
|
107.448 Mbps
|
Avg:
|
112.379 Mbps
|
Max:
|
116.559 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
311000000 ( 311 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
8
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
771.126 MB
|
Cx Ave:
|
807.544 MB
|
Cx Max:
|
839.235 MB
|
All Cx:
|
6.309 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
6.309 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
276444444 (276.444 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
9
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
92.062 Mbps
|
Cx Ave:
|
97.72 Mbps
|
Cx Max:
|
114.357 Mbps
|
All Cx:
|
879.477 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
879.477 Mbps
|
Aggregated Rate:
|
Min:
|
92.062 Mbps
|
Avg:
|
97.72 Mbps
|
Max:
|
114.357 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
276444444 (276.444 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
9
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
665.08 MB
|
Cx Ave:
|
704.702 MB
|
Cx Max:
|
825.897 MB
|
All Cx:
|
6.194 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
6.194 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
248800000 (248.8 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
10
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
73.511 Mbps
|
Cx Ave:
|
78.266 Mbps
|
Cx Max:
|
84.479 Mbps
|
All Cx:
|
782.661 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
782.661 Mbps
|
Aggregated Rate:
|
Min:
|
73.511 Mbps
|
Avg:
|
78.266 Mbps
|
Max:
|
84.479 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
248800000 (248.8 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
10
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
527.769 MB
|
Cx Ave:
|
564.828 MB
|
Cx Max:
|
608.262 MB
|
All Cx:
|
5.516 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
5.516 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
226181818 (226.182 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
11
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
71.789 Mbps
|
Cx Ave:
|
77.699 Mbps
|
Cx Max:
|
83.533 Mbps
|
All Cx:
|
854.694 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
854.694 Mbps
|
Aggregated Rate:
|
Min:
|
71.789 Mbps
|
Avg:
|
77.699 Mbps
|
Max:
|
83.533 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
226181818 (226.182 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
11
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
514.75 MB
|
Cx Ave:
|
560.208 MB
|
Cx Max:
|
601.378 MB
|
All Cx:
|
6.018 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
6.018 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
207333333 (207.333 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
12
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
72.87 Mbps
|
Cx Ave:
|
76.546 Mbps
|
Cx Max:
|
78.469 Mbps
|
All Cx:
|
918.548 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
918.548 Mbps
|
Aggregated Rate:
|
Min:
|
72.87 Mbps
|
Avg:
|
76.546 Mbps
|
Max:
|
78.469 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
207333333 (207.333 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
12
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
523.889 MB
|
Cx Ave:
|
550.952 MB
|
Cx Max:
|
564.735 MB
|
All Cx:
|
6.456 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
6.456 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
191384615 (191.385 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
13
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
41.947 Mbps
|
Cx Ave:
|
53.268 Mbps
|
Cx Max:
|
79.054 Mbps
|
All Cx:
|
692.489 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
692.489 Mbps
|
Aggregated Rate:
|
Min:
|
41.947 Mbps
|
Avg:
|
53.268 Mbps
|
Max:
|
79.054 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
191384615 (191.385 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
13
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
302.978 MB
|
Cx Ave:
|
384.132 MB
|
Cx Max:
|
570.207 MB
|
All Cx:
|
4.877 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
4.877 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
177714285 (177.714 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
14
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
39.929 Mbps
|
Cx Ave:
|
60.006 Mbps
|
Cx Max:
|
71.534 Mbps
|
All Cx:
|
840.086 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
840.086 Mbps
|
Aggregated Rate:
|
Min:
|
39.929 Mbps
|
Avg:
|
60.006 Mbps
|
Max:
|
71.534 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
177714285 (177.714 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
14
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
288.209 MB
|
Cx Ave:
|
432.799 MB
|
Cx Max:
|
516.646 MB
|
All Cx:
|
5.917 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
5.917 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
165866666 (165.867 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
15
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
35.914 Mbps
|
Cx Ave:
|
55.215 Mbps
|
Cx Max:
|
63.767 Mbps
|
All Cx:
|
828.22 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
828.22 Mbps
|
Aggregated Rate:
|
Min:
|
35.914 Mbps
|
Avg:
|
55.215 Mbps
|
Max:
|
63.767 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
165866666 (165.867 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
15
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
258.377 MB
|
Cx Ave:
|
397.814 MB
|
Cx Max:
|
458.693 MB
|
All Cx:
|
5.827 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
5.827 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
155500000 (155.5 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
16
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
34.307 Mbps
|
Cx Ave:
|
52.387 Mbps
|
Cx Max:
|
60.459 Mbps
|
All Cx:
|
838.193 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
838.193 Mbps
|
Aggregated Rate:
|
Min:
|
34.307 Mbps
|
Avg:
|
52.387 Mbps
|
Max:
|
60.459 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
155500000 (155.5 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
16
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
247.149 MB
|
Cx Ave:
|
377.403 MB
|
Cx Max:
|
436.471 MB
|
All Cx:
|
5.897 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
5.897 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
146352941 (146.353 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
17
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
31.845 Mbps
|
Cx Ave:
|
48.752 Mbps
|
Cx Max:
|
57.209 Mbps
|
All Cx:
|
828.787 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
828.787 Mbps
|
Aggregated Rate:
|
Min:
|
31.845 Mbps
|
Avg:
|
48.752 Mbps
|
Max:
|
57.209 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
146352941 (146.353 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
17
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
229.594 MB
|
Cx Ave:
|
351.387 MB
|
Cx Max:
|
411.743 MB
|
All Cx:
|
5.834 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
5.834 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
138222222 (138.222 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
18
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
29.748 Mbps
|
Cx Ave:
|
48.423 Mbps
|
Cx Max:
|
52.501 Mbps
|
All Cx:
|
871.61 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
871.61 Mbps
|
Aggregated Rate:
|
Min:
|
29.748 Mbps
|
Avg:
|
48.423 Mbps
|
Max:
|
52.501 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
138222222 (138.222 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
18
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
214.756 MB
|
Cx Ave:
|
349.356 MB
|
Cx Max:
|
378.495 MB
|
All Cx:
|
6.141 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
6.141 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
130947368 (130.947 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
19
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Rate:
Download Rate:
|
Cx Min:
|
18.279 Mbps
|
Cx Ave:
|
36.241 Mbps
|
Cx Max:
|
51.348 Mbps
|
All Cx:
|
688.58 Mbps
|
Upload Rate:
|
Cx Min:
|
0 bps
|
Cx Ave:
|
0 bps
|
Cx Max:
|
0 bps
|
All Cx:
|
0 bps
|
Total:
|
688.58 Mbps
|
Aggregated Rate:
|
Min:
|
18.279 Mbps
|
Avg:
|
36.241 Mbps
|
Max:
|
51.348 Mbps
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Mbps, 60 second running average
Text Data for Graph
Requested Parameters:
Download Rate:
|
Per station:
|
130947368 (130.947 Mbps)
|
All:
|
2488000000 (2.488 Gbps)
|
Upload Rate:
|
Per station:
|
0 ( 0 bps)
|
All:
|
0 ( 0 bps)
|
Total:
|
2488000000 (2.488 Gbps)
|
Station count:
|
19
|
Connections per station:
|
1
|
Payload (PDU) sizes:
|
AUTO (AUTO)
|
Observed Amount:
Download Amount:
|
Cx Min:
|
131.811 MB
|
Cx Ave:
|
261.74 MB
|
Cx Max:
|
371.442 MB
|
All Cx:
|
4.857 GB
|
Upload Amount:
|
Cx Min:
|
0 B
|
Cx Ave:
|
0 B
|
Cx Max:
|
0 B
|
All Cx:
|
0 B
|
Total:
|
4.857 GB
|
This graph shows fairness. On a fair system, each station should get
about the same throughput.
In the download direction, it is mostly
the device-under-test that is responsible for this behavior,
but in
the upload direction, LANforge itself would be the source of most
fairness issues
unless the device-under-test takes specific actions
to ensure fairness.
CSV Data for Combined Received Megabytes, for entire 1 m run
Text Data for Graph
Maximum Stations Connected: 19
Stations NOT connected at this time: 0
Maximum Stations with IP Address: 19
Stations without IP at this time: 0
CSV Data for Station Maximums
RF stats give an indication of how well how congested is the RF environment. Channel activity is what the wifi radio reports as the busy-time for the RF environment. It is expected that this be near 100% when LANforge is running at max speed, but at lower speeds, this should be a lower percentage unless the RF environment is busy with other systems.
CSV Data for RF Stats for Stations
RX-Signal and Activity Data
Link rate stats give an indication of how well the rate-control is working. For rate-control, the 'RX' link rate corresponds to what the device-under-test is transmitting. If all of the stations are on the same radio, then the TX and RX encoding rates should be similar for all stations. If there is a definite pattern where some stations do not get good RX rate, then probably the device-under-test has rate-control problems. The TX rate is what LANforge is transmitting at.
CSV Data for Link Rate for Stations
TX/RX Link Rate Data
Key Performance Indicators CSV
Scan Results for SSIDs used in this test.
BSS 00:0a:52:38:84:1b(on wlan14)
last seen: 8453.525s [boottime]
TSF: 8467106820 usec (0d, 02:21:07)
freq: 5955
beacon interval: 240 TUs
capability: ESS Privacy (0x0011)
signal: -73.00 dBm
last seen: 9 ms ago
Information elements from Probe Response frame:
SSID: ben-7916-6g
Supported rates: 6.0* 9.0 12.0* 18.0 24.0* 36.0 48.0 54.0
DS Parameter set: channel 1
RSN: * Version: 1
* Group cipher: CCMP
* Pairwise ciphers: CCMP
* Authentication suites: SAE
* Capabilities: 16-PTKSA-RC 1-GTKSA-RC MFP-required MFP-capable (0x00cc)
Supported operating classes:
* current operating class: 134
Extended capabilities:
* Extended Channel Switching
* Multiple BSSID
* SSID List
* Operating Mode Notification
* 6
Transmit Power Envelope:
Transmit Power Envelope:
HE capabilities:
HE MAC Capabilities (0x00051a081044):
+HTC HE Supported
TWT Responder
BSR
OM Control
Maximum A-MPDU Length Exponent: 3
BQR
A-MSDU in A-MPDU
OM Control UL MU Data Disable RX
HE PHY Capabilities: (0x0c20ce126f09afc8000c00):
HE40/HE80/5GHz
HE160/5GHz
LDPC Coding in Payload
NDP with 4x HE-LTF and 3.2us GI
STBC Tx <= 80MHz
STBC Rx <= 80MHz
Full Bandwidth UL MU-MIMO
Partial Bandwidth UL MU-MIMO
DCM Max Constellation: 2
DCM Max Constellation Rx: 2
SU Beamformee
MU Beamformer
Beamformee STS <= 80Mhz: 3
Beamformee STS > 80Mhz: 3
Sounding Dimensions <= 80Mhz: 1
Sounding Dimensions > 80Mhz: 1
Codebook Size SU Feedback
Codebook Size MU Feedback
Triggered SU Beamforming Feedback
Triggered MU Beamforming Feedback
Partial Bandwidth Extended Range
PPE Threshold Present
Max NC: 1
STBC Tx > 80MHz
STBC Rx > 80MHz
TX 1024-QAM
RX 1024-QAM
HE RX MCS and NSS set <= 80 MHz
1 streams: MCS 0-11
2 streams: MCS 0-11
3 streams: not supported
4 streams: not supported
5 streams: not supported
6 streams: not supported
7 streams: not supported
8 streams: not supported
HE TX MCS and NSS set <= 80 MHz
1 streams: MCS 0-11
2 streams: MCS 0-11
3 streams: not supported
4 streams: not supported
5 streams: not supported
6 streams: not supported
7 streams: not supported
8 streams: not supported
HE RX MCS and NSS set 160 MHz
1 streams: MCS 0-11
2 streams: MCS 0-11
3 streams: not supported
4 streams: not supported
5 streams: not supported
6 streams: not supported
7 streams: not supported
8 streams: not supported
HE TX MCS and NSS set 160 MHz
1 streams: MCS 0-11
2 streams: MCS 0-11
3 streams: not supported
4 streams: not supported
5 streams: not supported
6 streams: not supported
7 streams: not supported
8 streams: not supported
PPE Threshold 0x39 0x1c 0xc7 0x71 0x1c 0x07
WMM: * Parameter version 1
* BE: CW 15-1023, AIFSN 3
* BK: CW 15-1023, AIFSN 7
* VI: CW 7-15, AIFSN 2, TXOP 3008 usec
* VO: CW 3-7, AIFSN 2, TXOP 1504 usec
META Information for Report for: Wifi Capacity Test