- IEEE Mentor

January/2006
doc.: IEEE 802.11-06/0026r0
Over the Air Testing - Comparing Systems with Different Antennas
Date: 2006-1-15
Authors:
Name
Pertti Visuri
Oleg Abramov
Allen Huotari
Hooman Kashef
Steve Hawkins
Brian Bella
Company
Address
5355 Ave Encinas,
Carlsbad, CA 92008
5355
Ave Encinas,
Airgain, Inc
Carlsbad, CA 92008
Linksys, a division of 121 Theory Drive
Cisco Systems, Inc.
Irvine, CA 92617
2401 Palm Bay Road NE.
Conexant, Inc
Palm Bay, Florida 32905
750 N. Commons Drive
Westell, Inc.
Aurora, IL 60504
750 N. Commons Drive
Westell, Inc.
Aurora, IL 60504
Airgain, Inc
Phone
email
(760) 597 0200
[email protected]
(760) 597 0200
[email protected]
(949) 261-1288
[email protected]
(321) 327-6300
hooman.kashef@conexant.
com
(630)-898-2500
[email protected]
(630)-898-2500
[email protected]
Notice: This document has been prepared to assist IEEE 802.11. It is offered as a basis for discussion and is not binding on the contributing individual(s) or organization(s). The material in
this document is subject to change in form and content after further study. The contributor(s) reserve(s) the right to add, amend or withdraw material contained herein.
Release: The contributor grants a free, irrevocable license to the IEEE to incorporate material contained in this contribution, and any modifications thereof, in the creation of an IEEE
Standards publication; to copyright in the IEEE’s name any IEEE Standards publication even though it may include portions of this contribution; and at the IEEE’s sole discretion to permit
others to reproduce in whole or in part the resulting IEEE Standards publication. The contributor also acknowledges and accepts that this contribution may be made public by IEEE 802.11.
Patent Policy and Procedures: The contributor is familiar with the IEEE 802 Patent Policy and Procedures including the statement "IEEE standards may include the known use of
patent(s), including patent applications, provided the IEEE receives assurance from the patent holder or applicant with respect to patents essential for compliance with both mandatory and
optional portions of the standard." Early disclosure to the Working Group of patent information that might be relevant to the standard is essential to reduce the possibility for delays in the
development process and increase the likelihood that the draft publication will be approved for publication. Please notify the Chair <[email protected]> as early as possible, in written
or electronic form, if patented technology (or technology under patent application) might be incorporated into a draft standard being developed within the IEEE 802.11 Working Group. If you
have questions, contact the IEEE Patent Committee Administrator at <[email protected]>.
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Abstract
This paper is a follow up presentation to the paper Over the Air Field Testing of
802.11 Systems ( IEEE 802.11-05/1259r0) which introduced the topic and was briefly
discussed in the December 19, 2006 conference call of task group T. The original
paper was written without access to the Draft document and did not discuss the work
already done in the field. This version covers all of the material in the previous one,
but puts it better into context with the existing standard draft. Some additional
measurement results are presented. Contributions from several cooperating parties are
included and recognized as well.
Local variations are caused by multipath fading and can result in 15 dB signal
strength variations across a few centimeters displacement of the antenna in either end
of a wireless link. These variations can not be eliminated by maintaining fixed
locations if the systems in the test use different types of antennas.
Statistical methods can be used to obtain accurate results in field tests even when
comparing different antennas and to calculate confidence limits for the results. The
nature of variations in over the air testing is examined and examples of statistically
correct tests are presented.
An automated method using a turn table for collecting data to obtain statistically
significant results is presented for both signal strength and for throughput tests. The
bias effect of continuous motion on throughput tests is explained and results from an
automated stop-motion turntable test system that eliminates the effect are presented.
A practical methodology for evaluating over the air performance and obtaining
reliable and repeatable results is presented
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Presentation Outline
• Effect of multipath fading on wireless tests
– Identical antennas: keep environment and test unit locations fixed
– Different antennas: perform several measurements with small
variations in location and use statistical methods to calculate
averages and confidence limits
• Practical Aspects of Statistical Testing
–
–
–
–
Calculating confidence limits
Number of data points and accuracy of results
Examples of signal strength tests
Automating test data collection
• Throughput testing
– Examples of repeatable throughput comparison tests
• Proposal for a new Test Environment and procedure
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Tests Are Usually Done in Controlled Environments
•
Testing of wireless components is usually done in controlled environments where the
effect of a single component can be seen.
– Wireless chipsets and system functions are tested in conductive environments with the signal
contained in a coaxial cable and in the test system using sophisticated simulations of real world
environment
– Antennas are tested in anechoic chambers that eliminate the effects of reflections and multipath variations.
•
•
The Task Group T draft includes specified Over The Air (OTA) test environments for
testing device performance at the system level
In OTA tests it is necessary to deal with variations in signal strength caused by
reflections and multipath fading
Anechoic chamber
Submission
Conductive test system
OTA test environment
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Multipath Fading
• Wireless signals reflect to varying degree from all
surfaces that they reach.
• Reflected signals arrive at the receiving antenna in
different phases depending on the distance they travel
• The electric field vectors add or subtract from one
another depending on their phase and polarization
• Moving either of the antennas or any
of the reflecting surfaces will result in
a change in signal strength
Transmitting
antenna
Submission
+ ++ =
Time
receiving
antenna
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Signal Variation with Antenna Location
• To measure the effect of multipath fading an access point with a standard
dipole antenna was moved over a grid of 100 locations and the signal
strength was measured in each location
• The client station connected
to the access point was
about 40m (120 feet) away
in a non-line of sight
location
• The client station was not
moved at all during the test
• The signal strength was
measured using the RSSI
reporting feature of an
802.11 radio card and
averaged over hundreds of
samples during a few
minutes to even out the
effect of small changes in
environment during the test
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Signal Strength Variation with Antenna Location
Dipole Signal Strength
Submission
15dB
• Major differences in signal strength of about 15 dB were observed within
5cm (2 inches) of one another
• The variation pattern
illustrated here was
measured using a regular
dipole antenna
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Need to Specify Antenna Position and Orientation for OTA tests
• The local multipath variation is the
reason all of the currently proposed
OTA test environments emphasize the
need to keep the antenna location of
both ends of the link the same in all
tests (within 1.5cm for 2.4GHz band)
and use the same antenna orientation
for all tests.
• However, when to comparing
performance of systems that have
antennas with different gain patterns it
is not possible to specify same location
and orientation
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Effect of Antenna Gain Patterns on Multipath Fading
• If the gain pattern of either the receiving or transmitting antenna is different
than the gain patterns in a reference system the resulting multipath fading will
be different
• Reflected signals from all directions
are included in the net signal strength
and their contributions are affected by
the antenna gain in each direction
++ + =
• This results in a different signal
strength (in each location and for each
orientation of the antennas) if the gain
patterns of both the transmitting and
receiving antenna in both of the
compared systems are not identical
Transmitting
antenna
Submission
receiving
antenna
+ ++ =
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Example: Signal Variation for a Smart Antenna
• The access point dipole was replaced by a smart antenna that provides a benefit
of 3 to 4dB in an anechoic chamber test. Nothing else was changed.
• The local signal strength variation pattern is different:
12dB
– The level is about 4dB higher
– The peaks and valleys are
in different locations
– The range of variation is 3dB
smaller
Smart antenna Signal Strength
Expected
Smart
Antenna
benefit
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Multipath Signal Variation for Different Antennas
• Since the local variation
patterns are different for
different antenna designs, it
is not possible to eliminate
the effect of local variations
by placing the antenna in
exactly same location for
comparison tests.
• In fact, there is no such
concept as the “exactly same
location” (or orientation)
when the antenna designs
are fundamentally different
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Difference in the Antennas in the Example Tests
Submission
Smart Antenna with 10 automatically switched beams
Smart Antenna
individual
beam
patterns
Horizontal cut -10 to +5 dBi
Comparison of gain patterns of the two antennas
Smart Antenna
aggregate
patterns
7
6
5
4
3
2
1
0
12345678901-
5
4
3
2
1
0
-1
-2
-3
-4
-5
-6
-7
-8
-9
-10
P1
• The two antennas used in the
above described tests of local
variations have quite different
gain patterns
• In addition the smart antenna is
automatically optimizing the
signal strength by switching to
the best antenna gain pattern
out of ten possible patterns
• The optimizing software for
the smart antenna runs on the
processor of the host device
and compares measured signal
strengths of each possible
pattern frequently to select the
best pattern for each associated
client station
Dipole
gain pattern
Horizontal cut -10 to +5 dBi
Vertical cut -10 to +7 dBi
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Moving Client Station Instead of Access Point
• Similar differences in local signal
strength variation patterns are
observed when the access point (in
this test the DUT) is kept stationary
and the client station (in this test the
WLCP) is moved over a 50 by 50cm
(20 by 20 inch) grid
• Clearly the signal strength depends
on the exact location of both ends of
a wireless link
AP
Client
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Comparing Devices with Different Antennas
• Comparing the signal strength of one antenna to another in each location in the test
shown on the previous slide gives values for the difference ranging from 0dB to 20dB
• Therefore using just one, arbitrarily chosen
Signal Strength Difference between a
location for the Device Under Test (DUT) or
Smart antenna and a dipole antenna
the Wireless Counterpart (WLCP) will result
in very large arbitrary variations in test results.
• The results will depend on the chosen
location for each antenna.
• A different approach is necessary for
comparing systems with different antennas
(Smart Antenna signal )
Submission
minus
(Dipole signal )
equals
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Single radio two antenna diversity unit
Multipath Variation for
Different Antenna Systems
• The multipath fading effects
influence all systems, including
MIMO systems
• In this test throughput of four
different systems were
compared using the physical
test arrangement on slide 6
– Throughput was measured
using a standard Chariot test in
100 locations for each of the
tested systems
– Actual level differences are
influenced by the units. The
relevant information of this test
is the range of variation and
local differences
Submission
Pertti Visuri, Airgain, Inc.
January/2006
Multipath Variation for
Different Antenna Systems
doc.: IEEE 802.11-06/0026r0
Single radio two antenna diversity unit
• The multipath fading effects
influence all systems, including
MIMO systems
• In this test throughput of four
different systems were
compared using the physical
test arrangement on slide 6
– Throughput was measured
using a standard Chariot test in
100 locations for each of the
tested systems
– The differences in the local
throughput patterns are evident
in this 2D view of the test
results
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Nature of the local variations
• Testing in several different kinds of environments reveals that the local signal
strength variations are always present. They appear smaller outdoors and seem to
be larger and more closely spaced in highly reflective indoor settings.
• The full scale of variation seems to take place within about 50 cm (20 inches) for
the 2.4GHz frequency, which corresponds to about 4 wavelengths
Dipole
Dipole
Submission
Smart Antenna
Smart Antenna
Dipole
Dipole
Smart Antenna
Smart Antenna
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Handling Variations to Obtain Accurate Results
•
•
•
•
•
•
•
Obtaining accurate and meaningful test results when there are variations caused by
uncontrolled variables is a well established discipline
Many industries, for example the entire pharmaceutical industry, depend on decisions
based on tests where conclusions are based on calculated confidence limits.
The same tools are also commonly used in radio engineering and telecommunications
The distribution of variations in any set of
measurements that are affected by a number
of independent, uncontrolled factors is likely
to be close to a normal distribution.
The probability of occurrence of a certain
result can be estimated for normally
distributed values using their average and the
standard deviation estimated from the set of
values

The standard deviation can be estimated for
any set of values by calculating the root mean
square deviation of the actual values from
In a normal distribution the percentage of
their average
samples that fall within one standard
deviation of the average value is 68%.
Based on this it is possible to calculate
Over 95% of samples fall between two
confidence limits for any set of measurements
standard deviations from the average
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Confidence Limits for Field Testing
•
•
•
Confidence limits establish the boundaries within which an average calculated from
a set of measurements is from the actual average (of an infinite set of tests).
The values of the confidence limits for a set of tests depend on
– The standard deviation of the results,
– Number of measurements taken and
Distribution of the dB difference of signal strength
between a smart antenna and a dipole antenna in
– The selected level of confidence
(the probability that the actual average 500 comparison tests compared to the shape of a
normal distribution curve
is between the calculated limits)
The formula for calculating the limits is:
= +/-t (M,C)*SQRT(2/M) ,
where  is the standard deviation, M is
the number of measurements, C is the
desired confidence level and t (M,C) is
the so called “Student’s coefficient”.
– If the standard deviation is known or if
the test includes more than 15
measurements the value of the coefficient
t(M>15,@95%) is approximately 2.0
– There is a table of values for Student’s
Coefficient in Appendix 1
Submission
Average 4.0 dB
Shape of
normal
distribution


Measured
distribution of
signal strength
difference
Standard
Deviation
4.6 dB
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Confidence Limits for Field Testing
•
As pointed out earlier, the nature of the local signal strength variation depends on
the physical environment and on the antenna system. Typical values for the standard
deviation in indoor tests range from 3 dB to 5.5 dB for a dipole antenna and from
2dB to 3.2 dB for a smart antenna
To illustrate how many measurements are necessary for various levels of accuracy
and percentage confidence we will assume a standard deviation of 3.6 dB
–
–
–
•
to obtain a 95% confidence that the
actual average value is within +- 1.2 dB
of the measured average it will be
necessary to make 36 measurements.
However, if a 80% confidence of the
same +- 1.2 dB limits is sufficient then
only 15 measurements are needed.
To obtain an accuracy of +- 0.6dB at
90% certainty about 100 measurement
would be needed.
These levels of accuracy are similar to
the levels generally quoted for standard
anechoic chamber testing
Confidence limits for different % confidence
levels and numbers of measurements taken
in an indoor test of signal strength (standard
Confidence Limits
deviation 3.6 dB)
2
1.8
Confidence limit in dB
•
1.6
1.4
1.2
1
0.8
0.6
95% confidence
0.4
90% confidence
0.2
80% confidence
0
500
100
36
25
15
Number of Measurements
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Confidence Limits for Only a Few Measurements
The same calculation of confidence limits can be applied to examine the level of
confidence in conclusions drawn from tests that only include measurements at only
very few locations of the antenna-under-test. First let us assume that we know the
standard deviation of the test results in the environment based on other test results
–
–
•
If only one measurement is taken, the level
of accuracy at 95% confidence level is only
+- 7dB
Increasing the number of locations in which
the measurements are taken to five improves
the accuracy to +- 3.2 dB
In a new environment, where the standard
deviation is not known, the confidence
limits for few measurements are even wider
since the standard deviation needs to be
estimated from the same test results.
–
–
In these cases the actual value of the
Student’s coefficient needs to be applied to
estimate the confidence limits. For example,
for two measurements the limits are +- 30dB,
for five measurements they would be +-5.5
and for ten measurements +- 2.6
If the standard deviation is not known it is
not possible to estimate confidence limits
for a single measurement
Submission
Confidence limits for different % confidence
levels and numbers of measurements taken
in an indoor test of signal strength (standard
deviation 3.6 dB)
8.00
St. Dev. Not known
7.00
Confidence limits dB
•
6.00
95% confidence
90% confidence
5.00
80% confidence
4.00
3.00
2.00
1.00
0.00
500
100
36
25
15
10
5
2
1
Number of samples
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Example of Testing in Different Environments
•
•
•
•
•
Comparison tests of signal strength were performed in five different indoor
environments. Two residential, two different office buildings and one laboratory setting.
100 measurements in a 50x50 cm (20x20 inch) grid were taken at each location for each
of the antennas. The test arrangement is illustrated on slide number 6.
This graph shows the average signal strengths at each environment and the associated
95% confidence limits for each result calculated according to the formula on slide 17.
Since the standard deviation is
different at each environment
the confidence limits are also
different for each data point
It is appears that the signal
strength difference between the
Smart
two antennas depends on the
Antenna
Dipole
local environment as well as the
specific test location.
Therefore for drawing general
conclusions about over the air
performance of wireless
systems it is necessary to
perform several measurements
in many different environments
Residential 1
Submission
Small office
Lab space
Residential 2 Large office
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Data Distributions and Confidence Limits
• This slide displays the actual distribution
of measurement results around the
average of the dipole antenna results in
each environment.
• The 95% confidence limits for the
average of each data set are also shown.
Small
office
5.5 +- 0.6 dB
+- 0.7 dB
Residential 2
4.3 +- 0.4 dB
Residential 1
5.8 +- 0.5 dB
+- 1.1 dB
2.3 +- 0.6 dB
Lab
space
+- 0.7dB
+- 0.6dB
Large office
2.1 +-0.5 dB
+- 0.6dB
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Variation between Local Environments
It appears from results on slides 18 and 19 that, in addition to the local variations within a
few wavelengths, there is second factor that causes variation in the performance of different
wireless systems:
– There seem to be differences in performance in different building environments. These could be
caused for example by the nature of multipath effects in different environments and the responses
of the wireless systems to these effects
– In this case the residential and small office
environments have a higher difference between the
antennas than the large office and lab space. Both of
Difference in signal strength between a smart
the latter have several metal partitions and furniture
antenna and a dipole in different environments
whereas the residential and small office have
plasterboard walls and ordinary furniture.
– The 95% confidence limits based on 100
measurements in each environment are shown.
– However, tests in five different environments are
not sufficient to prove the effect of the environment.
The standard deviation for variation between
different environments can be estimated from the
five values. It is 1.7 dB.
– Based on this the 95% confidence limits for the
overall performance in different environments are
+- 2.2 dB
– More tests would be needed to prove the effect
ResidenLab
Signal strength difference dB
•
tial 1
Submission
Small
office
Residential 2
Large
office
space
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Time Dependent Signal Variations in Field Tests
• In most field tests there is a fair amount of rapid time fluctuation in the signal
strength. A small part of the variation is caused by thermal noise or other sources
from the electronic equipment, but mostly the variations are caused by small
changes in the environment. (People or other objects moving within the space that
can be reached by the signal, interference from sources like cordless phones or
microwave ovens)
• The fast time fluctuations can be compensated for relatively easily by performing
the test for a suitable period with repeated sampling and averaging the results.
• Slow variations over time can be best eliminated by testing the DUT’s to be
compared immediately one after the other in each location
Two examples of wireless 802.11g signal strength variations over a 30 second time interval
9dB
6dB
30s
Submission
30s
Pertti Visuri, Airgain, Inc.
January/2006
•
•
•
Confidence Limits for Time Fluctuations
The rapid time variations are usually smaller than the variations caused by location.
Their standard deviation is typically 1.2 dB. Therefore their contribution to test
result uncertainty is also smaller
The number of necessary samples depends on the on the desired accuracy
With 200ms sampling interval a 20 second test will provide one hundred test
points. From this the 95%confidence limits for time fluctuations are at about 0.25
dB. test at each location. Therefore usually even shorter tests would be enough.
Averaging the signal strength further across a several locations increases the total
number of samples and further reduces the time-related confidence limits.
An example of wireless 802.11g signal
strength variations over a 30 second
time interval
Confidence limits for different %
confidence levels and numbers of samples
(standard deviation 1.2 dB)
0.7
0.6
Confidence limits dB
•
doc.: IEEE 802.11-06/0026r0
6dB
0.5
0.4
0.3
0.2
95% confidence
90% confidence
0.1
30s
80% confidence
0
10,000
500
100
36
25
15
Number of samples
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Using a Turn-table to Average Over Local variations
•
•
•
•
•
To develop an automated method for collecting signal strength measurement results in
different locations tests were performed using a rotating turn-table at one end of the
link and averaging the measured values
The turn table motion converts the local variation to time domain and makes it more
convenient to calculate the average of the signal strength
The turntable also changes the orientation of the antenna in one end of the link and
thereby further helps to provide a meaningful average over local signal strength
variations
The observed signal strength variations over a full rotation reflect the 15 to 20 dB
local variations that were observed in the earlier tests
The expectation is that averaging over local variations
at one end of the link will reduce the local variations
at the other end
Total
45
Average of C2
One full
rotation
35 35
Total
30 30
25 25
20
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
Signal Strength dB
45 40
0
10
C1
20
30
Cell Sequence
Time Grid(sec)
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
The Effect of Turn Table in One End of the Link
•
•
•
Placing the client station (WLCP in this test) on a turn table and averaging over at least
one rotation reduced the range of variation caused by small changes in access point
(DUT in this test) location by 50% (from 15dB to about 8dB)
The standard deviation of signal strength across the different locations was reduced
from 3.15 dB to 1.50 dB. This corresponds to a similar reduction of confidence limits
However, it is important to note that having a turntable in one end of a test is not
enough to eliminate the variation. Several test locations for the other end are needed.
– Even with the turn table averaging the results at one end of the link (WLCP) the result for
only one location of access point (DUT) will still have confidence limits of about +- 3 dB
– Ten different locations of the DUT need
Signal strength as a function of DUT location when the
to be measured with the WLCP on a turn
table to reduce the 95% confidence limits WLCP is stationary and when WLCP is on a turn table
to about +-1 dB
– All different relative orientations of the
DUT and WLCP need to be represented
in the placements of the DUT
– If the differences in different building
environments are to be studied it would
be necessary to perform tests at ten
different locations with a turntable at one
end of the link in each of the building
environments where the performance is
stationary
on a turn table
to be characterized.
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Effect of Motion in Signal Strength Tests
When one or both ends of the wireless link are physically in motion that would not
occur in normal use it is important to consider the effect of the motion to the test result
– Each signal strength test that uses the built in function of a wireless LAN card takes place
during the packet preamble and takes less than a millisecond.
– If the turn table turns at 2rpm the client moves less than 0.05mm during the test. This would
not have any effect on the value and the measurements can easily follow the variations in
signal strength as can be seen from the test graphs below
– However, if there are other
dynamic control functions involved it is necessary
to consider
Total
Total
their time constants as well. For example, a smart antenna system may be affected by the
Average of C4
Average of C2
45
rapid,
exaggerated signal strength changes caused45by
the turn table motion
Client stationary
Client rotating 2 rpm
40
40
45
45
35
35
Total
30
30
Signal Strength dB
One full rotation
3535
Total
3030
2525
25
25
20
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
20
0
Submission
10
Time (sec)
F10
20
Grid Cell Sequence
30
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
Signal Strength dB
•
10
Time (sec)
C1
20
30
Pertti Visuri, Airgain, Inc.
Grid Cell Sequence
January/2006
doc.: IEEE 802.11-06/0026r0
Over the Air Throughput Tests
•
Since the 802.11 systems all adjust the data encoding complexity automatically to
adapt to different link quality levels due to , the throughput of a particular link has a
strong correlation of the signal quality of the link
– This results in similar shape in local multipath variations for both variables
•
When the radio system works correctly the observed relationship between signal
strength and throughput is an S-curve
Throughput at different encoding levels
20
Correlation of measured throughput to signal strength
18
16
14
12
signal strength
10
Throughput
8
6
4
2
0
15.00
Submission
20.00
25.00
30.00
35.00
40.00
45.00
50.00
55.00
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Over the Air Throughput Tests
•
•
Because the relationship between signal strength and throughput is not linear a constant
signal strength improvement will result in different absolute and relative throughput
improvements at different ranges of throughput
This needs to be taken into account when averaging throughput test results
Throughput at different encoding levels
Mbits/s
Theoretical throughput increase in Mbits/s
for a constant signal strength increase
%
High
Low
Medium
Theoretical relative throughput increase in %
for a constant signal strength increase
Throughput range (Mbits/s)
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Effect of Client Motion in Throughput Tests
• Even though motion may not affect signal strength tests, it can have a significant effect in
throughput testing depending on the chipset implementation and the speed of the motion
– The transmission data rate algorithms in many WLAN chipsets use packet error rate (PER) as the
main input parameter for rate setting. The PER is measured over a period of time
– The rate setting may be affected by the variations caused by the motion on a rotating turn table or
other device that would not be representative of real use situations.
– At least some chipsets are greatly affected if the speed of the antenna is higher than 50 cm minute
It is therefore important to verify that the speed does not have an effect on the tests if a
continuous motion turntable is used in the tests.
– One way of avoiding the motion effect in throughput data is to use a stop-motion turn table that
moves the unit through several locations, but keeps it stationary during the actual tests
Total
45
Effect of Client rotating 1 to 2 rpm
Client rotating 2 rpm
27
45
Continuous Motion
40
Stop-Motion turntable
Signal Strength dB
Throughput Mbits/s
26
Average of C2
25
24
23
22
35
35
Total
3030
2525
21
dipole
Submission
smart antenna
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
20
20
10
C1
20
30
Pertti Visuri, Airgain, Inc.
Grid Cell Sequence
January/2006
doc.: IEEE 802.11-06/0026r0
Using a Stop Motion Turn Table
•
•
To study the distribution of throughput performance across small location variations a
test was performed using an automated turn table that turned 10 degrees, stopped, ran a
standard 30 second Chariot throughput test, and then proceeded the next 10 degrees
and stopped for the next test, etc.
In these tests the standard deviation
Distributions of test results in a stop-motion
for the dipole antenna was 2.6 Mbits/s
turntable throughput test for two antennas
and for the smart antenna 1.5 Mbits/s
This test was performed at close to
maximum throughput. However, even
Smart
Antenna
at this level the better signal strength
results in higher throughput.
26.4 dB +0.35
number of test results at each throughput level
•
-
24.9dB +- 0.6dB
Dipole
Throughput Mbits/s
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Throughput tests on Stop motion turntable
•
•
To develop a method for throughput testing a set of measurements was taken for nine
different wireless access point and client locations in an office building. The stop motion
turntable with 18 stops/measurements was used in each location.
The results appear meaningful, but can not be used for quantitative analysis as each test point
represents only one antenna location only and the variations between locations are large
Dipole
– The average throughput and the 95%
confidence limits are shown for each antenna
– As was demonstrated on slide 25, it is
necessary to average test results from more
than one location of both the DUT and WLCP
to obtain representative results and more
definitive conclusions
Smart
Antenna
3
8
66
4
Displayed floor plan is similar but not the
actual site (for security reasons)
Submission
9
7
55
2
1
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Averaging across More Than One DUT Location
•
•
Grouping the results in three zones based on throughput level allows meaningful averaging
This way at least two different gateway locations are used for each average. This is less than
recommended, but can be used here to illustrate the basic method
High
Dipole
Dipole
– The results were combined into three
groups based on the throughput level of
the dipole
– Average of all measured throughput
differences between the two antennas
was calculated for each of the three
zones of throughput level
Smart
Smart
Antenna
Antenna
Medium`
Low
3
3
8
8
65
66
4
Displayed floor plan is similar but not the
actual site (for security reasons)
Submission
56
2
55
1
9
4
7
9
7
2
1
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Averaged Throughput Results
•
Actual difference in throughput
Smart Antenna over dipole
6.7 +- 2.1 Mbits/s
2.3 +- 1.2 Mbits/s
1.1 +- 0.7 Mbits/s
20 to 30
10 to 20
0 to 10
High
Low
Medium
Throughput range (Mbits/s)
Submission
Throughput difference (%)
•
By combining the throughput comparison results into three groups enough measurements are
available in each group to achieve definitive conclusions at 95% confidence level
The results are as expected based on the results of signal strength differences on slides 20 to
21 and the theoretical expected results shown on slide 25
Increasing the number of test locations in each zone would narrow the confidence limits
further. Ten locations in each zone would narrow the limits to approximately half of the ones
shown here
Throughput difference (Mbits/s)
•
Percentage difference in throughput
Smart Antenna over dipole
92 +- 46%
35 +- 11%
4.5 +- 2.6%
20 to 30
10 to 20
0 to 10
High
Medium
Low
Throughput range (Mbits/s)
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Repeatability of Tests
•
•
•
•
One important criterion for defining a usable tests method is to verify that the method
provides repeatable results within the estimated confidence limits when applied in
different circumstances
For this purpose the throughput comparison test was performed in a residential
environment using the same method as described above
In the second set of tests both
downlink and uplink were tested and
the number of AP (DUT) locations in
the building was increased to 15
Each link was measured using a stopmotion turn table for the client device
(WLCP). The table rotated 20 degrees
and stopped to run a 30s long
throughput test using a standard
Chariot script (18 stops at each
location)
Submission
The two test environments and typical
multipath variation patterns in each
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Test Results in a Two Story Residential Building
•
•
•
The AP (DUT) and client station (WLCP) locations for
the measurements were selected to provide test results
at various levels of throughput
The throughput improvement provided by the Smart
Antenna varied from a few percent to 250%
The confidence levels for individual DUT locations are
still fairly wide
300%
Downlink (AP to Client Station)
Uplink (Client Station to AP)
uplink
200%
150%
100%
50%
0%
-50%
Throughput Comparison
dow nlink
250%
1
2
3
4
5
6
7
8
15
14
56 7 8
2
Smart antenna
9 10 11 12 13 14 15
12
9
13
4
Smart antenna
10
3 1
Standard dipole
Standard dipole
Measured Links
(between floors in dashed line)
Submission
11
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Averaging across More than one DUT Location
•
•
Just like in the first test, the results were
averaged in three throughput zones based on
the downlink throughput of the dipole antenna
and the confidence limits were calculated
Since this time there were more test results in
each zone, the confidence limits are smaller
58 +- 20%
Downlink
Uplink
32 +- 7%
17 +- 5%
6 +- 3% 8 +- 4%
20 to 30
Throughput Comparison
Downlink (AP to Client Station)
66 +- 19%
High
Uplink (Client Station to AP)
15
14
Smart antenna
Smart antenna
10 to 20
Medium
2
0 to 10
Low
12 13
56 7 8 9
4
10
3 1
Standard dipole
Standard dipole
Measured Links
Submission
(between floors in dashed line)
11
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Comparable Results of the Two different Tests
• The test results are consistent within the calculated confidence limits.
– In the first test all Low Range measurements were done below 5Mbits/s and in
the second test three out of five were between 5 and 10 Mbits/s.
– Since the percentage difference grows very rapidly as throughput goes down, the
proper comparison for test method consistency should be done using results in
the same throughput of ranges. The figure below also displays the results of the
second test with the lowest range divided into two parts
Throughput difference (%)
114 +- 33%
92 +- 46%
66 +- 19%
Downlink
42 +- 38%
35 +- 11%
32 +- 7%
6 +- 3%
4.5 +- 2.6%
20 to 30
10 to 20
0 to 10
High
Medium
Low
Throughput range (Mbits/s)
Submission
20 to 30
10 to 20
0 to 10
Low
High
Medium
Throughput range (Mbits/s)
5 to 10
0 to 5
Low
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Conclusions
Local signal strength variations have a major effect on over the air
performance measurements of 802.11 systems. They cause single tests to
provide results that can vary arbitrarily up to 15dB in signal strength or over
10 Mbits/s in throughput in apparently identical tests. This effect can not be
avoided by controlling the placement of the antennas in the test system if the
units under test do not have identical antennas.
Statistical methods, that are very well known and used in several other
industries, can be used to obtain accurate and repeatable results in field tests
and to calculate confidence limits for the results.
An automated method can be used to conveniently obtain enough data
for accurate and reliable results from over the air performance testing. Both
ends of a wireless link need to be placed in several different locations and the
results averaged in suitable groups to get representative results. Testing can
be partially automated by using a turn table. However, in certain situations
continuous motion may introduce a bias effect. This can be eliminated by a
stop-motion test system that is stationary during the actual test.
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Appendix
1 – Student’s Coefficients
Appendix
Values for Student's coefficients to calculate confidence intervals when the value of
the standard deviation needs to be estimated from the measurements.
Confidence probability values
N
2
3
4
5
6
7
8
9
10
11
12
13
14
0,5
1,000
0,816
0,765
0,741
0,727
0,718
0,711
0,706
0,703
0,700
0,697
0,695
0,694
0,6
1,376
1,061
0,978
0,941
0,920
0,906
0,896
0,889
0,883
0,879
0,876
0,873
0,870
0,7
1,963
1,336
1,250
1,190
1,156
1,134
1,119
1,108
1,100
1,093
1,088
1,083
1,079
0,8
3,078
1,886
1,638
1,533
1,476
1,440
1,415
1,397
1,383
1,372
1,363
1,356
1,350
0,9
0,95
0,98
6,314 12,706 31,821
2,920 4,303 6,965
2,353 3,182 4,541
2,132 2,776 3,747
2,015 2,571 3,365
1,943 2,441 3,143
1,895 2,365 2,998
1,860 2,306 2,896
1,833 2,262 2,821
1,812 2,228 2,674
1,796 2,201 2,718
1,782 2,179 2,681
1,771 2,160 2,650
0,99
0,999
63,657 636,619
9,925 31,598
5,841 12,941
4,604 8,610
4,032 6,859
3,707 5,959
3,499 5,405
3,355 5,041
3,250 4,781
3,169 4,587
3,106 4,487
3,055 4,318
0
3,912
4,221
N = number of measurements
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Appendix 2 – Recommended Testing Procedure
Introduction to the Test Method
•
•
•
As has been demonstrated in the main presentation, local signal strength
variations caused by multipath fading can cause random variations in any over
the air testing of wireless systems.
The only way to overcome the uncontrolled variations when comparing
performance of two devices with different antennas is to perform several
measurements at different locations of both ends of the wireless link (the device
under test, DUT, and the wireless counterpart, WLCP) and averaging the results.
Even keeping the test environment completely unchanged will not help if the
systems to be tested have different kinds of antennas. There is no such concept as
“the same location” for devices with different antennas.
This appendix presents a recommended practical method for obtaining reliable
and repeatable test results. The recommendations include numbers of
measurements to take, methods to average the results and to calculate confidence
limits for them
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Appendix 2 – Recommended Testing Procedure
Taking Measurements at Several Locations
•
•
•
The key to all testing is to take measurements at several locations. Both ends of the
wireless link must be tested in many locations. It is sufficient to move the devices only a
few centimeters to average over the multipath fading variation.
However, to obtain results that are representative of the overall performance in a
particular building environment, it is better to include measurements made at different
parts of the building.
The measurements should be grouped so that meaningful averages can be calculated.
– For example, to evaluate signal strength differences all
results in a particular building environment can be averaged
together, since the signal strength difference is not expected
to depend on the overall level of the signal
– For evaluating throughput
differences caused by better
signal the results need to be
grouped by the level of the
4
throughput, since the expected
improvement depends on the
2
56
overall throughput level
3
8
65
9
7
1
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Appendix 2 – Recommended Testing Procedure
Rule of Thumb for How Many Measurements are Necessary
•
The range of multipath variations in signal strength of 802.11b/g systems tend to be about
15dB. Typical values for standard deviation appear to be between 2 and 6dB. The
corresponding numbers for throughput variation depend on the actual level of throughput,
but are similar to these numbers in the mid throughput range.
Based on this general information we can roughly estimate the number of tests needed for
various levels of accuracy and confidence. The sample calculation from slide 17 is below
– For example, to evaluate signal strength differences
with better than +- 1dB accuracy at 95% confidence,
50 to 150 individual measurements will be needed
for each device to be compared
– These measurements should include at least ten
different locations for both ends of the wireless link
that is to be tested
– A practical way to obtain such data is to set one end
of the link on a stop-motion turn table and program
it to collect the data at 10 to 20 locations along the
full turn. This automated test setup should then be
operated in ten locations in the building and the
results averaged together
Confidence limits for different % confidence levels
and numbers of measurements taken in an indoor
test of signal strength (standard deviation 3.6 dB)
8.00
7.00
Confidence limits dB
•
6.00
95% confidence
90% confidence
5.00
80% confidence
4.00
3.00
2.00
1.00
0.00
500
100
36
25
15
10
5
2
1
Number of samples
Submission
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Appendix 2 – Recommended Testing Procedure
Using Zone-Averaging for Throughput Tests
•
•
If the nature of the difference in performance that is to be evaluated is dependent of
another known quantity, the measurements should be grouped so that only test results that
are similar in nature are averaged together
For example, throughput improvements from better signal strength are dependent of the
original throughout. Hence the test results for throughput improvement need to be
averaged together in at least three different zones of original throughput.
– In a practical test plan, selecting the
respective locations for the access point and
for the stop-motion turn table in the various
tests should be done so that all different
throughput levels are represented.
– As an example, ten locations and 18 stops for
measurements at each location will achieve
an approximate accuracy of +- 20% on the
improvement at 95% confidence level.
– Only four locations in each zone would
provide an approximate accuracy level of
+- 50% of the resulting average values
Submission
High
Medium
Low
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Appendix 2 – Recommended Testing Procedure
Calculating Confidence Limits for Tests
•
•
The confidence limits on slides 41 and 42 are just examples to indicate typical ranges. The
actual confidence limits for the average of any set of measurement results can be easily
calculated from the results using the two formulas given below
For practical purposes Microsoft Excel spreadsheet has convenient pre-defined functions
for calculating both standard deviations and confidence limits for sets that include at least
10 results or where the standard deviation is known. The names and formats for these
functions are =STDEV(<range of numbers>) and =CONFIDENCE(<1-confidence level>,
<Standard deviation>, <number of measurements>)
= +/-t (M,C)*SQRT(2/M) ,
where  is the confidence limit,  is the
standard deviation, M is the number of
measurements, C is the desired
confidence level and t (M,C) is the so
called “Student’s coefficient”.
– For more than 15 measurements and for
the confidence limit of 95% the value of
t (M>15,@95%) is approximately 2
Submission

Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Appendix 2 – Recommended Testing Procedure
Calculating Confidence Limits when Standard Deviation is not Known
•
•
In cases where the number of measurements is smaller than 15 and the standard deviation
is not known from other sources (for example from other relevant tests in the same
building environment) it will be necessary to apply the Student’s coefficient from the table
in Appendix 1 and the actual formula for the confidence limits
As can be seen from the graph, the confidence
Confidence limits for various sample sizes
limits for only a few measurements are quite wide when the standard deviation (3.6) is estimated
from the measurements in the sample
= +/-t (M,C)*SQRT(2/M) ,
where  is the confidence limit,  is the
standard deviation, M is the number of
measurements, C is the desired
confidence level and t (M,C) is the so
called “Student’s coefficient” (given in
Appendix 1).
35
30
Standard deviation known
25
20
Standard deviation
estimated from sample
15
10
5
0
1000 100
Submission
50
25
14
10
6
4
3
2
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
1. Identify the interference-free channel in the
test facility
2. Set up the Reference Device and WLCP
(WLCP is on a turn table, stop motion if needed
3. Start the turn-table
4. Perform measurements (e.g. signal strength
or throughput test)
5. Immediately replace the Reference Device
with a DUT and set it up
6. Move the DUT to a new location and
orientation (this may be in a different
environment if average performance in
different environments is to be characterized)
7. Process the obtained data to calculate the
relevant averages and the confidence limits
Submission
Repeat enough times to achieve low enough confidence limits
Block diagram of the Proposed Test Procedure
•It is always important to minimize
interference and other uncontrolled
factors even statistical methods reduce
their impact
•Using the turn-table mitigates the
local variations in the signal strength
in the WLCP end
•Capturing and averaging the data
over a long (up to 1-2 minutes) time
interval mitigates the fast time
variations of the signal strength
•Quick switching between the
Reference Device and DUT mitigates
the slow time variations in the signal
strength
•Randomly orienting the devices in
different environments reduces the
influence of the fluctuations of the
antenna gain in different directions.
•Higher number of tests with DUT test
locations reduces confidence limits
Pertti Visuri, Airgain, Inc.
January/2006
doc.: IEEE 802.11-06/0026r0
Summary of Reducing Variations in OTA tests
Factors contributing to
Experimental estimation of the Proposed ways to mitigate
variations in the signal strength standard deviation
the variations
Local spatial variations (within L  3 - 5 dB
several inches)
4. Performing several measurements
varying the exact location of both
ends of the link (DUT and WLCP)
Variations of the antenna gain
in different directions
G  0.5 - 1.5 dB
1. Measuring and averaging in
several randomly chosen
orientations of the antenna
Environmental variations
(eg. between buildings)
E  2 - 5 dB
5. Conducting measurements in
multiple environments
Fast (within milliseconds) time ST  3 - 4 dB
variations of the signal strength
Slow (within minutes and
hours) time variations
LT  3 - 4 dB
Resulting confidence interval: 2.8    2.8(2 +2 +
B
G
ST
2LT + 2L + 2E)  (16-26) dB
Submission
2. Measuring and averaging of more
than 20 values of the signal strength
within a few seconds .
3. Performing of the measurements
on DUTs to be compared
immediately one after the other.
(1-2) dB is achievable if about
100 or more measurements are used
with enough DUT locations and
environments
Pertti Visuri, Airgain, Inc.