Network Test :: Benchmarking Services Network Test Methodologies::Results Contact Information About Network Test

Network World Lab Test: VoIP Over Wireless LANs

Scheduled for Publication in Fall 2004

Draft Test Methodology

 

v. 1.10. Copyright 2003-2004 by Network Test Inc. Network Test welcomes comments on this document and any other aspect of test methodology. Network Test reserves the right to change the parameters of this test at any time.

 

This document’s URL is http://networktest.com/wlan04/wlan04meth.html. A copy of this document is available in Word format at ftp://public.networktest.com/pub/wlan04/wlan04meth_1.10.zip.

1         Executive summary

This document describes the procedures used to assess the voice-over-IP handling capabilities of wireless LAN switches. The primary focus of the tests described here is on VoIP in enterprise settings.

 

Network World commissioned this project, and plans to publish test results in fall 2004.

 

We evaluate products using five criteria:

 

 

For the features and price areas, we ask vendors to complete a features questionnaire and “mini-RFP.” All other tests involve performance measurements, as described in this document.

 

For the provisioning and management tests, we ask vendors to complete a “mini-RFP.” Written responses and a walk-through of system features will help determining scoring in this area.

 

1.1      Organization of this document

This document is organized as follows. This section introduces the project. Section 2 describes product requirements and the test bed infrastructure. Section 3 describes test procedures. Section 4 logs changes to this document.

 

2         Test bed requirements

This section describes product requirements and test bed infrastructure.

 

2.1      Product requirements

Participating vendors must supply the following equipment:

 

One wireless LAN switch with at least one copper 100- or 1000-Mbit/s Ethernet LAN interface (two switches if access points require direct attachment to switch)

At least two access points with 802.11b and 802.11g capabilities

Provisioning and management software as needed

 

2.2      Test bed infrastructure

This section describes the infrastructure used in testing products.

2.2.1      VeriWave WaveTest

The primary test instrument for this project is the WaveTest system from Veriwave.

 

The WaveTest system comprises a series of hardware TestPoints – up to 16 can be daisy-chained – and measurement software. The TestPoints use dedicated hardware to provide extremely precise measurements of delay and jitter on wireless networks.

 

VeriWave has developed a VoIP over WLAN test suite especially for this project. The VoIP suite measures R-value, delay, jitter, frame loss and other metrics critical to voice and data quality.

 

R-value, described in ITU standard G.107, is a measure of voice quality derived from measurements of packet loss, jitter, and delay. The WaveTest’s VoIP suite computes R-value from direct measurements of these other metrics.

 

Experience suggests there is a very strong correlation between subjective scoring methods such as Mean Opinion Scores (MOS) (ITU P.80) and R-values. The following table shows the relationship between R-value and MOS measurements:

 

User Satisfaction

R-value

MOS

Very Satisfied

90

4.3

Satisfied

80

4.0

Some Users Dissatisfied

70

3.6

Many Users Dissatisfied

60

3.1

Nearly All Users Dissatisfied

50

2.6

Not Recommended

0

1.0

 

The WaveTest system also generates test traffic, including the ability to generate errors, and it can capture and replay traffic from other sources.

 

More information about the WaveTest system is available at http://veriwave.com/pdf/waveTestDatasheet.pdf.

 

An introduction to VoIP traffic analysis is available at http://www.ittc.ukans.edu/research/thesis/documents/michael_todd_gardner.pdf

2.2.2      SpectraLink handsets

To generate voice calls, we use up to 14 e340 and i640 handsets from SpectraLink, a leading provider of wireless VoIP handsets.  To handle call setup and routing, we use the SVP Server from Spectralink, an H.323 call server.

 

SpectraLink recommends no more than 12 handsets associate with a given access point. We use up to 14 handsets in some overload tests.

 

The handsets use G.711 codecs. Measurements with the VeriWave WaveTest system show that each call uses approximately 67 kbit/s of bandwidth in each direction at all times. The nature of the G.711 codec is that call volume, language, and other audio characteristics have no impact on bandwidth consumption: A call uses the same amount of bandwidth to transmit silence as it does a very loud signal.

 

The SpectraLink handsets use a proprietary protocol called SVP to prioritize voice traffic during periods of congestion. SVP is widely supported in other vendors’ equipment. SpectraLink and other vendors have indicated they are working to add support for the 802.11e standard in WME mode in fall 2004. We plan to test with both SVP and 802.11e if support is available at test time.

 

More information about SpectraLink VoIP products is available at http://www.spectralink.com/products/netlink.html.

 

2.2.3      GENIE

To emulate the conditions introduced by WAN circuits separating WLAN switches at two sites, we use the GENIE (gigabit Ethernet network impairment emulator) application for the AX/4000 test instrument from Spirent Communications. GENIE introduces delay, bit errors, and packet loss in deterministic amounts. We configure GENIE to simulate conditions on calls between the US East and West coasts.

 

More information about GENIE is available at http://www.spirentcom.com/documents/386.pdf?wt=2&az-c=dc.

 

3         Test procedures

This section describes the procedures we use in each test event. For each test, we describe the test’s objective, test bed configuration, procedure, metrics, and reporting requirements.

 

3.1      Audio quality

3.1.1      Objective

To determine the audio quality of VoIP traffic through the system under test (SUT) under lightly loaded and heavily loaded network conditions

 

3.1.2      Test bed configuration

One or more pairs of handsets establish calls through the Spectralink SVP Server. Our voice test traffic comprises a .wav file playing continuous combined tones at 440 and 880 Hz. We use a sound card for input, connected to a jack on the SpectraLink handset.

 

The VeriWave WaveTest system generates UDP/IP data traffic in some tests to fully load the network.

 

We begin with tests involving one access point and one WLAN switch. We then repeat tests using two access points, with an emulated WAN between the APs.

 

Vendors MUST NOT configure any QOS enforcement features on their devices. This includes disabling support for SVP.

 

Vendors should use the following configuration parameters:

 

Parameter

Value

ESSID, all APs

nww

AP1 IP address

172.16.1.4

AP1 subnet mask

255.255.255.0

AP1 gateway

172.16.1.1

AP1 VLAN ID (if needed)

101, untagged

WLAN switch IP address

172.16.1.254

WLAN switch subnet mask

255.255.255.0

WLAN switch gateway

172.16.1.1

AP2 IP address

172.16.2.4

AP2 subnet mask

255.255.255.0

AP2 gateway

172.16.2.1

AP2 VLAN ID (if needed)

102, untagged

L2 802.1p/Q, Diff-serv, queuing, 802.11e, SVP

All disabled

 

3.1.3      Procedure

  1. Two handsets associate with a single access point. We designate this access point “AP1.”
  2. We establish a voice call between handsets.
  3. For at least 30 seconds, we play a combined 440/880-Hz tone through the input jack of one handset. We send no other test traffic. (We take measurements during this step.)
  4. An additional 12 handsets associate with AP1.
  5. We establish an additional six voice calls between handsets. There are now seven calls active on 14 handsets, all associated with AP1.
  6. The test instruments generate 1500-byte UDP/IP frames between emulated clients as background traffic.
  7. For at least 30 seconds, we play a combined 440/880-Hz tone through the input jack of one handset. (We take measurements during this step.)
  8. We disassociate all handsets from AP1.
  9. Two handsets associate with two access points. We designate these access points “AP1” and “AP2.” The access points use different 802.11b channels and VLAN IDs and we force the handsets to associate with separate APs.
  10. For at least 30 seconds, we play a combined 440/880-Hz tone through the input jack of one handset. We send no other test traffic. (We take measurements during this step.)
  11. An additional six handsets associate with AP1 and AP2 using the forced parameters given in step 9.
  12. The test instruments generate 1500-byte UDP/IP frames between emulated clients as background traffic.
  13. For at least 30 seconds, we play a combined 440/880-Hz tone through the input jack of one handset. (We take measurements during this step.)
  14.  

3.1.4      Metrics

For each of four tests (1 call, 1 AP; max calls, 1 AP; 1 call, two APs; max calls, 2 APs):

R-value

Average delay

Maximum delay

Jitter

3.1.5      Reporting requirements

Test results

Test instrument configuration

SUT configuration

 

All configurations will be saved to a TFTP server.

 

3.2      QOS enforcement

3.2.1      Objective

To determine the ability of the system under test (SUT) to prioritize voice traffic during periods of heavy congestion

3.2.2      Test bed configuration

The test bed configuration for this event is identical to that for the audio quality tests, with one key exception: Vendors should configure the WLAN switch and/or AP to prioritize VoIP traffic.

Vendors MUST NOT reserve, or “nail up,” bandwidth for the exclusive use of VoIP traffic during this test. Bandwidth must be available to data traffic when there is no voice traffic – something we will verify with a data-only test.

 

We will test prioritization using Spectralink’s SVP and optionally 802.11e WME if it is supported at test time.

 

See section 3.1.1 for details on the test bed configuration.

3.2.3      Procedure

  1. 14 handsets associate with a single access point. We refer to this access point as AP1.
  2. We establish seven voice calls between handsets.
  3. To create bandwidth contention, the test instruments generate 1500-byte UDP/IP frames between emulated clients as background traffic. The emulated clients will associate with AP1 as the handsets. The transmit rate will be approximately 11 Mbit/s, creating an overload.
  4. For at least 30 seconds, we play a combined 440/880-Hz tone through the input jack of one handset. (We take measurements during this step.)
  5. We disassociate all handsets from AP1.
  6. Seven handsets associate with each of two access points, AP1 and AP2. The access points use different 802.11b channels and VLAN IDs and we force the handsets to associate with separate APs.
  7. To create bandwidth contention, the test instruments generate 1500-byte UDP/IP frames between emulated clients as background traffic. The emulated clients will associate with both AP1 and AP2. The transmit rate through both APs will be approximately 11 Mbit/s, creating an overload.
  8. For at least 30 seconds, we play a combined 440/880-Hz tone through the input jack of one handset. (We take measurements during this step.)
  9. We disassociate all handsets from AP1 and AP2.
  10. The test instruments generate 1500-byte UDP/IP frames between emulated clients at a rate of 5.5 Mbit/s in each direction. This extra “sanity check” verifies that bandwidth is available to data when no voice traffic is present. (We take measurements during this step.)

3.2.4      Metrics

For each of two tests (max calls plus data, 1 AP; max calls plus data, 2 APs):

R-value

Average delay

Maximum delay

Jitter

For one test (data only)

Forwarding rate

3.2.5      Reporting requirements

Test results

Test instrument configuration

SUT configuration

 

All configurations will be saved to a TFTP server.

 

3.3      Client Roaming

3.3.1      Objective

To determine audio quality and failover time for VoIP clients migrating from one access point to another

3.3.2      Test bed configuration

The test bed configuration for this event is similar to that for the audio quality tests, with two exceptions: First, tests will cover both single-subnet and dual-subnet roaming cases. Second, we disable the GENIE impairment emulator for this test.

 

Vendors should use the following configuration parameters:

 

Parameter

Value

ESSID, all APs

nww

AP1 IP address

172.16.1.4

AP1 subnet mask

255.255.255.0

AP1 gateway

172.16.1.1

AP1 VLAN ID (if needed)

101, untagged

WLAN switch IP address

172.16.1.254

WLAN switch subnet mask

255.255.255.0

WLAN switch gateway

172.16.1.1

AP2 IP address (single subnet case)

172.16.1.5

AP2 subnet mask (single subnet case)

255.255.255.0

AP2 gateway (single subnet case)

172.16.1.1

AP2 VLAN ID (if needed) (single subnet case)

101, untagged

AP2 IP address (dual subnet case)

172.16.2.4

AP2 subnet mask

255.255.255.0

AP2 gateway

172.16.2.1

AP2 VLAN ID (if needed)

102, untagged

 

See section 3.1.1 for other details on the test bed configuration.

3.3.3      Procedure

  1. Two handsets associate with a single access point. We refer to this access point as AP1.
  2. We bring up power on a second access point configured to use the same IP subnet and VLAN. We refer to this access point as AP2.
  3. We establish a voice call between handsets.
  4. For at least 60 seconds, we play a combined 440/880-Hz tone through the input jack of one handset. We send no other test traffic.
  5. At least 10 seconds after starting the tone, we disconnect power from AP1, forcing the phones to roam to AP2. (We take measurements during this step.)
  6. An additional 12 handsets associate with AP2.
  7. We restore power to AP1.
  8. We establish an additional six voice calls between handsets. There are now seven calls active on 14 handsets, all associated with AP2.
  9. The test instruments generate 1500-byte UDP/IP frames between emulated clients as background traffic. Emulated clients will associate with the same AP as the handsets and contend for bandwidth on the same channel.
  10. For at least 60 seconds, we play a combined 440/880-Hz tone through the input jack of one handset.
  11. At least 10 seconds after starting the tone, we disconnect power from AP2, forcing the phones to roam to back to AP1. (We take measurements during this step.)
  12. We disassociate all handsets from AP1.
  13. To determine the effect (if any) of crossing subnet boundaries, we repeat steps 1-12 with AP1 and AP2 in different IP subnets and VLANs. (We take measurements during this step.)

 

 

3.3.4      Metrics

For each of three tests (max calls plus data, 1 AP; max calls plus data, 2 APs):

R-value

Average delay

Maximum delay

Jitter

For one test (data only):

Forwarding rate

3.3.5      Reporting requirements

Test results

Test instrument configuration

SUT configuration

 

All configurations will be saved to a TFTP server.

 

3.4      Features comparison

3.4.1      Objective

To collect data for a table comparing WLAN switch features of each test participant

3.4.2      Test bed configuration

Not applicable

3.4.3      Procedure

We ask participating vendors to complete a detailed features questionnaire comparing products on a wide variety of areas, including RF management; rogue detection; inter-AP communications methods (if any); interswitch communications methods (if any); interface options; network management; user authentication methods; data security mechanisms; and many other areas.

 

We plan to publish results of the questionnaire in a features table accompanying the test.

 

We will distribute the features questionnaire to vendors after they elect to participate.

3.4.4      Metrics

Not applicable

3.4.5      Reporting requirements

Responses to features questionnaire

 

3.5      Pricing

3.5.1      Objective

To collect pricing data on systems under test

3.5.2      Test bed configuration

Not applicable

3.5.3      Procedure

We ask participating vendors to complete a mini-RFP for a fictional company’s wireless LAN switch deployment in conjunction with a wireless VoIP rollout. The RFP will seek quotes only for WLAN switches, access points, and any required management systems; VoIP systems should not be included in the quote.

 

The mini-RFP will also ask for pricing on the systems exactly as supplied for testing.

 

We plan to publish results of the mini-RFP along with test results.

 

We will distribute the features questionnaire to vendors after they elect to participate.

3.5.4      Metrics

Not applicable

3.5.5      Reporting requirements

Responses to mini-RFP

 

4         Change history

Version 1.10

19 August 2004

Section 3.1.3: Added labels for AP1 and AP2; noted which steps require taking of measurements

 

Version 1.0

18 August 2004

Initial public release

 

 

                  

Network Test Footer. Privacy Policy. Contact us to learn more.