+ All Categories
Home > Documents > Central London Benchmarking - awtg.co.uk · PDF fileBenchmarking of 3G voice and data services...

Central London Benchmarking - awtg.co.uk · PDF fileBenchmarking of 3G voice and data services...

Date post: 16-Mar-2018
Category:
Upload: buique
View: 217 times
Download: 1 times
Share this document with a friend
9
Central London Benchmarking Use Case AWTG Limited 7th Floor, Westgate House, Westgate Road, London, W5 1YY. Tel. 0208 799 0368, Fax. 0208 728 9610 www.awtg.co.uk
Transcript
Page 1: Central London Benchmarking - awtg.co.uk · PDF fileBenchmarking of 3G voice and data services ... The major KPI’s collected, ... Central London Benchmarking

Copyright AWTG Ltd Page 0 of 8 August 2013

Document V 1.2 Use Case: Central London Benchmarking

Central London Benchmarking

Use Case

AWTG Limited 7th Floor, Westgate House, Westgate Road,

London, W5 1YY. Tel. 0208 799 0368, Fax. 0208 728 9610

www.awtg.co.uk

Page 2: Central London Benchmarking - awtg.co.uk · PDF fileBenchmarking of 3G voice and data services ... The major KPI’s collected, ... Central London Benchmarking

Copyright AWTG Ltd Page 1 of 8 August 2013

Document V 1.2 Use Case: Central London Benchmarking

Contents Executive Summary ..................................................................................................................................... 2

Introduction ................................................................................................................................................. 2

Why is QoE important? ........................................................................................................................... 2

How does QoE help? ............................................................................................................................... 2

Test Case ...................................................................................................................................................... 3

Information about test tools ................................................................................................................... 3

Test Area .................................................................................................................................................. 3

Methodology: .......................................................................................................................................... 3

Results ......................................................................................................................................................... 4

3G Radio Summary Report ...................................................................................................................... 4

Bus Test ................................................................................................................................................... 6

LTE vs. Wi-Fi ............................................................................................................................................. 7

Conclusion ................................................................................................................................................... 8

Page 3: Central London Benchmarking - awtg.co.uk · PDF fileBenchmarking of 3G voice and data services ... The major KPI’s collected, ... Central London Benchmarking

Copyright AWTG Ltd Page 2 of 8 August 2013

Document V 1.2 Use Case: Central London Benchmarking

Central London Benchmarking Use Case

Executive Summary AWTG has recently carried out a pilot mobile-service benchmarking trial. The benchmarking was aimed at multi-dimensional and multi-level analysis.

The project involved Benchmarking of four major Mobile Network Operators during the Christmas (2013) period. The main features of the project included:

Benchmarking of 3G voice and data services

Testing and benchmarking the recently launched and

much hyped EE LTE service against carrier grade

outdoor Wi-Fi services

Testing and benchmarking the mobile services on the

new ‘Boris’ buses.

Introduction End-user Experiences or Quality of Experience (QoE) Measurement is the process of understanding the performance of a service as compared with the customer expectations and requirements.

The aim is to provide an objective measure by using all the testing tools and techniques the company has in the locker - in-house QoE & SwissQual tools, static service testing, walk testing, even testing on a variety of London buses - to synthesize an overall real-world user experience score for mobile networks already up and running.

Three main scenarios were covered in this project - Outdoor, Indoor and In-bus. The major technologies evaluated in the project were 2G, 3G, LTE and Wi-Fi.

Why is QoE important? Most market analysts agree that the two most visible trends driving market changes are:

Rising operating costs and

Stagnating subscription revenue growth.

Given this environment, retaining customer base and raising Average Revenue Per User (ARPU) becomes the main focus of most service providers.

An Accenture survey showed that 29 out of 30 unhappy customers never call to complain, and 90% of customers will not complain before defecting. Further, studies by EPSI (Extended Performance Satisfaction Index which covers over 20 European and Asian countries, indicates that subscribers are growing slightly more dissatisfied with mobile and broadband services.

How does QoE help?

Traditionally, Service Providers invested in creating significant resources that measure and manage the Quality of Service (QoS).

Quality of Service (QoS) considers the network elements up to, but not including, the subscriber. It helps understand “how the network performs?” i.e. system wide performance measurements and third-party services like DNS resolution, email etc. What it fails to provide is understanding of “How well did the service meet customer expectations?” i.e. how the customer experience was irrespective of how well the core network handled the customer traffic.

The benefits of effective QoE measurement include:

• Customer retention through increased

satisfaction with services

• Increased consumer take-up of new

discretional-spend services such as pay-per-media

• Reduced operational costs through reduced

customer complaints

• Efficiencies in capacity planning through

increased knowledge

Traditionally, operators have dedicated significant resources to managing network Quality of Service (QoS). While QoS management remains necessary, it stops short of understanding the actual customer experience, leaving operators without critical knowledge. Quality of Experience measurements are made at the point of delivery - directly from the subscriber’s smart phone or PC.

Page 4: Central London Benchmarking - awtg.co.uk · PDF fileBenchmarking of 3G voice and data services ... The major KPI’s collected, ... Central London Benchmarking

Copyright AWTG Ltd Page 3 of 8 August 2013

Document V 1.2 Use Case: Central London Benchmarking

Test Case Information about test tools AWTG used a SwissQual benchmarking tool called “Ranger” for this project. Data Collection was carried out using SwissQual NQView and post processing using NQDI and AWTG In-house tools.

Figure 1: SwissQual Test Equipment

AWTG QoE and Broadband tester was used to capture detailed log files especially for Wi-Fi benchmarking; furthermore, AWTG’s spectrum interference analyzer tool was used for the more detail analysis.

Test Area Arguably the busiest half a square kilometre cluster in U.K was chosen for the test. The idea was to choose a high footfall area where operators aim to deliver their best performance but the huge crowd will still take the network to its limit. There was no better option than the Golden Square comprising of Oxford Circus, Tottenham Court Road, Piccadilly Circus, and Leicester Square during the busiest Christmas period.

Figure 2: Test Area

Methodology Three main scenarios covered in this project were Outdoor, Indoor and In-bus. The walk-test was conducted in approximately 10-12 km outdoor pedestrian path as shown in Figure 3. The major KPI’s collected, processed and reported during the project are Speech quality (MOS), FTP Throughput for Uplink/Downlink, HTTP Throughput, Latency.

Figure 3: Outdoor Test Route

2.4 km of the Boris bus and traditional London Double Decker bus route was tested as shown in figure 4. The purpose was to establish if there is any mobile signal degradation in new Boris Buses when compared with existing buses. The walk test was also conducted along the bus route to see the difference in signal levels while in bus and walking.

Figure 4: Bus Route

Static data tests and voice tests were conducted inside Bentalls shopping centre. All the tests described in outdoor testing were conducted in this scenario and the same test specifications were used.

Area: 0.57 km²

Page 5: Central London Benchmarking - awtg.co.uk · PDF fileBenchmarking of 3G voice and data services ... The major KPI’s collected, ... Central London Benchmarking

Copyright AWTG Ltd Page 4 of 8 August 2013

Document V 1.2 Use Case: Central London Benchmarking

Data tests were carried out every 10 meters on the first floor. Voice tests were carried out while walking slowly on the ground floor in order to collect enough MOS samples

Figures below show the test area inside Bentalls shopping centre.

Figure 5: Indoor Test Area

For Outdoor tests, the defined test area was divided into 5 clusters and a time plan was devised for conducting tests in each location. Tests were conducted in Cluster 1 from 9:00 to 10:30, Cluster 2 from 10:30 to 12:30, Cluster 3 from 12:30 to 14:30, Cluster 4 from 14:30 to 16:00, Cluster 5 from 16:30 to 18:00.

The same test specifications were configured for each operator. For Voice tests, call duration of 120 seconds and pause duration of 30 seconds was selected. For FTP tests, the same server location was chosen for all operators in order for the benchmarking to be fair. FTP Upload file size was 100Mb and FTP download file size was 250 Mb. For HTTP tests, the most visited websites such as Google, MSN, Yahoo, BBC and Amazon were chosen. Ping test was performed before and after each data test in order to measure the latency.

An elaborate test plan was put in to place in order to finish all tests according to the schedule. On the first day, FTP upload tests along with Voice tests were conducted. On the second day, FTP download tests along with Voice tests were conducted. HTTP tests were conducted on day 3 and 4. On day 5, indoor tests were conducted. All voice and data tests were repeated inside the busy shopping mall. On day 6, all voice and data tests were conducted in Boris Bus. From day 7-11, LTE service was benchmarked against 3G and Wi-Fi services.

Results 3G Radio Summary Report All operators had good coverage in the area. OP2 has the best coverage

Figure 6: Radio Summary

OP1

OP2

OP3

OP4

Page 6: Central London Benchmarking - awtg.co.uk · PDF fileBenchmarking of 3G voice and data services ... The major KPI’s collected, ... Central London Benchmarking

Copyright AWTG Ltd Page 5 of 8 August 2013

Document V 1.2 Use Case: Central London Benchmarking

Statistics below in figure 7 show that acceptable call success rates and throughput values were noted for all operators. For HTTP tests, OP3 was by far the best performer.

Figure 7: Call Statistics

Figure 8: FTP

Figure 9: HTTP Throughput

Accessibility and MOS statistics in figures 10, 11 and 12 reveal very interesting facts. Operators sometimes choose to compromise quality to achieve higher network accessibility. The MOS is expressed as a single number in the range 1 to 5, where 1 is lowest perceived audio quality, and 5 is the highest perceived audio quality measurement. In terms of

voice quality 1 Bad (very annoying impairment), 2 Poor (annoying impairment), 3 Fair (slightly annoying impairment), 4 Good (perceptible but not annoying impairment) and 5 Excellent (imperceptible impairment). As shown in Figure 11, OP3 shows best MOS but lower accessibility compared with other operators.

Figure 10: Accessibility-Voice

Figure 11: Average MOS for all operators

Figure 12: MOS

Sample MOS Heat Map

Page 7: Central London Benchmarking - awtg.co.uk · PDF fileBenchmarking of 3G voice and data services ... The major KPI’s collected, ... Central London Benchmarking

Copyright AWTG Ltd Page 6 of 8 August 2013

Document V 1.2 Use Case: Central London Benchmarking

Bus Test Some results from the tests conducted on the Boris bus route are shown in the figures below. RSCP levels in the Conventional bus and Boris bus were almost the same. As expected there was some degradation when compared with the RSCP levels while walking. Good MOS was noted in all the scenarios.

Figure 13: Average RSCP

Figure 14: Average MOS

Figure 15: MOS results whilst Walking

Figure 16: MOS results whilst in a Normal Bus

Figure 17: MOS results whilst in a Boris Bus

3G-OP4 MOS Boris Bus

3G-OP4 MOS Normal Bus

3G-OP4 MOS Walk

Page 8: Central London Benchmarking - awtg.co.uk · PDF fileBenchmarking of 3G voice and data services ... The major KPI’s collected, ... Central London Benchmarking

Copyright AWTG Ltd Page 7 of 8 August 2013

Document V 1.2 Use Case: Central London Benchmarking

LTE vs. Wi-Fi LTE coverage was generally good in the area but one pocket with bad coverage was noticed during post processing. Poor coverage resulted in failed ping tests and lower FTP Throughput. In-depth analysis revealed that there was no LTE site very close to the area.

Figure 17: LTE Coverage

Data throughputs achieved by 3G networks do not provide a good comparison with LTE. Hence LTE was benchmarked against carrier grade Outdoor Wi-Fi network and showed comparable performance.

As indicated in Figure 17, FTP downlink performance with LTE is lower than Wi-Fi but more consistent, while the upload performance of LTE is much higher.

Packet loss in LTE is much lower compared with Wi-Fi. Delay values are comparable for LTE and Wi-Fi but LTE again shows more consistency.

Figure 18: LTE and Wi-Fi Statistics

LTE-OP RSRP

LTE-OP Ping Test LTE-OP FTP DL

Page 9: Central London Benchmarking - awtg.co.uk · PDF fileBenchmarking of 3G voice and data services ... The major KPI’s collected, ... Central London Benchmarking

Copyright AWTG Ltd Page 8 of 8 August 2013

Document V 1.2 Use Case: Central London Benchmarking

Web throughput performance of LTE and Wi-Fi technologies are comparable in both locations while overall “time to load” for LTE in better than Wi-Fi.

Figure 19: LTE and Wi-Fi Web Performance

Conclusion

The aim was to measure quality of end user experience, which is basically to use all the testing tools and techniques the company has in the locker - in-house QoE & SwissQual tools, static service testing, walk testing, even testing on a variety of London buses - to synthesize an overall real-world user experience score for mobile networks already up and running.

One of the most surprising results was the sharp difference in performance between all the networks on specific voice/data measures. Even networks that were technologically very similar showed varied performance. For example if we consider the voice MOS values or call setup times, all networks showed major differences.

But the real surprise came when the testing compared the overall performance of public Wi-Fi against LTE: here the differences were significant and most were in the Wi-Fi networks' favour. Results illustrate, not only that Wi-Fi (as the current instantiation of small cell) is playing an increasingly important role in mobile device access, but that mesh Wi-Fi can already beat LTE at its own game.

The quality of experience with static web browsing showed rough parity between Wi-Fi and LTE. But AWTG also undertook a static FTP uplink/downlink test to measure throughput performance. Wi-Fi beat LTE on the critical downlink performance but lost out to LTE on uplink. With YouTube which is a critical mobile broadband use case, Wi-Fi overall came out better than LTE.

AWTG believes that these scores should be the basis for consumer comparison of networks. Ideally the scoring and its dissemination as a consumer friendly network comparison tool could become the responsibility of the regulator. For the regulators the message is clear, they need to focus on quality of end user experience to reflect the true state of mobile broadband networks.

The overall message to the networks, though, is that LTE will not be a panacea if simply deployed using the conventional macro-and micro cellular models. The ONLY way they can meet future consumer needs (in terms of coverage and capacity in dense urban areas) is to invest heavily in small cells and operator-managed Wi-Fi.


Recommended