of 42
8/12/2019 RANOP Part2 Assessment v1.2
1/42
Soc Classification level
1 Nokia Siemens Networks Presentation / Author / Date
RANOPRadio Network OptimizationAssessment
8/12/2019 RANOP Part2 Assessment v1.2
2/42
Soc Classification level
2 Nokia Siemens Networks Presentation / Author / Date
Objectives
At the end of the chapter2 participants will be able to
know what is network assessment
know why network assessment is needed
know how to collect data for assessment know which data to collect from OSS
know which tools can be used for assessment
8/12/2019 RANOP Part2 Assessment v1.2
3/42
Soc Classification level
3 Nokia Siemens Networks Presentation / Author / Date
Contents of RANOP
Introduction to optimization What is network optimization What should be taken into account when starting Network
Optimization?
Assessment
Situation at the momentKPIs and Measurements
Measurement tables + KPIs
Solution findings(optimization) and verification
Maximum gain in limited time
Bottlenecks
Features to be consider NSN recommended features to be used in optimization
8/12/2019 RANOP Part2 Assessment v1.2
4/42
Soc Classification level
4 Nokia Siemens Networks Presentation / Author / Date
Assessmenttable of contents
Configuration Assessment
Introduction / dependency table example Area assessment Network assessment
Parameter Assessment Parameter values Default and specific values Parameter checks (default and specific parameters) Consistency checks & delta values Frequencies (planning tool vs. real network values) Parameter discrepancies Feature assessment
Performance Assessment Benchmarking KPIs used
Performance Data analysis Multi vendor KPI Field Tests Alarms
Tools to be used
8/12/2019 RANOP Part2 Assessment v1.2
5/42
Soc Classification level
5 Nokia Siemens Networks Presentation / Author / Date
Assessment - Introduction
Project
Starts
INPUTS
Assessments Solutions
PriorizationWork ordersSolutionVerification
Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring /
Data collection
Analysis Solutions
More detailed inputs
(if necessary) based
on assessment
Yes
No
Project
report
Performance
Problems
Assessment
Configuration assessment
Parameter assessment
Performance assessment
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Moredetailedinputs
(ifnecessary)based
onassessment
Project
report
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
6/42
Moredetailedinputs
8/12/2019 RANOP Part2 Assessment v1.2
7/42
Soc Classification level
7 Nokia Siemens Networks Presentation / Author / Date
Configuration AssessmentIntroduction
Why configuration assessment is needed?
It supports optimization decisions It gives information which features can be implemented
If there are some limitations, dependency table, these should beknown
If AMR project and lots of Talk Family BTSs
It gives information, how much capacity can be increased
If there are some limitations. Can same BTS cabinet be used ifadditional TRX will be added
Are TRXs supporting EDGE
Can HR be used?
It specify targets and inputs It gives information if special resources / knowledge will be needed
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Moredetailedinputs
(ifnecessary)based
onassessment
Project
report
Performance
Problems
Moredetailedinputs
8/12/2019 RANOP Part2 Assessment v1.2
8/42
Soc Classification level
8 Nokia Siemens Networks Presentation / Author / Date
Configuration AssessmentDependency table example
BSS BSCBTS
Flexi EDGE
BTS
UltraSite
BTS
MetroSite
Release BSS12 S12 No dependency No dependency No dependency
BTS
TalkFamily
BTS
PrimeSite
BTS
InSite
2ndGen
BTS
Release No dependency Not supported Not supported Not supported
SGSN MCS BSC NetAct
Release SG5 or later M11 or laterBSC2i or BCS3iPCU2
OSS4.1
DTMDual Transfer Mode in BSC
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
p
(ifnecessary)based
onassessment
Project
report
Performance
Problems
Moredetailedinputs
8/12/2019 RANOP Part2 Assessment v1.2
9/42
Soc Classification level
9 Nokia Siemens Networks Presentation / Author / Date
Configuration AssessmentFeature to Feature Dependency table example
Some examples of feature to feature dependencies
DTM GPRS,EDGE must be implemented before DTM can be used
BSC2i or BSC3i / PCU2
HMC 3GPP release 4 or earlier MSs are limited to combined DL/UL TSL sum
of 5
DFCA
DFCA requires BSS synchronization assuming that a LocationMeasurement Unit (LMU) unit is installed in every BTS site with DFCA.
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al ys i s S ol u ti o ns
(ifnecessary)based
onassessment
Project
report
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
10/42
Moredetailedinputs(ifnecessary)based
8/12/2019 RANOP Part2 Assessment v1.2
11/42
Soc Classification level
11 Nokia Siemens Networks Presentation / Author / Date
Configuration AssessmentNetwork ElementAssessment
Sites / cells (Site database) Site, cell names with IDs
BTS configurations
Antenna types, azimuths, heights, feeder lengths
Location, coordinates and addresses
MSC, BSC, LAC, RAC parameters, TX powers
E1/T1s
BTS HW Assessments Which HW versions are in use in certain site
BSC BSC name and type
BSC Utilization and max capacity
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
(ifnecessary)based
onassessment
Project
report
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
12/42
Soc Classification level
12 Nokia Siemens Networks Presentation / Author / Date
Parameter assessment
Moredetailedinputs(ifnecessary)based
8/12/2019 RANOP Part2 Assessment v1.2
13/42
Soc Classification level
13 Nokia Siemens Networks Presentation / Author / Date
Parameter assessmentParameter values
Generic process of parameter collection
Upload BSC
Actual
Configuration
Take BSC
XML
Dump
Define
Audit
Template
Run
Audit
Script
Generate
XML
Output
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
onassessment
Project
report
Performance
Problems
INPUTS
Moredetailedinputs
(ifnecessary)based
onassessment
8/12/2019 RANOP Part2 Assessment v1.2
14/42
Soc Classification level
14 Nokia Siemens Networks Presentation / Author / Date
Parameter assessmentParameter values
Following parameters should be collected to get full
parameter assessment
ACP BTS LMUA RTSL
ADJC/ADCE DAP MA/MAL SG
ANE GPC NS_VC SMLC
BA/BAL HOC NSE TRK_TBL/TRKTBCF IO_TEXT/TID PCU TRX
BSC LAC to SPC POC UADJC/ADJW
BTS LCSE RA
PRFILE and FIFILE
SS7 signalling network parameters
Parameters of X.25Parameters of CLNS
PAFILE
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
onassessment
Project
report
Performance
Problems
INPUTS
Moredetailedinputs
(ifnecessary)based
onassessment
8/12/2019 RANOP Part2 Assessment v1.2
15/42
Soc Classification level
15 Nokia Siemens Networks Presentation / Author / Date
Default Parameter Values: Almost all parameters are set to
default values Default values are defined based
on studies
Specific Parameter Values:
Special solution for certain area
Default values are not working well
Changes might be quite dangerous as
they may have a critical impact on the
network.
It is not recommended to fix side-
effects of the special values by setting
special values
Parameter assessmentDefault and specificparameter values
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
onassessment
Project
report
Performance
Problems
INPUTS
Moredetailedinputs(ifnecessary)based
onassessment
8/12/2019 RANOP Part2 Assessment v1.2
16/42
Soc Classification level
16 Nokia Siemens Networks Presentation / Author / Date
Project
Starts
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Performance
Problems
Parameter assessmentdefault parameter check
Default parameter check
Export Actual Default Parameters from the OSS through a xml file, for e.g.
Use always latest default values (based on latest studies)
Import xml file into a Database using for e.g. CM Plan Editor.
Compare Planned Default Sets with Actual Default Parameters in thenetwork.
Mark exceptions for each default parameter
Show differences
Select changes to be done
Create dat file or xml file for correction
Note!It is very important to verify if any changes to Default Parametershas been done before any optimization, as this may result in undesired andunexpected impacts on the network.
INPUTS
Moredetailedinputs(if necessary) based
onassessment
8/12/2019 RANOP Part2 Assessment v1.2
17/42
Soc Classification level
17 Nokia Siemens Networks Presentation / Author / Date
Project
Starts
Assessments&
AnalysisSolutions
PriorizationWork ordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionA n al y si s S o lu t io n s
Project
report
Performance
Problems
Parameter assessmentSpecific parameter check
Specific
Parameter area
Specific
Parameter area
Specific
Parameter area
Specific
Parameter areaSpecific
Parameter area
If lots of specific parameters in thenetwork. What kind of problems mightexists?
Hmm! Do I really handleall these specificrelationships?
INPUTS
Moredetailedinputs(ifnecessary)based
onassessment
8/12/2019 RANOP Part2 Assessment v1.2
18/42
Soc Classification level
18 Nokia Siemens Networks Presentation / Author / Date
Parameter assessmentConsistency check
OSS
CM Database
Parameter
file
Master configuration
database
actual default
planned
default
actual specific
NetAct Planner
Planning
database
Parameterfile
export
import
Consistency
check
Parameterfile
Mapinfo
ODBC
Consistenc
y check
Consistenc
y check
Project
Starts
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Performance
Problems
INPUTS
Moredetailedinputs(ifnecessary)based
onassessment
8/12/2019 RANOP Part2 Assessment v1.2
19/42
Soc Classification level
19 Nokia Siemens Networks Presentation / Author / Date
Parameter assessmentConsistency check
Delta values, planning tool values, network values (next slide) All values should be as planned
For example frequencies, BSICs, neighbors
Delta values, network values vs. default/recommended values If not specific values, parameter values should be set as default value.
BSC, BTS, GPRS, POC, HOC
Discrepancies Neighbor parameters
Non symmetrical neighbors (slide)
BTS parameter vs. adjacent parameters
Project
Starts
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
20/42
INPUTS
Moredetailedinputs(ifnecessary)based
onassessment
8/12/2019 RANOP Part2 Assessment v1.2
21/42
Soc Classification level
21 Nokia Siemens Networks Presentation / Author / Date
Cell A Cell B
Parameter assessmentdiscrepanciesNon symmetrical adjacencies
Find all non symmetrical adjacencies :
cell B is neighbour of Cell A
cell A is not neighbour of Cell B
Project
Starts
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Performance
Problems
INPUTS
Moredetailedinputs(ifnecessary)based
onassessment
8/12/2019 RANOP Part2 Assessment v1.2
22/42
Soc Classification level
22 Nokia Siemens Networks Presentation / Author / Date
BSC id BTS id TRX id FREQ
fA
BSC id BTS id N-LAC N-Cell id N-BCCH
fAB
TRX
ADCE
N-Neighbour
Cell ACell B
fAB
fA
Applies also to other cell parameters !!
Parameter assessmentdiscrepanciesN-BCCH in ADCE table against FREQ in TRX table
Project
Starts
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Performance
Problems
INPUTS
Moredetailedinputs(ifnecessary)based
onassessment
8/12/2019 RANOP Part2 Assessment v1.2
23/42
Soc Classification level
23 Nokia Siemens Networks Presentation / Author / Date
Parameter AssessmentBSS FeatureAssessment
The following features must be checked and interworking issues must beemphasized among features below:
Idle mode settings and Directed Retry BB and RF Hopping MultiBCF and Common BCCH Umbrella
FR / HR / DR / AMR HSCSD Extended Cell Intelligent Underlay Overlay Priority based QoS NMO1
NCCR CS1-4 NACC DFCA DTM
Project
Starts
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
24/42
Project
INPUTS
Assessments&S l ti
Moredetailedinputs
(ifnecessary)based
onassessment
Performance
8/12/2019 RANOP Part2 Assessment v1.2
25/42
Soc Classification level
25 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentIntroduction
Why Performance assessment is needed? Performance assessment is needed to get clear picture
about network functionality and find the possibilities forincreasing the performance.
Challenges Missing data
If some hourly data is missingdifficult to find out without hourlyanalysis (Hint: Sharp drop in traffic)
All tables are not activated
All tables can not be found from OSS
Project
Starts
Assessments&
AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Performance
Problems
Project
INPUTS
Assessments&Solutions
Moredetailedinputs
(ifnecessary)based
onassessment
Performance
8/12/2019 RANOP Part2 Assessment v1.2
26/42
Soc Classification level
26 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentPerformance data analysis
Performance data measurements and analysis areusually separated to
OSS Performance data Statistics
Benchmarking KPIs
Multi Vendor KPI
Alarms
Customer complaints
Drive Test Analysis
Planning tools
Executive Summary & Overview
j
Starts AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Performance
Problems
Project
INPUTS
Assessments&Solutions
Moredetailedinputs
(ifnecessary)based
onassessment
Performance
8/12/2019 RANOP Part2 Assessment v1.2
27/42
Soc Classification level
27 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentStatistics
Strong points +
Allows centralized data collection
A cost efficient way to monitor network quality
Pro-Active
Selective performance monitoring (e.g. after new feature activation)
Permanent information flow
Useful to monitor trends Locate problems on a per-cell level
Weak points -
Needs statistically relevant traffic volume to provide reliable results Limited geographical location of problems is possible
Can be difficult to understand the meaning of the counters
Starts AnalysisSolutions
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Problems
Project
S
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)based
onassessment
Performance
P bl
8/12/2019 RANOP Part2 Assessment v1.2
28/42
Soc Classification level
28 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentBenchmarking KPIs used
Bencmarking KPIs
Is recommended for reporting
Release specific
Customer KPIs Almost all operators have operator specific KPIs.
Important KPIs because optimization is done for customer
Limitations SW Release limitation
There might be errors in results if wrong KPIs will be used
Feature limitations There might be errors in results if wrong KPIs will be used
For example if HR is not used
Starts Analysis
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Problems
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)based
onassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
29/42
Soc Classification level
29 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentBenchmarking KPIs - CS
OSS counter and CS KPI analysis gives exact picture about network
performance. The analysis of CS KPIs can be based on the following list:
Accessibility SDCCH access KPIs
TCH, SDCCH Blocking KPIs
HO failure due to blocking
Retainability TCH, SDCCH drop KPIs
Total HO failure, HO drops
Quality UL/DL Quality KPIs
Traffic share TCH, SDCCH Traffic sum
AMR traffic share
S13 EDGE, GPRS, andGSM KPIs can be seen
here
Starts Analysis
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Problems
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)based
onassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
30/42
Soc Classification level
30 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentBenchmarking KPIs - PS
OSS counter and PS KPI analysis gives exact picture about network performance. The analysisof PS KPIs can be based on the following list:
Network Usage UL/DL PS Traffic UL /DL Payloads
PDTCH Congestion DL hard/soft blocking
Abis/PCU Congestion Inadequate EDAP resources in DL DL MCS selection limited by PCU
PDTCH Quality UL/DL (E)GPRS RLC throughput TBF Success ratio
Mobility
Downlink flush per minuteAvailability
Data service availability ratioUser Experience
LLC Throughput
Starts
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Problems
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)based
onassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
31/42
Soc Classification level
31 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentBenchmarking KPIs - PS
QoS
TBF establishment/success ratioConnectivity capacity
DL MCS selection KPIs
Mobility
DL Flush
E2E data rate
LLC Throughput
Volume weighted LLC throughput
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)based
onassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
32/42
Soc Classification level
32 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentMulti Vendor KPI
Multi Vendor KPI comparison is like comparingapples and kiwis
Strong points +
Performance can be compared by using statistics
Weak points -
Counter trigger points might be different
Calculation methods might be different Are we calculating fails or (all attempts - successful
attempts)
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)basedonassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
33/42
Soc Classification level
33 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentAlarms
In assessment report all main active alarms can be mentioned:
Note! Check also if any blocked / filtered alarms
If there are lots of active alarms, collected data is not valid
Performance data is not reliable
Optimization targets are based on wrong inputs
Optimization can be started when (most of) critical activealarms have been solved
* Low Priority
** Med Priority
*** High Priority
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)basedonassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
34/42
Soc Classification level
34 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentCustomer Complaints
Strong points +
Good co-operation between network optimization and customer service department isnecessary
Helps to find not known problems
Helps to confirm known problems
Might pinpoint location of problem
Reports indoor coverage holes
May trigger fast decision from management in case heavy users are involved
Problems with specific mobiles
Problems of mobile configuration
Weak points - Reacting to customer complaints is reacting too late
Not efficient for optimizing a whole network
Difficult to distinguish between mss and bss problems
E.g.: Errors in the numbering plan can only be detected from Customer Complaints!
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)basedonassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
35/42
Soc Classification level
35 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentField Tests
Another way of measuring the target network performance
Necessary in different steps of optimisation
Drive Test process consists on:
1.Doing the test and collecting the logs
2.Post-processing these logs
3.Analysis of Drive Test
GPS
Outdoor antenna
Indoor antenna
Drive Test Software
Pre-defined route
Network 1
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al ys i s S ol u ti o ns
Project
report
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)basedonassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
36/42
Soc Classification level
36 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentField TestsStrong points Helps to adjust propagation models
Troubleshooting
Detects problems that can be masked by statistics
Graphical reports on maps that allow to easily identify hot areas
Monitoring and validation of new functionalities or changes done in the network
Identify coverage problems, meaning both coverage holes and cells covering less or more thanwanted. Used in antenna tilting optimisation.
Finding interference (difficult to measure) and bad quality.
Detecting adjacency problems:
Missing neighbours
Unnecessary or unwanted neighbours
Incorrectly defined adjacencies.
Finding hardware problems: Crossed sectors.
Mixed antenna lines.
Faulty units (TRXs, BBUs, etc).
Imbalance problems (e.g., due to ROE in cables or jumpers).
+
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)basedonassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
37/42
Soc Classification level
37 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentField Tests
Weak points - Very resource and time consuming, making it expensive to monitor wide
areas and sometimes restricts DTs to specific areas
QoS monitoring from moving subscriber point of view. Gives a feeling ofcustomer view, although slow moving MS, indoor MS, high floor MS are not
taken into consideration Supplies in most cases only downlink information
Snap-shot in time
Statistics can be influenced by the velocity of the driver in good areas and inbad areas.
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
Project
Starts
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)basedonassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
38/42
Soc Classification level38 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentField Tests examples
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
CS
Coverage
Quality - Interference
HO measurements
CSSR with short call duration
Traffic handling measurements PS
PSW accessibility analysis GPRS attach / PDP content activation / TBF establishment
Throughput analysisStationary Average throughput (RLC/MAC and Application)
MCS distribution
BLER ,C/I ratio
Throughput analysisMobility (intra/inter PCU and RAU cell-reselection) LLC, RLC/MAC
Retransmission based on cell re-selection
fProject
Starts
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)basedonassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
39/42
Soc Classification level39 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentOptimizer
Problems cells based on interference matrix from Optimizer
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
P f A tProject
Starts
INPUTS
Assessments&
AnalysisSolutions
Moredetailedinputs
(ifnecessary)basedonassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
40/42
Soc Classification level40 Nokia Siemens Networks Presentation / Author / Date
Performance AssessmentExecutive Summary and Overview
In performance assessment report following items should be
mentioned Executive summary
Strengths
Weakness
Overview All KPIs with values should be listed
Network level & BSC level
Main causes for problems should be mentioned
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report
8/12/2019 RANOP Part2 Assessment v1.2
41/42
Soc Classification level41 Nokia Siemens Networks Presentation / Author / Date
Tools to be used
T l t b dProject
Starts
INPUTS
Assessments&
AnalysisSolutions
Solution
Moredetailedinputs
(ifnecessary)based
onassessment
Performance
Problems
8/12/2019 RANOP Part2 Assessment v1.2
42/42
Tools to be used
Example of tools to be used in assessment phase
Configuration data NetAct tools (Plan Editor, CM Plan Manager etc)
Planning tools (NetAct Planner, Asset, etc)
Performance data
Statistics (Reporting Suite, Metrica, Report Builder etc)
Drive test (Nemo Outdoor. TEMS, Actix etc)
Parameter assessment NetAct tools (Plan Editor, Reporting Suite etc
Planning tools (NetAct Planner, Asset, etc)
PriorizationWorkordersSolution
Verification Implementations
Are Criterias
Fulfilled
Project
Ends
Monitoring/
DatacollectionAn al y si s S ol ut i on s
Project
report