sample core technical audit report.pdf

Upload: erbili

Post on 06-Jul-2018

222 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    1/90

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    2/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 2 of 90

    Table of Contents

    1  INTRODUCTION .............................................................................................................................................. 9 

    2  NETWORK OVERVIEW ..................................................................................................................................... 9 

    2.1 Core Hardware Location per City ............................... ........................... .......................... ...................... 9

    2.2 BSS Hardware Location per City ............................. .......................... .......................... ......................... 10

    3  CAPACITY ANALYSIS ...................................................................................................................................... 11 

    3.1 VLR Subscriber Register Capacity ............................... ........................... .......................... .................... 11

    3.2 VLR Subscriber Capacity Currently in Use ............................ .......................... .......................... ............ 12

    3.3 VLR Subscriber Capacity Utilization ............................ .......................... .......................... ..................... 12

    3.4 HLR Subscriber Capacity and Utilization .............................. .......................... .......................... ............ 13

    3.5 MGW Capacity and Utilization ............................... .......................... .......................... ......................... 13

    4  PERFORMANCE INDICATORS ........................................................................................................................ 14 

    4.1 Introduction ............................... .......................... .......................... .......................... .......................... 14

    4.2 Concepts ........................... .......................... .......................... .......................... .......................... ......... 14

    4.3 Availability........................................ .......................... .............................. ........................... ............... 18

    4.3.1 System Downtime ............................ .......................... .......................... .......................... ............. 18

    4.3.2 Signaling Performance, SS7 Link availability, ETSI ............................... .......................... ............... 18

    4.4 Accessibility ........................... .......................... .......................... .......................... .......................... ..... 19

    4.4.1   Authentication ........................................................................................................................... 19

    4.4.2  Ciphering, GSM  ......................................................................................................................... 19

    4.4.3  CP Processor Load .................................................................................................................... 20

    4.4.4  Location Update........................................................................................................................ 21

    4.4.5 Mobile IN Calls ............................. .......................... .......................... .......................... ................. 23

    4.4.6  Channel Assignment ................................................................................................................. 23

    4.4.7  Short Messages Service (SMS), ORG ...................................................................................... 24

    4.4.8  Short Messages Service (SMS), TERM .................................................................................... 25

    4.4.9  Successful SMS Delivery Terminating SMS............................................................................. 26

    4.4.10  Signaling Performance, SS7 Link Congestion ........................... .......................... .................... 27

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    3/90

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    4/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 4 of 90

    6.4 MSC SIGTRAN M3UA routing performance .............................. .......................... .......................... ....... 50

    7  M-MGW KPI .................................................................................................................................................. 51 

    7.1 Scope ........................... .......................... .......................... .......................... .......................... .............. 52

    7.2 Introduction ............................... .......................... .......................... .......................... .......................... 52

    7.3 Key Performance Indicators for Internal Accessibility................................................. ......................... 55

    7.4 Key Performance Indicators for External Accessibility ........................... .......................... .................... 56

    7.5 AAL2 Termination Seizure Success Rate .............................. .......................... .......................... ............ 56

    7.6 TDM Termination Reservation Success Rate ........................... .......................... .......................... ........ 57

    7.7 IP Termination Seizure Success Rate ............................... .......................... .......................... ................ 58

    7.8 Originating Nb Connection Initialization Success Rate ........................... .......................... .................... 58

    7.9 Software Licensing, Media Stream Channel Seizure Success Rate ........................... .......................... .. 59

    7.10 Interactive Messaging, Basic Message Success Rate.................... .......................... .......................... .... 59

    7.11 Interactive Messaging, Message Composition Success Rate ........................... .......................... ........... 60

    7.12 Outgoing AAL2 Connection Reservation Success Rate ........................... .......................... .................... 60

    7.13 Retainability ............................... .......................... .......................... .......................... .......................... 61

    7.14 Integrity ............................ .......................... .......................... .......................... .......................... ......... 62

    7.14.1  SS7 over ATM QoS  ................................................................................................................... 63

    7.14.2  SS7 over TDM QoS  ................................................................................................................... 63

    7.14.3  Signaling over IP discard Ratio (Giga Bit Ethernet interface) ............................ ................... 64

    7.14.4  IP Bearer success rate (HOST)................................................................................................ 65

    7.14.5   Aal2 Bearer establish success rate .......................................................................................... 65

    7.14.6  SCTP  .......................................................................................................................................... 65

    7.14.7  Sigtran Retransmission  ............................................................................................................ 66

    7.14.8  M3UA   ......................................................................................................................................... 677.15 Key Performance Indicators for Traffic and Load.................... .......................... ........................... ........ 69

    7.15.1  Usage Rate of Received and Transmitted ATM Cells on a VC Link  ...................................... 72

    7.15.2  TDM termination success rate ................................................................................................. 74

    7.15.3  Media stream resource reservation rate ................................................................................. 74

    7.15.4  GCP message Statistics ............................................................................................................ 75

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    5/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 5 of 90

    7.15.5  MTP2 Link usage  ...................................................................................................................... 75

    7.15.6  Mtp3b Link usage  ..................................................................................................................... 77

    7.15.7  TDM utilization Rate  ................................................................................................................. 78

    7.15.8  Device pool utilization rate  ...................................................................................................... 79

    7.15.9  Device pool reservation success rate ...................................................................................... 79

    7.15.10  Processor Load...................................................................................................................... 79

    7.15.11  Current Traffic Load  ............................................................................................................. 81

    7.15.12  Software Licensing, Media Stream Channel Utilization Rate .............................. ............... 81

    8  BSS AUDIT AND TROUBLESHOOTING ............................................................................................................ 83 8.1 TRH Overload ............................. .......................... .......................... .......................... .......................... 83

    8.2 TRA Pool Supervision Definition ............................. .......................... .......................... ......................... 84

    9  DOCUMENTATION AND PROCEDURES .......................................................................................................... 87 

    9.1 Documentation .............................. ........................... .......................... .......................... ..................... 87

    9.2 Procedures ............................ .......................... .......................... .......................... .......................... ..... 88

    9.3 Recommendations .............................. .......................... .......................... .......................... ................. 88

    10  CONCLUSION ................................................................................................................................................ 89 

    11  APPENDIX ..................................................................................................................................................... 90 

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    6/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 6 of 90

    List of Figures

    Figure 1: VLR Subscriber Capacity Currently in Use..................................................................................... 12

    Figure 2: VLR Subscriber Capacity Utilization  ............................................................................................... 12

    Figure 3: HLR Subscriber Capacity and Utilization  ....................................................................................... 13

    Figure 4: MGW Capacity  ................................................................................................................................. 13

    Figure 5: System Downtime  ........................................................................................................................... 18

    Figure 6: Signaling Performance, SS7 Link availability  ................................................................................ 18

    Figure 7 Authentication ................................................................................................................................... 19

    Figure 8 Ciphering, GSM  ................................................................................................................................. 20

    Figure 9 CP Processor Load  ............................................................................................................................ 20

    Figure 10 Location Update  ............................................................................................................................. 21

    Figure 11 Mobile IN Calls  ................................................................................................................................ 23

    Figure 12 Channel Assignment....................................................................................................................... 23

    Figure 13 Short Messages Service (SMS),ORG ............................................................................................. 24

    Figure 14 Short Messages Service (SMS), TERM .......................................................................................... 25

    Figure 15 Successful SMS Delivery Terminating SMS  .................................................................................. 26

    Figure 16: Signaling Performance, SS7 Link Congestion Narrowband ....................................................... 27

    Figure 17 Signaling Performance, SS7 Link Congestion High Speed .......................................................... 27

    Figure 18: Trunk-Route Performance, Call statistics .................................................................................... 29

    Figure 19: Trunk-Route Utilization, Call statistics ......................................................................................... 29Figure 20 Paging  ............................................................................................................................................. 30

    Figure 21: Call type measurements ORG  ...................................................................................................... 33

    Figure 22 Call type measurements TE ........................................................................................................... 34

    Figure 23 Inter MSC Handover ....................................................................................................................... 35

    Figure 24: End of Selection Codes-1  ............................................................................................................. 40

    Figure 25 End of Selection Codes-2 ............................................................................................................... 40

    Figure 26 End of Selection Codes-3 ............................................................................................................... 41

    Figure 27 End of Selection Codes-4 ............................................................................................................... 41

    Figure 28 Announcement Data  ...................................................................................................................... 43

    Figure 29 Trunk Route Devices Status  .......................................................................................................... 43

    Figure 30 TRH Overload  ................................................................................................................................. 83

    Figure 31 TRH Failure  ..................................................................................................................................... 84

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    7/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 7 of 90

    List of TablesTable 1 Core Hardware Location Per City .......................................................................................... 9

    Table 2 BSS Hardware Location per City ......................................................................................... 10

    Table 3 VLR Subscriber Register Capacity ....................................................................................... 11

    Table 4 Clock Reference in XXXX Network ...................................................................................... 37

    Table 5 HW FAULT MSC ............................................................................................................... 44

    Table 6 HW FAULT BSC ................................................................................................................ 45

    Table 7 Unused Cell ID Definitions ................................................................................................. 46

    Table 8 Software Level Integrity .................................................................................................... 48

    Table 9 SIGTRAN-1 ...................................................................................................................... 49

    Table 10 SIGTRAN-2 .................................................................................................................... 49Table 11 SIGTRAN-3 .................................................................................................................... 50

    Table 12 SIGTRAN-4 .................................................................................................................... 50

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    8/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 8 of 90

    Document Release History

     V ERSIONNO.

    R ELEASE D ATE  PURPOSE 

    1.0 XXXX FINAL DRAFT 

    DISTRIBUTION LIST 

    N AME  POSITION / DEPARTMENT 

     APPROVALS 

     APPROVED B Y   SIGNATUR   D ATE 

    XXXX A IRCOM INTERNATIONAL 

    XXXX COUNTRY

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    9/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 9 of 90

    1  INTRODUCTION

     Aircom has conducted a Technical Audit of XXXX Network between the dates of XXXX and XXXX. This Audit

    project comprises of a combination of collecting data; discussion with XXXX technical teams; desk based

    research; detailed interviews and analysis of documentation and information supplied by the XXXX. This NSS

    audit report has been prepared based on the data provided by the Core planning & O&M responsible personal of

    XXXX 

    2  NETWORK OVERVIEW

    Following are the core network entities and sites of XXXX

    2.1  Core Hardware Location per City

    Cities Location Node Name MSC HLR MGW

    CITY5 XXMSC1 1

    CITY4 XXMSC1 1

    CITY3 XXMSC1 1

    CITY6 XXMSC1 1

    CITY2 XXMSC1 1CITY1 Technical Villa XXMSC3 1

    CITY1 Technical Villa XXMSC4 1

    CITY1 Park plaza XXMSC 1

    CITY1 Park plaza MSCS2 1

    CITY1 HLR1 1

    CITY1 HLR2 1

    CITY1 Technical Villa MGW11 1

    CITY1 Technical Villa MGW12 1

    CITY2 MGW21 1

    CITY1 Park plaza MGW31 1

    Total 9 2 4

    Table 1: Core Hardware Location per City

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    10/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 10 of 90

    2.2  BSS Hardware Location per City

    City Location BSC Name BSC Total Node inCity

    CITY7 BGNRBSC 1 1

    CITY8 GZNRBSC 1 1

    CITY5 HEBSC1 1 1

    CITY4 JABSC1 1 1

    CITY9 JZNRBSC 1 1

    CITY3 KDBSC1 1 1

    CITY10 KHRBSC1 1 1

    CITY6 KUBSC1 1 1

    CITY2 MABSC1 1 2

    CITY2 MABSC2 1

    CITY11 NEBSC1 1 1

    CITY1 Technical Villa KABSC1 1 5

    CITY1 Technical Villa KABSC2 1

    CITY1 Park plaza KABSC3 1

    CITY1 Park plaza KABSC4 1

    CITY1 Technical Villa KABSC5 1

    Total 16 16

    Table 2: BSS Hardware Location per City

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    11/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 11 of 90

    3  CAPACITY ANALYSIS

    This section contains the outcomes of capacity audits.

    3.1   VLR Subscriber Register Capacity

    NODE Name TOTNSUB REGISTERD VLR CAPACITY Available Capacity

     XXMSC1 197380 600000 402620

     XXMSC1 283839 600000 316161

     XXMSC1 58407 600000 541593

     XXMSC1 328894 600000 271106

     XXMSC1 437078 1000000 562922

     XXMSC3 120015 1000000 879985

     XXMSC4 536274 1000000 463726

     XXMSC 346476 1000000 653524

    MSCS2 267511 1000000 732489

    Table 3: VLR Subscriber Register Capacity

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    12/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 12 of 90

    3.2   VLR Subscriber Capacity Currently in Use

    Figure 1: VLR Subscriber Capacity Currently in Use

    3.3   VLR Subscriber Capacity Utilization

    Below is the VLR subscriber capacity utilization. Threshold is showing to be 54% No expansion required at this

    time.XXXX can add more BSC/BTS to improve radio coverage.

    Figure 2: VLR Subscriber Capacity Utilization

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    13/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 13 of 90

    3.4  HLR Subscriber Capacity and Utilization

    Below is the HLR subscriber capacity utilization. Threshold is reached to be 95% it is highly recommended toperform reconciliation schedule on monthly basis to efficiently utilize HSD memory and avoid its expansion.

    Figure 3: HLR Subscriber Capacity and Utilization

    3.5  MGW Capacity and UtilizationLicense capacity is enough for current traffic load; XXXX can add more equipment without any expansion of MGW

    Figure 4: MGW Capacity

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    14/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 14 of 90

    4  PERFORMANCE INDICATORS

    4.1  Introduction

    This section defines switching system performance indicators for the MSC and MSC Server. The MSC is the callcontrol handling node in layered and non-layered architecture. All counter descriptions in this section are used forinformation. The “Application Information’s” shall be used for latest and more detailed counter descriptions.

    4.2  Concepts

    Performance indicators defined in this section focus on reliability and how a service is executed in the MSC/VLRServer.

    Figure 3-1: Principle definition of ISP 

    The MSC/VLR Server is the call control handling node in the Ericsson Core Network containing counter, which arestepped/not stepped, based on information received from other core network elements/nodes. Some counters

    even reflect end-user and radio network behavior. See figure 2. 

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    15/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 15 of 90

    Figure 3-2: MSC/VLR Server in Core Network

    Monitoring and analyzing Performance Indicators provide information for:

    • Benchmarking

    • System Improvements

    • Performance monitoring

    • Node performance fine tuning

    Key Performance Indicator

    Key Performance Indicators (KPI) are defined on network/system level and reflect the end-to-end performance. A Key Performance Indicator consists of one or more Performance indicators (PI); see also figure 3-3.

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    16/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 16 of 90

    Figure 3-3

     Availability

     Availability is defined as the ability of an item to be in a state to perform a required function at a given point of

    time or at any instant of time within a given time interval, assuming that the external resources, if required, areprovided.

    Severability

    The ability of a service to be obtained - within specified tolerances and other given conditions - when requested

    by the user and continue to be provided without excessive impairment for a requested duration. Serve-ability

    performance is subdivided into the service accessibility performance, service retain-ability performance and the

    service integrity performance.

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    17/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 17 of 90

     Accessibility

    The ability of a service to be obtained, within specified tolerances and other given conditions, when requested bythe end-user.

    Retain-ability

    Retain-ability reflects the ability of the user to keep a service once it was accessed under given conditions for arequested period of time.

    Integrity

    Integrity reflects the ability of a user to receive requested service at desired quality. No Integrity PIs are definedfor the MSC.

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    18/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 18 of 90

    4.3   Availability

    4.3.1  System Downtime

    Figure 5: System Downtime

     Accumulated System Down Time (SDT) for the last 12 Months in Second, Its showing no major down time innetwork.

    4.3.2  Signaling Performance, SS7 Link availability, ETSI

    Figure 6: Signaling Performance, SS7 Link availability

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    19/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 19 of 90

    Link unavailbity due to the Transmission fluctuation, XXXX should resolve this issue to improve healthy KPI

    4.4   Accessibility

    4.4.1   Authentication

    The average successful Authentication results for the complete XXXX network are shown in the figures below

    Figure 7 Authentication

    Recommendations:

    The Authentication Success rate is indicating normal conditions in all the network; the values are currently around

    97% which is in par with the world average according the previously mentioned benchmark and above the

    minimum recommended value of 95%

    4.4.2  Ciphering, GSM

    The average Ciphering results are shown in the table and figures below for the complete XXXX network:

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    20/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 20 of 90

    Figure 8 Ciphering, GSM

    Recommendations:

    The Ciphering Success rate is indicating normal conditions in all the network; the values are currently around

    99% which is in par with the world average according the previously mentioned benchmark and above the

    minimum recommended value of 95%.

    4.4.3  CP Processor Load

    Figure 9 CP Processor Load

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    21/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 21 of 90

    Recommendations:

    The central processor load in all the nodes were considered normal and the peak load in the busy hour did not

    reach the maximum recommended limit (75%).

    4.4.4  Location Update

    Figure 10 Location Update

    Recommendations:

    The Location Update Success rate is indicating normal conditions in the XXMSC3 and XXMSC; the values are

    currently around 97% which is in par with the world average according to the previously mentioned benchmark.

    On the other hand, Location Update Success rates in the other MSCs are showing slightly lower values.

    Where the gathered performance measurements for consecutive days show a significant drop starting onwards

    on a daily basis; normally, there are many major reasons for Location Update failure: Unknown IMSI in HLR,

    Timeout, MAP fallback, Network Failure, Congestion... Further investigations are needed to determine the actual

    reasons.

    The following location update signaling flows show how the above mentioned counters are being increased

    accordingly:

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    22/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 22 of 90

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    23/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 23 of 90

    4.4.5  Mobile IN Calls

    Figure 11 Mobile IN Calls

     All the MSC in XXXX Network showed a value of 100% regarding successful IN calls so no recommendation

    needed on this KPI

    4.4.6  Channel Assignment

    Figure 12 Channel Assignment

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    24/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 24 of 90

    Recommended KPI Minimum Value: 98% 

    The results show a normal behavior regarding channel assignment and no additional recommendations are

    needed.

    4.4.7  Short Messages Service (SMS), ORG

    Figure 13 Short Messages Service (SMS)ORG

    Recommended KPI Minimum Value: 80%

    Recommendations:

     According to the above table, we can clearly see that the SMS originating success rate is low for the complete

    period on all the MSC-S. There are some known reasons for the SMS sending failure rate

      Subscribers being barred from sending SMS due to insufficient credit

      Invalid message center numbers

      Invalid B Party numbers

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    25/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 25 of 90

    4.4.8  Short Messages Service (SMS), TERM

    Figure 14 Short Messages Service (SMS), TERM

    Recommended KPI Minimum Value: 73% 

    Recommendations:

    From the above figures, we can see that performance measurements are low before reaching the required level.

    Most of the MSC-S are showing a standard average equal to the world and European averages. Some known

    causes for low SMS receiving rates are:

    •   Absent Subscriber: The receiving user is either powered off or out of the service area.

    •  Memory Capacity Exceeded: The MS memory of the receiving user is full.

    •  Subscriber Busy for MT-SMS: The allocated MS is receiving another SMS.

    •  System Failure: Mostly related to the radio network and the MS, such as assignment failure of SDDCH,

    call drop when receiving SMS, etc…

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    26/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 26 of 90

    4.4.9  Successful SMS Delivery Terminating SMS

    Figure 15 Successful SMS Delivery Terminating SMS

    Recommended KPI Minimum Value: 95% 

    The average results are above the recommended KPI minimum value so no additional recommendation needed

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    27/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 27 of 90

    4.4.10  Signaling Performance, SS7 Link Congestion

    Figure 16: Signaling Performance, SS7 Link Congestion Narrowband

    Figure 17 Signaling Performance, SS7 Link Congestion High Speed

    Recommendations:

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    28/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 28 of 90

    •  Dimensioning rules are allowing utilization 30% load in a non-failure situation and 60% load in a load in

    a failure situation.

    •  It is very important that load limits are maintained within the range, as when the SS7 links reaches a

    certain load level, the message success rate decreases dramatically.

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    29/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 29 of 90

    4.4.11  Trunk route Performance, Call statistics

    Figure 18: Trunk-Route Performance, Call statistics

    Figure 19: Trunk-Route Utilization, Call statistics

    Observed the occurrence of EOS codes in XXMSC1, XXMSC3, XXMSC, XXMSC, XXMSC1 and XXMSC1, the reason

    for the errors is improper CIC assignment which includes Cross Connections of E1s, due to this the subscriber

    received Wrong(ambiguous) calls and Cross Talk. To rectify the issue it is recommended to check all

    Interconnect routes individually with TCTDI command to make sure all CIC are integrated properly

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    30/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 30 of 90

    4.4.12  Loss Route Performance

    Remove unnecessary configuration to have a clean alarm list. Block Devices on Routes are responsible for Low ASR, Route Congestion and Call Rejection. See attached file for more detail.

    4.4.13  Paging

    Figure 20 Paging

    Suss_GSM_First Page Recommended KPI Minimum Value 84% 

    Succ_GSM_Paging Recommended KPI Minimum Value 88%. 

    Recommendations: (Suss_GSM_First Page) 

    The XXMSC3, XXMSC, XXMSC and XXMSC1 paging results show a normal behavior and in accordance to the

    global values.

    In the other hand, for the XXMSC1,XXMSC1,XXMSC1 located outside of CITY1 the values could be improved a bit

    with improvements to radio coverage e.g. an attached mobile out of coverage will not be able to receive orrespond to a page.

    Check the parameter settings of the network; it can often improve the paging performance especially if coverage

    is not the main problem.

    The time between periodic registrations, the function Implicit IMSI detach, the Nr. of LAs and the size of the LAs

    are the key issues. TMSI should be used at least for the first page.

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    31/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 31 of 90

    Recommendations: (Succ_GSM_Paging) 

    The XXMSC1, XXMSC1 XXMSC1 and XXMSC1 MSCs are showing a slightly lower average results that the global

    benchmark (around 88%) mentioned above.

     As for the other MSCs, there seem to be problems as the number of repeated page attempts to a location area

    over A-interface is high. The following causes might explain the low paging success rate:

    LA dimensioning should be carried out in order to have proper Nr. of LA in 1 MSC. If LA is under dimensioned,

    then it will affect paging success rate, on the other hand if LA is over dimensioned, then it will increase LU load,

    and affect LU success rate.

    Low paging success rate could be explained with coverage problems or that the function Implicit IMSI detach is

    not used or that T3212 is set too high.

    Paging performance is mainly depending on radio performance, especially radio coverage, radio capacity, cell

    planning and frequency planning to reduce as much interference as possible.

    Figure 1: Paging of a MS

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    32/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 32 of 90

    Figure 2: Paging Strategies via A-interface

    Figure 3: Paging Strategies

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    33/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 33 of 90

    Other strategies than those recommended affect the paging load as follows:  No second page: No second page reduces the paging load in both the BTS and the BSC. The

    disadvantage is risk of more unsuccessful MS paging.  Global second page: Compared to a local second page, a global second page increases the

    paging load. The advantage is that MSs that, for some reason, have the wrong LA status in the VLR stand a better chance of being successfully paged.

      TMSI for second pages: If the second page is global, IMSI must be used to identify the MS. If thesecond page is local, either IMSI or TMSI can be used to identify the MS. Using TMSI increasesthe paging capacity in the BTS. The drawback is that some pages may be unsuccessful if an MShas the wrong TMSI in the VLR, for example, immediately after having crossed an LA boarder.

    4.4.14  MTRAFTYPE, Call type measurements

    This performance indicator monitors the performance of the nr. of successful calls compared to the nr. of totalcalls for originating and terminating calls.The counters are defined per main traffic type (ORG, TE, OEX, IEX).For this PI only traffic type ORG and TE, hasbeen selected.

    Figure 21: Call type measurements ORG

     Analysis Observation & Recommendation

      The major failure in the ORG-Setup is due to subscriber missed calls or early disconnects and wrongdialing.

      In XXMSC1 area the wrong dialing ratio is high. Call testing is required to identify the missing routes.

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    34/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 34 of 90

    Figure 22 Call type measurements TE

    In this audit it is observed that in the areas where the MT-SUCC% is low the major cause of degradation is lowpaging success rate. Relationship of MT-SUCC% and MT-Subscriber unreachable is also presented to give apicture of radio coverage impact of MT calls. 

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    35/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 35 of 90

    4.5  Retain-ability

    4.5.1  Inter MSC Handover /Intra-MSC Handover

    This performance indicator reflects the successful incoming and outgoing inter-MSC handover attempts includingsubsequent handovers. Events are counted for each neighboring MSC.

    Observe in many directions the Inter MSS handover (In and out) success rates are low. The external LACdefinition needs to be verified by the help of radio team. In few cases the intra MSS handover is also low. Thisshould be checked by BSS team, because in intra MSS handover procedure MSS does not play any role.

    Figure 23 Inter MSC Handover

    Recommendations: The Network LAC diagram should be marinated by the help of radio team. The core

    network personnel should define the external or adjacent LACs according to the radio geographical boundaries

    designed by Radio department.

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    36/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 36 of 90

    5  FINDING5.1  Roaming

    The ROAMWARE version XXXX is using is only capable of retaining the users i.e., it will only hold the user which

    are already on the XXXX network or after they are registered for the first time due to better radio coverage. This

    is not helping to attract new incoming roamers registration in XXXX network.

    In order to capture maximum number of incoming new roamers with priority to XXXX, newer version of

    ROAMWARE should be used in which the capturing feature is available. (See Attached file for more detail).

    5.2  Network Time Synchronization

    5.2.1  Overview

    Network synchronization deals with the distribution of common time and frequency references to all the nodes ina network, in order to align the time and frequency scales of all the clocks employed in the network.

    Time synchronization in particular ensures that all nodes share the same time reference, which is important for

    charging and O&M functions. For example, it may be crucial to know exactly when (in terms of

    day/hour/minute/second/millisecond) a certain event has occurred, so that events from different nodes can be

    correlated. Event correlation is of fundamental importance not only for trouble shooting and charging but also

    for services as the XXXX Revenue Assurance Solution.

    Time synchronization is achieved through time servers, which provide Time-of-Day (ToD) information and deliver

    it over an IP network to the clients, i.e., the network nodes, by means of the Network Time Protocol (NTP) or its

    simplified version Simple Network Time Protocol (SNTP). (More details are available in attached file below)

    Clock Reference in XXXX Network

    NODE REFERENCE1 REFERENCE2 REFERENCE3 STATE URC1 (NTP)

    BGNRBSC 0ETM2,MS-0 0ETM2,MS-1 EX,MBL NOTCONNECTED

    GZNRBSC 0ETM2,MS-0 0ETM2,MS-1 EX,SB NOTCONNECTED

    HEBSC1 0ETM2,MS-0 0ETM2,MS-1 EX,SB NOTCONNECTED

    JABSC1 0ETM2,MS-0 0ETM2,MS-1 ABL,EX NOTCONNECTED

    JZNRBSC 0ETM2,MS-0 0ETM2,MS-1 EX,SB NOTCONNECTED

    KABSC1 0ETM2,MS-0 9ETM2,MS-0 EX,SB NOTCONNECTED

    KABSC2 0ETM2,MS-0 3ETM2,MS-0 EX,SB NOTCONNECTED

    KABSC3 0ETM2,MS-0 4ETM2,MS-0 EX,SB NOTCONNECTED

    KABSC4 0ETM2,MS-0 4ETM2,MS-0 EX,SB NOT

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    37/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 37 of 90

    CONNECTED

    KABSC5 0ETM2,MS-0 1ETM2,MS-0 EX,SB NOTCONNECTED

    NEBSC1 0ETM2,MS-0 0ETM2,MS-1 EX,UPD NOTCONNECTED

    KHRBSC1 0ETM2,MS-0 0ETM2,MS-1 EX,SB NOTCONNECTED

    KUBSC1 0ETM2,MS-0 0ETM2,MS-1 EX,ABL NOTCONNECTED

    MABSC1 1ETM2,MS-0 3ETM2,MS-0 EX,SB NOTCONNECTED

    MABSC2 0ETM2,MS-0 4ETM2,MS-0 EX,SB NOTCONNECTED

    KDBSC1 1ETM2,MS-0 1ETM2,MS-1 EX,ABL NOTCONNECTED

    HLR1 0E1551,MS-0 0E1551,MS-1 SB,EX NOTCONNECTED

    HLR2 0E1551,MS-0 0E1551,MS-1 SB,EX NOTCONNECTED

     XXMSC3 NOTCONNECTED

    NOTCONNECTED

    NOTCONNECTED

     XXMSC4 NOTCONNECTED

    NOTCONNECTED

    NOTCONNECTED

     XXMSC NOTCONNECTED

    NOTCONNECTED

    NOTCONNECTED

     XXMSC NOT

    CONNECTED

    NOT

    CONNECTED

    NOT

    CONNECTED

     XXMSC1 NOTCONNECTED

    NOTCONNECTED

    NOTCONNECTED

     XXMSC1 1E1551,MS-0 1E1551,MS-1 RCM-0 MBL,MBL,EX NOTCONNECTED

     XXMSC1 0E1551,MS-0 0E1551,MS-1 RCM-0 MBL,MBL,EX NOTCONNECTED

     XXMSC1 1E1551,MS-0 1E1551,MS-1 RCM-0 EX,SB,SB NOTCONNECTED

     XXMSC1 0E1551,MS-0 0E1551,MS-1 RCM-0 MBL,MBL,EX NOTCONNECTED

    Table 4: Clock Reference in XXXX Network

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    38/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 38 of 90

    5.3  Size Alteration Event Utilization

    The recommended SAE utilization is between 40% - 50% during normal traffic behavior. If utilization persistentlyhigh for a duration of more than a week, then SAE individual increment is advisable, Use Ericsson Formula in Alex

    for Increment

    SDATE NODE BLOCK SAE SAE_Utilization

    4/7/11 7:00 PM HLR HMAPTC 500 60%

    4/7/11 7:00 PM HLR HSUDAP2 500 67%

    4/7/11 7:00 PM HLR HUEXAP2 500 67%

    4/8/11 7:00 PM HLR HMAPTC 500 50%

    4/8/11 7:00 PM HLR HSUDAP2 500 56%

    4/8/11 7:00 PM HLR HUEXAP2 500 56%

    4/9/11 7:00 PM HLR HMAPTC 500 51%

    4/9/11 7:00 PM HLR HSUDAP2 500 56%4/9/11 7:00 PM HLR HUEXAP2 500 56%

    4/10/11 7:00 PM HLR HSUDAP2 500 53%

    4/10/11 7:00 PM HLR HUEXAP2 500 53%

    4/11/11 7:00 PM HLR HMAPTC 500 52%

    4/11/11 7:00 PM HLR HSD 786 86%

    4/11/11 7:00 PM HLR HSUDAP2 500 59%

    4/11/11 7:00 PM HLR HUEXAP2 500 59%

    4/12/11 7:00 PM HLR HMAPTC 500 54%

    4/12/11 7:00 PM HLR HSD 786 96%

    4/12/11 7:00 PM HLR HSUDAP2 500 62%

    4/12/11 7:00 PM HLR HUEXAP2 500 62%

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    39/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 39 of 90

    SDATE NODE BLOCK SAE SAE_Utilization

    4/7/11 7:00 PM JAMSC1 COMAIN 1130   61%

    4/7/11 7:00 PM JAMSC1 MMM 1132   60%

    4/7/11 7:00 PM JAMSC1 MRRM 1139   60%

    4/7/11 7:00 PM JAMSC1 MRRMH 1053   58%

    4/7/11 7:00 PM JAMSC1 MSCCO 500   60%

    4/7/11 7:00 PM JAMSC1 SHMM 604   59%

    4/8/11 7:00 PM JAMSC1 COMAIN 1130   60%

    4/8/11 7:00 PM JAMSC1 MMM 1132   58%

    4/8/11 7:00 PM JAMSC1 MRRM 1139   58%

    4/8/11 7:00 PM JAMSC1 MSCCO 500   58%

    4/9/11 7:00 PM JAMSC1 COMAIN 1130   60%

    4/9/11 7:00 PM JAMSC1 MMM 1132   59%

    4/9/11 7:00 PM JAMSC1 MRRM 1139  59%

    4/9/11 7:00 PM JAMSC1 MSCCO 500   59%

    4/9/11 7:00 PM JAMSC1 SHMM 604   58%

    4/12/11 7:00 PM JAMSC1 COMAIN 1130   59%

    4/12/11 7:00 PM JAMSC1 MMM 1132   58%

    4/12/11 7:00 PM JAMSC1 MRRM 1139   58%

    4/12/11 7:00 PM JAMSC1 MSCCO 500   58%

    SDATE NODE BLOCK SAE SAE_Utilization

    4/7/11 7:00 PM KAMSC1 SHEC 604   72%

    4/8/11 7:00 PM KAMSC1 SHEC 604   69%

    4/9/11 7:00 PM KAMSC1 SHEC 604   70%

    4/11/11 7:00 PM KAMSC1 SHEC 604   68%4/12/11 7:00 PM KAMSC1 SHEC 604   72%

    SDATE NODE BLOCK SAE SAE_Utilization

    4/11/11 7:00 PM KAMSC2 COHW 500   54%

    SDATE NODE BLOCK SAE SAE_Utilization

    4/7/11 7:00 PM KUMSC1 UPPC4S 500   64%

    4/8/11 7:00 PM KUMSC1 UPPC4S 500   60%

    4/9/11 7:00 PM KUMSC1 UPPC4S 500   60%

    4/10/11 7:00 PM KUMSC1 UPPC4S 500  60%

    4/11/11 7:00 PM KUMSC1 UPPC4S 500   61%

    4/12/11 7:00 PM KUMSC1 UPPC4S 500   64%

    SDATE NODE BLOCK SAE SAE_Utili zation

    4/9/11 7:00 PM MAMSC1 MSMMCAH 287   51%

    4/10/11 7:00 PM MAMSC1 MSMMCAH 287   50%

    4/11/11 7:00 PM MAMSC1 MSMMCAH 287   51%

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    40/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 40 of 90

    5.4  End of Selection Codes

    Figure 24: End of Selection Codes-1

    Recommendation: Defined Proper Selection Type (ST Value) on Trunk Route Both Side

    Figure 25 End of Selection Codes-2

    SDATE NODE BLOCK SAE SAE_Utili zation

    4/7/11 7:00 PM MSCS1 UPMHS4S 647   56%

    4/8/11 7:00 PM MSCS1 UPMHS4S 647   56%4/9/11 7:00 PM MSCS1 UPMHS4S 647   56%

    4/10/11 7:00 PM MSCS1 UPMHS4S 647   56%

    4/11/11 7:00 PM MSCS1 UPMHS4S 647   56%

    4/12/11 7:00 PM MSCS1 UPMHS4S 647   56%

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    41/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 41 of 90

    Observation & Recommendation: Analysis of alternate routing case in XXMSC1, XXMSC2 and XXMSC3, There

    is some branching not defined properly for over flow traffic.

    Observed occurrence of EOS codes in XXMSC1, XXMSC1, XXMSC1 and XXMSC1. The reason for the errors is

    improper CIC assignment which includes Cross Connections of E1s, due to this the subscriber received Wrong

    (ambiguous) calls and Cross Talk. To rectify the issue it is recommended to check all Interconnect routes

    individually with TCTDI command to make sure all CIC are integrated properly

    Figure 26 End of Selection Codes-3

    Recommendation: Check Link Failure/Congestion between SSF and SCF

    Figure 27 End of Selection Codes-4

    Recommendation: Set BTDM/T3212 Setting accordingly for implicit detach marking of mobile subscribers.

    Check Radio Coverage and Link Fluctuation.

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    42/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 42 of 90

    5.5  Echo Canceller Setting

     Analysis of routing data leads to the conclusion that the switching equipment congestion (resources not utilizedoptimally) is due to the incorrect Echo Canceller settings. These setting will also adversely contribute to increased

    processor load and excessive use of EC’s in other switches which will degrade the performances other connected

    switches.

    With the recommended setting mentioned below users will observe improved voice call quality with no delay.

       ALL PLMN Routing ESS=1

       ALL PLMN Routing ESR=1

       ALL PSTN Routing ESS=1

       ALL PSTN Routing ESR=1

    5.6  B Number Table

    In Analysis of B Number Table of all MSCs, all parameters were found correctly defined with the exception of

    XXMSC3 where there should be no Charging Case on Announcement Route

    The Value should be set as below:

     ANBSI:B=99-8,RC=94,L=4;

     ANBSI:B=99-9,RC=95,L=4;

    5.7   Announcement Data

    In the analysis announcement route highly congested and blocked devices were found in XXMSC1, XXMSC1 and

    XXMSC1.

    In order to reduce congestion all blocked devices should be fixed and more HW to be added. This will increase

    the QOS for the subscriber

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    43/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 43 of 90

    Figure 28 Announcement Data

    Recommandation: Replace faulty HW or move all announcement route to MGW.

    5.8  Trunk Route Devices Status

    Figure 29 Trunk Route Devices Status

    Recommendation: There are lot of devices on trunk routs blocked due to lack of O&M, Preventive maintenance

    and proper integration is highly recommended, Block Devices on Routes are responsible for Low ASR, Route

    Congestion and Call Rejection

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    44/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 44 of 90

    5.9  HW FAULT Drill-down by MSC

    Table 5: HW FAULT MSC

    Recommendation: The RPs highlighted in red are having high errors therefore needs to be replaced with higher

    versions. For this CSR to Ericsson should be raised on priority.

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    45/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 45 of 90

    5.10  HW FAULT Drill Down BSC

    Table 6: HW FAULT BSC

    Recommendation: The RPs highlighted in red are having high errors therefore needs to be replaced with higherversions. For this CSR to Ericsson should be raised on priority.

    5.11  Unused Cell ID Definitions

    In this section the comparison of MSC and BSS defined cells is presented. The main objective of this practice was

    to identify the extra cells defined on the MSC & to remove the junk data for making space available in cells table

    and to organize cells tables. Mentioned below is the list of cells which are identified as extra on MSC by

    comparing with BSS data.

    Notice: 

    Please do not dilute any cell from the MSC side prior to the final confirmation from BSS Team. BSS should double

    check the traffic on these cells. The cells ID dilution should take place with the cooperation of BSS and NSS

    teams.

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    46/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 46 of 90

    5.12  System Logs

    System log defined in all MSCs is of fixed size which eventually results in loss of data after reaching its maximum

    limit because the new data coming is over written on the previous data. Therefore it is recommended to define

    transfer queue for direct data transfer to the OSS in order to avoid data loss.

    5.13  Signalling Error Reports Fixing

    The Signaling error reports from the nodes were analyzed after which it was concluded that data coming from the

    nodes have some necessary information missing which help in identifying/rectifying the problem occurred. The

    missing information issue is resolved for accurate fault fixing in future. (See attached file)

    5.14   APG Drive Full

     Analysis of Alarms on the APG leads to the fact that on some nodes the APG Drive is almost full, and once it is

    completely filled the APG will be down and no statistical data will come forward thus no performance reports

    could be generated for the management of the network. Therefore it is recommended to have proper

    maintenance of the APG drive.

    MSC NODE CELL in MSC not in BSC BSC NODE XXMSC1 HRT084A HEBSC1

    HRT084B HEBSC1

    HRT084C HEBSC1

    HRT085A HEBSC1

    HRT085B HEBSC1

    HRT085C HEBSC1

     XXMSC1 KNR009A BSC1JA

    KNR009B BSC1JA

    KNR009C BSC1JA

     XXMSC1 SMN023C MABSC1

     XXMSC KBL211X KABSC3KBL211Y KABSC3

    KBL261C KABSC3

     XXMSC KBL261A KABSC3

    KBL261B KABSC3

    KBL261C KABSC3Table 7: Unused Cell ID Definitions

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    47/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 47 of 90

    5.15  Unused Route

     A lot of unused route data is defined in BSCs as well in the MSCs. This results in High CP load and increased Call

    Setup Time. To avoid this situation this data should be removed and proper size alteration to be done for

    enhanced CP performance.

    5.16  Naming Standard

    There are no standard Naming Convention followed in Core Network. One single network element has different

    name in different domains, e.g. the Node X is named as “A” in the Exchange-Header where it is named “B” in

    Signaling Point ID (SPID). These inconsistencies make handling/troubleshooting process complex and difficult.

    Therefore standard naming must be followed to improve Emergency handling and O&M.

    5.17  Software Level Integrity

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    48/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 48 of 90

    Table 8: Software Level Integrity

     After investigating the Alarms (Software fault) on the nodes it is concluded that Software running on all the MSCsis defective. In order to avoid events such as system restart (i.e., outage in the network) an immediate CSR

    should be raised to fix the issued

    MTNA Software Level Integrity (BSS)

    MTN Core Node APZ Type IO Type System IPA Missing Corrections

    BGNRBSC   21230/33   APG40   08B   10

    GZNRBSC   21230/33   APG40   08B   10

    HEBSC1

    JABSC1   21250   APG43   08B   10

    JZNRBSC   21230/33   APG40   08B   10

    KABSC1   21230/33   APG40   08B   10

    KABSC2   21230/33   APG40   08B   10

    KABSC3   21230/33   APG40   08B   10

    KAMSC4   21250   APG43   08B   10

    KABSC5   21250   APG43   08B   10

    KDBSC2   21250   APG43   08B   10

    KHRBSC1   21250   APG43   08B   10

    KUBSC1   21230/33   APG40   08B   10

    MABSC1   21230/33   APG40   08B   10

    MABSC2   21250   APG43   08B   10

    NEBSC1   21230/33   APG40   08B   10

    Software level Discrepancies

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    49/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 49 of 90

    6  SIGTRAN

    6.1  MSC SIGTRAN SCTP “Out of Blue” packets

     “Out of Blue” packets are received with correct format, right checksum,

    but the receiver is not able to identify the association to which the packet belongs.

    Those packets will be discarded.

    Node Name Out of Blue Packets

     XXMMS3 4

     XXMSC4 0 XXMSC 0

     XXMSC 0

     XXMSC1 11

    Table 9: SIGTRAN-1 

    6.2  MSC SIGTRAN Association Unavailability

    Node Name Asso. Unavail. (number of time) Asso. Unavail.

    (sec.)

    Object

     XXMSC3 12 55 M3_NI2_XXMSC

     XXMSC4 0 0

     XXMSC 0 0

     XXMSC 0 0

     XXMSC1 16 67 M3_NI2_XXMGW11

    Table 10: SIGTRAN-2

    Recommendation: As shown in the table for XXMSC3 and XXMSC1, M3UA has interruption recorded during

    110311 to 130311. Check the error interruption on MPBN side

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    50/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 50 of 90

    6.3  MSC SIGTRAN Destination Unavailability

    Node Name Dest. Unreach. (number of time) Dest. Unreach.(sec.)

     XXMSC3 3033 121639

     XXMSC4 375 14007

     XXMSC 162 14575

     XXMSC 160 11606

    Table 11: SIGTRAN-3

    Recommendation: Check the Transmission Availability

    6.4  MSC SIGTRAN M3UA routing performance

    Node Name Routing Errors

     XXMSC3 4994181

     XXMSC4 135894

     XXMSC 127

     XXMSC 0

     XXMSC1 965390

    Table 12: SIGTRAN-4

    Recommendation: Check Event Record properly, Time out somewhere in the network

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    51/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 51 of 90

    7  M-MGW KPI

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    52/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 52 of 90

    7.1  Scope

    This study cover the request of XXXX for list the KPI needed on M-MGW. It can be used for:

      List of needed measurements/KPI

      Possible reasons for unhealthy value/measurement.

    7.2  Introduction

    XXXX has M-MGW R5 on ATM backbone and the KPI suggested in this study are relative to ATM network and M-MGW R5.

    Normally the KPI used in MGW are:

       Accessibility (ratio of successful connection establishments)  Retainability (ratio of end user initiated connection releases)  Integrity (QOS end user perception of the network)

    In addition to these KPI mentioned above it is important to know also the traffic/load.

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    53/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 53 of 90

    Overview on KPI Counter Stepping Phases

    M-MGw

    Check licensed capacity

    Ok?

    Step counter ‘termReq’’

    Reserve internal

    resources

    Respond to MSC

    Ok?Step counter ‘termRej’

    Bearer establishment

    Ok?

    Through connected.

    (QoS related counters

    are stepped.)

    Step counters ‘rejected due

    to capacity’ and ‘termRej’

    Step counter ’unmaturerelease’

    Release resources.

    Step counter ’normalrelease’.

    Release resources.

     AddReq received

    Step counter ‘external

    accessibility failure’.

    Release resources.

    Reason for termination?

    Failure*

    Normal**

    No

    Yes

    No

    No

    Yes

    Yes

    MSC AddReq

     AddRsp (NOK).

     AddRsp (NOK).

     AddRsp (OK).

    NotifyReq

    NotifyReq

    SubRsp (OK).

    SubReq**

     Accessibility

    (internal)

    Ratio of successful

    Termination

    reservations/term.

    requests

     Accessibility

    (external)

    RetainabilityRatio of mature

    released connections/all connections

    Integrity(BER/BLER/..)

    * E.g. due to program/board restart

    ** Normal release = SubReq received from MSC

    Rsp?NOK

    OK

    NotifyReq?

    Yes

    No

    1000

    Connection attemptsI.e. AddReqs

    999

    Successful attempts

    I.e AddRsp OK

    =>accessibility

    (internal) = 99,9%

    997

    Bearer establismentsOK =>accessibility

    (external)997/999*100% = 99,8%

    996

    Normal releases I.eSubRsp OK

    =>retainability996/997*100% = 99,9%

    Example case of KPI calculation. Note that successive measurement is based on number of connections

    that have reached that particular phase.

    t

     

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    54/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 54 of 90

     Accessibility

     Accessibility has been divided into two parts:

      Internal Accessibility measurement-Measured from reception of AddReq to sending of AddRsp.-Considers all internal resources of MGw except admission control for IP and ATM.

      External Accessibility measurement-Measured from sending of AddRsp i.e. where internal accessibility ends to successful bearerestablishment. In non-CSD termination the Nb or Iu has been initialized, in UDI calls the Q.Aal2connection has been established and in CSD calls Iu or Nb has been initialized & the radio and fixedprotocols are up.

    -Considers IP & ATM admission control and external bearer setup protocols.

    Retainability

    Retainability should it be just one KPI that cover the following measurement:

    Internal Retainability measurement

      Measurement starts after external bearer is up i.e. where external accessibility ends.

      Considers failures of internal resources e.g. MSB or ET in MGw that lead to that call is disconnectedabnormally.

    External retainability

      Failures e.g. Q.aal2 RES or ICMP DU that lead a call.

      GCP commands that are replied with error code due to external failure.

    can be left on lower priority as those can be assumed to be covered by other nodes contributing the networkretainability.

    Integrity

    The integrity is the ability of an external connection to maintain requested service at desired quality.

    Traffic load

    This category provides information about the current status of a node, mainly from resource usage point of view.

    Following items should be considered for daily measurement:

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    55/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 55 of 90

      Current Traffic Load  Software Licensing, Media Stream Channel Utilization Rate  Processor Load  Media Stream Resource Reservation Rate  STP&SGw, SEP and SRP Signaling Traffic (MSU/s)   AAL2 Pipe Utilization Rate  MTP3b Signalling Link Usage

    The following KPI should be considered for check the traffic during special events (High Traffic) or after somenetwork change.

      MTP3b Signaling Link Usage  Number of Received and Sent M3UA Payload Data Messages  MTP2 Signaling Link Usage

      Received and Transmitted Bandwidth (bps) on a VC Link  Usage Rate of Received and Transmitted ATM Cells on a VC Link  Received and Transmitted Bandwidth (Mbps) in Fast Ethernet Signalling

    7.3  Key Performance Indicators for Internal Accessibility

    The internal accessibility is the ability to obtain requested service from the system between the reception of aGCP Add message and the sending of a GCP AddReply message.This KPI can be used for example monitoring the utilization and congestion rate of resources.

    MGW Accessibility

    Healthy value range: 99.7 — 100%  (long term average)

    Possible reasons for falling below the healthy value range:

      Congestion in some M-MGw resources  Maximum use of licensed software capacity

    Possible consequence of falling below the healthy value range:  Increased traffic rejection rate

    Recommended actions when falling below the healthy value range:

    MGW11 MGW21 MGW31

    99.35% 92.95% 99.57%

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    56/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 56 of 90

      Check if the event 80 % Capacity Limit Met for Media Stream Channels or the event 100 % CapacityLimit Met for Media Stream Channels is issued.

      Check software capacity licenses.   Analyze the following PIs to see if the problem concerns ATM, IP or TDM traffic, AAL2 Termination

    Seizure Success Rate, IP Termination Seizure Success Rate and TDM Termination Reservation SuccessRate.

      Identify and redimension (if possible) the congested resources in the node.  Check the status of related resources and devices.  Check the counter MgwApplication.pmNrOfRejsByStaticAdmCtrl.

    7.4  Key Performance Indicators for External Accessibility

    This chapter specifies the PIs for external accessibility that are supported by the M-MGw. The external

    accessibility is the ability to obtain requested service from the system between the sending of a GCP AddReplymessage and the completion of a bearer setup.

    The major KPI to monitor is “Incoming AAL2 Connection Reservation Success Rate”:The Incoming AAL2 Connection Reservation Success Rate measurement is used for calculating the incoming AAL2connection reservation success rate initiated by the adjacent node. This measurement is made for AAL2 AccessPoint (Aal2Ap).

    Successful Rate in AAL2AP

    Healthy value range: 99.7 - 100% (long term average).

    Possible reasons for falling below the healthy value range:   AAL2 configuration mismatch between this node and remote node  Congestion in remote node

    Possible consequence of falling below the healthy value range:  Increased traffic rejection rate

    Recommended actions when falling below the healthy value range:

      Check the AAL2 configuration on remote node, fix the detected faulty configurations.  Redimension the AAL2 pipe.

      Consider rerouting of traffic to other nodes or network expansion. 

    7.5   AAL2 Termination Seizure Success Rate

    MGW11 MGW21 MGW31

    100% 100% 100%

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    57/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 57 of 90

    MGW11

     Aal2 Rejection = 16466

     Aal2 Request = 15917054851

     Aal2 Termination success Rate = 99.98%

    Healthy value range: 99.7 — 100% 

    MGW21

     Aal2 Rejection = 13202

     Aal2 Request = 3245811935

     Aal2 Termination success Rate = 99.95%

    Healthy value range: 99.7 — 100%

    MGW31

     Aal2 Rejection = 52290

     Aal2 Request = 13450554991

     Aal2 Termination success Rate = 99.96%

    Healthy value range: 99.7 — 100%

    7.6  TDM Termination Reservation Success Rate

    MGW11

    TDM Rejection = 2247053

    TDM Request = 24816959645

    TDM Termination success Rate = 99.09%

    Healthy value range: 99.7 — 100%

    MGW21

    TDM Rejection = 7057901

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    58/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 58 of 90

    TDM Request = 10919805864

    TDM Termination success Rate = 93.53%

    Healthy value range: 99.7 — 100%

    MGW31

    TDM Rejection = 952605

    TDM Request = 15153868410

    TDM Termination success Rate = 99.37%

    Healthy value range: 99.7 — 100%

    7.7  IP Termination Seizure Success Rate

    Not Applicable. N/A

    7.8  Originating Nb Connection Initialization Success Rate

    MGW11

    Nb Init Fault = 0

    Nb Init = 4486564122

    Nb Connection Initialization Success Rate= 100 %

    Healthy value range: 99.7 — 100% 

    MGW21

    Nb Init Fault = 0

    Nb Init = 75132256

    Nb Connection Initialization Success Rate= 100 %

    Healthy value range: 99.7 — 100% 

    MGW31

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    59/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 59 of 90

    Nb Init Fault = 0Nb Init = 5027045203

    Nb Connection Initialization Success Rate= 100 %

    Healthy value range: 99.7 — 100% 

    7.9  Software Licensing, Media Stream Channel Seizure Success Rate

    MGW11

    Stream Channels Rejection= 0

    Stream Channel request = 35791661320

    Channel Seizure Success Rate = 100%

    Healthy value range: 99.7 — 100% 

    MGW21

    Stream Channels Rejection= 0

    Stream Channel request = 14008237011

    Channel Seizure Success Rate = 100%

    Healthy value range: 99.7 — 100%

    MGW31

    Stream Channels Rejection= 0

    Stream Channel request = 24743543905

    Channel Seizure Success Rate = 100%

    Healthy value range: 99.7 — 100% 

    7.10  Interactive Messaging, Basic Message Success Rate

    Not Valid as no data is available

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    60/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 60 of 90

    7.11  Interactive Messaging, Message Composition Success Rate

    MGW11

    Call Attempt = 5614692755

    Call Rejection = 0

    Message composition success Rate = 100%

    Healthy value range: 99.7 — 100%

    MGW21

    Call Attempt = 3095522565

    Call Rejection = 0

    Message composition success Rate = 100%

    Healthy value range: 99.7 — 100% 

    MGW31

    Call Attempt = 4019940232

    Call Rejection = 0

    Message composition success Rate = 100%

    Healthy value range: 99.7 — 100% 

    7.12  Outgoing AAL2 Connection Reservation Success Rate

    MGW11

    Succ Out Conns Remote Qos ClassA= 3922201

    UnSucc Out Conns Remote Qos ClassA=2087

     Aal2 connection success rate = 99.94%

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    61/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 61 of 90

    MGW21

    Data Not Available

    MGW31

    Succ Out Conns Remote Qos Class A= 2543085

    UnSucc Out Conns Remote Qos ClassA= 2069

     Aal2 connection success rate = 81.29%

    7.13  Retainability

    It shall be possible to measure retainability on a M-MGw node level. In addition it shall be ensured that externalfaults and problems, independent from M-MGw, are excluded from M-MGw retainability result.

    The external part is can be left on lower priority as those can be assumed to be covered by other nodescontributing the network retainability.

    Note: the core network level retainability shall be measured in MSC server.

    The Service Retainability measurement shows the M-MGw ability to retain the services, once obtained, for thedesired duration. The measurement is made for physical M-MGw.

    Reatinabilty

    MGW11 MGW21 MGW31

    100% 100% 100%

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    62/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 62 of 90

    pmNrOfGcpNotifyCsdFaultAEst

    The total number of encountered Circuit Switched Data (CSD) termination faults after bearer establishment (between establishment of bearer and reception of Gateway Control Protocol (GCP) Sub, resulting in the sendingof a GCP Notify message towards the MGC.

    Condition: The counter is incremented when a notify message is sent for CSD calls (both internal and externalreasons counted) between establishment of bearer and GCP Sub (tear down of connection).

    pmNrOfGcpNotifySpeechFaultAEst

    The total number of encountered speech termination faults after bearer establishment (betweenestablishment of bearer and reception of Gateway Control Protocol (GCP) Sub that result in the sending of a GCP

    Notify message towards the Media Gateway controller (MGC).

    Condition: The counter is incremented when a notify message is sent for speech calls (both internal and externalreasons counted) between establishment of bearer and GCP Sub (tear down of connection).

    Possible reasons for falling below the healthy value range:  High processor load  Congestion in device pool, for example in AMR pool  Problems (for example. faults) in some M-MGw resources

    Possible consequence of falling below the healthy value range:  Increased amount of dropped calls

    7.14  Integrity

    The integrity is the ability of an external connection to maintain requested service at desired quality.It shall be possible to measure integrity on a M-MGw node level. Even though it might be difficult to get anobjective view on what level of integrity (=quality of service) is still normal and acceptable M-MGw shall haveindicators for data handling quality.

    The possible measures integrity on a connection type level are:

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    63/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 63 of 90

    Traffic over ATM, except broadband signalling, is left out since quality related measurements on ATM wouldcause considerable high load on the node.Due to the same reason all current ATM quality supervision measurements have to be set ‘ON’ separately andnumber of them is l imited. Besides, ATM is considered very reliable and robust and would not be meaningful tobe monitored (except when building up the network or debugging specific problems).

    7.14.1  SS7 over ATM QoS

    The SS7 over ATM QoS measurement is used for calculating the SS7 broadband signalling quality (over ATM). Itshows the ratio of successfully handled signalling packets. The measurement is made for physical M-MGw.

    Formulas

    SS7 Broad Band Signalling Quality

    Healthy value range: 99.999–100% (long term average)

    Possible reason for falling below the healthy value range:

      Protocol errors  Link congestion

    Possible consequence of falling below the healthy value range:  Decreased capacity for handling ATM based broadband signalling

    Recommended action when exceeding the healthy value range:  Reconfigure Nni Saal Profile.

    7.14.2  SS7 over TDM QoS

    PI Integrity Healthy ATM Transport QoS, Jitter 99,9%

    IP Transport QoS, Packet Loss 99,9%

    IP Transport QoS, Jitter 99,9%

    SS7 over ATM QoS 99,9%

    SS7 over IP QoS 99,9%

    SS7 over TDM QoS 99,9%

    MGW11 MGW21 MGW31

    99.99% 99.99% 100%

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    64/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 64 of 90

    The SS7 over TDM QoS measurement is used for calculating the incoming and outgoing SS7 narrowbandsignalling quality (over TDM). It shows the ratio of successfully handled signalling packets. The measurement ismade for physical M-MGw.

    Narrow Band Signalling Quality

    termination point types:  Mtp2TpItu (when using ITU standard)  Mtp2TpAnsi (when using ANSI standard)  Mtp2TpChina (when using MII standard)

    Healthy value range: 99.999–100% (long term average).

    Possible reason for falling below the healthy value range:  Protocol errors  Link congestion

    Possible consequence of falling below the healthy value range:  Decreased capacity for handling TDM based narrowband signalling

    Recommended action when exceeding the healthy value range:  Reconfigure Mtp2 Profile.

    7.14.3  Signaling over IP discard Ratio (Giga Bit Ethernet interface)

    The Signaling over IP QoS, IP Packet Discard Ratio measurements are used for calculating the IP Packet DiscardRatio (IPDR) of connections in an IP interface, defined for signaling over IP traffic, on an ET-MFG board. Themeasurement is made for IpInterface.

    Discard received IP datagram

    MGW11 MGW21 MGW31

    0 0 0

    Healthy value: At most 10^-5 (long term average, 0 - 0.001%)

    Discard send IP datagram

    MGW11 MGW21 MGW31

    99.97% 99.98% 99.98%

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    65/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 65 of 90

    MGW11 MGW21 MGW31

    0 0 0

    Healthy value: At most 10^-5 (long term average, 0 - 0.001%)

     Values are in healthy range no action required

    7.14.4  IP Bearer success rate (HOST)

    The Signaling over IP QoS, IP Packet Error Ratio (Host) measurements are used for calculating the received IP

    Packet Error Ratio (IPER) in an IP host in the M-MGw, for signaling over IP related traffic. The measurement ismade for IpAccessHostGpb.

    MGW11 MGW21 MGW31

    0 0 0

    Healthy value: At most 10^-5 (long term average, 0 - 0.001%)

     Value in healthy range no action required

    7.14.5   Aal2 Bearer establish success rate

    The AAL2 Bearer Establishment Success Rate measurement is used to monitor the AAL2 bearer establishment

    success rate. The measurement is made per VMGw.

    MGW11 MGW21 MGW31

    95.97% 100% 99.99%

    Health value = 99.99%

     Very slight Rejection in MGW11.Recommended actions when falling below the healthy value range:

    Identify and redimension (if possible) the congested resources in the local node.

    7.14.6  SCTP

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    66/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 66 of 90

    Number of SCTP packets received from the peers, with an invalid checksumMGW11 MGW21 MGW31

    0 0 0

    Number of unordered chunks sent to the peers

    MGW11 MGW21 MGW31

    Not Zero 0 Not Zero

    Number of unordered chunks received from the peers.

    MGW11 MGW21 MGW31

    Not Zero 0 Not Zero

    Number of sent chunks dropped, when the sending buffer overflows.

    MGW11 MGW21 MGW31

    0 0 0

    The target value for all of them should be 0.For the case where it is not Zero please check the IP backbone, disturbance and fluctuation for the IP associated

    with the relevant SCTP.The problem is in the IP backbone.

    7.14.7  Sigtran Retransmission

    MGW11 MGW21 MGW31

    0.0328 0.0030 0.0034

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    67/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 67 of 90

    Healthy value: (long term average, 0 — 0.001%)Problem may be in the IP backbone .Maybe due to the continuous fluctuation in the IP back bone close

    monitoring should be done.

    7.14.8  M3UA

    The Number of Received and Sent M3UA Payload Data Messages

    MGW Sent Receive Cong

    MGW11 420059165.5 381350197.8 1.67

    MGW22 207794070.8 202341239.7 0

    MGW33 425233858.3 413335803 0

    Here Congestion is not the formula but it is calculated on average basis, so very slight congestion in MGW11 it is

    ignorable as it in peak hours only, but recommendation is to increase the association. It was observed quite

    often ,the disturbance in the IP backbone. Mention below is the time when disturbance was seen in MGW11 and

    MGW22

    MGW21(Time) Sent Receive MGW11(Time) Sent Receive

    '20110317001500 0 0 '20110317001500 0 0

    '20110317003000 0 0 '20110317003000 0 0'20110317004500 0 0 '20110317004500 0 0

    '20110317010000 0 0 '20110317010000 0 0

    '20110317011500 0 0 '20110317011500 0 0

    '20110317013000 0 0 '20110317013000 0 0

    '20110317014500 0 0 '20110317014500 0 0

    '20110317020000 0 0 '20110317020000 0 0

    '20110317021500 0 0 '20110317021500 0 0

    '20110317023000 0 0 '20110317023000 0 0

    '20110317024500 0 0 '20110317024500 0 0

    '20110317030000 0 0 '20110317030000 0 0

    '20110317031500 0 0 '20110317031500 0 0

    '20110317033000 0 0 '20110317033000 0 0'20110317034500 0 0 '20110317034500 0 0

    '20110317040000 0 0 '20110317040000 0 0

    '20110317041500 0 0 '20110317041500 0 0

    '20110317043000 0 0 '20110317043000 0 0

    '20110317044500 0 0 '20110317044500 0 0

    '20110317050000 0 0 '20110317050000 0 0

    '20110317051500 0 0 '20110317051500 0 0

    '20110317053000 0 0 '20110317053000 0 0

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    68/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 68 of 90

    '20110317054500 0 0 '20110317054500 0 0

    '20110317060000 0 0 '20110317060000 0 0'20110317061500 0 0 '20110317061500 0 0

    '20110317063000 0 0 '20110317063000 0 0

    '20110317064500 0 0 '20110317064500 0 0

    '20110317070000 0 0 '20110317070000 0 0

    '20110317071500 0 0 '20110317071500 0 0

    '20110317073000 0 0 '20110317073000 0 0

    '20110317074500 0 0 '20110317074500 0 0

    '20110317080000 0 0 '20110317080000 0 0

    '20110317081500 0 0 '20110317081500 0 0

    '20110317083000 0 0 '20110317083000 0 0

    '20110317084500 0 0 '20110317084500 0 0

    '20110317090000 0 0 '20110317090000 0 0'20110317091500 0 0 '20110317091500 0 0

    '20110317093000 0 0 '20110317093000 0 0

    '20110317094500 0 0 '20110317094500 0 0

    '20110317100000 0 0 '20110317100000 0 0

    '20110317101500 0 0 '20110317101500 0 0

    '20110317103000 0 0 '20110317103000 0 0

    '20110317104500 0 0 '20110317104500 0 0

    '20110317110000 0 0 '20110317110000 0 0

    '20110317111500 0 0 '20110317111500 0 0

    '20110317113000 0 0 '20110317113000 0 0

    '20110317114500 0 0 '20110317114500 0 0

    '20110317120000 0 0 '20110317120000 0 0'20110317121500 0 0 '20110317121500 0 0

    '20110317123000 0 0 '20110317123000 0 0

    '20110317124500 0 0 '20110317124500 0 0

    '20110317130000 0 0 '20110317130000 0 0

    '20110317131500 0 0 '20110317131500 0 0

    '20110317133000 0 0 '20110317133000 0 0

    '20110317134500 0 0 '20110317134500 0 0

    '20110317140000 0 0 '20110317140000 0 0

    '20110317141500 0 0 '20110317141500 0 0

    '20110317143000 0 0 '20110317143000 0 0

    '20110317144500 0 0 '20110317144500 0 0

    '20110317150000 0 0 '20110317150000 0 0'20110317151500 0 0 '20110317151500 0 0

    '20110317153000 0 0 '20110317153000 0 0

    '20110317154500 0 0 '20110317154500 0 0

    '20110317160000 0 0 '20110317160000 0 0

    '20110317161500 0 0 '20110317161500 0 0

    '20110317163000 0 0 '20110317163000 0 0

    '20110317164500 0 0 '20110317164500 0 0

    '20110317170000 0 0 '20110317170000 0 0

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    69/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 69 of 90

    '20110317171500 0 0 '20110317171500 0 0

    '20110317173000 0 0 '20110317173000 0 0'20110317174500 0 0 '20110317174500 0 0

    '20110317180000 0 0 '20110317180000 0 0

    '20110317181500 0 0 '20110317181500 0 0

    '20110317183000 0 0 '20110317183000 0 0

    '20110317184500 0 0 '20110317184500 0 0

    '20110317190000 0 0 '20110317190000 0 0

    '20110317191500 0 0 '20110317191500 0 0

    '20110317193000 0 0 '20110317193000 0 0

    '20110317194500 0 0 '20110317194500 0 0

    '20110317200000 0 0 '20110317200000 0 0

    '20110317201500 0 0 '20110317201500 0 0

    '20110317203000 0 0 '20110317203000 0 0'20110317204500 0 0 '20110317204500 0 0

    '20110317210000 0 0 '20110317210000 0 0

    '20110317211500 0 0 '20110317211500 0 0

    '20110317213000 0 0 '20110317213000 0 0

    '20110317214500 0 0 '20110317214500 0 0

    '20110317220000 0 0 '20110317220000 0 0

    '20110317221500 0 0 '20110317221500 0 0

    '20110317223000 0 0 '20110317223000 0 0

    '20110317224500 0 0 '20110317224500 0 0

    '20110317230000 0 0 '20110317230000 0 0

    '20110317231500 0 0 '20110317231500 0 0

    '20110317233000 0 0 '20110317233000 0 0'20110317234500 0 0 '20110317234500 0 0

    '20110318000000 0 0 '20110318000000 0 0

    7.15  Key Performance Indicators for Traffic and Load

    This category provides information about the current status of a node, mainly from resource usage point of view.We suggest the monitoring of the following KPI for Traffic and load:

    KPI Traffic Healthy Processor Load 0-80%

    Current Traffic Load NA

    Software Licensing, Media Stream Channel Utilization Rate (M-MGW R5)0-80%

    The following KPI may be monitoring in case of problems in a specific area.

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    70/90

     

    Core Network Audit Report for XXXX

    Aircom International 2011– Commercial & Confidence Page | 70 of 90

    PI Traffic optional HealthyM-MGW NodeSTP&SGw, SEP and SRP Signalling Traffic (MSU/s) See Ref Error!

    Reference source not

    found. pag 36

    User Plane Services

    Media Stream Resource Reservation Rate 0-80%

    Number of GSM CSD Connections, Analogue (Modem) Services NA

    Number of GSM Fax Connections NA

    Number of Non-transparent GSM CSD Connections, Digital Services NA

    Number of Non-transparent WCDMA CSD Connections, Digital

    Services

    NA

    Number of Transparent WCDMA CSD Connections, Digital Services NA

    Number of WCDMA CSD Connections, Analogue (modem) Services NA

    Q.2630

     AAL2 Pipe Utilization Rate 0-80%

    GCP

    GCP Message Statistics See Ref Error!

    Reference source not

    found. 

    SCCP

    SCCP Policing 0

    SCCP Relay NA

    MTP3/MTP3b/M3UA

    MTP3b Signalling Link Usage NA

    Number of Received and Sent M3UA Payload Data Messages NA

  • 8/17/2019 Sample Core Technical Audit Report.pdf

    71/90

     

    Core Network Aud