+ All Categories
Home > Documents > [IEEE 2012 IX International Symposium on Telecommunications (BIHTEL) - Sarajevo, Bosnia and...

[IEEE 2012 IX International Symposium on Telecommunications (BIHTEL) - Sarajevo, Bosnia and...

Date post: 18-Dec-2016
Category:
Upload: sabina
View: 219 times
Download: 5 times
Share this document with a friend
6
2012 IX International Symposium on Telecommunications (BIHTEL) October 25-27, 2012, Sarajevo, Bosnia and Herzegovina 978-1-4673-4876-8/12/$31.00 ©2012 IEEE Basic Telephony SIP End – to – End Performance Metrics Jasmina Baraković Husić, Himzo Bajrić BH Telecom, Joint Stock Company, Sarajevo Sarajevo, Bosnia and Herzegovina {jasmina.barakovic, himzo.bajric}@bhtelecom.ba Emina Neković Faculty of Electrical Engineering Sarajevo Sarajevo, Bosnia and Herzegovina [email protected] Sabina Baraković Ministry of Security of Bosnia and Herzegovina Sarajevo, Bosnia and Herzegovina [email protected] Abstract— The Internet Protocol Multimedia Subsystem (IMS) has been recognized as a common signaling architecture for providing next generation multimedia services. In order to enhance perceived service quality regardless of access network and device, IMS supports Quality of Service (QoS) negotiation and signaling. The IMS procedures used for that purpose are based on Session Initiation Protocol (SIP). Therefore, signaling performance of SIP plays an important role in affecting the overall Quality of Experience (QoE) in next generation networks. Although many standards to evaluate the performance of telephony signaling protocols have been proposed, none of the metrics address SIP. Recent research in this field has resulted in the definitions of a standard set of metrics for measuring the performance of end-to-end SIP for basic telephony service. This paper aims to provide an insight in the process of measuring and evaluating SIP performance metrics as defined in Internet Engineering Task Force (IETF) Request for Comments (RFC) 6076. The open source IMS platform is used as the test environment for the purpose of examining SIP performance metrics under different conditions. Presented results show the impact of SIP signaling performance on QoE for IMS-based telephony service. Keywords-IMS; IRA; ISA; QoE; QoS; RDD; SCR; SDT; SEER; SER; SIP; SRD I. INTRODUCTION The Internet Protocol Multimedia Subsystem (IMS) has been recognized as a generic standard architecture for offering next generation multimedia services. Initially specified by Third Generation Partnership Project (3GPP) and adopted by other standards bodies such as European Telecommunications Standards Institute (ETSI), the IMS has been defined as a multimedia session control subsystem based on a horizontally layered architecture for the provision of multimedia services [1]. Unlike best effort services on the Internet, IMS supports end-to-end Quality of Service (QoS) for converged mobile and fixed multimedia services [2]. These services require signaling before the session starts. The signaling is used for session management including dynamic negotiation between involved parties in order to agree on a common and feasible set of QoS parameters at session setup, as well as reacting to changes during the session [3]. Different protocols can be used to deal with the vast variety of signaling procedures. The IMS has adopted Session Initiation Protocol (SIP) as the protocol of choice for integrating them into its signaling operations. SIP is a general- purpose, easy-to-implement application layer protocol originally designed for session establishment, modification and release [4]. These SIP signaling procedures have direct impact not only on the QoS, but also on the overall Quality of Experience (QoE) [5]. Therefore, this paper deals with the process of measuring and evaluating SIP signaling performance. Although many standards for evaluation the performance of telephony signaling protocols have been proposed such as Signaling System No. 7 (SS7), none of the metrics address SIP. Recent research in this field has resulted in the definitions of a standard set of metrics for measuring the performance of end- to-end SIP for basic telephony service [6]. However, any numerical objectives or acceptance threshold values for these metrics have not been provided. Internet Engineering Task Force (IETF) has defined the following SIP performance metrics in Request for Comments (RFC) 6076: Registration Request Delay (RRD), Ineffective Registration Attempts (IRAs), Session Request Delay (SRD), Session Disconnect Delay (SDD), Session Duration Time (SDT), Session Establishment Ratio (SER), Session Establishment Effectiveness Ratio (SEER), Ineffective Session Attempts (ISAs), and Session Completion Ratio (SCR). These performance metrics are intended to be utilized in production telephony environments for providing input regarding Key Performance Indicators (KPIs) and Service Level Agreement (SLA) indications. Additionally, they may be used for testing end-to-end SIP-based service environments. The use of these metrics provides common viewpoint across all vendors, service providers, and users [6]. The aim of our research is to examine in detail the behavior of SIP performance metrics defined in RFC 6076 under different conditions. The intention is to perform measurements upon the open source IMS platform through a series of different scenarios and to consider the impact of SIP performance metrics on QoE for basic telephony service. The rest of the paper is organized as follows. Section II describes the related work. Section III introduces measurement methodology and environment used for the purpose of examining SIP performance metrics. Measurement results of SIP performance metrics are presented and discussed in Section IV. Section V gives the concluding remarks and identifies opportunities for further research.
Transcript
Page 1: [IEEE 2012 IX International Symposium on Telecommunications (BIHTEL) - Sarajevo, Bosnia and Herzegovina (2012.10.25-2012.10.27)] 2012 IX International Symposium on Telecommunications

2012 IX International Symposium on Telecommunications (BIHTEL)October 25-27, 2012, Sarajevo, Bosnia and Herzegovina

978-1-4673-4876-8/12/$31.00 ©2012 IEEE

Basic Telephony SIP End – to – End

Performance Metrics

Jasmina Baraković Husić, Himzo Bajrić

BH Telecom, Joint Stock Company, Sarajevo

Sarajevo, Bosnia and Herzegovina {jasmina.barakovic, himzo.bajric}@bhtelecom.ba

Emina Neković

Faculty of Electrical Engineering Sarajevo

Sarajevo, Bosnia and Herzegovina

[email protected]

Sabina Baraković

Ministry of Security of Bosnia and Herzegovina

Sarajevo, Bosnia and Herzegovina [email protected]

Abstract— The Internet Protocol Multimedia Subsystem (IMS)

has been recognized as a common signaling architecture for

providing next generation multimedia services. In order to

enhance perceived service quality regardless of access network

and device, IMS supports Quality of Service (QoS) negotiation

and signaling. The IMS procedures used for that purpose are

based on Session Initiation Protocol (SIP). Therefore, signaling

performance of SIP plays an important role in affecting the

overall Quality of Experience (QoE) in next generation networks.

Although many standards to evaluate the performance of

telephony signaling protocols have been proposed, none of the

metrics address SIP. Recent research in this field has resulted in

the definitions of a standard set of metrics for measuring the

performance of end-to-end SIP for basic telephony service. This

paper aims to provide an insight in the process of measuring and

evaluating SIP performance metrics as defined in Internet

Engineering Task Force (IETF) Request for Comments (RFC)

6076. The open source IMS platform is used as the test

environment for the purpose of examining SIP performance

metrics under different conditions. Presented results show the

impact of SIP signaling performance on QoE for IMS-based

telephony service.

Keywords-IMS; IRA; ISA; QoE; QoS; RDD; SCR; SDT;

SEER; SER; SIP; SRD

I. INTRODUCTION

The Internet Protocol Multimedia Subsystem (IMS) has been recognized as a generic standard architecture for offering next generation multimedia services. Initially specified by Third Generation Partnership Project (3GPP) and adopted by other standards bodies such as European Telecommunications Standards Institute (ETSI), the IMS has been defined as a multimedia session control subsystem based on a horizontally layered architecture for the provision of multimedia services [1]. Unlike best effort services on the Internet, IMS supports end-to-end Quality of Service (QoS) for converged mobile and fixed multimedia services [2]. These services require signaling before the session starts. The signaling is used for session management including dynamic negotiation between involved parties in order to agree on a common and feasible set of QoS parameters at session setup, as well as reacting to changes during the session [3].

Different protocols can be used to deal with the vast variety of signaling procedures. The IMS has adopted Session Initiation Protocol (SIP) as the protocol of choice for

integrating them into its signaling operations. SIP is a general-purpose, easy-to-implement application layer protocol originally designed for session establishment, modification and release [4]. These SIP signaling procedures have direct impact not only on the QoS, but also on the overall Quality of Experience (QoE) [5]. Therefore, this paper deals with the process of measuring and evaluating SIP signaling performance.

Although many standards for evaluation the performance of telephony signaling protocols have been proposed such as Signaling System No. 7 (SS7), none of the metrics address SIP. Recent research in this field has resulted in the definitions of a standard set of metrics for measuring the performance of end-to-end SIP for basic telephony service [6]. However, any numerical objectives or acceptance threshold values for these metrics have not been provided.

Internet Engineering Task Force (IETF) has defined the following SIP performance metrics in Request for Comments (RFC) 6076: Registration Request Delay (RRD), Ineffective Registration Attempts (IRAs), Session Request Delay (SRD), Session Disconnect Delay (SDD), Session Duration Time (SDT), Session Establishment Ratio (SER), Session Establishment Effectiveness Ratio (SEER), Ineffective Session Attempts (ISAs), and Session Completion Ratio (SCR). These performance metrics are intended to be utilized in production telephony environments for providing input regarding Key Performance Indicators (KPIs) and Service Level Agreement (SLA) indications. Additionally, they may be used for testing end-to-end SIP-based service environments. The use of these metrics provides common viewpoint across all vendors, service providers, and users [6].

The aim of our research is to examine in detail the behavior of SIP performance metrics defined in RFC 6076 under different conditions. The intention is to perform measurements upon the open source IMS platform through a series of different scenarios and to consider the impact of SIP performance metrics on QoE for basic telephony service.

The rest of the paper is organized as follows. Section II describes the related work. Section III introduces measurement methodology and environment used for the purpose of examining SIP performance metrics. Measurement results of SIP performance metrics are presented and discussed in Section IV. Section V gives the concluding remarks and identifies opportunities for further research.

Page 2: [IEEE 2012 IX International Symposium on Telecommunications (BIHTEL) - Sarajevo, Bosnia and Herzegovina (2012.10.25-2012.10.27)] 2012 IX International Symposium on Telecommunications

II. RELATED WORK

A standard set of common metrics for evaluating the performance of end-to-end SIP for telephony service has been defined by IETF. Similar metrics for circuit-switched services has been specified by International Telecommunication Union - Telecommunication Standardization Sector (ITU-T). Post-Selection Delay (PSD) is proposed in ITU-T Recommendation E.721. This metric is similar to SRD defined in RFC 6076. Answer Seizure Ratio (ASR) and Network Effectiveness Ratio (NER) are defined in ITU-T Recommendation E.411. These metrics are similar to SER and SEER, respectively [6].

Beside standardization activities, several studies have been performed resulting in definition of different SIP performance metrics. Session Setup Delay (SSD) is defined in [3] as the time interval from the moment when a user sends INVITE request until the user receives an alarming message. This metric is similar to Call Setup Delay (CSD), which is discussed in [2]. CSD consists of Post-Dialing Delay (PDD), Answer Signal Delay (ASD) and Call Release Delay (CRD). PDD is the time between when the caller clicks the button of the terminal to call another caller and the time the caller hears the callee’s terminal ringing. ASD is the time between when the callee picks up the phone and the time the caller receives indication of this. CRD is the time between when the releasing party hangs up the phone and the time the same party can initiate/receive a new call. The values of these metrics depend on distance at which the calls are established. For example, a SIP call between Australia and Europe will indicate significantly larger PDD, ASD and CRD. The delay occurring during SIP messages transmission for different values of Block Error Rates (BLER) is observed in [7]. It is concluded that this delay increases when the message size increases.

In order to describe the performance of a single SIP transaction, a new concept called Quality of Signaling (QoSg) is introduced in [8]. This concept defines the following metrics: User to User Delay (UUD), Processing Delay (PD) and Response Delay (RpD). UUD represents the time interval from sending a request until it arrives at the destination. PD represents the time the User Agent Server (UAS) needs to process the request and to send a response message. RpD is the time a User Agent Client (UAC) waits from sending a request until it receives a response message. The delay introduced by SIP transactions over 802.11b links is also investigated in [9].

SIP performance metrics proposed in RFC 6076 are not sufficient to resolve the signals that are exchanged through the entire duration of the session. In order to accurately assess the negotiation process, Session Negotiation Time (SNT) is defined in [10]. This metric indicates the time interval required for negotiation during a session. SNT represents the time elapsed between sending a SIP INVITE request, which begins the negotiation procedure until appropriated 200 OK response is received. In addition, a new metric called Session Renegotiation Delay (SRNT) is introduced in [10]. This metric is used to describe the process of renegotiation that is initiated by the SIP UPDATE request or by sending INVITE request that carries updated information and it stops when corresponding 200 OK response is received.

Performance and modeling of SIP session setup has been studied in [11, 12]. The session setup performance was considered for SIP over different transport protocols. The setup delays of SIP over User Datagram Protocol (UDP) were found to be less than those over Transmission Control Protocol (TCP) or Stream Control Transmission Protocol (SCTP). The performed analysis has shown that the lognormal and Rayleigh model should be applied for SSD components [11].

The SIP performance metrics were studied in [13] with the aim of evaluating the operation of the IMS core components. In order to benchmark how IMS test bed performs, the following four metrics has been identified in [14]: Registration Time, Initial Response Time, Initial Ringing Time, and Disconnect Request Time. Although the most of research activities deal with the estimation of system specific metrics, our research is focused on a set of metrics and their usage to evaluate the performance of end-to-end SIP for telephony service in testing environment.

III. MEASUREMENT ENVIRONMENT AND METHODOLOGY

A. Measurement Environment

SIP performance metrics were measured on the Open Source IMS Core platform. This platform was developed within the laboratory of Fraunhofer Institute for Open Communication Systems (FOKUS) in Berlin [15]. The simulations were carried out over ACER ASPIRE 5732Z laptop with 2.2 GHz Intel Pentium T4400 processor and GMA 4500 M. The Linux distribution Ubuntu 12.04 was used as an open source operating system. The measurements were based on using the Multimedia Open InerNtet Services and Telecommunication EnviRonment (myMONSTER)-Telco Communicator Suite (TCS) Software Release 0.9.25 [16].

The Open Source IMS Core is an open source implementation of Call Session Control Functions (CSCFs) and Home Subscriber Server (HSS), which together form the core elements of IMS architecture. The Open IMS CSCFs are built upon the SIP Express Router (SER), which can act as SIP registrar, proxy or redirection server and is capable of handling many thousands of calls per second. The Proxy-CSCF (P-CSCF) presents firewall at the application level. The Interrogating-CSCF (I-CSCF) has a role of a stateless proxy that sends requests to the HSS and based on responses it routes the message to the appropriate Serving-CSCF (S-CSCF), which is implemented for updating registration information. FOKUS developed its own prototype HSS, the FOKUS HSS (FHoSS), which stores the IMS user profiles and provides the location information of the user [17]. Fig. 1 shows an Open Source IMS Core architecture overview.

To measure the SIP performance metrics under different conditions, the Open Source IMS Core platform is loaded by SIPp traffic generator. It is a free open source test tool/traffic generator for establishing and releasing multiple calls with the INVITE and BYE methods [18]. The Wireshark packet analyzer is used to capture SIP signaling traffic and analyze the SIP performance metrics in different measurement scenarios [19].

Page 3: [IEEE 2012 IX International Symposium on Telecommunications (BIHTEL) - Sarajevo, Bosnia and Herzegovina (2012.10.25-2012.10.27)] 2012 IX International Symposium on Telecommunications

Figure 1. Open Source IMS Core architecture overview

B. Measurement Methodology

The SIP performance measurements were performed using two User Agents (UAs): sip:[email protected] and sip:[email protected]. Additional SIP UAs, i.e. sip:[email protected] and sip:[email protected], were configured with the aim of generating a traffic load in IMS network. For that purpose, the following command was used: ./add-imscore-user_newdb.sh -u tom -t tel:11 –a.

The SIP end-to-end performance metrics were measured under different conditions. For that purpose, the SIPp traffic generator was used, requiring the corresponding eXtensible Markup Language (XML) scripts. These scripts simulate the process of registration and session establishment. The registration process is simulated using the following command: ./sipp –sf reg_alice.xml –r 10 –rp 1000 127.0.0.1:4060 –i

127.0.0.1 –p 3089. This command runs a script called reg_alice.xml, which represents a registration of SIP UA Alice on port number 3089 by sending 10 REGISTER requests (–r

10) in a time interval of 1000 ms (–rp 1000).

The IMS network is loaded by generating different number of REGISTER requests, which is chosen arbitrarily. This number is in range from 0 to 150 messages (generating interval is 10 messages per second) when SIP registration metrics are considered. The number of REGISTER requests is in range from 0 to 100 messages (generating interval is 5 messages per second) when all other SIP performance metrics are measured. A massage generating interval value is shorter in the session establishment process than in the registration process because S-CSCF becomes overloaded quickly during registration process.

IV. MEASUREMENT RESULTS AND DISCUSSION

This section presents the measurement results of SIP performance metrics defined in RFC 6076. The obtained results are not discussed in comparison with ones summarized in Section II because the measurements were performed under different conditions and environment. However, the results are analyzed in the context of QoE.

A. Registration Request Delay (RRD)

The RRD describes the delay in the responding to a REGISTER request. It is measured and analyzed only for successful REGISTER requests and at the originating UA [6]. This metric is calculated using (1):

Request. REGISTER of Time- Response Final of TimeRRD =

(1)

Fig. 2 shows the values of this metric for different loads of the IMS network. The congestion of S-CSCF occurred when 100 REGISTER requests are generated. Since RRD is calculated only for successful registrations, these values of RRD are not shown in Fig. 2. With the increasing load of IMS network, the value of RRD also increases. When there is no load in IMS network, the value of RRD equals 95.72 ms. The RRD rapidly increases when the number of REGISTER requests generated by SIPp is in range from 20 to 70 messages. After that, it linearly increases to about 2.6 s. Although satisfactory performance was achieved, it must be emphasized that measurement conditions do not match the reference conditions existing in the production environment, both for this and all other SIP performance metrics.

B. Ineffective Registration Attempts (IRA)

The IRA is used for detecting failures or defects that cause the inability of the registration server to receive a UA REGISTER request [6]. It is calculated as the percentage of unsuccessful registration at the total number of REGISTER requests:

[ ] .100% ⋅=REGISTER

IRA

N

NIRA

(2)

A total of 16 measurements were performed. There were 11 successful and 5 unsuccessful registration attempts. Unsuccessful registration attempts resulted in 600 “Busy everywhere” responses, indicating that S-CSCF was overloaded.

Figure 2. Graphical representation of RRD

Page 4: [IEEE 2012 IX International Symposium on Telecommunications (BIHTEL) - Sarajevo, Bosnia and Herzegovina (2012.10.25-2012.10.27)] 2012 IX International Symposium on Telecommunications

Equation (2) is used to calculate the value of IRA. The obtained result equals 31.25%, which is satisfactory, but unreal value. It can be concluded that with the increasing load of IMS network, the value of IRA also increases.

C. Session Request Delay (SRD)

The SRD is used for detecting the faults or defects causing delays in responding to INVITE request [6]. It is calculated for both successful and unsuccessful session setup requests using (3):

INVITE. of Time- Response Indicative Statusof TimeSRD =

(3)

A total of 21 measurements were performed. There were 15 successful and 6 unsuccessful session setup attempts. The obtained results for successful session setup SRD are shown on Fig. 3. It can be concluded that the increasing load of IMS network results in greater value of SRD. Session setup attempts were unsuccessful when number of generated REGISTER requests was in range from 75 to 100 messages, resulting in 408 “Request Timeout” and 500 “Server Internal Error” responses. Fig. 4 shows results for unsuccessful session setup SRD. The average value of this metric is about 29.95 s.

D. Session Disconnect Delay (SDD)

The SDD is defined as the interval between the first bit of the sent session completion message, such as BYE, and the last bit of the subsequently received 2xx response [6]. This metric is calculated using (4):

(BYE). MessageCompletion of Time-Timeout or 2xx of TimeSDD= (4)

Since this metric can be measured from either end-point UA involved in the SIP dialog, two measurement scenarios were conducted. The first scenario involved measuring SDD at the originating UA, and the second one measuring the SDD at the terminating UA. The obtained results are shown on Fig. 5. Due to the small difference between SDD values measured at the originating UA and terminating UA, it is entirely irrelevant which UA terminates the established session.

E. Session Duration Time (SDT)

The SDT is defined as the interval between receipt of the first bit of a 200 OK response to INVITE request and receipt of the last bit of associated BYE request [6]. This metric is calculated using (5):

INVITE. to response OK 200 of Time- TImeout or BYE of TimeSDT =

(5)

The SDT is measured at the originating UA and terminating UA. The values of this metric measured at the originating UA differ from the value measured at terminating UA as shown on Fig. 6. The SDT between originating and terminating UA is determined to be 15 s regardless of IMS network load. However, SDT exceeds this value and increases with the growing amount of generated SIP messages that overload IMS network. This is expected since IMS nodes need more time to process the corresponding SIP messages when they are under heavy load. Moreover, the measurement environment introduces additional processing delay.

Figure 3. Graphical representation of successful session setup SRD

Figure 4. Graphical representation of unsuccessful session setup SRD

Figure 5. Graphical representation of SDD at originating and terminating UA

Figure 6. Graphical representation of SDT at originating and terminating UA

Page 5: [IEEE 2012 IX International Symposium on Telecommunications (BIHTEL) - Sarajevo, Bosnia and Herzegovina (2012.10.25-2012.10.27)] 2012 IX International Symposium on Telecommunications

F. Session Establishment Ratio (SER)

The SER is used to detect the ability of a terminating UA or proxy server during the session establishment. It is defined as the ratio of INVITE requests resulting in 200 OK responses and the difference of the total number of INVITE requests and INVITE requests resulting in 3xx responses [6]. This metric is calculated using (6):

[ ] .100%3/

2/⋅

−=

xxwINVITEINVITE

xxwINVITE

NN

NSER (6)

A total of 21 measurements were performed. There were 15 successful and 6 unsuccessful session setup attempts. None of responses were from the 3xx class because all measurements were performed at one domain. Therefore, the value of SER equals 71.43%.

G. Session Establishment Effectiveness Ratio (SEER)

The SEER is complementary to the SER, but is intended to exclude potential effects on the individual user. It is defined as the ratio of the number of INVITE requests resulting in a 200 OK response and INVITE requests resulting in a 480, 486, 600 or 603 responses; to the total number of attempted INVITE requests less INVITE requests resulting in a 3xx response [6]. This metric is calculated using (7):

[ ] .100%3/

603600,486,480,200/⋅

−=

xxwINVITEINVITE

orwINVITE

NN

NSEER (7)

None of session attempts resulted in 480, 486, 600 or 603 response codes. Thus, the SEER is equivalent to the SER in our measurement study. This metric value equals 71.43%.

H. Ineffective Session Attempts (ISA)

The ISA occur when SIP entities are damaged or overloaded [6]. This metric is calculated as a percentage of total session setup requests using (8):

[ ] .100% ⋅=INVITE

ISA

N

NISA (8)

Since 6 of 21 session attempts in our measurement study

were unsuccessful, the value of ISA equals 28.57 %.

I. Session Completion Ratio (SCR)

The SCR presents percentage of successfully completed sessions in the total number of sessions [6]. This metric is calculated using (9):

[ ] .100% ⋅=INVITE

SESSION

N

NSCR (9)

Since the measurements were made at one domain, the SCR and SER are identical. The value of SCR equals 71.43%.

It can be noticed that with the increasing number of session attempts, the percentage of ISA also increases, while the percentage of SER, SEER and SCR decreases. Fig. 7 shows the comparison of these metric values.

Figure 7. Comparison of values of SCR, SER, SEER and ISA

J. Discussion of Measurement Results

Service providers have traditionally focused on determining and managing QoS, not QoE. In today’s competitive world of multimedia services, it is imperative for service providers to differentiate themselves in terms of managing QoE. Users have more than one option to choose the service provider on the basis of their experience and satisfaction. Satisfied users recommend the service, whereas dissatisfied users leave the service and share their experience with others. Therefore, it is a great issue to keep the users loyal with a service provider and to prevent the churn [20].

Next generation multimedia services have to be delivered with a high QoE. These services require signaling before the session starts. This makes users experience the quality of service for the first time during the signaling procedures. For example, the session setup procedure starts when the initial request is sent by the originating UA until the corresponding response indicating that the new session can be admitted is received to the terminating UA. If the request is failed or delayed, the user's experience decreases. In this paper, QoE is defined as the measure of how well a system meets the user's expectations. Although this concept is directly related to QoS, the estimation of user's experience is an entirely different measure than QoS [21]. This paper does not deal with different methods to elaborate the user's experience, but indicates the importance of SIP performance metrics as factor that contributes to QoE.

Decomposing the service session into separate, measurable service elements that contribute to the user QoE, can be very effective way to measure changes in QoE. The basic telephony service can be decomposed into the following sequential elements [22]: service start, connection setup time, voice quality and call disconnect time. Traditional service quality measurements focus on the media-related service elements, such as voice quality, while the signaling-related service elements, such as connection setup time or call disconnect time, are insufficiently analyzed. For the user perception of the service quality, both factors are important. Either excessive call setup delay or poor voice quality will result in poor overall QoE evaluation. If both occur within the same session, the total effect can be amplified since the user is likely to be dissatisfied if he experiences both an excessive call setup delay and poor voice quality within the same call session.

Page 6: [IEEE 2012 IX International Symposium on Telecommunications (BIHTEL) - Sarajevo, Bosnia and Herzegovina (2012.10.25-2012.10.27)] 2012 IX International Symposium on Telecommunications

Therefore, signaling-related service elements should be assigned a high importance because they could help service providers to prevent the churn or failure of the service in order to survive in the ever growing competition.

V. CONCLUSION AND FUTURE WORK

This paper provides an insight in the process of measuring and evaluating values of SIP performance metrics as defined in RFC 6076. Although signaling performance metrics can be measured and analyzed both on the IP-based transport layer and on the session control layer, this paper considers only the SIP performance metrics on the session control layer. In particular, the behavior of SIP performance metrics for IMS-based telephony service is examined in detail under different conditions. Measurements are performed upon the open source IMS platform through a series of different scenarios generated by SIPp. The obtained results show how SIP performance metrics impact the QoE for basic IMS-based telephony service. The increasing amount of signaling in IMS network has negative impact on SIP performance metrics, as is expected. The signaling performance is important and has significant impact on the user experience. Therefore, service providers must pay attention to the metrics analyzed in this paper.

Since service providers face the challenge of rapidly increasing of signaling, they need to establish methods for handling an explosion of signaling traffic generated by different types of applications. For example, signaling traffic is becoming the main bottleneck as Machine-to-Machine (M2M) application increases. Therefore, service providers need new ways to meet each application’s requirements and Service Level Agreements (SLA) while protecting the network and making more efficient use of available resources. Being linked to various research areas, the identified problems are going to become a new starting point for research activities in the future.

REFERENCES

[1] G. Camarillo and M. A. Garcia–Martin, The 3G IP Multimedia Subsystem (IMS): merging the Internet and the cellular worlds. John Wiley & Sons Ltd, 2008.

[2] J. I. Agbinya, IP Communications and Services for NGN. CRC Press, 2010.

[3] H. Fathi, S. S. Chakraborty, and R. Prasad, Voice over IP in Wireless Heterogeneous Networks: Signalling, Mobility, and Security. Springer Science+Business Media B.V, 2009.

[4] A. B. Johnston, SIP: Understanding the Session Initiation Protocol. Artech House, London, 3rd Edition, 2009.

[5] J .M. Batalla, J. Sliwinski, H. Tarasiuk, and W. Burakowski, “Impact of Signaling System Performance on QoE in Next Generation Networks,” Journal of Telecommunications and Information Technology, 2009, pp. 124-137.

[6] D. Malas and A. Morton, Basic Telephony SIP End – to – End Performance Metrics. Technical Report RFC 6076, Internet Engineering Task Force, 2011.

[7] V. Y. H. Kueh, R. Tafazolli, and B. G. Evans, “Performance Evaluation of SIP Based Session Establishment over Satellite – UMTS,” In the proceedings of the 57th IEEE Semiannual Vehicular Technology Conference (VTC 2003 – Spring), Jeju, Korea, 2003, pp. 1381 – 1385.

[8] M. Happenhofer, C. Egger, and P. Reichl, “Quality of Signaling: A New Concept for Evaluating the Performance of non-INVITE SIP Transactions,” In the proceedings of the 22nd International Teletraffic Congress (ITC), Amsterdam, Netherlands, 2010, pp. 1-8.

[9] C. Hesselman, H. Eertink, I. Widya, and E. Huizer, “Measurements of SIP Signaling over 802.11b Links,” In the proceedings of the 3rd ACM International Workshop on Wireless Mobile Applications and Services on WLAN Hotspots (WMASH 2005), Cologne, Germany, 2005, pp. 74-83.

[10] A. Brajdić, M. Suznjević, and M. Matijašević, “Measurement of SIP Signaling Performance for Advanced Multimedia Services,” In the proceedings of the 10th International Conference on Telecommunications (ConTEL 2009), Zagreb, Croatia, 2009, pp. 381-388.

[11] A. Ahmed, “Performance and Modeling of SIP Session Setup,” Master thesis. Blekinge Institute of Technology, Karlskrona, Sweden, 2009.

[12] A. Alfallah and H. Alkaabey, “Study the Performance of the Use of SIP for Video Conferencing,” Master thesis. Blekinge Institute of Technology, Karlskrona, Sweden, 2011.

[13] D. Thisen, J. M. Espinosa Carl' in, and R. Herpertz, “Evaluating the Performance of an IMS/NGN Deployment,” In the proceedings of the 2nd Workshop on Services, Platforms, Innovations and Research for new Infrastructures in Telecommunications (SPIRIT 2009), Lübeck, Germany, 2009, pp. 2561-2573.

[14] J. Tang, C. Davids, and Y. Cheng, “A Study of an Open Source IP Multimedia Subsystem Test Bed,” In the proceedings of the 5th International ICST Conference on Heterogeneous Networking for Quality, Reliability, Security and Robustness (QShine 2008), Hong Kong, China, 2008.

[15] Fraunhofer FOKUS website. Available online: http://www.fokus.fraunhofer.de/en/fokus/index.html. Accessed on July 23rd, 2012.

[16] The myMONSTER - Telco Communicator Suite website. Available online: http://www.monster-the-client.org/. Accessed on July 28th, 2012.

[17] The Open Source IMS Core Project website. Available online: http://openimscore.org/. Accessed on July 28th, 2012.

[18] The SIPp website. Available online: http://sipp.sourceforge.net/. Accessed on August 14th, 2012.

[19] The Wireshark website. Available online: http://www.wireshark.org/. Accessed on August 14th, 2012.

[20] T. N. Minhas, “Network impact on Quality of Experience of Mobile Video,” Doctoral thesis. Blekinge Institute of Technology, Karlskrona, Sweden, 2012.

[21] S. Baraković, J. Baraković, and H. Bajrić, “QoE Dimensions and QoE Measurement of NGN Services,” In the proceedings of the 18th Telecommunications Forum (TELFOR 2010), Belgrade, Serbia, 2010.

[22] H. Batteram et al., “Delivering Quality of Experience in Multimedia Networks,” Bell Labs Technical Journal, 2010, pp. 175-194.


Recommended