+ All Categories
Home > Documents > art%3A10.1007%2Fs11277-012-0659-6

art%3A10.1007%2Fs11277-012-0659-6

Date post: 15-Dec-2015
Category:
Upload: suchi87
View: 212 times
Download: 0 times
Share this document with a friend
Description:
imp
Popular Tags:
15
Wireless Pers Commun (2013) 69:1719–1733 DOI 10.1007/s11277-012-0659-6 A Comparative Study of Different Entropies for Spectrum Sensing Techniques Wanjing Zhu · Jianguo Ma · Oliver Faust Published online: 19 May 2012 © Springer Science+Business Media, LLC. 2012 Abstract In this paper, we propose entropy-based spectrum sensing schemes to detect the existence of a primary user in Cognitive Radio (CR). To support this proposal, we have studied four types of entropies [Approximate Entropy (ApEn), Bispectral Entropy (BispEn), Sample Entropy (SamEn) and Rényi Entropy (RenyiEn)] and their applications for spectrum sensing. The reason for investigating these entropies comes from the fact that different types of entropies have different characteristics which make them more or less suitable for specific applications. Monte Carlo simulations were executed to find out how suitable these measures are for CR. Averaged value curves, boxplots and Analysis of Variance (ANOVA) tests docu- ment the performance of these types of entropies. The results show that BispEn outperformed the other three entropy measures. The ANOVA test shows that BispEn can sense modulated signals when the Signal to Noise Ratio (SNR) is as low as 15 dB. This is at least a 5 dB improvement compared to the other entropies studied in this paper. Keywords Entropy · Cognitive radio · Spectrum sensing Abbreviations AWGN Additive White Gaussian Noise ANOVA Analysis ofVariance W. Zhu (B ) School of Electronic Engineering, University of Electronic Science and Technology of China, Chengdu, China e-mail: [email protected] J. Ma School of Electronic Information Engineering, Tianjin University, Tianjin, China e-mail: [email protected] O. Faust School of Engineering, Ngee Ann Polytechnic, Singapore, Singapore e-mail: [email protected] 123
Transcript

Wireless Pers Commun (2013) 69:1719–1733DOI 10.1007/s11277-012-0659-6

A Comparative Study of Different Entropiesfor Spectrum Sensing Techniques

Wanjing Zhu · Jianguo Ma · Oliver Faust

Published online: 19 May 2012© Springer Science+Business Media, LLC. 2012

Abstract In this paper, we propose entropy-based spectrum sensing schemes to detectthe existence of a primary user in Cognitive Radio (CR). To support this proposal, we havestudied four types of entropies [Approximate Entropy (ApEn), Bispectral Entropy (BispEn),Sample Entropy (SamEn) and Rényi Entropy (RenyiEn)] and their applications for spectrumsensing. The reason for investigating these entropies comes from the fact that different typesof entropies have different characteristics which make them more or less suitable for specificapplications. Monte Carlo simulations were executed to find out how suitable these measuresare for CR. Averaged value curves, boxplots and Analysis of Variance (ANOVA) tests docu-ment the performance of these types of entropies. The results show that BispEn outperformedthe other three entropy measures. The ANOVA test shows that BispEn can sense modulatedsignals when the Signal to Noise Ratio (SNR) is as low as −15 dB. This is at least a 5 dBimprovement compared to the other entropies studied in this paper.

Keywords Entropy · Cognitive radio · Spectrum sensing

Abbreviations

AWGN Additive White Gaussian NoiseANOVA Analysis of Variance

W. Zhu (B)School of Electronic Engineering, University of Electronic Science and Technology of China,Chengdu, Chinae-mail: [email protected]

J. MaSchool of Electronic Information Engineering, Tianjin University, Tianjin, Chinae-mail: [email protected]

O. FaustSchool of Engineering, Ngee Ann Polytechnic, Singapore, Singaporee-mail: [email protected]

123

1720 W. Zhu et al.

ApEn Approximate EntropyASK Amplitude-shift keyingBispEn Bispectral EntropyCR Cognitive RadioEEG ElectroencephalographyFCC Federal Communications CommissionHRV Heart Rate VariabilityRenyiEn Rényi EntropyRRC Root-raised CosineSamEn Sample EntropySNR Signal to Noise RatioOFDM Orthogonal Frequency-division Multiplexing

1 Introduction

In nature, the spectrum, used for wireless communication, is a limited resource. Transmit-ters which use a specific spectrum band need to be licensed by governments. Most of thisprecious resource is licensed to a single user with a static control scheme. Rapid growth ofcommunication services and prosperous communication markets demands a high level ofspectrum efficiency. Haykin has shown that this requirement cannot be satisfied with a staticfrequency allocation policy [1].

Cognitive Radio (CR) is a possible solution to the problem of spectrum shortage [2],because it uses dynamic spectrum access techniques. The parameters of CR are reconfigura-ble and they can be set such that they adapt to the surrounding radio scenario [3], therefore,it is possible to increase the spectrum resource efficiency in both time and space domains. InCR terminology, primary users have higher priority or legacy rights on the usage of a specificspectrum band, while secondary users have lower priority and they need to exploit the spec-trum in a way which does not interfere with primary users [4]. Hence, secondary users have tosense the spectrum in order to detect whether or not a particular band is occupied by primaryusers. In order to know the surrounding radio scenario, CR systems have one important taskwhich is called spectrum sensing. To be specific, spectrum sensing means the detection ofunoccupied bands, the so called spectrum holes, that can be used for data transmission [5].

One of challenges in spectrum sensing arises from the hidden primary user problem [6].Figure 1 shows a scenario where the hidden primary user problem occurs. The CR systemcannot detect the presence of a signal which was transmitted by a primary user. The dashedcircles indicate the transmission area. When the primary user receiver is in the inter sec-tion, the cognitive radio device cannot detect the existence of the primary user transmitterand therefore will cause interference to the primary user receiver. Secondary users, such asCRs, should have a sensitive detection method to achieve high performance spectrum sensingwhich can sense the modulated signal in low Signal to Noise Ratio (SNR). Having this abilitymeans that the detection area of the CR device is expanded. Hence, there is less possibilityfor the hidden primary user problem to occur. Nagaraj proposed an entropy-based algorithmfor spectrum sensing when the noise and interference level is unknown [7]. And it has abetter performance than a cyclostationarity based detector. Another entropy-based detectorin frequency domain was introduced [8,9]. It can detect the primary user in a lower SNRsituation than the scheme proposed in reference [7]. The results presented in these papersshow that entropy is an effective feature for spectrum sensing. It is useful to investigatedifferent entropies and their performance in spectrum sensing.

123

A Comparative Study of Different Entropies 1721

Fig. 1 Hidden primary user scenario

In this paper, we study approximate entropy, sample entropy, Rényi entropy and bispec-tral entropy and analyze their performance to detect the presence of modulated signals inlow SNR signals. Additive White Gaussian Noise (AWGN) is selected to model backgroundnoise for this spectrum sensing application and Amplitude-shift keying (ASK) [10] is chosenas the modulation scheme for the primary users. Analysis of Variance (ANOVA) [11] testsare used to evaluate the performance of the four different entropies. Our results indicatethat bispectral entropy outperforms approximate, sample and Rényi entropy. We support thisclaim with appropriate graphs and tables. The Bispectral Entropy (BispEn) can detect themodulated signal when the SNR is as low as −15 dB.

Section 2 introduces the methods we used in this paper. A MATLAB [12] simulation givesus the data series of different SNRs. The detailed setup of the simulation is also stated in thispart. Then we introduce the definition of the entropies used in this paper. A brief introductionof ANOVA is given in this part as well. Thus, the simulation results are demonstrated inSects. 3 and 4 discusses these results. This paper concludes with Sect. 5.

2 Methods Used

This section illustrates the simulation setup in detail. Monte Carlo simulations [13] in MAT-LAB generate groups of data series for spectrum sensing. A brief introduction of the spectrumsensing model is given. We define the spectrum sensing task as identifying whether or notthe received signal contains modulated signal. After that, the four types of entropies, usedin this paper, are presented as well as their mathematical representations. Then we introduceboth boxplot and ANOVA, methods which were used for entropy data analysis.

The mathematical notation, which describes the signals in this paper, starts with definingthe received wireless communication signal. We assume that the received signal is a time

123

1722 W. Zhu et al.

Fig. 2 Simulation process

discrete sequence y[n] where n is the discrete time variable which ranges from 1 to N andN is the length of the sequence. This signal is partitioned into individual sample sequences.These sample sequences are modeled as m-dimensional vectors xm

i :

xmi = (y[i], y[i + 1], . . . , y[i + m − 1]) (1)

where i = 1, 2, . . . , N − m + 1.The sample sequences xm

i have been used for the calculation of Approximate Entropy(ApEn) and Sample Entropy (SamEn).

2.1 Simulation Setup

Figure 2 shows the simulation procedure. In this simulation, the primary users employed ASKas modulation scheme to create the transmission signal. Random binary data was chosen asthe source information and Root-raised Cosine (RRC) filter was used to reduce the occupiedbandwidth. AWGN channels were used to simulate a transmission environment and differentSNR values were set to get different groups of received signals. The power of the modulatedsignal with noise was normalized. All the data series was produced through simulations withMATLAB. Four groups of entropy values, one group for each of the four entropy measuresintroduced in Sect. 2.3, were calculated from the time series. Then the entropy values wereanalyzed with boxplots and ANOVA tests.

The frequency values in the simulation were normalized to the symbol rate ( fs):

f̄ = f

fs(2)

The carrier frequency of each modulation scheme was 5 and the sample rate was 200.Every stream of the modulation signal contained 200 symbols, which meant the data seriescontains 40,000 samples.

2.2 Spectrum Sensing Model

When the primary user is transmitting, the received signal for the CR is modeled as:

y(t) = s(t) + w(t) (3)

In Eq. 3, s(t) is the modulated signal transmitted by the primary user while w(t) is thenoise from the channel. The following equation defines the discrete-time representation ofthis relationship:

123

A Comparative Study of Different Entropies 1723

y[n] = s[n] + w[n], n = 1, 2, . . . , N (4)

Mathematically, the spectrum sensing problem can be presented as the following twohypotheses:

H0 : y[n] = w[n]H1 : y[n] = s[n] + w[n] (5)

The null hypothesis H0 means that there is no signal transmitted by primary users whileH1 denotes the presence of the primary signal.

2.3 Entropies

Entropy is widely used for measuring randomness, uncertainty and diversity of a signal.There are different types of entropies for the time series data. The modulation signal con-tains predictable structures which allow the receiver to access the transmitted information.Therefore, the entropy values of white Gaussian noise would be larger than the modulatedsignal.

2.3.1 Approximate Entropy (ApEn)

ApEn was proposed by Pincus as a measurement of systems complexity [14]. It is a quan-titative regularity statistic which measures the predictability of time series fluctuations. Westart the mathematical discussion of ApEn by defining:

Cmi (r) =

∑N−m+1j=1 Θ(i, j, m, r)

N − m + 1(6)

Θ(i, j, m, r) is defined below:

Θ(i, j, m, r) ={

1, if ‖ xmi − xm

j ‖m< r

0, otherwise(7)

where {xmi } is a vector which represents a sample sequence of the received signal, as shown

in Eq. 1.Let us define Φm(r) as:

Φm(r) =∑N−m+1

i=1 ln Cmi (r)

N − m + 1(8)

Finally, the definition of ApEn follows as:

ApEn(m, r, N ) = Φm(r) − Φm+1(r) (9)

If the time series is a predictable, the past and current value will enable us to predictthe future value and the ApEn will decrease [15]. Due to the structured patterns introducedby modulation schemes, wireless communication signals are, to some extent, predictable,hence ApEn can be used as feature for signal detection in spectrum sensing. It has beenused in biomedical engineering to identify abnormalities in biomedical signals, such asElectroencephalography (EEG) [16].

123

1724 W. Zhu et al.

2.3.2 Sample Entropy (SamEn)

SamEn is the negative natural logarithm of the conditional probability that two sequenceswhich are similar for m points remain similar at next point [17]. It has the advantage thatthe prediction accuracy does not depend on the time series length. Similar to the discussionof ApEn, we define preliminary functions before we combine them to form the definition ofSamEn. We start with Bm

i (r):

Bmi (r) = 1

N − m − 1

N−m∑

j=1

Θ(i, j, m, r) (10)

where Θ is defined in Eq. (7).Bm(r) is defined as below [17]:

Bm(r) = 1

N − m

N−m∑

i=1

Bmi (i, j, m, r) (11)

And Am(r) is the Bm(r) value of {xm+1r }. The equation for SamEn is:

SampEn(m, r, N ) = − ln

[Am(r)

Bm(r)

]

(12)

SamEn is used as a measure of signal complexity. Al-Angari and Sahakian used it toevaluate the behavior of Heart Rate Variability (HRV) in obstructive sleep apnoea syndrome[18]. SamEn-based scheme had a accuracy of 70.3 % in a minute-by minute classificationand was proved to be a useful tool to detect apnea episodes during sleep.

2.3.3 Rényi Entropy (RenyiEn)

RenyiEn is a generalization of Shannon entropy named after Alfréd Rényi [19]. RenyiEnextends Shannon Entropy to a family of entropy measures, which can have is flexible in itsapplications.

The definition of the Rényi entropy is:

RenyiEnq = 1

1 − qlog2

{∑

i

[pr (i)]q

}

(13)

where pr (i) represents the probabilities of {xi }, q > 0 and q �= 1. Rényi entropy is ageneralized version of Shannon entropy. When the q value equals to 1, the Rényi entropy isthe same as the Shannon entropy. In this paper, we choose the q value as 2.5.

RenyiEn values reflect difference in modulation schemes, therefore Tabatabaei et al. haveused these values as one of four features for modulation recognition [20].

2.3.4 Bispectral Entropy (BispEn)

BispEn is a kind of spectral entropy which is derived from the so called bispectrum. Thebispectrum is a statistic used to search for nonlinear interactions.

We start the discussion of BispEn by defining bispectrum [21]:

B( f1, f2) = E[X ( f1)X ( f2)X∗( f1 + f2)

](14)

where E indicates mathematical expectation.

123

A Comparative Study of Different Entropies 1725

Fig. 3 Region of Ω

0 0.2 0.4 0.6 0.8 1.00

0.2

0.4

0.6

The BispEn is [22]:

BispEn = −∑

i

pi log pi (15)

Where the pi is defined as follows:

pi = |B( f1, f2)|∑

Ω |B( f1, f2)| (16)

Figure 3 shows the Ω region, which is the area where f1 > 0, f2 > 0, f1 > f2 andf1 + f2 < 1.

BispEn is used for vehicle detection [23]. It is an effective method for vehicle detectionbased on acoustic signal. The bispectral entropy detector can detect vehicles which are morethan 1,000 m away, but a time domain detector can only do it when the vehicle is just 200 maway and frequency domain detector 500 m away. The results about the application of BispEnin spectrum sensing will be shown in Sect. 3.

2.4 Boxplots

Boxplot are an excellent way to study data characteristics, because it depicts the underlin-ing statistical distribution of the data. The method was proposed by Tukey [24]. In a basicboxplot, there are five parameters which visualize the data statistics: 0.25 quartile, median,0.75 quartile, minimum value and maximum value. These parameters characterize the statis-tical distribution of the data. Figure 4 shows an example boxplot. The boxplots of differententropies will be shown in Sect. 4.

2.5 Analysis of Variance (ANOVA)

Analysis of Variance (ANOVA) is a statistical method to test whether or not the means ofseveral groups are all equal. It is a formal analysis of variance proposed by Fisher [25]. Thenull hypothesis is that the means of all different testing groups are statistically the same.ANOVA tests give us a value which indicates whether or not the difference of the means isstatistically significant. When the p value is small, the null hypothesis is rejected, that meansthe different testing groups have different mean values.

In this paper, ANOVA tests have been used to evaluate the effectiveness of the four dif-ferent entropy measures for spectrum sensing. From the p values, we can identify which

123

1726 W. Zhu et al.

Fig. 4 Boxplot example

0 1 2 3 4 50

0.2

0.4

0.6

0.8

1

Maximum0.75 quartile

Median0.25 quartile

Minimum

type of entropy has the best performance. This will help us to build better spectrum sensingalgorithms.

3 Simulation Results

This section presents the performance results of the four different entropy measures. Thetime series data, which models the received signal for Cognitive Radio (CR), comes fromthe simulation and all the detailed information about them is stated in Sect. 2.1. The entropyvalues were calculated from the time series in MATLAB. We present the values of all fourtypes entropies introduced in Sect. 2.3. To present the results, we have plotted the averagedentropy value curves and boxplots. These plots show the performance of different entro-pies for spectrum sensing. The ANOVA test results show the significance of the differencebetween signals with low SNR and pure white noise.

3.1 Entropy Values

Through Monte Carlo simulations, we extracted the individual entropy values from the sig-nals. Table 1 shows means and standard deviations for each set of 200 raw entropy values.The raw entropy values were used for the following analysis.

3.2 Averaged Entropy Values

In this part, we investigate averaged entropy values to see how sensitive they are to the Signalto Noise Ratio (SNR). Figure 5 shows these values for the four types of entropies. Each resultwas obtained by averaging 200 Monte Carlo trials. All the values are normalized by the meanof the entropy value of white Gaussian noise. As the difference between ApEn, SamEn andRenyiEn cannot be seen clearly in Fig. 5a, b shows them in detail.

Figure 5a depicts the excellent performance of BispEn. The BispEn values are more dis-tinct than the other entropies in low SNRs. Figure 5b indicates that the SamEn has a betterperformance than ApEn. Table 1 indicates that the standard deviation of SamEn is larger thanthe ApEn.

123

A Comparative Study of Different Entropies 1727

Table 1 Entropy values

NR(dB) ApEn SamEn RenyiEn BispEn

0 2.3179 ± 0.0011 2.1625 ± 0.0025 6.9905 ± 0.0010 0.7150 ± 0.0432

−1 2.3218 ± 0.0010 2.1698 ± 0.0025 6.9913 ± 0.0010 0.7369 ± 0.0486

−2 2.3248 ± 0.0007 2.1745 ± 0.0026 6.9917 ± 0.0011 0.7719 ± 0.0344

−3 2.3269 ± 0.0006 2.1785 ± 0.0025 6.9922 ± 0.0011 0.7930 ± 0.0292

−4 2.3282 ± 0.0006 2.1805 ± 0.0025 6.9923 ± 0.0010 0.8165 ± 0.0262

−5 2.3292 ± 0.0006 2.1822 ± 0.0023 6.9925 ± 0.0010 0.8378 ± 0.0162

−6 2.3297 ± 0.0005 2.1833 ± 0.0023 6.9927 ± 0.0011 0.8519 ± 0.0149

−7 2.3301 ± 0.0005 2.1841 ± 0.0022 6.9927 ± 0.0010 0.8616 ± 0.0138

−8 2.3304 ± 0.0005 2.1843 ± 0.0024 6.9927 ± 0.0010 0.8729 ± 0.0109

−9 2.3306 ± 0.0005 2.1847 ± 0.0021 6.9928 ± 0.0010 0.8792 ± 0.0091

−10 2.3307 ± 0.0004 2.1848 ± 0.0025 6.9927 ± 0.0010 0.8836 ± 0.0069

White noise 2.3309 ± 0.0005 2.1851 ± 0.0024 6.9928 ± 0.0010 0.8981 ± 0.0041

Fig. 5 Averaged entropy values

-10 -9 -8 -7 -6 -5 -4 -3 -2 -1 00.75

0.80

0.85

0.90

0.95

1.00

SNR (dB)

Nor

mal

ized

Ent

ropy

Val

ue

ApEnBispEn

SamEnRenyiEn

(a)

-10 -9 -8 -7 -6 -5 -4 -3 -2 -1 00.988

0.990

0.992

0.994

0.996

0.998

1.000

SNR (dB)

Nor

mal

ized

Ent

ropy

Val

ue

ApEn

SamEnRenyiEn

(b)

123

1728 W. Zhu et al.

-10 -9 -8 -7 -6 -5 -4 -3 -2 -1 00.992

0.994

0.996

0.998

1.000

SNR (dB)

Nor

mal

ized

ApE

n(a)

-10 -9 -8 -7 -6 -5 -4 -3 -2 -1 00.985

0.990

0.995

1.000

1.005

SNR (dB)

Nor

mal

ized

Sam

En

(b)

-10 -9 -8 -7 -6 -5 -4 -3 -2 -1 00.9990

0.9995

1.0000

1.0005

SNR (dB)

Nor

mal

ized

Ren

yiE

n

(c)

-10 -9 -8 -7 -6 -5 -4 -3 -2 -1 00.5

0.6

0.7

0.8

0.9

1.0

SNR (dB)

Nor

mal

ized

Bis

pEn

(d)

Fig. 6 Boxplots of entropies. a ApEn, b SamEn, c RenyiEn, d BispEn

3.3 Boxplots

Figure 6 shows the boxplots of four types of entropies with each result characterized by 200trials. Each entropy value, shown in the plots, is normalized to the corresponding averagedentropy value of white Gaussian noise.

Figure 6 indicates that different entropies have different statistical distributions. As itis depicted in Fig. 6a, b ApEn has a smaller variance than SamEn. According to Fig. 6c,RenyiEn values of modulated signal are insensitive to the modulated signals while the SNRsare less than −5 dB. BispEn values are below 1, which means they are smaller than the meanvalue of BispEn value of white Gaussian noise, and this indicates its good performance inlow SNR.

3.4 ANOVA Results

The ANOVA tests in this part indicate whether or not the entropy measures are able to dis-criminate between a modulated signal in noise and pure noise. To conduct this test we haveformed two groups of data. One group contains 200 sets of entropy values which result fromwhite Gaussian noise. The other group is composed from 200 sets of entropy values whichwere extracted from modulated signals with SNRs in a range from −16 to −5 dB.

123

A Comparative Study of Different Entropies 1729

Table 2 Entropy ANOVA test results (p value)

SNR(dB) ApEn SamEn RenyiEn BispEn

−16 0.9134 0.8558 0.3593 0.0698

−15 0.1473 0.5804 0.1547 1.151 × 10−6

−14 0.7281 0.8990 0.3891 2.581 × 10−13

−13 0.6116 0.7256 0.6007 2.431 × 10−25

−12 0.5629 0.2881 0.8630 1.113 × 10−33

−11 0.0576 0.4141 0.5786 7.871 × 10−56

−10 2.108 × 10−5 0.1228 0.1111 3.353 × 10−86

−9 1.255 × 10−10 0.0483 0.6812 4.534 × 10−91

−8 3.012 × 10−27 3.192 × 10−4 0.1537 4.726 × 10−111

−7 6.732 × 10−44 4.496 × 10−6 0.0345 4.252 × 10−127

−6 2.454 × 10−78 1.977 × 10−13 0.1017 2.422 × 10−149

−5 8.294 × 10−127 2.144 × 10−29 1.150 × 10−4 8.658 × 10−177

Table 2 presents the ANOVA test results of the entropy values. Two groups are used foreach ANOVA test: one is the entropy of noise signal from the channel and the other is thecorresponding one of the signal that contains the modulated signal. Each group contains 200samples. The p values in the table denote whether the difference between the mean valuesof the two groups are statistically significant or not. BispEn has a sharp detection at −15 dBwhile the RenyiEn can show the difference at −5 dB. Same results can be obtained in Fig. 6.

4 Discussion

Entropy is a quantified measure of randomness, uncertainty and diversity within a signal.Shannon introduced the concept of entropy into information theory [26] and since then itbecame an important signal feature. Of all possible signals, white Gaussian noise has thehighest entropy value [27]. When a signal contains regular patterns its entropy decreases.This property was utilized to detect radar signals in the presence of white noise [28]. As thesignal transmitted by primary users contains structure, its entropy value should be differ-ent from a white Gaussian noise. Therefore, entropy can be used as a feature for spectrumsensing.

Since Shannon first came up with the idea of information entropy, the science of signalprocessing has made tremendous progress. Now, there is a whole range of different entropymeasurements, such as approximate entropy, spectral entropy, state entropy and not to forgetShannon entropy [15]. Different entropies have different properties which make them moreor less suitable for specific applications. A comparative investigation of these entropies ishelpful for us to build better spectrum sensing algorithms.

In this paper, we have conducted a comparative study which investigates the performanceof ApEn, SamEn, RenyiEn and BispEn for spectrum sensing. Table 2 gives the entropy valuesextracted from Monte Carlo simulations. Figure 5 depicts the averaged entropy values. Withdifferent SNR values, the boxplots of each entropy value are shown in Fig. 6. Table 2 presents

123

1730 W. Zhu et al.

ANOVA test results. Through all these results, we can identify which type of entropy is moresuitable for spectrum sensing.

The information we can extract from Fig. 5 and Table 2 is that BispEn is the most effectivemethod for spectrum sensing. Related research has shown that entropy measures extractedfrom frequency domain signals outperform similar measures from time domain signals.Therefore, Zhang et al. [8,9] developed a frequency domain detector which has a better per-formance than a corresponding time domain system. With the detection probability equalsto 0.9 and the false alarm rate is 0.08, the frequency domain detector has 5 dB performanceimprovement than the time domain one proposed in reference [7]. BispEn is a kind of spec-tral entropy, so it shows the advantage in spectrum sensing. That indicates the structuresof modulation signals in frequency domain can be measured better by entropy than in timedomain.

RenyiEn has a poor performance when the SNR is low and therefore it is not fit for thethis application. As mentioned before, it was used as a feature for modulation classification[20]. Although it can classify three types of digital modulation schemes, it is not sensitivein spectrum sensing when the SNR is low. Therefore, it is not a good feature for spectrumsensing.

Figure 6 presents the variance within the entropy value groups. Compared to SamEn andBispEn, ApEn has a relative small variation, this means that it is less sensitive to the infor-mation source than the other entropies. This will help if we use a single entropy value iscalculated to do the classification. Figure 6d shows that the variance of BispEn is related tothe SNR. The relationship needs further investigation.

Shannon Entropy has been used to detect an IEEE 802.11A Orthogonal Frequency-divi-sion Multiplexing (OFDM) waveform in noise [29]. The specific waveform features areidentifiable when the SNR is −5 dB. Chen and Nagaraj investigated estimated Shannonentropy for the entropy based detector [30]. And the entropy values cannot show the differ-ence between H0 and H1, defined in Eq. (5), when the SNR is as low as 0 dB. Cross entropybased spectrum sensing technique is introduced and has a better performance over the Shan-non entropy [31]. But compared to the curve we obtained in Fig. 6, the cross entropy curvescan only show a smaller difference between the value of Gaussian white noise and low SNRmodulated signal.

BispEn has outperformed the other three types of entropies which were investigated inthis paper. It can detect the modulated signal when SNR is as low as −15 dB, that meansit is suitable for spectrum sensing. Detectors for spectrum sensing which are based on thisfeature will have a promising performance.

5 Conclusion

This paper presents a comparative study of four types of entropies. The results show us whichtype of entropy is more suitable for spectrum sensing. The work in this paper is helpful forbuilding a well performed spectrum sensing algorithm. Monte Carlo methods and ANOVAtests show that BispEn has the best performance among the four types of entropies. Accord-ing to Table 2, BispEn has at least 5 dB improvement than the other three types of entropies.This helps us to understand BispEn is more suitable for spectrum sensing.

The divergence of the performances motivates us to carry out further investigation on thedetector. Based on the results we have obtained in this paper, we can study more spectralentropies and build spectrum sensing schemes with better performance. So far we have only

123

A Comparative Study of Different Entropies 1731

carried out a comparative study on the feature itself. As future work, we need to analyzedifferent classification algorithms with the promising entropy features.

Acknowledgments We are grateful for the help from U. Rajendra Acharya in designing the entropycalculation programs. Finally, we would like to acknowledge that this work was supported by TianjinScience and Technology Support Key Project Plan (10ZCKFGX01500) and National Key Special Project(2012ZX03004008).

References

1. Haykin, S. (2005). Cognitive radio: Brain-empowered wireless communications. Selected Areas inCommunications, IEEE Journal On, 23(2), 201–220.

2. Sridhara, K., Chandra, A., & Tripathi, P. (2008). Spectrum challenges and solutions by cognitiveradio: An overview. Wireless Personal Communications, 45, 281–291.

3. Federal Communications Commission. (2005). Notice of proposed rule making and order: Facilitatingopportunities for flexible, efficient, and reliable spectrum use employing cognitive radio technologies.ET Docket No. 03-108

4. Gavrilovska, L., & Atanasovski, V. (2011). Spectrum sensing framework for cognitive radio net-works. Wireless Personal Communications, 59, 447–469.

5. Budiarjo, I., Lakshmanan, M., & Nikookar, H. (2008). Cognitive radio dynamic access techniques. Wire-less Personal Communications, 45, 293–324.

6. Yucek, T., & Arslan, H. (2009). A survey of spectrum sensing algorithms for cognitive radioapplications. Communications Surveys Tutorials IEEE, 11(1), 116–130.

7. Nagaraj, S. V. (2009). Entropy-based spectrum sensing in cognitive radio. Signal Processing, 89(2), 174–180.

8. Zhang, Y., Zhang, Q., & Wu, S. (2010). Entropy-based robust spectrum sensing in cognitive radio.Communications, IET, 4(4), 428–436.

9. Zhang, Y. L., Zhang, Q. Y., & Melodia, T. (2010). A frequency-domain entropy-based detector forrobust spectrum sensing in cognitive radio networks. Communications Letters, IEEE, 14(6), 533–535.

10. Proakis, J. G., & Salehi, M. (2008). Digital communications, 5th edn. NY: McGraw Hill.11. Harris, R. J. (1994). ANOVA: An analysis of variance primer. Itasca: F E Peacock Pub.12. The MathWorks Inc. (2007). MATLAB version 7.4.0 (R2007a).13. Fishman, G. S. (1995). Monte Carlo: Concepts, algorithms, and applications. New York: Springer.14. Pincus, S. M. (1995). Approximate entropy (ApEn) as a complexity measure. Chaos, 5(1), 110–117.15. Bein, B. (2006). Entropy. Best Practice and Research Clinical Anaesthesiology, 20(1), 101–109.16. Faust, O., Acharya, U. R., Molinari, F., Chattopadhyay, S., & Tamura, T. (2012). Linear and non-linear

analysis of cardiac health in diabetic subjects. Biomedical Signal Processing and Control, 7(3),295–302.

17. Richman, J. S., & Moorman, J. R. (2000). Physiological time-series analysis using approximate entropyand sample entropy. American Journal of Physiology. Heart and Circulatory Physiology, 278(6), H2039–2049.

18. Al-Angari, H. M., & Sahakian, A. V. (2007). Use of sample entropy approach to study heartrate variability in obstructive sleep apnea syndrome. Biomedical Engineering, IEEE TransactionsOn, 54(10), 1900–1904.

19. Rényi, A. (1960). On Measures Of Entropy And Information. In Proceedings of the 4th Berkeleysymposium on mathematics, statistics and probability, (pp. 547–561).

20. Tabatabaei, T. S., Krishnan, S., & Anpalagan, A. (2010). SVM-based classification of digital modulationsignals. Systems man and cybernetics (SMC), 2010 IEEE international conference on (pp. 277–280).

21. Nikias, C. L., & Raghuveer, M. R. (1987). Bispectrum estimation: A digital signal processingframework. Proceedings of the IEEE, 75(7), 869–891.

22. Mohebbi, M., & Ghassemian, H. (2012). Prediction of paroxysmal atrial fibrillation based on non-linearanalysis and spectrum and bispectrum features of the heart rate variability signal. Computer Methodsand Programs in Biomedicine, 105(1), 40–49.

23. Bao, M., Zheng, C., Li, X., Yang, J., & Tian, J. (2009). Acoustical vehicle detection based onbispectral entropy. Signal Processing Letters, IEEE, 16(5), 378–381.

24. Tukey, J. W. (1977). Exploratory data analysis. New York: Addison-Wesley.

123

1732 W. Zhu et al.

25. Fisher, R. A. (1918). The correlation between relatives on the supposition of mendelian inheri-tance. Philosophical Transactions of the Royal Society of Edinburgh, 52, 399–433.

26. Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27,379–423; 623–656.

27. Renevey, P., & Drygajlo, A. (2001). Entropy based voice activity detection in very noisy con-ditions. In Proceedings of 7th European conference on speech communication and technology,EUROSPEECH’2001, (pp. 1887–1890).

28. Zhang, Q. T. (1989). An entropy-based receiver for the detection of random signals and its applicationto radar. Signal Processing, 18(4), 387–396.

29. Rehm, C. R., Temple, M. A., Raines, R. A., & Mills, R. F. (2006). Entropy-based spectral processingon the ieee 802.11A OFDM waveform. In Military communications conference, 2006. MILCOM 2006.IEEE, (pp. 1–5).

30. Chen, X., & Nagaraj, S. (2008). Entropy based spectrum sensing in cognitive radio. In Wirelesstelecommunications symposium, 2008. WTS 2008, (pp. 57–61).

31. Gu, J., Liu, W. Jang, S. J., & Kim, J. M. (2010). Cross entropy based spectrum sensing. InCommunication technology (ICCT), 2010 12th IEEE international conference on, (pp. 373–376).

Author Biographies

Wanjing Zhu received his B.Sc. and B.Ec. degrees from Universityof Electronic Science and Technology of China, Chengdu, China. Heis currently a Master student in the same university with Prof. JianguoMa. He is now doing his internship as a Research Assistant under thesupervision of Dr. Oliver Faust in the Digital Signal Processing Cen-tre, Ngee Ann Polytechnic, Singapore. His research interests are DigitalCommunication, Cognitive Radio (CR), Modulation Recognition andDigital Signal Processing techniques for them.

Jianguo Ma (M’96, SM’97) received the B.Sc. degree fromLanzhou University, Lanzhou, China, in 1982, and doctoral degreein engineering from Duisburg University, Duisburg, Germany. Hewas with Technical University of Nova Scotia (TUNS), Halifax, NS,Canada from April 1996 to September 1997 as a postdoctoral fellow.He was with Nanyang Technological University (NTU), Singapore,from October 1997 to November 2005 as a faculty member, where hewas also the founding director of the Center for Integrated Circuitsand Systems, NTU. From December 2005 to Oct. 2009, he was withUniversity of Electronic Science and Technology of China (UESTC),Chengdu, China. He is the Technical Director for Tianjin IC DesignCenter since Nov 2008 concurrently, and since October 2009 he servesas the Dean for the School of Electronic Information Engineering inTianjin University. He serves as the founding director of the Centerfor IC & Computing Systems of Tianjin since May 2010. His researchinterests are: RFICs and RF integrated systems for wireless, RF devicecharacterization modeling, MMIC, RF/Microwave Circuits & Systems,

EMI in wireless, RFID & Wireless sensing network. In these areas, he has published about 269 technicalpapers (120 are in SCI cited journals), six U.S. patents granted and 15 filed/granted China patents, and twobooks. Dr. Ma served as the Associate Editor of IEEE Microwave and Wireless Components Letters fromJanuary 2004 to December 2005. He was one of the founding members for IEEE Chengdu Section andserved as the External Liaison Chair and Technical Activity Chair for IEEE Chengdu Section; He founded

123

A Comparative Study of Different Entropies 1733

the IEEE EDS Chengdu Chapter. Dr. Ma is the Changjiang Professor awarded by the Ministry of Educationof China, he is also the Distinguished Young Investigator awarded by National Natural Science Foundationof China. He is the member for IEEE University Program ad hoc Committee.

Oliver Faust In 2001, Oliver Faust completed his first degree inGermany at a private university of applied science. Subsequently, hereceived his PhD from the University of Aberdeen (Scotland UK) in2006. After that he worked for 2 years as a postdoctoral researcherat the same institution. Subsequently, he moved to Belgium to workfor 1 year in a private company as a hardware researcher. Currently,he is visiting faculty at Ngee Ann Polytechnic in Singapore. There heperuses his research interests of digital communication, nonlinear andcognitive signal processing.

123


Recommended