Measurements and Modeling of Primary User
Activity for Dynamic Spectrum Access
Robert L. Jones III
April 29, 2011
Under the direction of Dr. J. Nicholas Laneman
In fulfillment of the Engineering Honors Program thesis requirement
in the Department of Electrical Engineering
at the University of Notre Dame
Abstract
The current wireless spectrum allocation scheme gives individuals or groups the
exclusive right to transmit at certain frequencies. Historically, this allocation scheme was
appropriate, but many argue that it has become inefficient. Most wireless traffic is
concentrated in a few frequency bands, while sizable portions of the spectrum go largely
unused. A dynamic spectrum access policy in which users are able to transmit over
multiple frequencies at different times has been proposed as a solution to this inefficient
allocation. While this policy could be implemented in several ways, the method most
similar to the current system, and therefore the most realistic candidate, is the assignment
of primary and secondary spectrum usage rights. In this new arrangement, primary users
are practically identical to current spectrum owners in that they have the uninhibited right
to transmit at specified frequencies without interference from other users. The only
difference is that secondary users are free to utilize a wireless channel whenever a
primary user is not transmitting. The development of cognitive software radios with the
ability to analyze channel activity and transmit over a wide range of frequencies has
made this primary ownership system feasible. Much recent research has focused on
realizing this system by developing accurate spectrum sensing algorithms for secondary
users to ensure these users can take advantage of transmission opportunities while
avoiding collisions with primary user transmissions. These algorithms assume a certain
Markov model can appropriately capture primary user behavior, but this model’s
accuracy has not been experimentally verified. In this thesis, a software radio-based tool
for aggregating data on a primary user’s transmission behavior is described. The results
of the application of this tool in monitoring public safety band transmissions are
statistically analyzed, and the appropriateness of the assumed primary user model is
called into question in light of this analysis.
2
Acknowledgements
Many thanks are due to Dr. J. Nicholas Laneman for his helpful and extremely
patient guidance throughout the process that culminated in the writing of this thesis. His
assistance was instrumental in refining my impossibly broad initial research interests,
finding the proper background information, learning the relevant communications
systems analysis skills, focusing on the achievement of manageable goals, clarifying my
interpretation of the numerical results of my research, and writing and editing this thesis,
among many other things. He always provided insightful answers to any questions I
asked. This project would have been impossible without his direction.
Thanks are also due to Glenn Bradford, whose willingness to share his knowledge
of GNU Radio was instrumental to my learning; Zhanwei Sun, who offered helpful
thoughts on improving the detection scheme; and Neil Dodson, who offered practical
advice about spectrum monitoring as well as the use of several helpful resources.
I would also like to thank Dr. Yih-Fang Huang and Dr. Thomas Pratt for agreeing
to take the time to read my thesis and be on the defense committee.
Finally, I would like to thank Dr. Ken Sauer for his facilitation of the Engineering
Honors Program and all the advice, encouragement, and administrative help he has
offered over my entire time at the University of Notre Dame.
3
Contents
Abstract
..................................................................................................................... 2
Acknowledgements............................................................................................................. 3
I. Introduction ..................................................................................................................... 5
Outline..................................................................................................................... 5
II. Background .................................................................................................................... 7
The allocation problem ........................................................................................... 7
History of the current scheme ................................................................................. 7
Inefficiency in the system ....................................................................................... 8
Possible solutions.................................................................................................... 9
Reallocation of spectrum access rights ....................................................... 9
General unlicensed spectrum access......................................................... 10
White space utilization.............................................................................. 11
Active sensing by unlicensed secondary users ......................................... 12
Focus of this paper ................................................................................................ 13
III. Spectrum Monitoring Tool ......................................................................................... 15
Description............................................................................................................ 15
Development for broad applicability .................................................................... 16
Data collection and processing ............................................................................. 17
IV. Example of Application.............................................................................................. 18
Task-specific customization.................................................................................. 18
Numerical results .................................................................................................. 18
V. Conclusion ................................................................................................................... 24
Implications for secondary user sensing algorithms............................................. 24
Reliability of data and suggestions for improvement ........................................... 24
Suggestions for further exploration ...................................................................... 26
References
................................................................................................................... 27
Appendices
A – Spectrum monitoring tool: monitor.py........................................................... 28
B – Log file parser: log_reader.py ........................................................................ 31
C – MATLAB commands..................................................................................... 32
D – Log file excerpt .............................................................................................. 33
Attachment: United States Wireless Spectrum Allocation Chart ..................................... 34
4
I. Introduction
In February of 2010, the FCC Chairman, Julius Genachowski, announced a new
FCC goal of opening another 500 MHz of wireless spectrum for mobile broadband use
over the next decade as part of a plan to "bring our spectrum policies into the 21st
century." [1] He also discussed new approaches to spectrum allocation beyond the
traditional static FCC allocation plan, including flexible licenses that can be transferred to
different users, dynamic allocation databases, unlicensed spectrum bands, and
opportunistic spectrum access. Such measures are seen as necessary because the wireless
community at large feels there is not enough spectrum available to keep up with the
rapidly increasing demand for wireless communications. As Genachowski put it, "The
FCC in recent years has authorized a 3-fold increase in commercial spectrum for mobile
broadband. But that increase will not allow us to keep pace with an estimated 30-fold
increase in traffic." [1]
While the FCC has been making efforts to better allocate the wireless spectrum,
researchers have been developing methods to make more efficient use of available
spectrum. One of the approaches that has received a good deal of attention is the
development of active sensing algorithms intended to detect transmissions made by
primary users of a wireless channel. Such algorithms are potentially useful because they
could permit secondary users to transmit over the same channel when the primary user is
not transmitting. The goal of this thesis is to contribute to these research efforts in several
ways. First, a simple but flexible and broadly applicable software radio tool designed to
detect primary user transmissions on a desired channel is described in detail. The
instructions and code necessary for the replication of this tool are provided. Second, the
application of this tool in monitoring the activity of a primary user is discussed. The data
gathered in the process, consisting of the durations of transmissions and idle states on the
channel, are presented in histograms accompanied by information on channel utilization
and the mean durations of idle and busy periods. Third, the data are analyzed and the
creation of a statistical model for primary user activity from these data is discussed. The
validity of the assumption in active sensing algorithms that primary user activity can be
modeled by a two state Markov chain with geometrically distributed states is called into
question based on this analysis.
Outline
In Section II, background information on spectrum allocation is provided and the
inefficiency present in the current allocation scheme is discussed. Several possible
solutions to this inefficiency are discussed, with a primary-secondary user system
utilizing active sensing eventually being deemed the solution with the greatest potential.
The problem of accurately modeling primary user activity emerges as an important
concern. In Section III, the software radio tool for gathering the data necessary for
modeling primary user activity is described. The flexibility and scope of this tool are also
explored. A general method for gathering and organizing data with this spectrum
monitoring tool is laid out as well. In Section IV, the application of the tool to a specific
primary user, the Notre Dame Security Police dispatch channel, is discussed. The
resulting data are presented and statistically analyzed. Section V begins with the
suggestion that, based on the preceding statistical analysis, the validity of the assumed
5
primary user model in many active sensing algorithms should be called into question. The
reliability of the data collected and possible sources of error are discussed, and then the
thesis concludes with recommendations for further work in this direction.
6
II. Background
The allocation problem
The physical properties of the electromagnetic spectrum are studied in detail in
electrical engineering at least in part because is it is such a useful resource for
transmission of information. Wireless communications have constantly expanded with
new applications in new markets ever since they were first used. Today they seem
ubiquitous. Cellular telephones, television stations, remote controls, radio stations, public
safety radios, computers, headphones, home entertainment systems, satellites, and
countless other applications simultaneously use electromagnetic wave propagation to
move information from one point to another. To accomplish this task, they all have to
utilize some portion of the electromagnetic spectrum, and they must do so without
interfering with each other.
In the United States, in order to minimize interference among all the different
users of the wireless spectrum, the Federal Communications Commission (FCC)
maintains a fixed allocation scheme that determines which specific users or types of users
can transmit over a given portion of the spectrum. The FCC is also responsible for
updating this scheme and enforcing it. A graphical representation of this allocation
scheme can be seen at <http://www.ntia.doc.gov/osmhome/allochrt.pdf> (a copy is
attached following Appendix D, but the page size limitation makes it difficult to read).
The obvious complexity of this scheme partially reflects the immensely varied
community of wireless spectrum users. The number of potential concurrent transmissions
makes a detailed allocation scheme necessary to a certain extent; even if all frequencies
were equally useful, without some initial resource allocation, multiple users would likely
choose to transmit at the same frequencies and create a great deal of interference, thereby
preventing all concerned parties from obtaining any benefit from their use of the
resource. Since the spectrum is theoretically an infinite continuum, though, the number of
users cannot be the only complicating factor. Wireless transmissions are effectively
limited to a certain range of frequencies by the possible oscillation speeds of current
communications equipment, and signal propagation is also physically more favorable at a
subset of the available frequencies. “Good” spectrum is therefore a scarce resource, and it
is this scarcity coupled with the high demand for wireless spectrum that makes the
allocation problem significant.
History of current scheme
Allocation of spectrum was not a significant problem when the government first
approached it in the 1920s. [2] The only real problem was minimizing interference, and
one very logical way to prevent interference from occurring was to simply assign
different frequency bands to different users. Those users could then invest in the creation
of equipment that operated in a small range of frequencies with the knowledge that they
would not interfere with or be interfered with by others. This was approximately the plan
that was followed, and it continued to be the logical approach for a good deal of time. As
radio stations, television stations, and satellites started broadcasting, they were allotted
portions of the spectrum, and technologies were developed that worked well for each
application.
7
Over the last few decades, though, with the advent of cellular communications
and the proliferation of wireless devices, available spectrum for new uses has become
almost nonexistent. It has also become correspondingly more valuable. When television
broadcasts in the United States transitioned from analog to digital in 2008, for example,
blocks of spectrum spanning 698MHz to 806MHz became open for reallocation. The
FCC auctioned off the access rights to some of this spectrum, and the winning bids
totaled $19.592 billion. [3] Demand for this spectrum was clearly very high, and the
current public fixation with wireless communications and information on-demand
provides good reason to believe the wireless spectrum will continue to become more
valuable as a resource. This poses problems for any users looking to gain initial or
additional spectrum usage rights. It also means that the space for general unlicensed use,
such as the bands exploited by WiFi, Bluetooth, and many other consumer wireless
devices, will likely continue to be minimal despite its heavy utilization, since by its
nature this sort of space has no direct owners willing to commit financial resources to its
expansion.
Inefficiency in the system
It might be assumed that the steady increase in demand for wireless spectrum has
been accompanied by increased efficiency in spectrum utilization. It is far from being
clear that this is the case, however. In 2004, for example, the power spectral density of
the wireless spectrum in Berkeley, CA from DC to 6GHz was recorded over a 50µs
period. The results are shown in Figure 1. [4]
Figure 1: Variability of measured wireless transmission power, 300MHz-3GHz
It is clear that some bands are relatively heavily utilized, at least in terms of the
power transmitted, while others appear to sit generally idle. This relationship does not by
itself determine how efficiently the spectrum is used; the heavily used bands might be
capable of carrying a good deal more traffic with a different transmission scheme, as with
code-division multiplexing, for example, while the apparently idle bands might be used
8
only occasionally because they must be available for very important transmissions, as
might be the case in some military communications. Nonetheless, it seems highly
unlikely that all of the idle spectrum is really necessary for high-priority use and equally
unlikely that heavily used bands are generally inefficiently utilized. A driving force
behind the demand for spectrum is desire to transmit large quantities of data, a goal that
seems to be realized best by heavily utilizing available spectrum. It seems fair, then, to
assume that the small pockets of heavy spectrum use and the relatively widespread idle
bands provide evidence of a certain amount of inefficiency in spectrum utilization.
This inefficiency appears to be due, at least in part, to the inability of a centrally
regulated policy to readily adapt to changes in spectrum demand and technology. Since
the spectrum is so strictly allocated, it is easy to see that some wireless applications use
the spectrum more efficiently than others do. The FCC’s allocation scheme, then, must
bear some of the responsibility for the inefficiency in spectrum utilization. The spectrum
would be utilized more heavily and more efficiently if access rights were distributed
using a method that more closely reflected the demand for this resource. Further, the
allocation scheme is not temporally or geographically flexible. Even where it might allow
efficient use during the day or in a city, for example, the allocation scheme might not be
appropriate for the different demands at night or in a rural area. Researchers could
plausibly experiment with new technologies at night using vacant frequencies, while
people in a remote area with no cellular coverage could put those unused bands to some
other use. In general, the FCC’s wireless spectrum allocation scheme promotes
inefficiency, partly because of the great advances in technology since its inception.
Possible solutions
Various solutions to the problem of inefficient spectrum allocation could be
employed to address the problem. Each has the potential to increase efficiency in some
way, but they all have drawbacks as well. Four of these solutions – reallocation of
spectrum access rights, general unlicensed spectrum access, white space utilization, and
active sensing by unlicensed secondary users – are discussed in detail in this section.
Reallocation of spectrum access rights: One possibility is a major restructuring
of the current scheme. In this solution, the FCC would redistribute spectrum rights to
reflect the high demand of some users and groups of users while consolidating seldomused spectrum. More of the spectrum would then be open to those applications that use it
heavily, and this would increase the efficiency of spectrum utilization, at least in the
sense that the plot in Figure 1 would show higher levels of activity over a broader range
of frequencies. Low-use applications would still have enough of the spectrum available to
maintain a low likelihood of interference. This solution could potentially be effected
relatively quickly. No new technologies would be necessary, and the heaviest users
would be able to keep the spectrum rights they already had, so investment in new
equipment would be kept to a minimum and would only be necessary for those users
gaining new spectrum ownership. On the regulatory side, the FCC could use the method
of auctioning newly freed spectrum to the highest bidder in order to reallocate spectrum,
as it did when analog television broadcasts ended. The transition to the new system thus
has the potential to be short, given proper action by both spectrum users and regulators.
9
This solution is not without its pitfalls, though. The recent transition to digital
television broadcasting showed that spectrum reallocation can take a great deal of time to
take effect in practice, however quick it might be in theory. Further, an auction would
effectively limit spectrum ownership to a small number of companies and industries with
extensive financial resources, and these users might not be the ones that would most
efficiently utilize the spectrum. Amateur radio operators, independent plane pilots,
independent marine operators, and other such groups without powerful organizations
would likely be deprived of a lot of their spectrum access rights by such a process.
Similarly, unlicensed use would have minimal support. Bands that allow unlicensed use
are some of the most heavily utilized due to the proliferation of technologies like WiFi
and Bluetooth, but without financial backing they would probably not be expanded.
Innovation would continue to suffer because no new space would be opened to pursue
new technologies. It would also be difficult for the FCC to determine to what degree lowuse spectrum should be consolidated. Military and public safety bands, for example,
might be utilized less than cellular telephone bands, but the danger of allocating too little
spectrum to the former is arguably much greater than that of not expanding the latter as
much as might be desired. The biggest issue, though, is that a simple reallocation would
essentially be a continuation of the current system, and it would therefore be open to
exactly the same problems in the future. After some time, new technologies might reduce
the demand of those applications that are currently the most spectrum-hungry, and the
greatest demand for spectrum might come from applications that have not yet been
conceived. The allocation scheme would once again require restructuring to promote
efficiency.
General unlicensed spectrum access: Another solution with a good deal of
potential is almost the polar opposite of reallocation. Instead of giving more spectrum
ownership to some users, the FCC could remove most or all spectrum access rights
entirely. The new system would consist of unlicensed use that could be negotiated in a
few ways. The most likely candidates would be a peer-to-peer system with all users
determining the appropriate frequencies collaboratively and a more centralized approach
somewhat akin to the WiFi protocol with a dedicated router dynamically assigning
frequencies to clients. This solution is the most appealing in a purely economical sense.
The rights to spectrum access would be entirely open to all users. The interactions
between users would essentially be governed only by demand and the scarce supply of
spectrum. Theoretically, this is exactly the sort of situation that leads to the most efficient
use of a given resource. No class of users would be favored, so all the groups that would
be at a disadvantage in the reallocation solution discussed above would have equal
spectrum access rights. Geographical and temporal concerns would be immaterial
because all users would follow the same procedure to gain spectrum access regardless of
the circumstances. Research and innovation could proceed almost ideally because almost
any desired frequency could be utilized. The nature of this plan would also incentivize
the development of new hardware to open up access to more frequency space and the
further development of communication algorithms to minimize spectrum use.
This solution faces many practical problems, though. The first that should be
considered is the “tragedy of the commons,” a situation in which self-interested users’
actions combine to deplete a commonly held resource, eventually preventing all users
10
from utilizing the resource to the extent they desire. This is a dangerous but real
possibility in any situation in which there is no ownership of a readily accessible
resource. A closely related problem is that self-interested use in a spectrum commons
would render many current technologies, including any dedicated broadcasts like radio,
television, and satellite, useless. All of these transmissions would be subject to a great
deal of interference unless every single wireless device attempting to broadcast had
access to a detailed database of at least the location and frequency of every dedicated
broadcaster, as well as GPS location capabilities to determine which geographical
information was appropriate. The creation, maintenance, and distribution of such a
database would be difficult, but determining which broadcasters counted as “dedicated”
and could therefore retain their spectrum ownership would be a controversial logistical
barrier as well.
Another problem with this plan would be the necessity of including cognitive
radio capabilities in every wireless device. Cognitive radio refers to the concept of
individual devices actively choosing the best frequency for their transmissions based on
observations of local spectrum usage. Many wireless devices are designed to be small,
mobile, and power efficient, but current cognitive devices are too computation- and
power-intensive to meet these goals. The size and cost of the basic radio hardware
necessary for transmitting and receiving over a broad range of frequencies also limits the
practicality of this solution. Additionally, the spectrum analysis capabilities and
interference minimization algorithms necessary for this system would require a good deal
of time to develop on such a large scale, so the transition would not occur quickly and the
inefficiency problem would thus persist for some time.
Another important roadblock would be the current beneficiaries of dedicated
spectrum access rights. Users who paid billions of dollars for their spectrum access
rights, and those who have traditionally had unlimited access rights even without having
to pay for them, would be quite unwilling to abandon a system so advantageous to them
and in which they invested so much.
White space utilization: Another solution is a much less dramatic version of the
unlicensed spectrum access solution discussed above: unlicensed use in white spaces.
White spaces are the blocks of spectrum allocated for a specific use like television
broadcasting that go unused in a given geographical area. There might be a television
channel 5 in one metropolitan area while the neighboring area has a channel 6 instead; in
this case, broadcasting on channel 6 in the first area and channel 5 in the second area
would cause no interference whatsoever. The allocated frequencies are still respected and
treated as they would be in another setting, however, when a broadcast is present. Instead
of freeing all frequencies from the allocation scheme, this solution is more targeted and
most directly addresses the geographical inefficiency problem. In a sufficiently localized
region where a block of spectrum is allocated for a certain use but is never actually
utilized in the intended fashion, it makes sense to open this spectrum up to other users.
Any areas of the spectrum with white space would then benefit from increased utilization
and therefore increased efficiency.
This solution might be approached in two different ways. In one form, it would
require the sort of database of “dedicated broadcasters” discussed above. The process of
delimiting these dedicated broadcasters would be much simpler and less controversial,
11
however, than it would be in a situation where that classification would have the effect of
maintaining access rights in an otherwise open market. But the database would still
require a good deal of work to create and maintain, and the problem of ensuring that all
devices had accurate copies or easy access to it would still remain. Again, GPS
capabilities would also be necessary in every device. White space utilization could also
be accomplished by actively sensing spectrum use in order to locate and avoid interfering
with dedicated broadcasts. This solution could eliminate the need for the database and
location sensing, but devices would require the bulk and computing power inherent in
cognitive radio devices as described above. In either case, the solution would not be ideal
for mobile devices, but it would be very manageable for permanent or semi-permanent
devices for which size, computing power, and power consumption are not primary
concerns. The use of white spaces would certainly increase spectrum efficiency either
way. The FCC has recognized this, in fact, and has chosen the geographic database
solution in allowing unlicensed use in TV white spaces. [5]
In the same FCC release that finalized the geographical database method of
unlicensed white space use, the possibility of using active sensing was also discussed. It
was ultimately not adopted, but the plan came with the recommendation that researchers
continue to pursue spectrum sensing technology: “While we are eliminating the sensing
requirement for [white space devices], we are encouraging continued development of this
capability because we believe it holds promise to further improvements in spectrum
efficiency … and will be a vital tool for providing opportunistic access to other spectrum
bands.” The ultimate solution to the spectrum inefficiency problem envisioned by the
FCC thus appears to be one in which unlicensed users may transmit in normally licensed
spectrum bands when the licensed users are not transmitting.
Active sensing by unlicensed secondary users: The active sensing solution has
the greatest potential to increase spectrum efficiency given the current allocation scheme.
In this system, current spectrum owners would keep their access rights essentially as they
currently exist. Whenever these primary users transmit information, they have a spectrum
band designated for that use, and they can expect to transmit without interference from
other users. The new system would be different in that, when primary users are idle,
unlicensed secondary users could opportunistically transmit over the primary users’
frequencies. This approach would address all of the inefficiencies present in the current
system while avoiding many of the logistical issues present in the simple unlicensed use
case. It would allow secondary users to access any idle spectrum – on frequencies that are
seldom used, in geographical areas where the licensed primary users are not broadcasting,
and at times when primary users are not broadcasting. One of the most important benefits
of this system is that it is not affected by the primary allocation scheme. The current
scheme could therefore remain in place. This is important because primary users who
have heavily invested in acquiring spectrum access rights and developing equipment and
technologies specific to their allocated bands of spectrum would retain essentially the
same transmission rights and would lose none of the fruits of their investments; this
would not be the case in any system that reallocated spectrum access rights in a
significant way. Another benefit would be the immediate opening of access to an
enormous amount of spectrum for every kind of user. Research and innovation could
proceed on a scale and at a pace that is entirely impossible in the current system.
12
Unlicensed consumer products would benefit from access to frequencies with much more
favorable physical propagation, and the move away from the crowded bands in which
many of them currently operate would mean much less interference from similar devices.
Those devices currently operating in the industrial, scientific, and medical (ISM) bands
must also accept any interference they receive, so their transmissions could become more
efficient in new bands with less interference.
There are a few potential issues with the active sensing system as well.
Enforcement could become extremely difficult if primary users attempted to monopolize
their access rights by broadcasting a dummy signal when they are idle. Secondary users
might try to use similar methods as well to keep others from transmitting at a certain
frequency when the primary user is idle, thereby effectively giving themselves de facto
ownership rights. It would also be extremely difficult to ensure that all unlicensed users
were strictly avoiding interference given the sheer number of wireless users and the
number of possible causes of interference. Primary users as well as secondary users
would likely have to be proactive in cataloging interference and reporting it, and the FCC
would need a system to handle complaints and punish violators. Another problem would
be the temptation for licensed primary users to expand into different frequency bands and
participate as secondary users in an attempt to secure more spectrum access. The biggest
issue, though, would likely be the initial development of active sensing itself for
secondary users. Given the great diversity of primary user applications, the problem of
determining whether a primary user is transmitting is difficult enough. Predicting when
transmissions will resume adds another layer of complexity to the problem. Accounting
for the activity of other secondary users makes it even more complicated. And, of course,
the concerns about space, computing, power, and hardware requirements discussed above
are still applicable.
Focus of this paper
Despite all of the possible difficulties associated with it, the wireless community
seems to have adopted dynamic spectrum access with active sensing for unlicensed
secondary users as a promising direction for wireless policy. Dynamic spectrum access
refers to a broad class of approaches to spectrum access in which devices are not bound
to a dedicated frequency band, as opposed to the current static access scheme. Given its
potential for increased efficiency, the minimal logistical problems that would be involved
in the transition to this system, and the FCC’s encouragement of the development of
active sensing technologies, the drawbacks of adopting this system are arguably
outweighed. The active sensing problem therefore requires, and is already receiving,
significant attention from researchers.
Many different approaches to managing secondary user channel activity have
been proposed, implemented, and analyzed. [6], [7], [8], [9] Great progress has been
made towards minimizing interference and maximizing efficiency given an assumed
generic model of primary user transmissions. The most successful studies assume a
primary user model similar to the one described in [10]. The primary user’s activity is
modeled as a Markov chain with two states: “busy” and “idle.” Each state has an
associated random variable that describes the amount of time the primary user remains in
that state. These two random variables are assumed to be distributed geometrically (in
discrete time) or exponentially (in continuous time), and their expected values are related
13
to the probabilities in the transition matrix that specifies the Markov chain. (For the sake
of brevity, only geometric distributions will be explicitly considered in this thesis, but
with the implication that similar results hold for exponential distributions since they are
analogous.) The model also takes as a parameter the probability that the state sequence
begins in one state or the other. This model seems appropriate for the task; a primary user
should indeed occupy either a “busy” or an “idle” state, and it makes sense that
transmission lengths are random to a degree and could therefore be characterized by a
statistical distribution. The model is general enough to account for different types of
primary users, as their transmissions would simply be characterized by distributions with
different parameters. These parameters are also supposed to be based on observations
made directly by the secondary users, so they should conform relatively well to the
primary user’s actual behavior. Given the presence of noise, though, the Markov process
cannot be directly observed, so it is termed a “hidden Markov model.”
To date, the assumptions made about the Markov process underlying the hidden
model have been assumed to be valid with almost no direct verification. One study finds
that, in a certain situation, the Viterbi algorithm can be used to determine the actual states
of a hidden Markov model from noisy observations with a mean success rate of about
80%. [11] This is far from ideal for minimizing interference, though, and the result is
based only on a simulated sequence generated with known parameters rather than on
actual observations of a primary user.
If the secondary user sensing algorithms currently being developed are to be
successfully deployed, it must be shown that active spectrum sensing can be practically
accomplished, and the assumptions underlying the sensing algorithms must be verified.
The remainder of this thesis describes the development of a spectrum monitoring tool that
demonstrates the ease of sensing channel activity, discusses the results of this tool’s
application to a specific primary user’s frequency, and uses these results to challenge
some assumptions typically made in sensing algorithms.
14
III. Spectrum Monitoring Tool
Description
In order to approach the problem of characterizing primary user activity, a tool for
monitoring spectrum use was needed. A generic version of such a tool was created with
the ability to determine when transmissions are occurring over a given wireless frequency
and record the times and lengths of these transmissions. This monitoring tool consists of
both hardware and software components.
The radio hardware is a Universal Software Radio Peripheral, or USRP. Several
versions of the USRP exist; the USRP1 is used in this application. The USRP1 is a device
with four 12-bit analog-to-digital converters operating at 64 mega-samples per second,
four 14-bit digital-to-analog converters operating at 128 mega-samples per second, four
daughtercard expansion ports, and USB connectivity. For this application, a WBX
daughtercard was installed in the USRP. The WBX board is a transceiver with a
frequency range of 50MHz to 2.2GHz. [12] The board has no dedicated antenna, but
instead a generic SMA antenna connector, due to this broad range. The spectrum
monitoring tool utilizes only the receive capabilities of the board. The USRP, with the
WBX board installed, is connected via USB to one of the Intel-based iMac computers
available in the Wireless at Notre Dame (WAND) lab and running Ubuntu 10.04. This
computer runs software specifically designed to drive the USRP hardware and process
the resulting data stream.
The software was created using the GNU Radio toolkit, a freely available set of
software radio blocks and related functionality. [13] A program called GNU Radio
Companion was used to facilitate the creation of a task-appropriate system of GNU Radio
blocks with the aid of a visual flowgraph abstraction. A screenshot of the GNU Radio
Companion flowgraph created for this application can be seen in Figure 2. In this image,
the USRP Source block corresponds to the digital data stream coming from the radio
hardware. The Simple Squelch block is a software implementation of a squelch that
calculates the instantaneous power in the signal and compares it to a threshold. The result
of this comparison is used to determine whether the data will be passed to the next block.
If the power level is over the threshold, the signal is passed through, but if not, no signal
is passed. The final block is a Waterfall Sink, which generates a real-time spectrogram of
the input data in a separate window. This flowgraph is arranged so that the waterfall plot
will show no power at any frequency unless the signal from the USRP exceeds the
specified power threshold, in which case the waterfall plot will simply display a
spectrogram of the input signal. This tool effectively emulates a hardware receiver
equipped with a squelch in that, given an appropriate threshold, the signal is only passed
to the waterfall plot for output when a real transmission is made on the channel.
In general, when working with GNU Radio software tools, the GNU Radio
Companion interface is unnecessary. When a flowgraph is created and executed in GNU
Radio Companion, a Python script is generated. This script instantiates all the blocks
used in the flowgraph, makes the appropriate connections between them in software to
properly process the signals, and then runs all the computation. The GNU Radio
Companion button used to execute the flowgraph simply is simply a graphical wrapper
for a tool that ensures that the script reflects the most up-to-date version of the flowgraph
and then makes a system call to run that script. The script can therefore be run without
15
GNU Radio Companion being open or even installed on the computer. It can also be
edited exactly like any other Python script. GNU Radio Companion was used for the
creation and verification of the spectrum monitoring tool, but the graphical interface was
then abandoned. The waterfall plot visualization is unnecessary for simple data
collection, so it was eventually disconnected in the Python script.
Figure 2: GNU Radio Companion flowgraph for the spectrum monitoring tool
Development for broad applicability
The spectrum monitoring tool was designed to have very broad applicability. A
power detection scheme was chosen because it can be used to recognize any kind of
transmission, regardless of modulation format or encoding. Transmitted signals must
have more average power than the background noise of the channel if they are to be
reliably detected by any receiver. The monitoring tool was designed to take advantage of
this fact in order to record all of the information needed to characterize primary user
activity; it does not need to decode messages to determine whether a transmission is
being made. The vulnerability of the power detection scheme is that it will result in false
positive detections when a primary user broadcasts a pilot tone or dummy signal when
idle.
The tool was designed in software radio due to the extreme flexibility of this
format. GNU Radio provides a set of resources that is more than capable of meeting the
challenge of detecting transmissions by comparing their power levels to a threshold. All
of the necessary parameters of the tool are adjustable with a quick text replacement in a
Python script: the operating frequency, receiver gain, sampling rate, and squelch
threshold are all set in the calls to the constructors of the blocks. Once data are actually
generated, given the computer-based environment, they can easily be stored in files that
can be repeatedly accessed, manipulated, and analyzed. The software radio format also
allows for the easy introduction of many additional tools for data verification or more
detailed data collection.
The hardware used is also extremely flexible. The USRP serves as a bridge
between the GNU Radio software and any USRP daughtercard. The daughtercard is the
only component that actually limits the range of frequencies that can be monitored. In the
work done for this paper, the WBX daughtercard was chosen because of its wide
operating range (50MHz-2.2GHz). Another daughtercard with a different range could
easily be use in its place, and the card installation process takes no more than a few
minutes. USRP daughtercards are generally manufactured with a standard antenna
16
connector as well so that antennas appropriate for the desired frequency can be used. Any
desired frequency could theoretically be monitored with this tool, given the appropriate
hardware.
Data collection and processing
As was discussed earlier, the assumed primary user model in need of verification
is a two-state Markov chain. The key parameters are the expected values of the “busy”
and “idle” durations, which were assumed to be geometrically distributed, and the initial
distribution of states. The easiest way to determine these parameters is to simply
aggregate data about when transmissions are detected and how long they last. The results
can then be used to analyze the statistical distributions of both the busy and idle
durations. If these distributions are indeed geometric, as assumed, then the expected
value of each can be determined and translated into the transition matrix.
Since the durations of busy and idle times were the desired data, the spectrum
monitoring tool had to be modified to record them in a usable way. These durations, as
well as the relative frequency of busy and idle states, can all be determined if the start and
end times of every transmission are known. To that end, the Python script for the
spectrum monitor was modified to wait for triggers from the squelch block. If the signal
power exceeded the threshold and the block began to pass data (an internal state of
“unmuted”), the time this occurred was recorded in a log file. When the signal power
subsequently dropped below the threshold and the block stopped passing data (an internal
state of “muted”), the time was again recorded. The differences between these times were
also recorded as a convenient way to keep track of busy and idle time durations,
eliminating the need for post-collection calculations to determine these durations. An
example of a log file can be seen in Appendix D.
Once the desired amount of data has been collected, it must be consolidated and
analyzed. A Python script was created to parse the logs created by the spectrum
monitoring tool (see Appendix B for code). It uses regular expressions to locate and
extract the busy period durations and place them in a list in a separate file, and it also
does this for the idle period durations. These new files lack the metadata compiled in the
logs, but as they consist only of lists of numbers, they can be easily imported into a
mathematical analysis program like MATLAB. Such a program can then be used to
manipulate the data many ways. Of particular interest in this application is the creation of
histograms to show the distribution of idle and busy durations. These histograms, with
appropriate bin sizes, should each approximately reflect the shape of a geometric
distribution if the assumed model is appropriate for predicting primary user activity. The
mean of each dataset can also be calculated. The mean can be used to generate an
estimated geometric distribution fit for the data. This generated fit can then be plotted
against the histogram for comparison. If the histogram does not match the expected
distribution, the data can also be compared to different distributions in the hopes of
finding a better fit.
17
IV. Example of Application
Task-specific customization
In order to collect an initial sample of data with the spectrum monitoring tool, an
appropriate frequency had to be chosen for monitoring. The Notre Dame Security Police
(NDSP) dispatch frequency, 460.625 MHz, was chosen for several reasons. Because of
the organization’s local nature, its power levels were likely to be quite high relative to the
background noise, making it easy to detect transmissions. The choice of a primary user
making a simple FM broadcast was also helpful in determining the proper detection
threshold since the detector’s accuracy could quickly be determined by comparing its
output to that of a simple FM receiver. Another reason for this choice was that most
transmissions were likely to be made from different locations around the campus, and the
officers carrying the radios were likely to be moving during at least some of the
transmissions. As a result, the transmissions could be seen as mobile in nature, making
the data collected relevant for more complex situations than those found in TV white
space access with stationary transmitters. Finally, since police are dispatched to handle
unexpected and unscheduled incidents, this frequency choice ensured that the data
collected would have a significant element of randomness. Although this frequency
choice was not expected to produce data representative of or applicable to every possible
primary user, it was seen as a good starting point for the verification of the assumed
primary user model given that a single frequency had to be chosen.
The broad frequency range of the WBX daughtercard installed in the USRP has
already been discussed. It should not be surprising, then, that the selected frequency,
460.625 MHz, falls comfortably within this range of 50MHz-2.2GHz. Because the WBX
has no antenna of its own, though, an appropriate antenna was necessary. The best match
available in the lab was a tri-band whip antenna with one center frequency at 440MHz.
Continuous testing over a four hour period verified that this antenna produced far more
reliable data than the others available. The results of the spectrum monitoring tool, both
in log records and on the waterfall plot, corresponded well to the audio output of an FM
receiver tuned to the same frequency only when the tri-band antenna was used. At exactly
the times the FM receiver began to play audio, log records indicated a transmission had
begun and the waterfall plot also began to display a non-empty spectrogram. A stopwatch
was used to determine the duration of transmissions, and these measured durations
corresponded well to those recorded in the logs. The spectrogram on the waterfall plot
also resumed its zero-power state when transmissions ceased. A sizeable range of receive
gain and threshold settings produced similarly desirable results, so a receive gain of 5 dB
and a threshold of 8 dB, which fell approximately in the middle of that range, were
selected. When other antennas were used, regardless of the choice of gain and threshold,
some transmissions were not properly detected when they commenced, others appeared
to have been interrupted even though they were not, and there were many instances of
false detections. It was thus determined that the tri-band antenna was the most
appropriate for the application.
Numerical results
The spectrum monitoring tool described above was used to collect transmission
data on the NDSP dispatch frequency of 460.625 MHz from March 5, 2011 to April 25,
18
2011. This collection occurred continuously, with the exception of seven interruptions of
approximately ten minutes each to verify continued proper operation of the tool. The
specific Python script used to run the tool can be seen in Appendix A. Once the data
collection process was completed, the Python log parsing script in Appendix B was used
to extract the relevant busy and idle durations from the log files and save them in lists
that could be easily imported into MATLAB. The MATLAB commands in Appendix C
were used to import the data sets, calculate various metrics, plot them in histograms, and,
where appropriate, superimpose a statistical distribution on the histograms for
comparison. In total, 3950389 seconds of idle time and 357849 seconds of busy time
were recorded, so the data suggest that the channel was in use for about 8.3% of the
observation period. The mean of the recorded idle durations was about 46.98 seconds and
the mean of the recorded busy durations was about 4.26 seconds.
Detailed results for the idle durations can be seen in Figure 3. The histogram
depicts a decaying relationship that suggests the assumption of a geometric distribution
could be appropriate. A geometric distribution is described by Pr(X=k) = p*(1-p)k-1,
k=1,2,3,… The expected value of such a distribution is given by 1/p. The maximum
likelihood estimation of p from the data is achieved by equating this value with the
sample mean of the data. This gives 1/p=46.98201776815765, or p=0.02128473930036.
In Figure 4, the geometric distribution generated by this value of p is plotted on a log
scale against the histogram data, which has been normalized by the sum of all of the idle
times. It is obvious that this geometric fit does not represent the data well at all. Another
fitting approach is suggested by the form of the geometric distribution. Since Pr(X=1) =
p*(1-p)1-1 = p, the initial values of the data and the fit line can be coordinated by setting p
equal to the normalized value of the histogram data at t=1, or p=0.01332197917724. The
resulting fit line can be seen, again on a log scale, in Figure 5. Once again, it is obvious
that this is not an appropriate fit. A gamma distribution might also be considered as a
candidate, especially since it has two parameters instead of one and is therefore a more
flexible distribution. This model is more difficult to generate, though, and it is not the
model assumed in active sensing algorithms, so finding a gamma distribution to fit the
data is beyond the scope of this thesis.
A more general approach to finding a distribution that fits the data is to fit a
polynomial to the log scale plot, then exponentiate this polynomial, and finally normalize
the result to have a unit integral. A linear fit (ax+b) corresponds to a geometric
distribution, but this choice has already been dismissed, and the data only appear linear in
the region from about 10 seconds to 500 seconds. The next option would be a quadratic
fit (ax2+bx+c), which would produce a result similar to a normal distribution. This seems
more appropriate than a linear model for durations less than 10 seconds and between
about 500 and 2000 seconds. The requirement of normalization to a unit integral means
the coefficient “a” must be a negative number, or else the quadratic function must be only
one part of a piecewise fit. The shape of the data corresponds to a positive “a” coefficient,
however, so a piecewise fit would be required; any higher-order polynomials would also
appear to require a piecewise fit of some sort. Every data point for durations longer than
2000 seconds represents a single occurrence, so it seems that defining the rest of a
piecewise fit would be very difficult with the remaining data, and any results obtained
would be unreliable without more data. In any case, this sort of a model is again beyond
the scope of this thesis.
19
The results for the busy times can be seen in Figure 6. This histogram appears
substantially different from that compiled based on the idle durations. For durations of
about four seconds and longer, this distribution appears to decay in what could be
geometric fashion. The data on the whole, though, are very obviously not geometrically
distributed. There are multiple possible explanations for this fact. One is that the
underlying activity can only be described by a degenerate random variable. Another is
that the data can be modeled with a bimodal distribution. This bimodal distribution could
be a combination of two normal distributions, or it might be a typical geometric
distribution added to another time-delayed geometric distribution, for example. In any
case, the assumption that this data set could be modeled by a simple geometric
distribution appears to be invalid.
One notable anomaly in the busy duration data occurred on April 16, the day of
Notre Dame’s spring intrasquad “Blue-Gold” football game. The typical concentration of
busy durations of 0.0 seconds was not found in this data set, as can be seen in Figure 7.
The reason for this specific difference is not obvious, but it was expected that the data
recorded on this day might be different from the rest because the unusually high number
of people and corresponding increase in activity on campus were likely to result in more
public safety communications. The distribution of data appeared to lose its bimodal
character, and the apparent mean busy duration increased by almost 1.3 seconds to 5.55
seconds. The data also suggest that utilization increased to 8.9%. The significant
differences here suggest that the appropriate model for this primary user might be timevarying.
20
Figure 3: Histogram of idle state durations, 100 bins
Figure 4: Idle durations with geometric fit, log scale
21
Figure 5: Idle durations with alternate geometric fit, log scale
Figure 6: Histogram of busy state durations, 82 bins
22
Figure 7: Blue-Gold game busy duration histogram, 41 bins
23
V. Conclusion
Implications for secondary user sensing algorithms
The numerical results obtained cannot be considered representative of all primary
users. It is quite possible that the assumed primary user model is accurate in some
applications. The results do show, however, that the assumed model is not sufficient to
describe the activity of at least one primary user. If there are only two states in the model,
it seems they cannot be described by geometric distributions. At least two different
scenarios seem possible based on these results. One is that the two-state Markov model is
still considered appropriate to characterize primary users, but the assumption of a
geometric distribution should be abandoned. For this to be the case, two distinct
statistical distributions would need to be found to fit the busy duration and idle duration
data. One concern in this scenario is that the more complicated distributions might not be
applicable to other primary users, so the model would be difficult to generalize. Another
is that using these more complicated distributions would be more computationally
intensive, reducing the efficiency of the algorithms. Another scenario is that the
assumption of a geometric distribution of state durations is appropriate, but that the twostate model is invalid. In this case, the remaining work would be in defining the proper
states. A method would also have to be devised for determining the current state since the
simple “busy” and “idle” characterizations would no longer be sufficient. In either case,
active sensing algorithms would have to be revised to incorporate the new model. A new
model also entails a modified decision process that would likely produce different results.
The results show two important things besides the apparent invalidity of the
assumed model for primary user activity. First, it is clear that even a simple and
unsophisticated sensing approach can yield meaningful results. The data collected on a
single frequency using a simple power detection scheme for less than two months were
coherent enough to prompt doubt in a modeling assumption. A more diverse data set
gathered over a longer period of time with more sophisticated techniques should only
serve to improve the utility of the results. Second, the results show that a cognitive
software radio can indeed gather the data that an unlicensed secondary user would need
to acquire in the course of its operation. The implications of this fact are significant. If the
cognitive radios intended for eventual secondary use are equipped with the hardware
necessary for their desired broadcasts, they also already have the hardware necessary for
receiving at those frequencies and the software capabilities for implementing a spectrum
monitor. These radios, then, should be capable of deployment as soon as their sensing
algorithms have been deemed sufficient. Once this has occurred and the appropriate
regulatory policies are in place, a shift to a dynamic spectrum access system and more
efficient spectrum utilization can occur in short order.
Reliability of data and suggestions for improvement
The data collected over the course of this experiment seemed generally reliable.
They were consistent over time as the histograms of the final aggregated data did not
differ much qualitatively from those compiled at several intermediate times over the
course of the experiment, with the exception of the data collected during the Blue-Gold
game, as discussed above. Small samples of the collected data were also verified in the
24
initial stages of acquisition and at multiple times over the course of the experiment.
Nevertheless, there were many potential sources of error.
The power threshold and gain were set at levels that seemed appropriate at the
times when the data were verified, but these times were small in comparison with the
entirety of the data collection. The background noise level might have increased at times
and corrupted some data. Some transmissions might also have occurred at a great enough
distance that they were not detected by the monitoring tool. In general, setting a static
threshold for months of data collection is a simple solution, but it is not an elegant one. A
dynamic threshold based on accumulated data might be explored to resolve this issue.
The precision of the recorded transmission times and durations was limited by the
precision of the computer’s system clock. Whether it was a limitation imposed by the
iMac hardware or the Ubuntu operating system, system time calls only reported the time
to the nearest second. This did not materially affect a good proportion of the data; for any
duration longer than about ten seconds, the maximum possible error would result in
practically negligible changes. Unfortunately, though, many of the recorded durations
were actually 0.0 seconds. Using a more precise clock would result in non-zero data, so
the mean durations would likely increase.
The frequent occurrence of 0.0 second durations also suggests a possible
shortcoming in the detection scheme. Idle times of 0.0 seconds could reflect short pauses
in one user’s transmission or the time between two different users’ transmissions in a
conversation, but they could also reflect momentary drops in signal power that do not
actually signal the end of one transmission. Similarly, busy times of 0.0 seconds could
accurately reflect short transmissions, but they could also indicate false positives when
the noise power level momentarily exceeded the threshold. Correcting this problem of
false state transition detection would change the metrics of both the idle and busy
durations by decreasing the number of recorded interruptions in both states. This would
decrease the number of very short durations and increase the mean duration for both
states, and these changes could increase the geometric character of the idle durations and
reduce the bimodal character of the busy durations. The potential for false triggering due
to transient conditions on the channel cannot be discounted without proper supporting
data. To address this issue, the spectrum monitoring tool could be modified to record the
signal power level in addition to the time whenever the threshold is crossed.
Another issue for data reliability is the sample size. Recording data for a span of
more than a month might be assumed to generate a sufficient sample, but this discounts
several possibilities. A given primary user’s activity might be very seasonal, for example,
and could therefore have completely different characteristics over a period of similar
length at a different time of the year. The period of observation could also have been
anomalous if, for example, the primary user had been making a transition to new
equipment or training new users at that time. Some of the primary user’s equipment could
also have been malfunctioning during this time. Finally, as discussed above, there is a
much smaller set of recorded data for longer durations than for shorter ones, and this
difference is accompanied by much more variation in the data for longer durations. Since
the portion of the data with the lowest variance is the one with the most occurrences, it
should be expected that a lower variance, and therefore more reliable data, would
accompany the accumulation of substantially more data. Observations would ideally
occur over the course of multiple years to account for all of these possibilities, but
25
waiting that long to analyze the data could render the data collection less relevant.
Suggestions for further exploration
The data gathered were found not to be geometrically distributed as it had been
assumed they would. Other possibilities, like the gamma distribution discussed for the
idle times and the bimodal distribution discussed for the busy times, should be explored
in an attempt to create a rigorous primary user model. A much larger data set should be
amassed in order to refine and verify that model. The possibility of a time-varying model
that was discussed in relation to the Blue-Gold game data should also be explored by
recording and analyzing activity during Notre Dame home football games when even
larger crowds and therefore more channel utilization are expected.
Many other opportunities beyond the specific concerns of this thesis should be
explored, as well. The spectrum monitoring tool was designed with broad applicability in
mind. Its use should not be limited to monitoring just one or a few frequencies. It can and
should be used to collect data on many different primary users operating in different
regions of the wireless spectrum with different transmitter characteristics. All of the data
collected should be carefully analyzed so that the models assumed to characterize
primary user activity can be closely scrutinized and, when necessary, corrected. At the
same time, the capabilities of the tool should be expanded. Although a simple power
detection scheme with a static threshold might be useful in many applications, more
complicated methods might provide better results, and they can easily be implemented
given the adaptable nature of software radio. Data verification tools can also easily be
incorporated, including features like audio and video playback or packet decoding,
depending on the application. In general, researchers should take advantage of the tool’s
great flexibility. More efficient spectrum use is an achievable goal, but it will likely
depend on the success of active sensing algorithms, and these algorithms can only
achieve their maximum potential if they are based on accurate models of primary user
activity. The spectrum monitoring tool can be a great aid in creating and verifying these
models.
26
References
[1] “Prepared Remarks of Chairman Julius Genachowski, Federal Communications
Commission” <http://www.fiercewireless.com/press-releases/prepared-remarks-chairmanjulius-genachowski-federal-communications-commission-2>
[2] “The End of Spectrum Scarcity” <http://spectrum.ieee.org/telecom/wireless/the-end-ofspectrum-scarcity>
[3] “Auction 73 Fact Sheet”
<http://wireless.fcc.gov/auctions/default.htm?job=auction_factsheet&id=73>
[4] Cabric, D.; Mishra, S.M.; Brodersen, R.W., "Implementation issues in spectrum sensing
for cognitive radios," Signals, Systems and Computers, 2004. Conference Record of the
Thirty-Eighth Asilomar Conference on, vol.1, pp. 772- 776, 7-10 Nov. 2004.
[5] “Second Memorandum Opinion and Order”
<http://www.fcc.gov/Daily_Releases/Daily_Business/2010/db0923/FCC-10-174A1.pdf>
[6] “Sequential Bandwidth and Power Auctions for Spectrum Sharing”
<http://www.eecs.northwestern.edu/~mh/Papers/BaeBeiBer08.pdf>
[7] Beibei Wang; Zhu Ji; Liu, K.J.R., "Self-Learning Repeated Game Framework for
Distributed Primary-Prioritized Dynamic Spectrum Access," Sensor, Mesh and Ad Hoc
Communications and Networks, 2007. SECON '07. 4th Annual IEEE Communications
Society Conference on , pp.631-638, 18-21 June 2007.
[8] Pal, R., "Efficient Routing Algorithms for Multi-Channel Dynamic Spectrum Access
Networks," New Frontiers in Dynamic Spectrum Access Networks, 2007. DySPAN 2007.
2nd IEEE International Symposium on, pp.288-291, 17-20 April 2007.
[9] Bin Shen; Kyung Sup Kwak; Longyang Huang; Zheng Zhou, "Improved Consecutive
Mean Excision Algorithm Based Spectrum Sensing for Dynamic Spectrum Access,"
Communications Workshops, 2008. ICC Workshops '08. IEEE International Conference on,
pp.513-517, 19-23 May 2008.
[10] Zhanwei Sun; Laneman, J.N.; Bradford, G.J., "Sequence Detection Algorithms for
Dynamic Spectrum Access Networks," New Frontiers in Dynamic Spectrum, 2010 IEEE
Symposium on, pp.1-9, 6-9 April 2010.
[11] C. Ghosh, C. Cordeiro, D. P. Agrawal, and M. B. Rao, “Markov Chain Existence and
Hidden Markov Models in Spectrum Sensing,” IEEE PerCom 2009, pp.1-6, 9-13 March
2009.
[12] “Ettus Research WBX” <http://www.ettus.com/WBX>
[13] “GNU Radio” <http://gnuradio.org/redmine/wiki/gnuradio>
27
Appendix A
Spectrum monitoring tool: monitor.py
#!/usr/bin/env python
##################################################
# Gnuradio Python Flow Graph
# Title: Top Block
# Generated: Wed Feb 23 21:35:02 2011
##################################################
from gnuradio import audio
from gnuradio import eng_notation
from gnuradio import gr
from gnuradio import window
from gnuradio.eng_option import eng_option
from gnuradio.gr import firdes
from gnuradio.wxgui import fftsink2
from gnuradio.wxgui import waterfallsink2
from grc_gnuradio import usrp as grc_usrp
from grc_gnuradio import wxgui as grc_wxgui
from numpy import pi
from optparse import OptionParser
import wx
import time
class top_block(grc_wxgui.top_block_gui):
def __init__(self):
grc_wxgui.top_block_gui.__init__(self, title="Top Block")
##################################################
# Variables
##################################################
self.samp_rate = samp_rate = 44100
##################################################
# Blocks
##################################################
self.audio_sink_0 = audio.sink(44100, "plughw:0,0", True)
self.gr_pll_freqdet_cf_0 = gr.pll_freqdet_cf(.200, .080, samp_rate*pi, samp_rate*pi)
self.gr_simple_squelch_cc_0 = gr.simple_squelch_cc(8, .01)
self.gr_throttle_1 = gr.throttle(gr.sizeof_gr_complex*1, samp_rate)
self.usrp_simple_source_x_0 = grc_usrp.simple_source_c(which=0, side="A",
rx_ant="TX/RX")
self.usrp_simple_source_x_0.set_decim_rate(256)
self.usrp_simple_source_x_0.set_frequency(460.625e6, verbose=True)
self.usrp_simple_source_x_0.set_gain(5)
self.wxgui_fftsink2_0 = fftsink2.fft_sink_c(
28
self.GetWin(),
baseband_freq=0,
y_per_div=10,
y_divs=10,
ref_level=50,
ref_scale=2.0,
sample_rate=samp_rate,
fft_size=1024,
fft_rate=30,
average=False,
avg_alpha=None,
title="FFT Plot",
peak_hold=False,
)
self.Add(self.wxgui_fftsink2_0.win)
self.wxgui_waterfallsink2_0 = waterfallsink2.waterfall_sink_c(
self.GetWin(),
baseband_freq=0,
dynamic_range=100,
ref_level=40,
ref_scale=2.0,
sample_rate=samp_rate,
fft_size=512,
fft_rate=15,
average=False,
avg_alpha=None,
title="Waterfall Plot",
)
self.Add(self.wxgui_waterfallsink2_0.win)
self.gr_null_sink_0 = gr.null_sink(gr.sizeof_gr_complex*1)
##################################################
# Connections
##################################################
#self.connect((self.gr_simple_squelch_cc_0, 0), (self.gr_pll_freqdet_cf_0, 0))
#self.connect((self.gr_pll_freqdet_cf_0, 0), (self.audio_sink_0, 0))
#self.connect((self.gr_throttle_1, 0), (self.wxgui_fftsink2_0, 0))
#self.connect((self.gr_throttle_1, 0), (self.wxgui_waterfallsink2_0, 0))
self.connect((self.gr_simple_squelch_cc_0, 0), (self.gr_throttle_1, 0))
self.connect((self.usrp_simple_source_x_0, 0), (self.gr_simple_squelch_cc_0,
0))
self.connect((self.gr_throttle_1, 0), (self.gr_null_sink_0, 0))
def set_samp_rate(self, samp_rate):
self.samp_rate = samp_rate
self.wxgui_fftsink2_0.set_sample_rate(self.samp_rate)
self.wxgui_waterfallsink2_0.set_sample_rate(self.samp_rate)
if __name__ == '__main__':
29
parser = OptionParser(option_class=eng_option, usage="%prog: [options]")
(options, args) = parser.parse_args()
tb = top_block()
tb.start()
off_t = time.localtime()
while True:
while not tb.gr_simple_squelch_cc_0.unmuted(): {}
print tb.gr_simple_squelch_cc_0.d_iir.d_prev_output()
on_t = time.localtime()
f = open('receive_log','a')
f.write("off time: ")
f.write(str(time.mktime(on_t)-time.mktime(off_t)))
f.write("\non at ")
f.write(str(on_t))
f.write("\n")
f.close()
while tb.gr_simple_squelch_cc_0.unmuted(): {}
off_t = time.localtime()
f = open('receive_log','a')
f.write("on time: ")
f.write(str(time.mktime(off_t)-time.mktime(on_t)))
f.write("\noff at ")
f.write(str(off_t))
f.write("\n")
f.close()
tb.stop()
30
Appendix B
Log file parser: log_reader.py
#!/usr/bin/env python
import time
import re
f1 = open('receive_log.DATE','r')
f2 = open('on_times','a')
f3 = open('off_times','a')
for line in f1:
m = re.search('(?<=off time: )[0-9]*\.[0-9]*',line)
if m:
f3.write(m.group(0))
f3.write('\n')
m = re.search('(?<=on time: )[0-9]*\.[0-9]*',line)
if m:
f2.write(m.group(0))
f2.write('\n')
f1.close()
f2.close()
f3.close()
31
Appendix C
MATLAB commands
% Import data
load(‘off_times’)
load(‘on_times’)
% Calculate channel use
sum(off_times)
sum(on_times)
sum(on_times)/(sum(on_times)+sum(off_times))
% Plot histograms of data sets
hist(off_times,100)
title('Idle durations')
ylabel('Count')
xlabel('Time (seconds)')
hist(on_times,82)
title('Busy durations')
ylabel('Count')
xlabel('Time (seconds)')
% Calculate mean of data sets
mean(off_times)
mean(on_times)
% Collect histogram data in arrays
[n,x]=hist(off_times,1000);
% Calculate p
p=1/mean(off_times)
p= n(1)/sum(off_times)
%based on mean
%based on first point of normalized data
% Plot data against geometric distribution generated by p
bar(x,n/sum(off_times))
hold on
plot([1:4500],p*(1-p).^[0:4499],'r')
% plot settings were modified within the GUI to use a log scale and add captions
32
Appendix D
Log file excerpt
off time: 26.0
on at time.struct_time(tm_year=2011, tm_mon=4, tm_mday=15, tm_hour=11,
tm_min=46, tm_sec=37, tm_wday=4, tm_yday=105, tm_isdst=1)
on time: 3.0
off at time.struct_time(tm_year=2011, tm_mon=4, tm_mday=15, tm_hour=11,
tm_min=46, tm_sec=40, tm_wday=4, tm_yday=105, tm_isdst=1)
off time: 2.0
on at time.struct_time(tm_year=2011, tm_mon=4, tm_mday=15, tm_hour=11,
tm_min=46, tm_sec=42, tm_wday=4, tm_yday=105, tm_isdst=1)
on time: 3.0
off at time.struct_time(tm_year=2011, tm_mon=4, tm_mday=15, tm_hour=11,
tm_min=46, tm_sec=45, tm_wday=4, tm_yday=105, tm_isdst=1)
off time: 168.0
on at time.struct_time(tm_year=2011, tm_mon=4, tm_mday=15, tm_hour=11,
tm_min=49, tm_sec=33, tm_wday=4, tm_yday=105, tm_isdst=1)
on time: 4.0
off at time.struct_time(tm_year=2011, tm_mon=4, tm_mday=15, tm_hour=11,
tm_min=49, tm_sec=37, tm_wday=4, tm_yday=105, tm_isdst=1)
off time: 1.0
on at time.struct_time(tm_year=2011, tm_mon=4, tm_mday=15, tm_hour=11,
tm_min=49, tm_sec=38, tm_wday=4, tm_yday=105, tm_isdst=1)
on time: 3.0
off at time.struct_time(tm_year=2011, tm_mon=4, tm_mday=15, tm_hour=11,
tm_min=49, tm_sec=41, tm_wday=4, tm_yday=105, tm_isdst=1)
off time: 469.0
on at time.struct_time(tm_year=2011, tm_mon=4, tm_mday=15, tm_hour=11,
tm_min=57, tm_sec=30, tm_wday=4, tm_yday=105, tm_isdst=1)
33
34
© Copyright 2026 Paperzz