• Camillo Gentile
  • Nayef Alsindi
  • Ronald Raulefs
  • Carole Teolis


The integration of location services into our day-to-day life will grow significantly over the next decade as technologies mature and accuracy improves. The evolution of localization technologies has occurred independently for different wireless systems/standards. The Global Positioning System (GPS) was the first system to bring to light the benefits of accurate and reliable location information. Consequently, it has been incorporated into many services and applications. Currently, outdoor localization, thanks to GPS, has revolutionized navigation-based applications running on automotive GPS-enabled devices and smart phones. Applications range from location awareness, to point-by-point directions between destinations, to identifying the closest cinema or coffee shop. The basic technology behind the system is to measure the time elapsed for a signal to travel between a number of satellites orbiting the globe and a mobile device. Through a computational technique known as triangulation, the location of the mobile can be calculated from the tracked positions of the satellites and the times measured, each known as the Time-of-Arrival. The success of GPS has been due to the reliability, availability, and practical accuracy that the system can deliver; however, GPS lacks coverage indoors and in urban areas, in particular near buildings when the signal is blocked; even in the best of conditions, the accuracy is on the order of several meters.


Global Position System Mobile Device Receive Signal Strength Inertial Navigation System Inertial Sensor 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1.1 Overview

1.1.1 Outdoor Localization

The integration of location services into our day-to-day life will grow significantly over the next decade as technologies mature and accuracy improves. The evolution of localization technologies has occurred independently for different wireless systems/standards. The Global Positioning System (GPS) was the first system to bring to light the benefits of accurate and reliable location information. Consequently, it has been incorporated into many services and applications. Currently, outdoor localization, thanks to GPS, has revolutionized navigation-based applications running on automotive GPS-enabled devices and smart phones. Applications range from location awareness, to point-by-point directions between destinations, to identifying the closest cinema or coffee shop. The basic technology behind the system is to measure the time elapsed for a signal to travel between a number of satellites orbiting the globe and a mobile device. Through a computational technique known as triangulation, the location of the mobile can be calculated from the tracked positions of the satellites and the times measured, each known as the Time-of-Arrival. The success of GPS has been due to the reliability, availability, and practical accuracy that the system can deliver; however, GPS lacks coverage indoors and in urban areas, in particular near buildings when the signal is blocked; even in the best of conditions, the accuracy is on the order of several meters.

As the growth of the number of smart devices and mobile users continues to increase without bound, the desire for new location-based services that require enhanced accuracy, including in GPS-denied areas, has emerged. To address this challenge, novel solutions attempt to integrate different wireless technologies with GPS. For example, assisted GPS (A-GPS) was developed to provide better localization information in limited coverage areas by decreasing the time necessary for GPS to obtain a position fix (Richton 2001). Specifically, in A-GPS, cellular networks furnish GPS-equipped mobile devices with satellite constellation information such that they can identify the closest orbiting satellites a priori, providing a faster lock. In addition, A-GPS relieves the burden of the computationally intensive triangulation technique from the CPU-limited mobile device by forwarding the links the GPS receiver measures to the base stations (BSs), which then calculate the mobile’s position and return the information to the mobile device.

Unsurprisingly, the next logical evolution of localization systems emerged from the cellular domain, where the requirement for localization was spearheaded by the Emergency-911 (E-911) mandate. Before GPS was widely available on mobile phones, cellular operators adopted and deployed varying technologies to locate mobiles within a cell radius. Time-Of-Arrival-based computational techniques, which originated from GPS systems, were adapted in order to achieve similar localization performance for the common cell phone channel sharing (multiplexing) schemes: Time Division Multiple Access (TDMA) and Code Division Multiple Access (CDMA). The use of cellular localization was limited to E-911 due to the difficulty in achieving useful accuracy especially in urban environments. The poor accuracy stemmed from clusters of buildings in urban and suburban residential areas which brought about significant signal degradation due to multipath and Non-Line-Of-Sight (NLOS) problems.

At the same time, the popular IEEE 802.11 standard emerged, enabling ubiquitous deployment of Wi-Fi hotspots which sparked enthusiasm for an alternative to cellular localization. The rapid expansion of Wi-Fi access points (AP) across the urban/indoor environments made it possible for researchers to envision alternatives to TOA-based systems. Specifically, Received Signal Strength (RSS) location fingerprinting techniques emerged. One success story for deployment in the urban environment is Skyhook Wireless, a start-up company in the Boston area. Skyhook realized the potential of exploiting Wi-Fi signals emitted from residential homes and offices (available for free!) that are continuously in use—particularly in dense urban areas. Skyhook realized they could improve localization by building databases of Wi-Fi signatures tied to locations that could be integrated to aid in the localization process. In essence, a survey is conducted by “wardriving” across a city with a Wi-Fi equipped device and a companion GPS receiver to record location. Wi-Fi RSS values and associated Medium Access Control (MAC) IDs are registered in a database for each location. During a localization query, a mobile device compares the RSSs measured from the registered APs to those in the database using a pattern matching technique. The mobile’s location is then determined by the best RSS match. Skyhook Wireless’s technology has attracted attention from the major players in the mobile device industry such as Apple and Google (Wortham 2009). The technique is very practical and delivers decent accuracy (tens of meters) for mobile location applications in outdoor urban environments where Wi-Fi APs are plentiful.

Of course, the aforementioned triangulation fingerprinting techniques are not just applicable to GPS, cellular, or Wi-Fi networks. They can also be readily extended to virtually any pervasive radio-frequency source, in particular to television networks. In fact, the Rosum Corporation from Redwood City, CA took advantage of the 4.5 MHz of bandwidth available in broadcast TV channels. Besides the wide bandwidth available for accurate TOA estimation, the low carrier frequency offered excellent penetration through walls to mitigate NLOS conditions. The performance of different types of RF location systems in terms of cost and accuracy will vary widely. Other system design considerations include 2D or 3D location accuracy requirements, power requirements, and whether infrastructure installation is acceptable and, if so, the density of BSs required to achieve the desired accuracy. In designing a system, there will be tradeoffs between performance and cost requirements. For example, typical RF base positioning accuracies are tens of meters accuracy at best and do not provide accurate elevation. Many solutions available to augment GPS leverage surrounding infrastructure such as cell towers, Wi-Fi hot spots, or installed RF tags. The precision of the results varies widely based on the infrastructure location and availability.
  • Cellular survey-based techniques: hundreds or thousands of meters

  • Cellular triangulation techniques: less than 100 m

  • Television triangulation techniques: tens to hundreds of meters

  • Wi-Fi survey-based techniques: tens to hundreds of meters.

The latter three techniques require signals from at least three reference stations which could lead to operational lapses indoors. It is not possible to rely on these infrastructure-based solutions in uncontrolled environments such as emergency or combat operations where the infrastructure may not exist; however, for commercial use, the accuracy and reliability provided may be adequate (Baker 2011; Mia 2011; Young 2008).

1.1.2 Indoor Localization

The lucrative business opportunities for location-enabled services are not limited to outdoors. In fact, the potential for indoor services has been projected by different sources as an untapped multibillion dollar industry (Patel 2011). The variety of indoor applications affects every aspect of our lives: from E-911 to respond to mobile emergency calls to tracking kids in day-care centers, elderly in nursing homes, inventories in warehouses, medical devices in hospitals, and personnel in emergency/first responder applications (firefighters). What is stopping or hindering the emergence of such needed—even life-saving for emergency response—applications is the difficulty in delivering the required accuracy and reliability in indoor environments. Indoor localization research has been going on for decades in the robotics field (Smith 1986; Durrant-Whyte 1988). The E-911 requirement for improved localization of cell phones indoors spurred more RF infrastructure and signals of opportunity-based research (Pahlavan et al. 1998). The fact that location research is to date a very active research area indicates that there are still many challenges left to resolve. The challenges depend on the required accuracy and reliability dictated by the application. For applications that require only coarse location information and can afford to install a significant amount of infrastructure, there are existing products, for example by the Finnish company Ekahau (EKAHAU 2012) and CISCO Wireless Location Appliance (CISCO Corporation 2012). These systems capitalize on the RSS location fingerprinting technique to deliver accuracies on the order of a few meters in the indoor environment. However, it became evident that the effectiveness and robustness of RSS-based fingerprinting techniques are limited to uncluttered environments and outdoors.

As the application domain gravitated toward the dense urban and indoor settings, where localizable assets naturally clutter together due to smaller dimensions, an alternative to legacy cellular localization and fingerprinting techniques was needed to push the accuracy boundary to sub-meter—the so-called “holy grail” of indoor localization. Many potential applications were envisioned to benefit from centimeter-level information, from inventory tracking to firefighters/soldiers tracking inside buildings. The fundamental challenge indoors is that the radio frequency environment—characterized by limited coverage, severe multipath signal fading and NLOS conditions—is not conducive to wireless propagation. Since the limitations are physical in nature, they must be dealt with by any algorithm or technique. To this end, researchers revisited TOA-based techniques—however applied Ultra-Wideband (UWB) communications which uses low power but increased bandwidth to provide protection against multipath—and NLOS mitigation algorithms to combat the effects of the propagation environment. A significant portion of this book is dedicated to addressing the challenges of harsh propagation environments.

As the form factor of mobile devices diminished in size, yet increased in complexity, a new school of thought emerged from the localization research community around the idea of collaboration using sensor networks. This area is interesting in that wireless sensor networks (WSNs) developed independently from cooperative localization and—only when applications were considered for the former—did it become obvious to the sensor network researchers that location information is indeed vital. At the same time, localization researchers analyzed the potential in collaboration between the two areas to address the propagation challenges and currently cooperative localization in WSNs is a very active research area—theory, experimental, and hardware/software development.

Since geolocation is a dynamic process, navigation and tracking techniques (similar to outdoor GPS) naturally complement “stationary” localization techniques/algorithms. The development of Microelectromechanical Systems (MEMS) technology led to the dawn of miniature inertial sensors such as accelerometers and gyroscopes, enabling smartphones and mobile/gaming devices to be equipped with navigation sensors. MEMS-based inertial navigation systems (INS) developed in parallel to RF geolocation techniques and provided another localization dimension. Inertial navigation technologies do have their own challenges and limitations—due to low-cost hardware that introduces errors/drifts/biases to the speed/acceleration estimation. The development of inertial technology integrates naturally with the evolution of “RF localization” in the sense that their complementary error properties can make possible even more accurate and robust geolocation systems.

In general, providing accurate location and navigation indoors will require extensive infrastructure or the implementation of multiple, complementary technologies (RF, gyroscopes, pressure sensors, speed sensors, etc.). In fact, the trend in indoor geolocation research seems to point towards the integration of hybrid sensor technologies. The effectiveness of different sensors can vary based on the environment of operation and the tracked subjects motion: RF propagation depends on building topology and construction material, lighting affects optical sensors, and the tracked subject’s motion affects optical and inertial sensors. Inertial and RF based location sensors provide complementary location information: inertial tracking systems provide high accuracy over short durations, but suffer from significant drift over longer times in the absence of methods to mitigate their drift; in contrast, RF ranging measurements are subject to short-term outages in areas with poor RF connectivity, but can provide long-term stability when fixed references are part of a managed infrastructure. Wi-Fi only location provides an unmanaged and often changing network of APs which cannot be relied upon if a known level of accuracy is required, but may provide adequate accuracy for many consumer applications. Finally, elevation determination (floor location) presents a challenge for RF systems; here, inertial and referenced barometric pressure systems can provide support.

Developing algorithms to effectively fuse sensor data from multiple sources to produce improved localization results is a hot research topic. One popular technique is Simultaneous Localization and Mapping (SLAM). SLAM relies on data from multiple sensors to build a map of the environment that enables one to navigate for long periods of time by using the map to provide location corrections. SLAM systems use RF and inertial sensors as well as sensors that measure the environment directly such as image, Light Detection and Ranging (LIDAR), and sonar sensors to construct a geometric or topological map of the environment and then use that map for navigation. The environmental sensors help to alleviate some of the problems faced by inertial and RF techniques, but they have their own set of problems and challenges in the path to accurate mapping and localization.

1.2 Organization

The focus of this book is to provide an overview on the different types of infrastructure supported by most commercial localization systems as well as on the most popular computational techniques which these systems employ. While much of the content presented applies to outdoor systems as well, the specific concentration of the book is on robust systems which can deliver high degrees of accuracy in harsh multipath environments; these environments are most common indoors. Each chapter of this book introduces a different aspect of localization systems and describes solutions that have been developed to address specific challenges.

The organization of the book chapters follows closely the evolution of geolocation techniques for the last couple of decades. In this section we will provide a detailed overview of each chapter. Figure 1.1 highlights the overall structure of the book where the focus is to introduce the fundamentals, challenges, and evolution of localization technology.
Fig. 1.1

Techniques for accurate and robust localization

1.2.1 Chapter 2: Localization in Harsh Multipath Environments

In  Chap. 2 the basics of RSS, TOA and Angle-of-Arrival (AOA) localization are first introduced. The impact of multipath and NLOS is then investigated for the two most popular ranging technologies: TOA and RSS. Finally, measurement and modeling of ranging is presented to provide an empirical analysis into the challenges of harsh multipath environments. For RSS-based systems, multipath causes the well-known fast fading phenomenon, where the received power in a given location fluctuates significantly due to constructive and destructive interference of incoming multipath signals. For TOA-based systems, the multipath impacts the distance estimation directly by adding a random bias to the estimation and it is usually a more serious problem. In low bandwidth systems, for example, the time resolution can yield significantly inaccurate distance estimates. Typically, the time domain resolution is inversely proportional to the system bandwidth. For example, the bandwidth of GSM signals is 200 kHz which translates to 5 μs or 1,500 m! This means that two paths arriving less than 1,500 m will not be resolved. For example, system bandwidth can vary between 5 and 20 MHz (UMTS/WiMAX/LTE) in which the highest bandwidth of 20 MHz equates to ~15 m of time resolution. This resolution, unfortunately, is not suitable for dense multipath environments (such as indoors) where large errors in the final localization solution can make it difficult to localize mobile devices to within even a single floor. The ambiguity resulted from poor multipath resolution is one of the major challenge facing localization technology in multipath rich environments such as dense urban or indoors.

The second major challenge facing dense urban/indoor environments is the high probability of the obstruction of the LOS between the transmitting and receiving device. This channel condition is commonly referred to as NLOS. For RSS-based systems, NLOS introduces the problem of shadow fading, where RSS is attenuated randomly as the mobile device moves from one area to the other. Since obstructions change significantly (doors, walls, elevators, etc.) the RSS changes significantly and this fluctuation makes it difficult to rely on RSS-based range estimates in NLOS. Furthermore, pathloss models that describe the distance–power relationship can be difficult to obtain for the variety of obstructions in realistic urban/indoor environments. For TOA-based systems, NLOS affects the estimation of the delay of the direct path signal. Since in most cases the direct path delay signal will not be detectible, ranging is achieved through non-direct path components which bias TOA-based estimation. This bias can range from a meter to even tens of meters depending on the propagation environment and type of obstructions.

A detailed empirical evaluation is further introduced in the last section of  Chap. 2 which will shed light on the significance of the multipath and NLOS problems. The ultimate aim of the measurement and modeling of TOA- and RSS-based ranging is to be able to answer the following fundamental questions:
  • How does the system bandwidth improve accuracy?

  • To what extent can the increase in system bandwidth improve accuracy in LOS and NLOS environments?

  • How significant are the NLOS-induced errors experienced in harsh multipath environments?

  • Is the TOA-based ranging error a function of the propagation environment (e.g. building structure)?

  • For a given operational multipath environment, what is the practical ranging coverage that can be achieved for TOA-based techniques? This question is important since a notion of ranging coverage which is analogous to communication coverage is needed in practice.

  • How are RSS-based ranging techniques affected by the LOS/NLOS power variations with location?

These questions are fundamentally important to system engineers designing next-generation ranging and localization systems. In addition, channel measurement and modeling can shed light on the correlation between the channel conditions (LOS vs. NLOS) and signal metrics such as power of the first path, total signal power, etc. These relationships can be exploited in NLOS identification algorithms, which are typically required for reliable and practical ranging and localization in harsh multipath environments (NLOS identification/mitigation algorithms are introduced in  Chap. 3).

1.2.2 Chapter 3: Multipath and NLOS Mitigation Algorithms

Mitigating the multipath propagation challenges is addressed in  Chap. 3 and the chapter starts with describing two major techniques/technologies to mitigate the multipath problem: Super-resolution and UWB. Super-resolution techniques have shown great potential for low-bandwidth systems and the improvement in time resolution can enhance the accuracy significantly for certain scenarios. UWB is an emerging technology that utilizes very large system bandwidths and has the potential for high data rate communications (in the Gigabit range) and centimeter level TOA estimation accuracies. From a ranging/localization perspective, full usage of the designated 7.5 GHz bandwidth translates into a time domain resolution of 4 cm, which is highly desirable for accurate positioning. There are two main types of UWB systems: Single band and multiband UWB. The single band UWB is typically known as impulse radio UWB [6], where very narrow pulses in the time domain achieve the bandwidth that defines UWB. The latter technique is multiband in nature and the very popular multiband OFDM (MB-OFDM) implementation has been the main proponent for high-data rate and accurate localization. Results of measurements and simulation have shown that UWB has the potential to achieve sub-centimeter accuracy in LOS environments but struggles to match the accuracy in NLOS environment due to the physical obstruction problem.

The NLOS problem is addressed in the second part of  Chap. 3 and it typically involves two stages: NLOS identification and NLOS mitigation. This area has received considerable attention in the research community within the last decade and it continues to provide innovation potential for researchers. NLOS identification techniques are based on estimating or identifying the condition of the channel to infer whether it is LOS or NLOS. Once the “channel” information is available, it is possible to incorporate it into an NLOS mitigation algorithm to improve the accuracy and robustness of the location estimate. NLOS identification typically operates on the physical-layer-sensed signal which can be used to extract a “metric” that can indicate the state of the channel. NLOS mitigation, however, operates at higher levels closer to the localization algorithm. As a result, identification and mitigation are typically independent; however, there are approaches that combine the identification and mitigation in one step. The robustness of NLOS mitigation algorithm depends inherently on the robustness of the NLOS identification stage. The better the detection accuracy (probability of detection for a given probability of false alarm) the more effective and useful the channel information can be for the mitigation stage and the entire localization algorithm. As a result it is no surprise that NLOS identification can be the critical element in the mitigation process.

1.2.3 Chapter 4: Fingerprinting Techniques

Survey-based localization is the focus of  Chap. 4. The basic idea behind this technique is to associate physically measurable properties to discrete locations throughout a deployment area. These properties, commonly referred to as fingerprints or signatures, can then act as location identifiers. The greater the spatial variability of the signatures, the greater the capacity of the system to discriminate between locations and, in turn, to deliver finer resolution. Therefore, the same physical properties of the environment which render non-survey-based techniques more challenging—in particular multipath fading in radio frequency systems—on the contrary facilitate survey-based techniques.

Location fingerprinting techniques are categorized mainly by the type of properties which are collected. The three major radio frequency properties that have been implemented to date are: RSS, the time domain Channel Impulse Response (CIR) [or equivalent frequency domain Channel Transfer Function (CTF)], and the Frequency Channel Coherence Function (FCF). RSS is by far the most prevalent in commercially deployed wireless systems. This is due to many factors, most notably its robustness and good penetration in NLOS conditions—especially at lower carrier frequencies—its simple data structure, and the computational ease (inexpensiveness) with which it can be measured. It also stems from the fact that RSS is accessible directly from the frameware in popular wireless standards such as the IEEE 802.11. As mentioned earlier, RSS fingerprinting systems have been successfully deployed in dense urban and indoor environments by Skyhook Wireless and Ekahau, respectively. The disadvantage of using RSS as a signature—especially when only a few APs are available—is the lack of uniqueness, meaning that multiple sites in close proximity throughout a deployment area may have similar fingerprints. This translates into limited localization resolution. While CIR, CTF, and FCF provide more distinctive signatures, they also require more complex (expensive) equipment to extract and have larger data storage requirements. The latter can be prohibitive for medium to large sized databases (typical indoor environments). In addition, because the data structure is more complex, the pattern matching algorithms are more computationally intensive.

The fingerprinting technique, in a nutshell, is to construct a database of signatures from available wireless network infrastructures, such as APs. Each signature is registered at a unique location—typically at points on a uniformly spaced grid throughout a given environment (e.g. 1 m2). This occurs in an “offline” phase, i.e., before localization is attempted. Figure 1.2 highlights the method of constructing a fingerprint at a given location. The location of a mobile device is then estimated during an “online” phase. For each query, the signature parameters are measured at the mobile device and subsequently are compared against the signatures registered in the database through a pattern matching algorithm. The location of the mobile is then designated as the location corresponding to the closest signature in the database. The role of the database in the offline and online stages is illustrated in Fig. 1.3.
Fig. 1.2

Overview of existing fingerprint construction. a Mobile terminal at location X conducts measurements to 3 APs and captures RF signals. b Channel metrics are extracted from the 3 RF signals and a fingerprint is created

Fig. 1.3

Location fingerprinting. a Offline: fingerprint database generation at locations on a grid. b Online pattern recognition and position estimation. Circle is actual position and circle/cross is the estimated position

1.2.4 Chapter 5: Cellular Localization Systems

Cellular localization is of tremendous interest for the network operators. As mentioned before, this was publicly stimulated by the FCC requirements that were published at the end of the 1990s for E-911 calls in the US and in 2003 Europe the E-112 initiative by the European Commission. However, the communication systems, like GSM, UMTS or LTE are designed to use the well-paid spectrum efficiently for communication needs. These needs are, e.g., a robust coverage as well as high throughput—to fulfill these requirements the spectrum is used efficiently for unknown data transfer. Localization in cellular systems is performed through fingerprinting or ranging. Fingerprinting methods range between a coarse localization through the cell ID or via signal strength-based localization. Signal strength methods are based on premeasured datasets and rely on known transmitted signal strength. Common time-based ranging methods require precise synchronization between the transmitter and the receiver. Such precise synchronization is not well established in common communication systems, especially not at the mobile terminal. Furthermore, in communications a single BS is enough to cover the basic needs. Localization requires three or more differently placed transmitters or receivers. The simplest solution to overcome interference was proposed for UMTS: Adding an idle period in the downlink to listen and to synchronize to multiple BSs one after another. Idle periods contradict the idea of an efficient use of spectrum, but it showed that a communication system needs to be defined properly to apply successfully geo-location in cellular mobile radio systems. The LTE standardization process intended to improve this, by adding special synchronization sequences for positioning.  Chapter 5 presents an overview of the different methods that were proposed and are applied since the 1990s and are now discussed in standardization of 3GPP LTE-advanced. Figure 1.4 presents how the different radio links are used to position in cellular mobile radio systems. Either the BSs or the mobile terminal performs ranging. Furthermore, also indoor APs acting as an anchor may be exploited to improve the performance of localization.
Fig. 1.4

Cellular mobile radio system indoors and outdoors

1.2.5 Chapters 6 and 7: Cooperative Localization in Wireless Sensor Networks—Centralized and Distributed Algorithms

The falling price and reduced size of wireless sensors in recent years have fueled the proliferation of dense networks to gauge and relay environmental properties such as temperature, light, sound, and vibration. Applications of such networks range from video surveillance and traffic control to health monitoring and industrial automation. In tandem, wireless specifications have been established to support these networks, most notably the Zigbee standard for communication protocols between small, low-power, and low-bit-rate radios designed to operate for years on a single disposable battery. In close relation, the IEEE 802.15.4 g standard also enables range measurement between such radios using UWB technology to extract TOA. In fact, furnishing the locations of the sensors in the networks proves as critical as furnishing the spatially sensitive readings themselves in order for an external system to calibrate a network response. In particular, military and public safety operations call for ad hoc localization such as that of a man down in a building ablaze with zero visibility. This has launched a research area known as cooperative localization which seeks to aggregate potentially enormous quantities of data to achieve optimal results.

The localization topology of wireless sensor networks differs fundamentally from the topology of other networks. In the latter, mobile devices enjoy direct connectivity to base stations (BSs) whose locations are known, as illustrated in Fig. 1.5. In the former, however, since the devices operate on low power, their range is limited. So even if placed outdoors, they will not be able to access GPS satellites, cellular BSs, or Wi-Fi hot spots. In addition due to their compact size, they may suffer from inadequate computational resources to process range or angle measurements into estimated locations. The implicit assumption in cooperative localization is that only a small ratio of the total number of devices in the network, known as anchor nodes, are able to estimate their locations from BSs. This may be due to their favorable placement in the environment which enables connectivity, but for the most part special network devices equipped with higher power and enhanced computational resources will be required.
Fig. 1.5

Localization topology in GPS, cellular, and WLAN networks. The mobile device has direct connectivity to the base stations

Hence in cooperative networks, sensors lacking direct connectivity to anchors must discern their locations through neighboring nodes whose locations are also unknown. In essence, sensors must connect to anchors through multiple hops, as illustrated Fig. 1.6. A consequence of this complex topology is that simple triangulation algorithms must be substituted with more sophisticated algorithms. And since each connection on a multi-hop link is subject to measurement error, the reliability of the composite link is diminished with respect to an otherwise direct link. However, since the number of nodes can range from the hundreds to the thousands, WSNs are often densely packed with overlapping coverage. Cooperative localization algorithms take advantage of this redundancy and, despite the multi-hop connectivity, have been shown to deliver good results.
Fig. 1.6

Localization topology in wireless sensor networks. The anchors (red) have direct connectivity to the BSs. The sensors (blue) are connected to the anchors through mulit-hop links

 Chapter 6 introduces centralized cooperative localization. Centralized implies that the range or angle measurements are gathered locally and then forwarded to a central processor such that they can transformed into the locations of the unknown nodes. The scalability of algorithms is a key ingredient for future wireless networks and the expected increase in the number of devices in wireless networks is exponential compared to the number of active devices today. For such networks, it may be infeasible to coordinate the devices through a centralized architecture. The positioning solutions for centralized cooperative methods are user/agent-centric. A related idea, which is common in WSNs, consists in sharing computational load onto the entire network, yet preserving reasonable complexity and low power consumption in each node. The sharing of computational load between geographically distributed nodes builds on the concept of distributed algorithms. The application requirements (scalability, energy efficiency, and accuracy) will influence the design of distributed algorithms. In  Chap. 7, a variety of distributed cooperative positioning algorithms, especially message passing, is presented.

1.2.6 Chapters 8 and 9: Inertial Navigation Systems

 Chapters 8 and  9 introduce sensors and methodologies that have been widely used in navigation systems for decades but are only recently applicable to commercial navigation applications thanks to the advancement in electronics miniaturization and increased computational power. The sensors discussed in  Chaps. 8 and  9 have complementary error characteristics to RF sensors and so can enable mitigation of the effects of multipath and NLOS errors in the location solution.

 Chapter 8 is dedicated to INS. An inertial navigation system (INS) is a navigation system that provides position, orientation, and velocity estimates based solely on measurements from inertial sensors. Inertial measurements are differential measurements in the sense that they quantify changes in speed or direction. The two primary types of inertial sensors are accelerometers and gyroscopes. Accelerometers measure instantaneous changes in speed, or equivalently force, and gyroscopes provide a fixed frame of reference with which to measure orientation or equivalently change in direction. Given its previous position and orientation as well as accelerometer and gyroscopic measurements over an elapsed period of time, an instrumented platform may calculate an estimate of its current position and orientation. Calculation of navigation information from differential measurements of speed and direction is termed dead reckoning.

Inertial navigation systems, by definition, compute their navigation solutions without the use of external references. INS were used as a prime means of navigation in the nineteenth and early twentieth century in maritime, aviation, and spaceflight applications. A main drawback of using purely inertial systems for navigation is that errors in the differential measurements are necessarily accumulated in the navigation solution over time. Thus, even with highly precise inertial measurements, position estimates based on them degrade over time.

It is now well accepted that a high accuracy navigation solution requires the ability to fuse input from multiple sensors making use of all available navigation information. Cross-validation allows inconsistent sensor data to be identified and suppressed in the overall navigation solution. Figure 1.7 illustrates a navigation device that takes input from multiple sources including sensors and map information.
Fig. 1.7

Robust navigation solutions require input from multiple sources

The key to making inertial sensors part of a precision positioning system is developing methods to both minimize free inertial position error growth and bound accumulated inertial position errors. In  Chap. 8 we discuss fusion of inertial sensor data with sensors and/or algorithms that provide estimates of secondary inertial state variables such as velocity, heading, and elevation.

SLAM techniques are one approach to fusing information from a variety of sensors. In  Chap. 9 we introduce SLAM algorithms which incorporate past path history and derived or available map information to determine the most probable position estimates conditioned on constraints determined by map information. Figure 1.8 shows a conceptual diagram of SLAM. Both the subject’s state, x k (termed the subject pose and indicated by successive triangles), and the location of select landmarks (indicated by stars) are tracked. The basic idea of SLAM is that if the sensor and algorithms can identify a landmark and a location of that landmark relative to tracked subject, then any time that landmark is seen again, its location can be used to correct the track subject’s location.
Fig. 1.8

Simultaneous localization and mapping

We discuss a small set of environmental sensors that can be used in SLAM algorithms including optical, magnetometer, and inertial and discuss how features are selected. We give an overview of approaches to solving the SLAM problem and then discuss some results of a particular implementation.


  1. J. Baker, The impact of indoor cellular coverage on location accuracy, Indoor location—the enabling technologies (2011)Google Scholar
  2. CISCO (2012).
  3. H. F. Durrant-Whyte, Uncertain geometry in robotics. IEEE J. Robot. Autom. 4(1), 23–31 (1988). doi: 10.1109/56.768.
  4. EKAHAU (2012).
  5. R.S. Mia, Indoor wireless location—the E911 perspective, Indoor location—the enabling technologies (2011)Google Scholar
  6. K. Pahlavan, P. Krishnamurthy, A. Beneat, Wideband radio propagation modeling for indoor geolocation applications. Commun. Mag. 36(4), 60–65 (1998)Google Scholar
  7. N. Patel, Strategy analytics: the $10 billion rule: location, location, location, navigation: wireless media strategies. May 11, 2011.
  8. R.E. Richton, G.M. Djuknic, Geo-location and assisted GPS. IEEE Comput., 34(2), 123–125 (2001)Google Scholar
  9. Skyhook Wireless (2012).
  10. R. Smith, P. Cheeseman, On the representation of spatial uncertainty. Int. J. Robotics Res., 5(4), 56–68 (1986)Google Scholar
  11. J. Wortham, Cellphone locator system needs no satellite. New York Times (June 2009)Google Scholar
  12. T. Young, TV + GPS location and timing, WPI precision indoor personnel location and tracking (2008)Google Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Camillo Gentile
    • 1
  • Nayef Alsindi
    • 2
  • Ronald Raulefs
    • 3
  • Carole Teolis
    • 4
  1. 1.National Institute of Standards and TechnologyGaithersburgUSA
  2. 2.Etisalat BT Innovation Center (EBTIC)Khalifa University of Science, Technology and Research (KUSTAR)Abu DhabiUnited Arab Emirates (UAE)
  3. 3.German Aerospace CenterWesslingGermany
  4. 4.TRX SystemsGreenbeltUSA

Personalised recommendations