Microsoft Research announced the 11 recipients of the SensorMap: Browsing the Physical World in Real-Time awards, totaling $700,000 in funding. The objective of this award is to establish a compelling portfolio of research projects that leverage the SensorMap platform to build an open and diverse community of sensor data publishers/consumers and to develop shared infrastructure and tools for data publishing, data management, and data querying and visualization.
We are currently developing Marmite, an end-user programming tool that makes it easy for anyone to create a website or application that combines content from more than one source into an integrated experience, in other words, a web-based mashup. Marmite supports software architecture based on the idea that changing the value of a variable should automatically force recalculation of the values of other variables. We are researching where Marmite supports data processed by a series of operators in a manner similar to UNIX pipes. In Unix-like computer operating systems, a pipeline is the original software pipeline: a set of processes chained by their standard streams, so that the output of each process feeds directly as input of the next one. Each connection is implemented by an anonymous pipe. We are investigating how to extend Marmite to filter, aggregate, refine, and visualize large sets of real-time sensor data in the context of SensorMap. We are using our Hitchhiking system as a sensor source, which will provide estimates on the amount of activity in a place. Success in this line of research will lead to an improved understanding of end-user programming; better techniques for refining, debugging, and visualizing a large set of sensor data; better techniques for managing semantics and resolving conflicts in how sensor data is represented; and an end-user programming tool (with openly available source code) that makes it easy to integrate existing and future sensor sources with SensorMap.
Our project is to design a SensorMap interface in CitySense, an urban scale sensor network testbed (test environment) that is funded by the National Science Foundation and being developed by researchers at Harvard University and BBN Technologies. CitySense will consist of 100 wireless sensors deployed on light poles around the city of Cambridge, MA. Each node will consist of an embedded PC, 802.11a/b/g interface, and various sensors for monitoring weather conditions and air pollutants. Most importantly, CitySense is intended to be an open testbed that researchers from all over the world can use to evaluate wireless networking and sensor network applications in a large-scale urban setting. We are opening up the CitySense testbed to allow remote users to reprogram the nodes, acquire data, and experiment with novel distributed well-defined procedures and routing protocols. However, integrating the SensorMap platform with CitySense raises a number of interesting challenges which we are investigating. We are dealing with temporary disconnections and failures of sensor nodes and representing the state of the network appropriately through the SensorMap interface; supporting arbitrary output from the diverse set of applications running on CitySense (requiring some form of standard data description and a tool to render user-generated data by SensorMap); new approaches to visualizing both the sensor data produced by CitySense (weather, air quality, and other sensor data), as well as, the state of the CitySense infrastructure itself. We are researching to enable SensorMap to display the quality of network links; contention across different 802.11 channels; and node characteristics such as CPU load, available memory, and uptime.
The National Weather Study Project (NWSP) is a large-scale environmental study project deploying hundreds of mini weather stations in schools throughout Singapore. In this project, we are developing a sensor grid connecting these weather stations to automatically collect and aggregate the weather data in real-time. Using the Microsoft SensorMap platform, we are building tools to simplify the process of publishing and querying the weather data. We also plan to use this system to conduct an important application case study on the correlation between the weather patterns and Dengue Fever occurrences in Singapore.
The 921 earthquake in 1999 devastated the landscape of Taiwan. Land collapsed and many crevices were formed in the rocks and soil. When typhoons brought heavy rains, the groundwater level increased and surface runoff concentrated, causing debris flows. Debris flows caused severe damage to the land, property, and life. Hence, Taiwan’s government has invested heavily on debris flow monitoring and warning systems. Since 2002, Feng Chia University has cooperated with the Soil and Water Conservation Bureau to establish and maintain 13 fixed and two mobile debris flow monitoring stations over the island, which is one of the most complete and advanced establishments in the world. In 2006, Tsing Hua University was brought in to extend the wired, spotty sensor systems with wireless sensors. They are capable of covering an entire region of interest. Through our Microsoft project, we are researching toward two goals: (1) to make the debris flow data collected through the wired and wireless sensors available to other researchers in the world, and (2) to cooperate with Microsoft to extend SensorMap with facilities for add-on services and event subscriptions/notifications. The features stated in the second goal not only facilitate early warnings and notifications of debris flows for reducing damages, but also serve as a general platform for supporting other applications requiring add-on services and event notification.
This effort seeks to serve up to SensorMap two datasources that have been recently developed at the Ohio State University, along with enhancements to support data history, visualization/animation, quality data descriptions, and quality control and query distribution. One datasource captures mobility in urban campus-area habitats and the other datasource captures the health/availability of equipment in a testing facility. The Integration tools we are developing in the process will be shared in open source.
In both computer science and information science, ontology is a data model that represents a set of concepts within a domain and the relationships between those concepts. It is used to reason about the objects within that domain. Ontologies are used in artificial intelligence, the semantic web, software engineering and information architecture as a form of knowledge representation about the world or some part of it. As ontologies become the preferred ways for storing data, sensor data providers are likely to develop detailed ontologies for their sensor data descriptions. However, currently envisioned and realized frameworks for publication of sensor data, such as the SenseWeb, do not provide a way to utilize provider-defined ontological representations of the sensor data. Instead, they require the provider to undertake potentially tedious and complex ways of registering their sensor feed data with them. Our research considers the provider-defined data models and automatically identifies and semantically aligns relevant concepts from the provider-defined data models with those of the publisher’s data models. In addition, we are investigating methods by which the provider data models may be appropriately merged into the publisher’s data models, thereby transforming the, possibly minimal, sensor types of the publisher into richer explicit data. This approach will not only alleviate the burden on the data provider by allowing reuse of existing data representations, it will also reduce the burden on the data publisher by avoiding the need to develop detailed data models of the different sensor types. Furthermore, the richer ontologies that result may be used to facilitate refined queries of the sensor data and new combinations with other existing data feeds. The data of third-party sensor feeds will be obtained and primarily used for evaluation. Part of this study includes a proposal to the University of Georgia’s campus transit board to set-up a wireless sensor network for tracking the campus bus shuttles with plans for publishing the sensor data on the SensorMap portal. This research is significant because it represents a major step toward automating the publication of sensor data feeds with minimal human effort involved.
Monitoring mobile activities is not adequately supported at present on SensorMap. Current applications on SensorMap focus primarily on static sensors, such as cameras, thermometers, or parking spot detectors. They only provide external measurements of mobile activities such as pictures of highway traffic. Being external to the activity they measure, they generate what we call third-person accounts of measured activities. In contrast, a significant category of future sensing applications will rely on sensors that are affixed to those same mobile entities whose activities they monitor. These sensors might be attached to cars, cell-phones, animals, or people. Fusing outputs from such mobile sensors can produce activity views as seen by the participants themselves. They provide what we call first-person accounts of the activity. Viewing mobile events “in the first person” is invaluable from the perspective of several scientific, social, and personal applications. For example, an ecologist could monitor bird behavior as recorded by sensors located on the birds themselves. A medical practitioner could obtain longitudinal measurements of the progress of a patient as recorded by patient-resident sensors. We intend to build the software infrastructure (including data publishing, search, and visualization tools) for monitoring mobile activities in the first-person, and interface it with the SensorMap. Two example applications are being implemented; one biological and one social. The first maps movements and behaviors of birds from sensors attached to local species (namely, northern cardinals) near the University of Illinois campus. These sensors are currently used to document and analyze the social behavior and vocalization of birds for biological research purposes. The second application provides maps of human activity as recorded via wearable sensing devices. We are exporting a virtual space that offers abstracts that govern data sharing, privacy settings, and the scope of user and data visibility. Both applications, together with their supporting data publishing, search, and visualization tools, are being made publicly available for use with SensorMap. A client-side activity browser plug-in is being developed to interact with data feeds from mobile activity sensors.
Recent developments in technology together with widely observed climate change phenomena have revealed coral reef ecosystems as critical areas greatly susceptible to impact of global climate variations as well as other man-made influences, but also as early indicators of such events. The need to understand and protect such delicate ecosystems has created an urgent demand for sensor networks technologies to be deployed in order to perform essential environmental monitoring and information collection. This data can then be analyzed by higher level systems such as a Semantic Web, an evolving extension of the World Wide Web in which web content can be expressed not only in natural language, but also in a form that can be understood, interpreted and used by software agents, thus permitting them to find, share and integrate information more easily. The Semantic Web will eventually provide predictive information on destructive events such as coral bleaching. Our SensorMap project on the Great Barrier Reef is based on providing a valuable interface between sensors and higher level objectives of multidisciplinary research teams around the world, from sensor networks researchers to marine biologists. Utilizing the core infrastructure associated with a sensor network deployment currently in progress on the Great Barrier Reef, this project will aid in the collection and dissemination of a diverse range of unique sensor data.
We are addressing two key issues that we believe exist in SensorMap: (1) privacy guarantees and (2) quality control for search results. Our research explores the two issues described above in the context of SensorWeb by producing two valuable sensor streams from our AlarmNet and MetroNet projects, which contain both public and private information in indoor and urban environments; a software system called StreamPublish that supports publishing of private data; and finally a software system called StreamRank that supports the search, aggregation, and interpretation of data streams.
The SensorMap platform, recently developed by Microsoft Research, enables users around the world to publish and query a diverse range of different items and geographically distributed sensor data in near real-time. The current infrastructure, however, offers limited query capabilities: users can only select geographic regions, sensor types, and zoom levels. SensorMap satisfies these requests by collecting data from the selected sensors in the region of interest. In this project, we are pushing SensorMap’s capabilities much farther, by giving users the ability to define sophisticated high-level events over the low-level sensor data. For example, a parent may register to receive an alert if it starts raining on the soccer field where a child is playing or if the spouse is stuck in traffic. There are several challenges in enabling such event detection and notification system. The key challenge comes from the uncertainty of the sensor data. Because sensors are brittle devices spread over large geographic regions, the data they produce is often inaccurate or even missing. The event detection engine must gracefully handle such input data errors by enabling users to specify their desired trade-off between detection coverage and precision; associating confidence levels with detected events; and properly displaying events with different levels of confidence. In this project, we are researching the above challenges in order to build a system that will enable flexible event detection and notification in a world-wide web of sensor data.
Air pollution is an important phenomenon affecting the life of many people. Like most U.S. cities, the Nashville metropolitan area monitors the air quality by a small network of fixed stations providing a low resolution sampling of the environment. We are designing a prototype Mobile Air Quality Monitoring Network (MAQUMON) comprised of sensor nodes mounted on cars. When the car is in motion, the device samples the pollutants every few seconds and stores the results tagged with a location and time stamp. When the car is parked, the samples are taken a few times an hour. When a car is within the coverage area of an available wireless local area network hotspot, all data is uploaded, processed and published on the SensorMap portal. Given a sufficient number of nodes and diverse mobility patterns, a detailed picture of the air quality in a large area could be obtained at a low cost. We are building five prototype sensors from commercial off-the-shelf components, developing the necessary infrastructure to measure the pollutants, gathering, processing and visualizing the data, and deploying the system in the Nashville metropolitan area to provide a continuous live data feed on the SensorMap portal. We will push the limits of the SensorMap project by proposing ways to handle mobile sensors, developing more advanced data collection procedures and new visualization methods.