By Rob Knies
July 5, 2006 5:00 AM PT
You’re in pre-movie limbo. It’s 6 p.m., and the film you want to see starts at 7. You’ve got enough time to grab a quick dinner—if you don’t have to wait for a table. You know the neighborhood around the theater has some good restaurants, but there’s no time to waste. Which ones could take your order right away?
Someday soon, perhaps, SensorMap will get you to your movie, relaxed and well-fed.
SensorMap is a platform for publishing and searching for real-time data, emerging from Microsoft Research’s Networked Embedded Computing group, that shows promise in being able to combine static and real-time data to help users receive search results informed and enriched by the users’ particular information needs.
“You can go to a search engine and search for all the restaurants in Seattle,” says Suman Nath, a researcher based in Microsoft Research’s Redmond lab. “But what you can’t do today is to find me all the restaurants in Seattle that have a waiting time of less than 30 minutes.
“That part is a real-time component, and we want to enable this type of scenario, so that you can search based not just on static data, but also on real-time data.”
SensorMap will be released to the public during the Faculty Summit, an annual event sponsored by Microsoft Research’s External Research & Programs group that brings together academic researchers, faculty, and Microsoft researchers and product-group personnel to discuss areas of mutual concern. This year’s event will be held July 17-18 in Redmond.
The SensorMap platform, part of the SenseWeb project, will serve as a portal for those who want to publish real-time data to make it searchable. And the gathering of academics for the Faculty Summit provides a perfect setting for a coming-out party.
“Many researchers in different universities do research on sensor networks and have deployed sensors for their experiments,” Nath explains. “While doing experiments, they are collecting information about different things, and we think SensorMap can be a good portal for them to publish their data.”
It’s all about collecting the data and sharing the collection.
“Why isn’t there this type of application now?” Nath asks. “The main problem is there is not enough data. Basically, there is this chicken-and-egg problem. There is no application because there is no data, and there is no data because people don’t know how useful that data is. We are trying to show people that if you give us live data, then we could do interesting stuff. Once the data is there, other people can write applications on top of it.”
“SensorMap allows users to easily publish their live data from their sensors,” Nath explains. “We actually don’t care about what that particular data source is. It can be sensors, it can be someone typing something, it can be some other thing that provides data that change over time. We make that part simple so we can encourage people to put more and more data on the Web.”
Once the data is collected, SensorMap indexes that data—“the most difficult part,” Nath says—to make it searchable. A database holds geographically indexed sensor descriptions and works in tandem with a sensor-data publishing Web service and a server-side query processor.
“We are trying to make all these things easier,” Nath says. “We are providing some tools for the publisher to publish data, we are providing this portal, where you can go and easily find that information, and, in the middle, the indexing and processing, this is all done by us.”
A few university projects elsewhere recently have used geographic Web interfaces to annotate sensors, as, for example, in a UCLA project. And there are Web-based maps that offer some sensor information, such as weather data or traffic information. But those maps are not integrated. SenseWeb is organizing the sensor data so users quickly can locate relevant information, just as a current search engine would do for text documents. The vision of searching the physical world with live sensor data was outlined in 2004 in a book by Zhao and Leonidas Guibas entitled Wireless Sensor Networks: An Information Processing Approach.
“What we are trying to do here,” Nath says, “is to bring all different types of sensors together. And we’re actually trying to provide more useful things—search based on geographic region or based on keywords. Also, depending on the zoom level of a map, we are trying to aggregate data. If you’re querying temperature sensors statewide, we can provide individual sensor information, but we also can show you the average temperature. That sort of aggregation is absent in other industry applications.”
The expectation is that the SensorMap platform will enable data publishers to define the uses for their contributions.
“Ideally,” Nath says, “what we want to do is to make this platform open in the sense that if you want to add new types of sensors, you can define your own icon and even define what types of things can be done with that sensor. For example, is it OK to compute an average with the data being shared? In the future, we’ll add a control so you can specify who would be able to see your data—just you, your friends, or the world.”
The technology behind SensorMap has come together rather quickly; work didn’t really coalesce until January. The Sense Web team built a small prototype to provide a proof of concept and displayed it in early March during TechFest, Microsoft Research’s annual technology fair for Microsoft employees. SensorMap received good feedback, and after some enhancements, was showcased during the International Conference on Information Processing in Sensor Networks, sponsored by the Association for Computing Machinery and the Institute of Electrical and Electronics Engineers and held April 19-21 in Nashville, Tenn. Again, the reception was positive, as it was in early May during Microsoft Research’s Silicon Valley Road Show.
“So far, we have got only good responses from people,” Nath smiles, “and that motivated us to make it public.”
Not only was SensorMap received well, but it also has elicited interest from Microsoft product groups.
“Windows Live Local™ was the default one,” Nath recalls. “We thought they might be interested in that. They are interested in the technology and the data. They said that if the technology matures, they could put it into their system.
“They also were interested in the data. They thought that if we have this type of platform, then more people will be interested in putting data online. And they can probably use that data to build another application on top of it.”
That group also was intrigued by some of the techniques employed by the SensorMap team. The existing system does not yet handle caching for live data streams and other issues. But SensorMap can enhance that.
“Every time we make a query,” Nath explains, “we validate it.”
Thus, Sensor Map can help answer a question such as, “How many restaurants are in this region of a city?” You simply click a few times on a map to create a polygon encompassing a region, then search within the prescribed area. The technology also enables searching using free text and searching by sensor type. Again, the more data available, the more useful the results.
“A sensor has two types of data,” Nath says. “One is the live data, which changes over time. The other is metadata that describes the sensor itself. Over time, that metadata doesn’t change that much.
“In our system, we have one central database where we store all the metadata. When you make a query, the query first goes to that central database, and it returns a list of sensors relevant to the query. Another module talks to those sensors directly, gets the live data, processes that data, and sends it back to the user.”
There are challenges, of course, in coping with such a dynamic set of information.
“The biggest problem,” Nath says, “is the query-processing part. And the biggest challenge in this whole space is scalability. If you have thousands of sensors and if you have many, many users, how can we scale efficiently to that? How can we cache data on the back end so we can reuse that data for scaling? There will be lots of issues that we need to solve.”
Things are progressing nicely, though. The team released the latest version of MSR Sense, the Microsoft Research Network Embedded Sensing Toolkit, in January. MSR Sense is a collection of software tools that enable users to collect, process, archive, and visualize data from a sensor network. A second toolkit is under construction that will enable users to build their own SensorMaps. And other refinements are in the works.
“There are many interesting things to be done,” Nath says. “Right now, we search only on the metadata. We have to explore how we can search more effectively on the live data.
“Also, we show all the data as points, but you can imagine a service where we generate some sort of map like a temperature-contour map or a gradient map and overlay those on top of the SensorMap.”
All in good time. For now, Nath and colleagues are pretty happy with what they’ve achieved thus far.
“Right now, you search only for static data,” he says, “but I hope there will someday be another box on a search engine where you can put in some real-time information. You could ask some complicated queries.
“That’s the most exciting part of this: You’ll have lots of data. What can you do with that?”