More and more it’s a data driven world. This is particularly true in embedded systems where devices – such as RFID tags, sensors, robots, process machinery, handheld devices and phones – constantly generate data. The key issue is how to sift through this data to find, process and operate on the truly valuable pieces of information.The team is focused on addressing this problem. Their goal is to unlock the potential of live and historical data thereby enabling on-the-fly business insight.
Complex Event Processing - The world of event-driven applications is characterized by high data rates, continuous queries and millisecond latency requirements that make storing data in a relational database impractical. The field of Complex Event Processing (CEP) focuses on handling streams of data by analyzing and correlating events in real-time. Our work on Complex Event Processing contributes to Microsoft SQL StreamInsight which offers deterministic temporal semantics, rich query semantics and high performance. We are particularly interested in enabling a “Monitor/Manage/Mine” virtuous cycle. We “monitor” data from multiple sources for meaningful patterns, trends, exceptions and opportunities by analyzing and correlating data in-flight. We “manage” your business by performing low-latency analytics and responding quickly to areas of opportunity or threat. And we help move toward a predictive business model by “mining” in-flight and historical data to continuously refine and improve your analytics definitions.
Edge-to-Cloud - Our current focus is enabling real-time analytics on distributed assets and creating analytics which span from Edge-to-Cloud. In many different verticals (such as manufacturing, automotive and retail) there is a need to process and correlate data in real-time to build low latency analytics for rapid decision making. However, the data being processed are typically generated from distributed devices which have limited capabilities in terms of memory or connectivity - and it is often not possible and/or desirable to move these data to a central location for processing. Our approach provides local, low latency complex processing at the edge of the system where the data is created, thus reducing the size and frequency of data to transport.
Software Quality - Empower software development teams to gain insight from product, process, people and customer data. Enable all disciplines, at all levels, to improve product quality and increase productivity at all phases of the software development lifecycle.
IT monitoring - Enable the analysis of traces, performance parameters, alerts, etc. Results can be correlated across machines to optimize and troubleshoot at scale.
Manufacturing - Enable asset-based monitoring and aggregation of machine-born data, like sensor-based observation of the plant floor. Generate alerts the moment something goes wrong. Provide proactive, condition-based maintenance on key equipment.
Web analytics - Provide immediate click-stream pattern detection and enable low-latency extraction of user behavior and interactions across large scale deployment.
Transportation - Enable smart local telemetry for charging electric vehicles, analyzing local traffic patterns, as well as global fleet management analysis and drill down.
Utilities - Gaining operational and environmental efficiencies by moving to smart grids. Support multiple levels of aggregation along the grid. Provide immediate response to variations in energy or water consumption to minimize or avoid disruptions of service.
Healthcare - Enable patient condition monitoring through portable devices. Offer central management and emergency response through global correlation and analytics.