Opening Plenary Session
Research in the 21st Century
|11:45–1:00||Lunch and Brown Bag Sessions|
Five Years of Faculty Fellowships: A Retrospective
In 2005, Microsoft Research created a fellowship program for research faculty that was designed as an investment in the development of talent critical to the future progress of the computing disciplines. Now, after five years of activity and with 25 fellows named, this session examines some of the successful researchers and activities enabled by the awards as well as future enhancements envisioned for the program
Presentation: Tom McMail, Microsoft Research Faculty Fellows
Needles in a Haystack: Reading Human Evolution in the Human Genome
The genomes of humans and our closest living species allow us to seek the genomic events that drove the unique evolution of our species. One such quest will be described, highlighting the intimate interplay between computation and experiments that allowed it to bear fruit.
Some Vignettes from Learning Theory
A great deal of recent research on computational learning theory and its applications focuses on a paradigm called "regret minimization." Regret-minimizing algorithms solve repeated decision problems (for example, which medical treatment to administer to a patient) and learn from their past mistakes, improving their performance as they gain experience. It is possible to design these algorithms to meet surprisingly strong provable worst-case guarantees, but decision problems "in the wild" often force us to reconsider the assumptions underlying these algorithms and to expand the theory in unexpected ways. in this discussion, we survey a few recent examples that illustrate how the theory is growing and maturing under the influence of applications from domains such as Web search and advertising.
Presentation: Robert Kleinberg, Some Vignettes from Learning Theory
Interactive and Collaborative Data Management in the Cloud
The scientific data management landscape is changing. Improvements in instrumentation and simulation software are giving scientists access to data at an unprecedented scale. This data is increasingly being stored in data centers running thousands of commodity servers. This new environment creates significant data management challenges. In addition to efficient query processing, the magnitude of data and queries call for new query management techniques such as runtime query control, intra-query fault tolerance, query composition support, and seamless query sharing. In this talk, we present our ongoing research efforts to provide scientists the tools they need to analyze data at these new scales and in these new environments. We also briefly discuss some of the other research projects in our group.
Presentation: Magdalena Balazinska, Interactive and Collaborative Data Management in the Cloud
Microsoft Cloud Computing Platform
Cloud computing uses data centers to provide on-demand access to services such as data storage and hosted applications that provide scalable Web services and large-scale scientific data analysis. While the architecture of a data center is similar to a conventional supercomputer, they are designed with very different goals. This talk highlights the basic cloud computing system architectures and the application programming models, including general concepts of data center architecture. We examine cloud computing and storage models with a detailed look at the Microsoft Azure cloud computing platform.
Presentations: Roger Barga, Dennis Gannon, Microsoft Cloud Computing Platform
Core Computer Science
The Design Expo is a Microsoft Research forum where the top graduate design institutions showcase their prototype interaction design ideas. Microsoft Research sponsors a semester-long class at leading interdisciplinary design schools and invites the top class projects to present their ideas as part of the Faculty Summit.
Toward Zero Carbon Energy Production
While the administration of United States President Obama has committed US$1.2 billion to go toward green energy research and development, approximately one thirtieth of the U.S. Department of Energy’s annual budget, both climate change and energy security, remain critical problems to solve. How do we avoid investing in energy sources that yield unintended consequences? What if the energy sources that are getting the most attention are between 25 to 1,000 times more polluting than the best available options? The science community is assessing not only the potential for delivering energy for electricity and vehicles from different sources, but also how they affect global warming, human health, energy security, water supply, space requirements, wildlife, water pollution, reliability, and sustainability. Join Stanford University Professor Mark Z. Jacobson, Conservation International Chief Advisor Michael Totten, and Oregon State University Dean and Professor Mark Abbott for an interactive workshop dedicated to using technology to achieve a whole systems evaluation of competing alternative energy options.
Presentation: Mark Abbott, Climate, Energy, and Economy
|Earth, Energy, and Environment|
Systems Biology and Transformative Healthcare
Moderator: Simon Mercer, Microsoft Research
This session investigates the role of computing in the fields of biological sciences and health care.
Systems Biology and Biotechnology of Microorganisms: Making Systems Biology Work
Systems biology has been changing the paradigm of biological and biotechnological research. It is now possible to perform so-called systems metabolic engineering by integrating metabolic engineering with systems biology. This lecture presents the general strategies for systems metabolic engineering and several examples on the production of various bioproducts. Systems metabolic engineering can be considered as one of the success stories of systems biology, and will become an essential strategy for developing various microbial processes for the production of chemicals and materials, thus helping us to move into sustainable bio-based economy.
Presentation: Sang Yup Lee, Systems Biology and Biotechnology of Microorganisms: Making Systems Biology Work
Interpreting Personalized Genetic Information
Although genome-wide association studies (GWAS) are rapidly increasing in number, numerous challenges persist in identifying and explaining the associations between loci and quantitative phenotypes. This project is developing tools to integrate gene association data with protein network information to identify the pathways underlying a patient’s genotype. These methods will elevate the study of gene association to a new study of “pathway association.” The project is a joint work with Richard Karp in the EECS Department at the University of California, Berkeley. Our proposed solution is to explain the associations captured by GWAS in terms of known gene and protein interactions. New technologies have provided a wealth of interaction data ranging from the proteome (protein-protein interaction networks) to the transcriptome (protein-DNA interactions) to the metabolome (metabolic pathways). We will develop computational tools that query these independent networks to identify pathways and sub-networks of interactions underlying the observed set of genome-wide associations. This framework is intended to improve the power of current GWAS, by identifying genes in loci with borderline significance that nonetheless have close network proximity to significant genes. Furthermore, it will provide a list of putative physical pathways incorporating the causal genes necessary to affect the phenotype.
Presentation: Trey Ideker, Pathway Association Analysis
|Health and Wellbeing|
Panel: Energy-Efficient Computing: Hype or Science?
This panel provides a forum for lively debate about the directions, challenges, and ideas about building energy-efficient computing systems. The experts examine energy and power issues in hardware and systems design, interconnect and optics, networking fabric, embedded systems, and software design.
Presentation: Feng Zhao, Introduction
Presentation: Trishul Chilimbi, Energy-Efficient Computing: Hype or Science?
Presentation: Fred Chong, Energy-Efficient Computing: Emerging Technologies
Presentation: Rajesh Gupta, Three Observations and Three Lessons from Embedded Systems
Presentation: Philip Levis, Lifting the Energy Veil
Presentation: Chuck Thacker, Energy-Efficient Computing: Hype or Science?
|Core Computer Science|
Highlights from Asia on eScience
Moderator: Lolan Song, Microsoft Research
This session presents some of the highlights from eScience Research in Asia. Three speakers from universities in the Asia-Pacific region talk about their research work in the environment, bioinformatics, and other areas.
The Health-e-Waterways Project An Exemplar Model for Environmental Monitoring and Resource Management
Numerous state, national, and international agencies are advocating the need for standardized frameworks and procedures for environmental accounting. The Health-e-Waterways project provides an ideal model for delivering a standardized approach to the aggregation of ecosystem health monitoring data and the generation of dynamic, interactive reports (that link back to the raw data sets). The system combines Microsoft Virtual Earth and Microsoft Silverlight to present environmental reports that not only save agencies significant time and money, but can also be used to guide regional, state, and national environmental policy development. Current work includes linking management action strategies to specific spatio-temporal indicators to identify the extent of impact of management actions and investmentsenabling adaptive management strategies based on environmental outcomes.
Presentation: Jane Hunter, The Health-e-Waterways Project
A Semantic and "Kansei" Computing System for Analyzing Global Environments
In the design of multimedia database systems, one of the most important issues is how to search and analyze media data (images, music, video, and documents), according to user's impressions and contexts. We introduce a "Kansei" and semantic associative search method based on our Mathematical Model of Meaning (MMM). The concept of "Kansei" includes several meanings on sensitive recognition, such as impression", "human senses", "feelings", "sensitivity", "psychological reaction", and "physiological reaction". This model realizes "Kansei" processing and semantic associative search for media data, according to various contexts. This model is applied to compute semantic correlations between images, music, video, and documents dynamically with a context interpretation mechanism. The main feature of this model is to realize semantic associative search and analysis in the 2000-dimensional orthogonal semantic space with semantic projection functions. This space is created for dynamically computing semantic equivalence or similarity between media data. One of the important applications of MMM is “Global Environment-Analysis,” which aims to evaluate various influences caused by natural disasters in global environments. We have several experiments for a global environment-analysis system based on MMM for natural disasters, especially for mud-flow disasters. Those results show the feasibility and effectiveness of our “Semantic Computing System” with MMM for realizing deep analysis of global environments.
Presentation: Yasushi Kiyoki, A Semantic and "Kansei" Computing System for Analyzing Global Environments
Computational Challenge in Analyzing Complex Traits
Most human important diseases and economically important animal and plant traits are complex traits controlled by multiple genes with gene-to-gene interaction (epistasis) and gene-to-environment interaction (GE). Detection of polygene with fixed effects of genes and random effects of GE interaction are often revealed by mixed-linear-model approach, which is a statistical method involving enormous computation of many inverses of an (n×n) matrix. Genes are located on chromosomes. There must be two-dimension presentation for multiple genes with gene-to-gene interaction. Since genes express differently during developmental stages and across various environments, the graphic presentation of dynamic gene expression is another type of challenge for bio-computation.
Presentation: Jun Zhu, Computational Challenges in Analyzing Complex Traits
Computer Games and Learning: Best Practices Using Games to Teach—in Academia and at Microsoft
The Games for Learning Institute is a joint venture with Microsoft Research, New York University, and affiliated New York regional schools. Nine months into its efforts, it has prematurely published its annual report discussing the latest research about how to make great games and how to make great game vehicles for teaching. This talk is complemented by three efforts at Microsoft where product groups are using games to teach the esoteric features of Microsoft software, facilitate learning, and improve software development. See some very cool stuff and learn how to get your kids to love math (as does Ken Perlin) or find out how to use a feature in Microsoft Office Word you have not yet discovered.
|Education and Scholarly Communications|
Computer Science Research in Latin America
Improving Meta-Analysis Based GWAS Through Data Quality Management
Defining mappings or indirect relations from genotype to phenotype has long been a challenge for those in the field of biology. The present pace of data generation from genomic sciences offers unparalleled opportunities in this regard. Prominent examples are Genome Wide Association Studies (GWAS), which jointly analyze thousands of Single Nucleotide Polymorphisms (SNPs) from chosen populations, looking for associations between a specific disease and a given genomic configuration. However, huge costs and project complexity restrict the application of GWAS approach. An option to overcome this limitation is to combine different studies, applying the so-called Meta-Analysis approach. Efforts such as Database of Genotype and Phenotype (dbGaP) are intended to provide a uniform repository of such studies. However, retrieving, integrating, and interpreting heterogeneous data sources are daunting tasks. Indeed, most successful meta-analyses rely on sophisticated statistics aided with expert inspection and filtering. This approach is slow, costly and error-prone (for example, multiple subjective decisions), introducing reproducibility problems. The main goal of our work is to provide a data quality assessment environment for GWAS, which enables a powerful and reliable application of Meta-Analysis. The environment trends to promote this approach by identifying core concepts and elements that would allow model-based, automated, comprehensive, and reproducible data quality management. Furthermore, while Meta-Analysis was extensively used for combining aggregated data, our approach intends to combine raw data, even from heterogeneous sources.
Presentation: Raul Ruggia, Improving Meta-Analysis–based GWAS Through Data Quality Management
Research at LaFHIS: The Tools and Foundations for Software Engineering Lab at University of Buenos Aires
The Laboratory on Foundations and Tools for Software Engineering (LaFHIS) within the Department of Computing at the Faculty of Science, University of Buenos Aires, aims to conduct leading-edge research in, and technology transfer of, effective engineering methods, tools, and environments for the development of composite, heterogeneous, and complex software-intensive systems. The group has strong interests in the specification, construction, and verification of software-intensive systems. This talk provides an overview of the research conducted at LaFHIS, which focuses on models and automated analysis. It provides particular insight into the researchers' work on model checking, scenario-based specifications, partial behavior modeling, and contract validation. This talk also includes descriptions of ongoing collaborative projects with Microsoft Research on model-based testing and program analysis.
Presentation: Sebastian Uchitel, The Foundations and Tools for Software Engineering Lab
Advancements of the LACCIR Virtual Institute: 2007–2009
With support and sponsorship from Microsoft Research, the Inter American Development Bank (IADB), and the Organization of American States (OAS), the Latin American and Caribbean Collaborative ICT Research (LACCIR) Virtual Institute was created in May 2007 as a federation of Latin American and Caribbean universities, for the advancement of collaborative information and communication technologies (ICT) research applied to social and economical development of the region. This presentation provides an account of activities and achievements in terms of regional research projects, graduate student fellowships, collaboration networks, and research indicators to date.
Presentation: Ignacio Casas, Latin American and Caribbean Collaborative ICT Research Federation
Devices, Sensors, and Mobility for Healthcare
This session focuses on innovative technologies in the devices sensors and mobile space being developed by Microsoft Researchers and their external collaborators.
Physiological Computing for Human-Computer Interaction and Medical Sensing
The human body is a complex biological machine and a prolific signal generator. Recent advances in sensing technologies have vastly augmented our ability to decode the signals generated by the body. This talk presents research into utilizing sensors placed on or in the human body in order to create natural and always-available interaction with computers around us. The talk also includes discussion of recent efforts in applying our expertise to build sensors and design experiences centered on medical sensing.
Presentation: Desney Tan, Enhancing Human-Computer Interaction with Physiological Computing
Monitoring and Diagnosing Sleep Apnea in the Home
This talk focuses on technology being developed in the Microsoft Research hardware team to help diagnoses sleep apnea and other sleeping disturbances. For proper diagnosis, patients typically must check into a sleep clinic (hospital) for monitoring. By reinstrumenting many of the sensors used in this controlled environment into a neck cuff, we posit that we can generate accurate predictions of sleep apnea (and with multiple data points) in the comfort of a subject's home.
Presentation: Kristin Tolle, Using the Ubiquity of the Cell Phone to Record Physiological Activities
MAUI: Mobile Assistance Using the Internet
Seamless augmentation of human cognition requires processing and energy that far outstrips the capabilities of mobile hardware. The CPU, memory, I/O, and energy demands of new world applications greatly exceed the capacity of devices that people are willing to carry or wear for extended periods. On such hardware, improving size, weight, and battery life are higher priorities than enhancing compute power. A mobile device can never be too small, too light, or have too long a battery life! This is not just a temporary limitation of current technology, but is intrinsic to mobility. At any given cost and level of technology, considerations of weight, power, size, and ergonomics will exact a penalty in computational resources. Computation on mobile devices will always be a compromise. Cloud computing suggests an obvious solution: Run the application on a distant high-performance computer or compute cluster and access it over the Internet via a mobile computer. Unfortunately, long WAN latencies hurt the crisp interaction that is so critical for seamless augmentation of human cognition. Humans are acutely sensitive to delay and jitter, and it is very difficult to control these parameters at WAN scale. As latency increases and bandwidth drops, interactive response suffers. This distracts the user, and reduces his or her depth of cognitive engagement.
Presentation: Victor Bahl, Mobile Assistance Using the Internet
|Health and Wellbeing|
Computational Thinking Enters the Mainstream
Moderator: Tom McMail, Microsoft Research
The Microsoft Carnegie Mellon Center for Computational Thinking was founded in 2007 to encourage breakthrough research in projects exemplifying this approach to problem solving. This session provides an overview of the investigations conducted at this center over its first two years and presents some interesting possibilities for the future.
The Spread of Computational Thinking
Every educated person should be able to think computationally. That is the thesis first promoted by Jeannette Wing, which formed the foundation of the Microsoft-supported Center for Computational Thinking. In the same manner that mathematical thinking, global thinking, and so on, are critical for succeeding or even surviving in today's world, computational thinking addresses problems which would be unsolvable or solved less well without computational advantages and the mindset required to use them most creatively and effectively. As a means for conceptualizing and solving complex problems in a number of domains in both the sciences and humanities, it has received wide attention in the research, teaching, and policy communities.
With the advent of Multicores, we are riding a third or fourth wave of parallel computing, and perhaps unlike previous ones this one will break. Many if not most computer science classes, however, remain case studies in how to push students into thinking sequentially. At the earliest stages, for example, we teach students that taking the dot product of two vectors or merging two lists involves starting at one end and sequentially traversing to the other. In reality, many problems, applications, and even algorithms are inherently parallel. The languages and models we use, however, push us to describe and conceptualize them sequentially. This talk describes some of the core concepts in parallel algorithms and points out that these ideas transcend any particular model and are thus largely robust against uncertainties in what parallel machines might look like. How programming languages can affect the way we think about the algorithms will also be discussed. Ideas from the audience are appreciated.
Computational Drug Discovery
We are using Computational Thinking to address the problem of designing drugs that evade resistance. Our approach uses two key abstractions. The first is to model the drug design process as a two-player game. Here, a pharmaceutical company makes a move by introducing a drug against a target molecule. The disease then makes a move by introducing mutations that decreases the binding affinity of the drug, while preserving the biological function of the target. The second abstraction is to model the physics of molecular interactions and the space of possible mutations as a complex probability distribution. This complex distribution is efficiently encoded by using undirected probabilistic graphical models; probabilistic queries are answered by using inference algorithms. This presentation focuses on graphical models used in this work.
Music Performance in the Computational Age
Computing has revolutionized music performance, recording, distribution, and listening. To date, most of the revolution has been driven by advances in storage and communication. The next revolution will come from computation, especially interactive real-time systems. We have been exploring how computing can augment musical performance by amateurs and professionals alike. A recent concert featured a 20-piece digital string orchestra playing with a live jazz band. Future work is aimed at interfaces that extend human musical abilities, especially in live performance.
Art and Code
Just as true literacy in English means being able to write as well as read, true literacy in software demands not only knowing how to use commercial software tools, but how to create new software for oneself and for others. Recently, a new set of visually- and musically-oriented programming environments (and accompanying pedagogic techniques) have been developed by artists, and for artists. These toolkitsmany of which are free, open-source initiativeshave made enormous inroads towards democratizing the education of computational thinking worldwide. With support from the Computational Thinking Center, a conference concerned with "programming environments for artists, young people, and the rest of us" brought together 15 of the key innovators leading significant revolutions in software-arts education, and provided workshops in 11 different arts-programming languages to an extremely diverse new community of creators.
|Core Computer Science|
Surface and Multitouch Moving Forward
The Microsoft Surface is being used in some very creative and innovative ways. Discover the potential of this fantastic new platform and see how touch computing can be applied in the future. Microsoft Research and the Microsoft Surface Product Group provide presentations and demos.
|Education and Scholarly Communications|
Emerging Transformational Changes in Healthcare Computing
Michael Gillam, Microsoft Research
The foundations for the biggest changes in the future of healthcare are being laid in the field of health information technology today. From the emergence of enterprise computational clouds to the fast growing area of personally owned digital health records; this session examines the historic medical trends that are defining the most promising areas for success in healthcare computing today.
Presentation: Michael Gillam, Emerging Transformational Changes in Healthcare Computing
Mr. Feynman Wasn’t Joking
Tony Hey, Corporate Vice President, External Research
Presentation: Tony Hey, Mr. Feynman Wasn’t Joking