We work on tools, languages and methodologies to dramatically increase the productivity of software development. We are interested in both analysis tools for existing software, as well as in asking questions about how software of the future should be designed. The following is a preview of the research areas that we are currently working on: Verification and Testing, Probabilistic Programming, Scalable Distributed Systems and Concurrency Control.
Verification and Testing
Ensuring correctness of programs is crucial for software development. The focus of this research is to build tools and techniques to ensure that a program meets its specification.
The Q program verifier brings together state-of-the-art tools to enable automated verification of programs written in a variety of programming languages, leveraging symbolic search techniques. Q stands out from its competition in its unique support for analyzing concurrent, multi-core, as well as distributed systems. The Yogi project pioneered the art of combining static analysis with testing in order to prove safety properties of sequential programs. Seal is a research prototype that infers and summarizes the potential side-effects of a method in a C# program. It is scalable, capable of consuming entire .NET libraries, and supports sophisticated C# features such as LINQ, delegates, event-handler, exceptions and so on. We also work on the application of machine learning techniques for efficient and scalable program verification.
- Akash Lal and Shaz Qadeer, Powering the Static Driver Verifier using Corral, in Foundations of Software Engineering (FSE), ACM SIGSOFT Distinguished Paper, November 2014
- Akash Lal and Shaz Qadeer, A Program Transformation for Faster Goal-Directed Search, in Formal Methods in Computer-Aided Design (FMCAD), Best paper, October 2014
- Alex Aiken, Aditya V. Nori and Rahul Sharma. Bias-Variance Tradeoffs in Program Analysis. In POPL '14: Principles of Programming Languages, January 2014
- Ravichandhran Madhavan, Ganesan Ramalingam, and Kapil Vaswani, Modular Heap Analysis For Higher Order Programs, in Static Analysis Symposium (SAS), September 2012
- Patrice Godefroid, Aditya V. Nori, Sriram K. Rajamani, and Sai Deep Tetali. Compositional May-Must Program Analysis: Unleashing the Power of Alternation. In POPL '10: Principles of Programming Languages, January 2010
Scalable Distributed Systems
Building systems that can scale seamlessly under load is an art. Scalability is a concern that pervades every aspect of the system stack including languages, runtimes, and storage. Unfortunately, scalability conflicts with other desirable properties such as consistency and security. The focus of this research is to rethink programming models, storage systems, and tools with scalability as the primary goal.
The CScale project explores new designs of distributed databases. As part of this project, we have developed a new replication protocol for building consistent and resilient distributed data structures. With Cipherbase, we are rethinking the design of databases and database abstractions with security as a first class citizen. We are building a compiler that securely migrates existing database applications to public cloud platforms. Perforator is a tool for costing and optimizing queries running on distributed query engines such as HIVE and DryadLINQ.
Sagar Chordia, Sriram Rajamani, Kaushik Rajan, G. Ramalingam, and Kapil Vaswani, Asynchronous Resilient Linearizability, in International Symposium on Distributed Computing (DISC), October 2013
Jose Faleiro, Sriram Rajamani, Kaushik Rajan, G Ramalingam, and Kapil Vaswani, Generalized Lattice Agreement, in Principles of Distributed Computing (PODC), ACM, July 2012
G Ramalingam and Kapil Vaswani, Fault Tolerance via Idempotence, in Principles of Programming Languages (POPL), ACM, January 2013
Probabilistic programs are usual programs with two added constructs: (1) the ability to draw values at random from distributions, and (2) the ability to condition values of variables in a program through observations. Models from diverse application areas such as computer vision, coding theory, cryptographic protocols, biology and reliability analysis can be written as probabilistic programs.
Probabilistic inference is the problem of computing an explicit representation of the probability distribution implicitly specified by a probabilistic program. We show connections this research area has with programming languages and software engineering, including language design, static and dynamic analysis. These ideas form the basis of the tool R2.
Aditya V. Nori, Chung-Kil Hur, Sriram K. Rajamani, and Selva Samuel, R2: An Efficient MCMC Sampler for Probabilistic Programs, in AAAI Conference on Artificial Intelligence (AAAI), AAAI, July 2014
Chung-Kil Hur, Aditya V. Nori, Sriram K. Rajamani, and Selva Samuel, Slicing Probabilistic Programs, in Programming Language Design and Implementation (PLDI), ACM, June 2014
Andrew D. Gordon, Thomas A. Henzinger, Aditya V. Nori, and Sriram K. Rajamani, Probabilistic Programming, in International Conference on Software Engineering (ICSE Future of Software Engineering), IEEE, May 2014
Guillaume Claret, Sriram K. Rajamani, Aditya V. Nori, Andrew D. Gordon, and Johannes Borgstrom, Bayesian Inference using Data Flow Analysis, in Foundations of Software Engineering (FSE), ACM, August 2013
(Joint work with collaborators at Tel-Aviv University, Technion, and Stanford University.)
A key challenge in writing concurrent programs is concurrency control: ensuring that concurrent accesses and modifications to shared mutable state do not interfere with each other in undesirable ways. Atomic sections are a simple way to declaratively specify a desired correctness criterion in the presence of concurrency: an atomic section is a code fragment that must appear to execute in isolation, with no interference from other parallel computation. Our work has focused on atomic sections, in various settings, both from an analysis/verification perspective as well as synthesis perspective. The problems we have looked at include the following. (a) Converting a sequential data-structure into a linearizable data-structure. (A linearizable data-structure essentially guarantees that each of its methods is atomic.) (b) Converting a linearizable data-structure into a transactional data-structure. A transactional data-structure is a data-structure that provides additional hooks that enable clients to perform a sequence of operations atomically. (c) Composing transactional data-structures.
- Oren Zomer, Guy Golan-Gueta, G. Ramalingam, and Mooly Sagiv, Checking Linearizability of Encapsulated Extended Operations. In European Symposium on Programming (ESOP), 2014.
- Guy Golan-Gueta, G. Ramalingam, Mooly Sagiv, and Eran Yahav, Automatic Semantic Locking. In Principles and Practice of Parallel Programming (PPoPP) Poster, February 2014.
- Guy Golan-Gueta, G. Ramalingam, Mooly Sagiv, and Eran Yahav, Concurrent libraries with foresight. In Programming Language Design and Implementation (PLDI), June 2013.
- Guy Golan-Gueta, Nathan Bronson, Alex Aiken, G. Ramalingam, Mooly Sagiv, and Eran Yahav, Automatic Fine-Grain Locking using Shape Properties. In Object-Oriented Programming, Systems, Languages & Applications (OOPSLA), 2011.
- We are looking for graduate interns for summer'15. Please apply here and write to us directly at email@example.com
- We are hiring, please check out the careers page
- Best paper awards at FMCAD 2014 and FSE 2014!
- Tom Ball and Sriram Rajamani are recipients of the 2011 CAV award!
- Automatic Predicate Abstraction of C Programs wins the Most Influential PLDI Paper award!