Faculty & Research

Programming Language Principles, Design, and Implementation

The research interests of the faculty in the Programming Language Principles, Design, and Implementation group cover a wide spectrum of topics related to the semantic and logic foundations of programming languages, language design, type theory, compilers, program analysis and optimization, program specification and construction, and emerging models of computation.

The following list gives a high-level overview of the major research topics. Please consult the pages of the researchers for more information:

  • Optimization techniques for scalable heterogenous parallel computers, including GPGPUs, and applying those techniques to language abstractions that can express parallelism at various levels. Another major focus in compilers group is developing mathematical models for data locality, using them to optimize high performance applications, and exploring the interplay of data locality and parallelism.
  • Runtime systems and compilers for actor-like languages
  • Profile-driven compilation and auto-tuning (e.g. granularity adjustment for parallelism)
  • Domain-specific languages for parallelism and non-traditional hardware targets, including embedded processors and sensor networks
  • Formal methods for system design, centering on formal synthesis of architectures. Models, methods and tools for cyber-physical system design. Infrastructure for synchronous software implementation.
  • Design, theory, and implementation of query and search languages for data models ranging from relational to semi-structured databases. This includes query languages for tree and graph databases and languages designed for parallel computation.

Faculty in this area include:
R. Kent Dybvig, Daniel Friedman, Christopher Haynes, Steven Johnson, Andrew Lumsdaine, Ryan Newton, Gregory J. E. Rawlins, Amr Sabry, Chung-chieh Shan, Jeremy Siek, Dirk Van Gucht