POPL 2024
Sun 14 - Sat 20 January 2024 London, United Kingdom
Sun 14 Jan 2024 11:20 - 11:30 at Kelvin Lecture - Second Session Chair(s): Steven Holtzen, Matthijs Vákár

Partial observability is an increasingly important issue for computer systems, as they are frequently deployed to environments where the exact state of the environment cannot be fully observed. One way to address partial observability is belief programming: A programming methodology where the program automatically performs state estimation in non-deterministic environments.

Our work lifts belief programming to a quantitative realm, which enables us to write and verify programs for probabilistic partially observable environments. A probabilistic belief program dynamically updates a probability distribution over all possible states of the environment in the form of a belief state, and enables sound inferences based on that distribution. Furthermore we introduce P-BLIMP, a model probabilistic programming language specialized for partial observability. P-BLIMP offers language features to symbolically model the behavior of the partially observable environment, to condition the belief state based on observations and to make inferences about the state of the environment. We also introduce a weakest pre-expectation calculus for probabilistic belief programs and present a case study on how predicates can be manipulated efficiently.

Sun 14 Jan

Displayed time zone: London change

11:00 - 12:30
Second SessionLAFI at Kelvin Lecture
Chair(s): Steven Holtzen Northeastern University, Matthijs Vákár Utrecht University
11:00
10m
Talk
A Tree Sampler for Bounded Context-Free Languages
LAFI
Breandan Considine McGill University
File Attached
11:10
10m
Talk
A Multi-language Approach to Probabilistic Program Inference
LAFI
Sam Stites Northeastern University, Steven Holtzen Northeastern University
11:20
10m
Talk
Belief Programming in Partially Observable Probabilistic Environments
LAFI
Tobias Gürtler Saarland University, Saarland Informatics Campus, Benjamin Lucien Kaminski Saarland University; University College London
11:30
10m
Talk
Homomorphic Reverse Differentiation of IterationOnline
LAFI
Fernando Lucatelli Nunes Utrecht University, Gordon Plotkin Google, Matthijs Vákár Utrecht University
File Attached
11:40
10m
Talk
MultiSPPL: extending SPPL with multivariate leaf nodes
LAFI
Matin Ghavami Massachusetts Institute of Technology, Mathieu Huot MIT, Martin C. Rinard Massachusetts Institute of Technology, Vikash K. Mansinghka Massachusetts Institute of Technology
11:50
10m
Talk
Reverse mode ADEV via YOLO: tangent estimators transpose to gradient estimators
LAFI
McCoy Reynolds Becker MIT, Mathieu Huot MIT, Alexander K. Lew Massachusetts Institute of Technology, Vikash K. Mansinghka Massachusetts Institute of Technology
12:00
10m
Talk
Sparse Differentiation in Computer Graphics
LAFI
Kevin Mu University of Washington, Jesse Michel Massachusetts Institute of Technology, William S. Moses Massachusetts Institute of Technology, Shoaib Kamil Adobe Research, Zachary Tatlock University of Washington, Alec Jacobson University of Toronto, Jonathan Ragan-Kelley Massachusetts Institute of Technology
12:10
10m
Talk
A slice sampler for the Indian Buffet Process: expressivity in nonparametric probabilistic programming
LAFI
Maria-Nicoleta Craciun University of Oxford, C.-H. Luke Ong NTU, Hugo Paquet LIPN, Université Sorbonne Paris Nord, Sam Staton University of Oxford
12:20
10m
Talk
Effective Sequential Monte Carlo for Language Model Probabilistic Programs
LAFI
Alexander K. Lew Massachusetts Institute of Technology, Tan Zhi-Xuan Massachusetts Institute of Technology, Gabriel Grand Massachusetts Institute of Technology, Jacob Andreas MIT, Vikash K. Mansinghka Massachusetts Institute of Technology