POPL 2024
Sun 14 - Sat 20 January 2024 London, United Kingdom
Sun 14 Jan 2024 12:20 - 12:30 at Kelvin Lecture - Second Session Chair(s): Steven Holtzen, Matthijs Vákár

Language model probabilistic programming extends standard PPLs with new primitives for sampling from and conditioning on the outputs of large language models. In principle, language model probabilistic programs can encode many distributions that would be difficult to elicit via prompting alone. This abstract advocates SMC for efficient inference in language model probabilistic programs. First, we briefly describe our LLaMPPL library for language model probabilistic programming, which makes it easy to rapidly explore a large space of sound SMC algorithms for a given language modeling task, and automates the efficient implementation of SMC, including auto-batching calls to LLMs. We then offer our perspective, informed by our preliminary experiments with LLaMPPL, on two key design challenges faced by users of SMC (designing the intermediate targets, and designing the proposal distributions), through the lens of three example models that outperform state-of-the-art LLMs and constrained generation techniques on several tasks.

Sun 14 Jan

Displayed time zone: London change

11:00 - 12:30
Second SessionLAFI at Kelvin Lecture
Chair(s): Steven Holtzen Northeastern University, Matthijs Vákár Utrecht University
11:00
10m
Talk
A Tree Sampler for Bounded Context-Free Languages
LAFI
Breandan Considine McGill University
File Attached
11:10
10m
Talk
A Multi-language Approach to Probabilistic Program Inference
LAFI
Sam Stites Northeastern University, Steven Holtzen Northeastern University
11:20
10m
Talk
Belief Programming in Partially Observable Probabilistic Environments
LAFI
Tobias Gürtler Saarland University, Saarland Informatics Campus, Benjamin Lucien Kaminski Saarland University; University College London
11:30
10m
Talk
Homomorphic Reverse Differentiation of IterationOnline
LAFI
Fernando Lucatelli Nunes Utrecht University, Gordon Plotkin Google, Matthijs Vákár Utrecht University
File Attached
11:40
10m
Talk
MultiSPPL: extending SPPL with multivariate leaf nodes
LAFI
Matin Ghavami Massachusetts Institute of Technology, Mathieu Huot MIT, Martin C. Rinard Massachusetts Institute of Technology, Vikash K. Mansinghka Massachusetts Institute of Technology
11:50
10m
Talk
Reverse mode ADEV via YOLO: tangent estimators transpose to gradient estimators
LAFI
McCoy Reynolds Becker MIT, Mathieu Huot MIT, Alexander K. Lew Massachusetts Institute of Technology, Vikash K. Mansinghka Massachusetts Institute of Technology
12:00
10m
Talk
Sparse Differentiation in Computer Graphics
LAFI
Kevin Mu University of Washington, Jesse Michel Massachusetts Institute of Technology, William S. Moses Massachusetts Institute of Technology, Shoaib Kamil Adobe Research, Zachary Tatlock University of Washington, Alec Jacobson University of Toronto, Jonathan Ragan-Kelley Massachusetts Institute of Technology
12:10
10m
Talk
A slice sampler for the Indian Buffet Process: expressivity in nonparametric probabilistic programming
LAFI
Maria-Nicoleta Craciun University of Oxford, C.-H. Luke Ong NTU, Hugo Paquet LIPN, Université Sorbonne Paris Nord, Sam Staton University of Oxford
12:20
10m
Talk
Effective Sequential Monte Carlo for Language Model Probabilistic Programs
LAFI
Alexander K. Lew Massachusetts Institute of Technology, Tan Zhi-Xuan Massachusetts Institute of Technology, Gabriel Grand Massachusetts Institute of Technology, Jacob Andreas MIT, Vikash K. Mansinghka Massachusetts Institute of Technology