What Compilers desire from Weak Memory SemanticsRemote
Compilers optimize our programs, a thing desirable for us as it helps software perform much better. However, given that they are historically designed for sequential programs, their safety takes a hit when we apply them on concurrent programs. Models such as sequential consistency (SC) disallow even simple reordering of independent code fragments, which were sequentially considered as harmless code motion. The quest for allowing more optimizations has not been an easy one, as allowing aggressive optimizations involving dependencies is problematic due to what we know as the out-of-thin-air problem. Moreover, simply allowing more concurrent behaviors (weak consistency models) does not translate to more safe optimizations. So, where did we go amiss? To shed light on this, we are investigating if it is possible to derive memory models retaining safety of optimizations from one model to another. In this talk, I intend to share our ongoing work on this.
FoWM (FOWM.pdf) | 1.12MiB |
Mon 15 JanDisplayed time zone: London change
14:00 - 15:30 | |||
14:00 20mTalk | Compilers should get over themselves and respect semantic dependencies! The Future of Weak Memory | ||
14:20 20mTalk | A case against semantic dependencies The Future of Weak Memory Ori Lahav Tel Aviv University | ||
14:40 20mTalk | What Compilers desire from Weak Memory SemanticsRemote The Future of Weak Memory Akshay Gopalakrishnan McGill University File Attached | ||
15:00 20mTalk | Programmers love mind-bogglingly complicated weak memory models The Future of Weak Memory Simon Cooksey NVIDIA File Attached |