Logic Synthesis Software School
July 23, 2021 · Virtual Experience · co-organized with IWLS
About the event
The Logic Synthesis Software School (LSSS) is an informal meeting on software tools for logic synthesis and verification. The goal of the meeting is to bring together researchers and tool developers in the fields of electronic design automation, logic synthesis, and verification to foster research on open science infrastructure and tools for logic synthesis and verification. LSSS is a joint event of IWLS.
Confirmed speakers:
- Bruno Schmitt, EPFL, CH
- Satrajit Chatterjee, Google, USA
- Pierre-Emmanuel Gaillardon, University of Utah, USA
- Kuldeep Meel, National University of Singapore, SG
The meeting was on Zoom
Program
07:00 am PT 16:00 pm CET 23:00 pm JST |
Welcome |
07:00 am - 07:40 am PT 16:00 pm - 16:40 pm CET 23:00 pm - 23:40 pm JST |
The EPFL Flagship Libraries in Action: A Development Snapshot of mockturtle & tweedledum (Heinz Riener & Bruno Schmitt) [PDF] mockturtle is a C++-17 logic network library. It provides several logic network implementations (such as And-inverter graphs, Majority-inverter graphs, and k-LUT networks), and generic algorithms for logic synthesis and logic optimization. mockturtle is part of the EPFL logic synthesis libraries, a collection of modular and open software libraries, to tackle problems in the field of logic synthesis. tweedledum is an extensible and efficient open-source library for quantum circuit synthesis and compilation. It has state-of-the-art performance in many compilation tasks relevant to near-term applications. In this work, we introduce its design principles and concrete implementation. In particular, we describe the library’s core: an intuitive and flexible intermediate representation (IR) that supports different abstraction levels across the same circuit structure. We demonstrate the power of such flexibility through a use case in which we use tweedledum to compile an oracle defined by classical Boolean function into a quantum circuit. This oracle is part of a circuit implementing Grover’s search to solve an instance of vertex coloring. We compare this result against some hand-optimized implementations and show that our automated flow empowers users to implement oracles on higher-level abstractions while delivering highly-optimized results. We also present a use case in which we employ tweedledum to map technology-independent circuits to constrained quantum hardware architectures. Our results show that tweedledum is significantly faster than established frameworks while delivering competitive quality of the results. GitHub: mockturtle • tweedledum
|
07:40 am - 07:45 am PT 16:40 pm - 16:45 pm CET 23:40 pm - 23:45 pm JST |
Q&A Break |
07:45 am - 08:25 am PT 16:45 pm - 17:25 pm CET 23:45 pm - 00:25 am JST |
Scalable Functional Synthesis: The Child of Perfect Marriage between Machine Learning and Formal Methods (Kuldeep Meel) [PDF] Don't we all dream of the perfect assistant whom we can just tell what to do, and the assistant can figure out how to accomplish the tasks? Formally, given a specification F(X, Y) over the set of input variables X and output variables Y, we want the assistant, aka functional synthesis engine, to design a function G such that (X, Y=G(X)) satisfies F. Functional synthesis has been studied for over 150 years, dating back Boole in 1850s and yet scalability remains a core challenge. Motivated by progress in machine learning, we design a new algorithmic framework Manthan, which views functional synthesis as a classification problem, relying on advances in constrained sampling for data generation, and advances in automated reasoning for a novel proof-guided refinement and provable verification. On an extensive and rigorous evaluation over 609 benchmarks, we demonstrate that Manthan significantly improves upon the current state of the art. The significant performance improvements, along with our detailed analysis, highlights several interesting avenues of future work at the intersection of machine learning, constrained sampling, and automated reasoning. GitHub: Manthan
|
08:25 am - 08:30 am PT 17:25 pm - 17:30 pm CET 00:25 am - 00:30 am JST |
Q&A Break |
08:30 am - 09:10 am PT 17:30 pm - 18:10 pm CET 00:30 am - 01:10 am JST |
Under the Hood of LSOracle (Pierre-Emmanuel Gaillardon) In this talk, we will introduce our recent effort in developing a logic synthesis platform, named LSOracle, whose goal is to achieve/approach global optimality by partitioning and classifying portions of a complete logic design, in order to apply ad-hoc logic optimizations (MIGs and other). Indeed, in spite of relying on a single data structure and class of heuristics (like ABC), LSOracle is capable of representing and optimizing logic using several Boolean representations (AIGs, MIGs, XMGs) and manipulate independent partitions back and force between those representations to select the best Power-Performance-Area (PPA) improvements per partition. The structure of LSOracle also lends itself naturally to massively parallel optimization and cloud-based deployments. GitHub: LSOracle
|
09:10 am - 09:15 am PT 18:10 pm - 18:15 pm CET 01:10 am - 01:15 am JST |
Q&A Break |
09:15 am - 09:55 am PT 18:15 pm - 18:55 pm CET 01:15 am - 01:55 am JST |
Using Logic to Understand Learning (Satrajit Chatterjee) [PDF] A fundamental question in Deep Learning today is the following: Why do neural networks generalize when they have sufficient capacity to memorize their training set. In this talk, I will describe how ideas from logic synthesis can help answer this question. In particular, using the idea of small lookup tables, such as those used in FPGAs, we will see if memorization alone can lead to generalization; and then using ideas from logic simulation, we will see if neural networks do in fact behave like lookup tables. Finally, I’ll present a brief introduction to a new theory of generalization for deep learning that has emerged from this line of work. GitHub: ABC
|
09:55 am - 10:00 am PT 18:55 pm - 19:00 pm CET 01:55 am - 02:00 am JST |
Q&A Break |
Organizers
- Heinz Riener, EPFL, CH
- Giovanni De Micheli, EPFL, CH