Ziyang Li(UPenn)
October 12, 2023

We present Scallop, a language which combines the benefits of deep learning and logical reasoning. Scallop enables users to write a wide range of neurosymbolic applications and train them in a data- and compute-efficient manner. It achieves these goals through three key features: 1) a flexible symbolic representation that is based on the relational data model; 2) a declarative logic programming language that is based on Datalog and supports recursion, aggregation, and negation; and 3) a framework for automatic and efficient differentiable reasoning that is based on the theory of provenance semirings. Our evaluation demonstrates that Scallop is capable of expressing algorithmic reasoning in diverse and challenging AI tasks, and provides a succinct interface for machine learning programmers to integrate logical domain knowledge. We further demonstrate Scallop’s role in the era of foundation models. By connecting Scallop to an extensible library of 12 foundation models, we can apply our relational programming language to 9 challenging tasks spanning natural language, vision, and structured and vector databases. Under no-training settings, our solutions can achieve comparable or better accuracy than competitive baselines.

About Ziyang Li(UPenn)

Ziyang Li is currently a fifth-year doctoral candidate at the University of Pennsylvania under the guidance of Prof. Mayur Naik. As an Amazon Fellow, his research encompasses both programming languages and machine learning, with a particular focus on neuro-symbolic techniques. He is the main language designer of the Scallop language. Ziyang’s contributions using neuro-symbolic techniques have been showcased across domains such as machine learning, natural language processing, computer vision, programming languages, and security.