Download PDFOpen PDF in browser

Embed2Sym - Scalable Neuro-Symbolic Reasoning via Clustered Embeddings

EasyChair Preprint 8637

11 pagesDate: August 11, 2022

Abstract

Neuro-symbolic reasoning approaches proposed in recent years combine a neural perception component with a symbolic reasoning component to solve a downstream task. By doing so, these approaches can provide neural networks with symbolic reasoning capabilities, improve their interpretability and enable generalization beyond the training task. However, this often comes at the cost of poor training time, with potential scalability issues. In this paper, we propose a scalable neuro-symbolic approach, called Embed2Sym. We complement a two-stage (perception and reasoning) neural network architecture designed to solve a downstream task end-to-end with a symbolic optimisation method for extracting learned latent concepts. Specifically, the trained perception network generates clusters in embedding space that are identified and labelled using symbolic knowledge and a symbolic solver. With the latent concepts identified, a neuro-symbolic model is constructed by combining the perception network with the symbolic knowledge of the downstream task, resulting in a model that is interpretable and transferable. Our evaluation shows that Embed2Sym outperforms state-of-the-art neuro-symbolic systems on benchmark tasks in terms of training time by several orders of magnitude while providing similar if not better accuracy.

Keyphrases: Answer Set Programming, Clustering, Perception & Reasoning, embedding space, neuro-symbolic, unstructured data

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:8637,
  author    = {Yaniv Aspis and Krysia Broda and Jorge Lobo and Alessandra Russo},
  title     = {Embed2Sym - Scalable Neuro-Symbolic Reasoning via Clustered Embeddings},
  howpublished = {EasyChair Preprint 8637},
  year      = {EasyChair, 2022}}
Download PDFOpen PDF in browser