Abstract:
Embedding symbolic logical formulas into a low-dimensional continuous space provides an effective way for the Neural-Symbolic system. However, current studies are all con...Show MoreMetadata
Abstract:
Embedding symbolic logical formulas into a low-dimensional continuous space provides an effective way for the Neural-Symbolic system. However, current studies are all constrained by the syntactic structure modeling and fail to preserve intrinsic semantics. To this end, we propose a novel model of Contrastive Graph Representations (ConGR) for logical formulas embedding. Firstly, it introduces a densely connected graph convolutional network (GCN) with an attention mechanism to process syntax parsing graphs of formulas. Secondly, the contrastive instances for each anchor formula are generated by the transformation under the guidance of logical properties. Two types of contrast, global-local and global-global, are carried out to refine formula embeddings with semantic information. Extensive experiments demonstrate that ConGR obtains superior performance against state-of-the-art baselines.
Date of Conference: 13-16 May 2024
Date Added to IEEE Xplore: 23 July 2024
ISBN Information: