ADAPT Director Prof. John Kelleher Highlights New Framework for Stronger Sentence Representations

23 December 2025

A study by ADAPT Director Prof. John Kelleher and ADAPT researcher Vasudevan Nedumpozhimana has recently been published in the journal Transactions on Machine Learning Research. The paper, titled “Know Yourself and Know Your Neighbour: A Syntactically Informed Self-Supervised Compositional Sentence Representation Learning Framework using a Recursive Hypernetwork”, investigates how to build sentence representations that reflect real structural and semantic understanding rather than simple word averaging.

The team spent years examining neural language embeddings to better understand what linguistic information is encoded by neural networks. Their latest work is focused on improving the ability of neural models to encode linguistic information. They introduce a lightweight recursive hypernetwork trained on top of a pre-trained neural language model. The design builds sentence meaning directly from a parse tree, keeps syntactic and semantic information separate yet linked, and learns through six self-supervised tasks created for this work.

Experiments show that the method captures deeper linguistic detail than several strong baselines, adapts well to varied syntactic structures, and remains stable across different sentence lengths.

Access the full paper here.