Knowledge Dependencies in Large Language Models

2 May 2024
Start time 
4:00 pm
Doctorate Program in Cognitive and Brain Sciences, CIMeC
Target audience: 
UniTrento alumni
University community
UniTrento staff
Online – Registration required
Registration email: 
Registration deadline: 
2 May 2024, 12:00
Contact person: 
Matteo De Matola, Davide Mazzaccara, Elisa Pasquini
+39 0464 808617
Mor Geva Ph.D. Senior Lecturer at the School of Computer Science at Tel Aviv University

Some of the most pressing issues with large language models (LLMs), such as the generation of factually incorrect text and logically incorrect reasoning, may be attributed to the way models represent and recall knowledge internally. In this talk, we will evaluate the representation and utilization of knowledge dependencies in LLMs from two different perspectives. First, we will consider the task of knowledge editing, showing that (a) using various editing methods to edit a specific fact does not implicitly modify other facts that depend on it, and (b) some facts are often hard to disentangle. Next, we will consider the setting of latent multi-hop reasoning, showing that LLMs only weakly rely on knowledge dependencies when answering complex queries. While these shortcomings could potentially be mitigated by intervening on the LLM computation, they call for better training procedures and possibly new architectures.