Seminario

Knowledge Dependencies in Large Language Models

2 maggio 2024
Orario di inizio 
16:00
Online
Organizzato da: 
Dottorato in Cognitive and Brain Sciences, CIMeC
Destinatari: 
Alumni UniTrento
Comunità universitaria
Dipendenti UniTrento
Partecipazione: 
Ingresso libero
Online
Online su prenotazione
Email per prenotazione: 
Scadenza prenotazioni: 
2 maggio 2024, 12:00
Referente: 
Matteo De Matola, Davide Mazzaccara, Elisa Pasquini
+39 0464 808617
Speaker: 
Mor Geva Ph.D. Senior Lecturer , School of Computer Science, Università di Tel Aviv

Some of the most pressing issues with large language models (LLMs), such as the generation of factually incorrect text and logically incorrect reasoning, may be attributed to the way models represent and recall knowledge internally. In this talk, we will evaluate the representation and utilization of knowledge dependencies in LLMs from two different perspectives. First, we will consider the task of knowledge editing, showing that (a) using various editing methods to edit a specific fact does not implicitly modify other facts that depend on it, and (b) some facts are often hard to disentangle. Next, we will consider the setting of latent multi-hop reasoning, showing that LLMs only weakly rely on knowledge dependencies when answering complex queries. While these shortcomings could potentially be mitigated by intervening on the LLM computation, they call for better training procedures and possibly new architectures.