Contributors: Laboratoire d'Informatique et des Systèmes (LIS) (Marseille, Toulon) (LIS); Aix Marseille Université (AMU)-Université de Toulon (UTLN)-Centre National de la Recherche Scientifique (CNRS); Centre de Recherche en Psychologie et Neurosciences (CRPN); Aix Marseille Université (AMU)-Centre National de la Recherche Scientifique (CNRS); Institute of Language, Communication and the Brain (ILCB); Aix Marseille Université (AMU); Laboratoire d'informatique Fondamentale de Marseille (LIF); Aix Marseille Université (AMU)-École Centrale de Marseille (ECM)-Centre National de la Recherche Scientifique (CNRS); DYNamiques de l’Information (DYNI); Aix Marseille Université (AMU)-Université de Toulon (UTLN)-Centre National de la Recherche Scientifique (CNRS)-Aix Marseille Université (AMU)-Université de Toulon (UTLN)-Centre National de la Recherche Scientifique (CNRS); Laboratoire de psychologie cognitive (LPC); Gary Lupyan; ANR-16-CONV-0002,ILCB,ILCB: Institute of Language Communication and the Brain(2016); ANR-23-CE23-0031,COMPO,Biais inductifs pour des modèles compositionnels du langage naturel(2023); ANR-23-CE28-0008,HEBBIAN,Apprentissage hebbien de séquences(2023)
نبذة مختصرة : International audience ; The deep learning revolution partly embodied in transformers architectures and pre-trained self-supervised models opens many perspectives in the study of linguistics and animal communication. By exploring transfer learning approaches for computational bioacoustics applied to primate vocalizations, we delve into the explainability of pre-trained speech models to understand what they can teach us about the origins of language. To examine divergences and similarities between speech and primate vocalizations from a deep learning perspective, our method consists in probing and fine-tuning experiments based on self-supervised acoustic models. By analyzing their ability to process primate vocalizations, we test the effect of models' architectures, pre-training datasets, and task specificities on their transfer learning performance. In doing so, we want to evaluate the validity of deep transfer learning as a scientific tool in the study of the origins of language from a comparative standpoint.
No Comments.