Skip to ContentSkip to Navigation
Research Our research Sector Plan Social Sciences and Humanities Humane AI

Projects

Learning or Aligning?

Eedan Amit-Danhi, in cooperation with Qinfeng Zhu, and Thijs de Zee. In this project, we look into the role of automated visualization in digital voting aids such as Kieskompas and StemWijzer. We conduct walkthrough interviews with voters, as well as semi-structured interviews with the creators of the voting aids, in order to find out whether automation breeds more trust and more informed civic choices.

Visual (AI) disinformation

Kun He, in cooperation with Jiapan Guo. This is DSSC XS funded project. Although extensive research has been conducted to detect and analyze text-based mis/disinformation, there is a lack of research on the spread and evolution of visual (AI) disinformation. This project aims to address this gap by analyzing the diffusion and evolution of visual (AI)disinformation on digital platforms.

Responsible, ethical, competitive: Ecologies of Artificial Intelligence in Inter/National Security

Benjamin Johnson. This project will qualitatively analyze recent national, regional, and international efforts to incorporate ethics (in a broad sense) into artificial intelligence (AI) regulatory regimes (including in developing and using AI) within a strategic framework of economic and geopolitical competition.

Evaluating language models from a multilingual perspective

Yevgen Matusevych
The abilities of language models often mirror those of humans, and the observed parallels open discussions about cognition-inspired AI. Nevertheless, few studies compare what models and humans can and cannot do. Even less work compares models to multilingual or non-native speakers. This gap hinders our understanding of models and affects the study of bilingualism, which has historically suffered from the lack of formal models. In this project, we evaluate large language and speech models from the perspective of cognitive science of bi-/multilingualism.

Last modified:08 April 2024 10.49 a.m.