Exploring Explainability in Multimodal Emotion Prediction: A Comparative Study on the ARTEMIS Dataset
This repository contains the code and related files for our Multimodality Project: Exploring Explainability in Multimodal Emotion Prediction: A Comparative Study on the ARTEMIS Dataset
This project is conducted by Longfei Zuo, Xiaoyu Zhao, Pingjun Hong and Zhen Wang under the structure of the course Erweiterungsmodul Computerlinguistik at LMU
Overleaf Link: https://www.overleaf.com/7772528279mnfwpnssndbw#f81f91
ArtEmis: Affective Language for Visual Art
Home Page: https://www.artemisdataset.org/
Introduction: The ArtEmis dataset is built on top of the publicly available WikiArt dataset which contains 80,031 unique and carefully curated artworks from 1,119 artists (as downloaded in 2015).
WikiArt | All images (120k+)
Home Page: https://www.kaggle.com/datasets/antoinegruson/-wikiart-all-images-120k-link?resource=download
Introduction: WikiArt images links, scraped from WikiArt https://www.wikiart.org/en/paintings-by-style
5 columns: Style, Artwork Name, Artist, Date, Link
IDEFICS: https://huggingface.co/HuggingFaceM4/idefics-9b-instruct
IDEFICS Playground: https://huggingface.co/spaces/HuggingFaceM4/idefics_playground
Moddle Home Page for Vorlesung: https://moodle.lmu.de/course/view.php?id=34505
Moodle Home Page for Übung: https://moodle.lmu.de/course/view.php?id=34506