EyeT4Empathy: Eye Movement and Empathy Dataset

EyeT4Empathy: Eye Movement and Empathy Dataset

Datasets

EyeT4Empathy: Eye Movement and Empathy Dataset

File

EyeT4Empathy: Eye Movement and Empathy Dataset

Use Case

Computer Vision

Description

Discover the EyeT4Empathy Dataset, a groundbreaking resource by GTS that integrates eye movement tracking with empathy assessments.

Overview

Developed through a collaborative effort by Pedro Lencastre, Samip Bhurtel, Anis Yazidi, Gustavo B. M. e Mello, Sergiy Denysov, and Pedro G. Lind, the EyeT4Empathy dataset is a rich resource capturing intricate patterns of eye movement and empathy evaluations from 60 participants. This dataset uniquely merges the exploration of visual information, gaze typing dynamics, and empathy assessment, offering a detailed view into the interaction between visual cognition and emotional understanding.

Tasks and Methodology: Participants engaged in two specific activities:

  1. Visual Exploration: Participants freely observed unstructured images, allowing for natural eye movement tracking.
  2. Gaze Typing: Utilizing an eye-gaze interface, participants completed typing tasks, providing insights into gaze-controlled interactions.

These activities were recorded using advanced eye-tracking technologies, resulting in a dataset that includes gaze positions and pupil diameter metrics. The dataset is divided into two segments, each catering to the distinct tasks performed.

Empathy Assessment: The empathy aspect of the dataset focuses on assessing both cognitive and affective components of empathy, particularly relevant for individuals with movement impairments. This was evaluated through a detailed 40-question survey administered twice to each participant, aiming to gauge changes or consistencies in empathic responses.

Dataset Specifications:

Measurements: Gaze position, pupil diameter.
Technology: Eye-tracking devices.
Influencing Factors: Luminosity, screen distance.
Sample Details:
Organism: Human
Setting: Controlled laboratory environment
Location: Oslo, Norway

Data Availability: The dataset, which includes eye movement recordings and empathy evaluation responses, is publicly accessible through Scientific Data. It comprises:
Eye Tracker Data: 504 CSV files detailing eye movements.
Raw Data: 60 TSV files capturing comprehensive eye-tracking data.
Questionnaires: Files available in PDF and other formats, providing insights into the empathy assessments.

Supplementary Documentation: Additional resources such as ‘columns_explained.pdf’ and ‘coordinate_system.pdf’ are available, offering detailed explanations of the data columns and the coordinate systems used.

Dataset Usage: This dataset is a valuable tool for researchers interested in the intersections of eye movement dynamics, user interface design via gaze tracking, and the psychological aspects of empathy. It provides a comprehensive base for studying how eye movement patterns can reflect cognitive and emotional states.

This dataset is sourced from Kaggle.

Contact Us

Technology

Quality Data Creation

Technology

Guaranteed TAT

Technology

ISO 9001:2015, ISO/IEC 27001:2013 Certified

Technology

HIPAA Compliance

Technology

GDPR Compliance

Technology

Compliance and Security

Let's Discuss your Data collection Requirement With Us

To get a detailed estimation of requirements please reach us.

{ "@context": "https://schema.org", "@graph": [ { "@type": "Service", "serviceType": "Custom AI Training Data & Annotation", "name": "EyeT4Empathy: Eye Movement and Empathy Dataset Service", "description": "Providing custom, high-accuracy, annotated datasets for training AI/ML models focused on eye movement analysis, gaze-controlled interfaces, and computational empathy studies.", "provider": { "@type": "Organization", "name": "GTS", "url": "https://gts.ai/", "logo": "https://gts.ai/wp-content/themes/mx/images/logo.png" }, "areaServed": "Global", "serviceOutput": "Eye Movement and Empathy Dataset for Affective Computing", "url": "https://gts.ai/dataset-download/eyet4empathy-eye-movement-and-empathy-dataset/" }, { "@type": "Organization", "name": "GLOBOSE TECHNOLOGY SOLUTIONS PRIVATE LIMITED", "url": "https://gts.ai/", "logo": "https://gts.ai/wp-content/themes/mx/images/logo.png", "contactPoint": { "@type": "ContactPoint", "contactType": "customer service", "telephone": "+91-9269795291", "email": "hi@gts.ai" } }, { "@type": "BreadcrumbList", "itemListElement": [ { "@type": "ListItem", "position": 1, "name": "Home", "item": "https://gts.ai/" }, { "@type": "ListItem", "position": 2, "name": "Dataset Download", "item": "https://gts.ai/dataset-download" }, { "@type": "ListItem", "position": 3, "name": "EyeT4Empathy: Eye Movement and Empathy Dataset", "item": "https://gts.ai/dataset-download/eyet4empathy-eye-movement-and-empathy-dataset/" } ] }, { "@type": "Dataset", "name": "EyeT4Empathy: Eye Movement and Empathy Dataset", "description": "A groundbreaking resource by GTS that integrates eye movement tracking (gaze position, pupil diameter) with cognitive and affective empathy assessments from 60 human participants in a controlled lab setting in Oslo, Norway.", "url": "https://gts.ai/dataset-download/eyet4empathy-eye-movement-and-empathy-dataset/", "keywords": [ "EyeT4Empathy: Eye Movement and Empathy Dataset", "datasets for computer vision", "data collection", "dataset for ml", "dataset for matchine Learning", "Gaze Tracking", "Affective Computing" ], "license": "https://creativecommons.org/publicdomain/zero/1.0/", "creator": { "@type": "Organization", "url": "https://gts.ai/", "logo": "https://gts.ai/wp-content/themes/mx/images/logo.png", "name": "GTS" }, "publisher": { "@type": "Organization", "name": "GLOBOSE TECHNOLOGY SOLUTIONS PRIVATE LIMITED" }, "distribution": { "@type": "DataDownload", "encodingFormat": "CSV/TSV/PDF", "contentUrl": "https://gts.ai/dataset-download/eyet4empathy-eye-movement-and-empathy-dataset/" } }, { "@type": "FAQPage", "mainEntity": [ { "@type": "Question", "name": "How is the EyeT4Empathy dataset used in AI/ML research?", "acceptedAnswer": { "@type": "Answer", "text": "The dataset trains models in Affective Computing and Human-Computer Interaction, specifically for developing gaze-controlled interfaces and AI systems that can infer user cognitive and emotional states (empathy) from eye movement patterns." } }, { "@type": "Question", "name": "What quality assurance (QA) methods are applied to the eye-tracking data?", "acceptedAnswer": { "@type": "Answer", "text": "Data quality is ensured through a controlled laboratory setting in Oslo, Norway, monitoring external factors like luminosity and screen distance, alongside detailed documentation ('columns_explained.pdf') to clarify all metrics (gaze position, pupil diameter)." } }, { "@type": "Question", "name": "What kind of diversity is represented in the dataset participants?", "acceptedAnswer": { "@type": "Answer", "text": "The dataset includes data from 60 human participants recorded in a controlled laboratory environment. For projects requiring specific demographic diversity (age, ethnicity, regions), GTS can provide custom data collection services." } }, { "@type": "Question", "name": "What data formats and delivery options are available for the dataset?", "acceptedAnswer": { "@type": "Answer", "text": "The dataset is available in detailed files, including 504 CSV files (eye movements), 60 TSV files (raw data), and questionnaires in PDF format. We can also provide a free sample based on your specific project delivery requirements." } } ] } ] }
Scroll to Top

Please provide your details to download the Dataset.