Affect dataset collection

I developed a virtual reality system that was used to collect an affect labeled dataset. This system uses several novel tasks to generate base pupil light responses (PLR) for participants. After baselining PLR, participants were shown 360 degree and 360-degree stereoscopic videos that were predicted to elicit certain affective states. While presenting stimuli, the system collected user physiological signals, pose data, and heart rate. After each stimuli participants were shown a second video that was generated using custom blur filters to add another reference point for PLR during the stimuli presentation.

Private repository pending publication


Unity | C# | Python | C++ | FitBit | Virtual Reality


Details

Python

  • Accessed FitBit web api using the CherryPy web framework
  • Formatted FitBit data to match C#/Unity logs using NumPy and Pandas
  • Saved new logs as CSV files
  • Generated blurred versions of the videos using OpenCV
  • Hand implemented convolutional gaussian filters

Video stream (C++)

  • Accessed the device’s raw eye camera bitstream
  • Converted the bitstream into AVI files
  • Generated meta data for AVI files from device timecodes and frame numbers
  • Executed the C++ AVI recording code from within Unity using C# to capture eye camera steams during tasks in the study

Shaders (HLSL)

  • Edited existing shaders to create new shaders for 360 degree and 360 stereoscopic video replay
  • Played videos on a sphere game object with inverted normals so that materials were projected inwards towards the user
  • Created color and light intensity sequencing shaders that cycle through the RGB spectrum and luminance ranges

Interactions (C#)

  • Created timed gaze targets where users had to stare at a gaze target for 3 seconds
  • Generated gaze tracing tasks where users were required to trace a pattern with their eyes
  • Implemented ray-casting and ray-casting UI elements using Unity XR

Data logging (C#)

  • Automatically created directory systems for each participant using the system IO library
  • Generated CSV files using file reader and writer tools from within unity
  • Recorded device tracker information at 60 Hz
  • Recorded gaze data at 200 Hz
  • Other logs were recorded using events and listeners for buttons and less frequent system events (i.e., video start time)

UI (C#)

  • Designed head tacked menus with soft viewing degree constraints by converting the menu location into a 2D coordinate system based on the users viewport then computing menu location and viewing angles using a point projected along the user’s forward vector
  • Developed a locator arrow that adjusted opacity based on its distance from a target object
  • Caused the arrow to remain within the user’s viewport at the closest position to the target object
  • Rotated the arrow towards the target object

Video of pupil light response collection