3D Data Visualization of Machine Learning


Demo Video


Project Info

Project Duration Jan - May 2022
Genre Discovery Project
Platform PC, VR, Leap Motion, Ultrahaptics
Project Website
Machine Learning Algorithm K-Means, Fast Fourier Transform, Reinforcement Learning (Q-Learning)
Tools Unity
Team Size 2 Programmers, 2 Artists, 1 Sound Designer, 1 UI/UX Designer, 1 Technical Artist
Team Members Bokang Wang, Anlan Yang, Jenifer Liao, Leah Lee, Jack Wesson, Ruizi Wang, Yigang Wen
My Contributions Producer, Programming, Game Design, Playtesting

Introduction

This is a project working with Google at the Carnegie Mellon University, Entertainment Technology Center (ETC), discovering how 3D Data Visualization and Interactions can be used to represent the High-Dimensionality features of Machine Learning Models in an Engaging Environment.

We think the concept of Machine Learning has been largely regarded as a Black Box. Our project aimed to open this black box through engaging interactive experiences.

We made 5 prototypes in total exploring 4 data types (Traditional number, Image, Music, AI Locomotion), 3 ML algorithms (K-Means, Fast Fourier Transform, Q-Learning) and 3 platforms (PC, VR, Ultrahaptics).


Target Demographic

People in higher education who are intimidated to explore the concept of ML


Global Hypothesis

Providing an engaging experience can raise interest and curiosity that can shift attitudes toward ML and build a cyclical relationship between the pursuit of learning and engagement


Prototype 1 & 2 - Isochromatic Deconstruction Using K-Means

We visualized RGB datasets using the K-means clustering algorithm to split a painting into isochromatic layers. The team used the RGB value as an XYZ data set to cluster images into layers grouped by color proximity. This experience visualizes a 2D image into a 3D space, making a unique impression. We built the project in Unity 3D and visualized it in a virtual space with an Oculus Quest 2 Head Mounted Display and trackers.

Multiple factors made the experience successful. First, visualizing direct and unambiguous data. In this case, a 2D image of a painting was quite lucid as a pixel data set and created a memorable impression. Moreover, allowing customization led to personal investment in image selection, which made the experience more relatable and engaging. The VR variant pulls users further into an immersive experience, adding additional avenues in which users can invest their attention, and offering a more visually striking and more memorable experience.


Prototype 3 - Haptic Music Using Fast Fourier Transform

We mapped beats, notes, and accents of a song to independent sensations across the hand to create visual metaphors on screen and matching sound and haptics to maximize the experience of ‘feeling’ music. This allowed users to haptically feel a clustering algorithm centered around sonic data, to haptically visualize any user-selected song through STRATOS Ultrahaptics. The novelty of interaction bolstered the level of interest. Though interest began high from novelty alone, this novelty also led to returnability, and those subsequent re-engagements led to increasing levels of understanding.


Prototype 4 - Reinforcement Learning Dinosaur Using Q-Learning

We visualized Reinforcement learning using a dinosaur (agent), meat (reward), lava (punishment) to represent narrative using a game format. The primary engagement learning centered around storytelling, which, when used with related metaphors, increased investment and inclinations towards sustained, longer-term interactions. Moreover, gamifying experience by rewarding and punishing agent behavior boosts user attentiveness, and results in a higher content-retention rate.


Prototype 5 - K-Means Garden Clustering

We visualized K-Means algorithm using flowers to represent data points. By scaffolding instructions, we aimed to show how K-means works step by step.


Final Conclusion

Relatable, visual metaphors help increase informational accessibility and emotional memorability.
With greater accessibility comes greater engagement, creating a recursive response, which feeds into a cycle that potentially results in more lasting paradigm shifts.


My Contributions

  • As a programmer, I implemented the haptic music prototype using Fast Fourier Transform, K-Means Garden prototype and Reinforcement Learning Dinosaur Prototype.
  • As a producer, I led a team of 7, served as point of contacting with advising professor and Google scientists, fostered productive team communication, managed the development process, and maintained development blog and website.