Avatar

Hunter Finney

Extended Reality Researcher with 7+ years of laboratory and development experience in leading projects from proposal to publication. My interdisciplinary research focuses on understanding how humans perceive, remember, and interact with real and virtual environments.

Extended Reality Researcher

Salt Lake City, Utah

hunter@hunterfinney.com

(662) 404-4868

LinkedIn


Skills

Unity 3D

7+ Years of Project Development

Effective Coding

8+ Years of Experience

Human Research Design

7+ Years of on Job Practice

Technical Commuinication

5+ Years of Publications & Talks

Languages

C# 

100+ Unity Projects / 350+ Scripts

Java 

100+ Projects / 175+ Files

Python

25+ Projects

C / C++

15+ Projects


Kids in VR

Vision, Audition, & Acition in Space & Time (VAAST) Lab
2020 - 2024
University of Utah. Salt Lake City, UT

Kids use VR too! This series of experiments focus on affordance perception and calibration in kids compared to adults. The projects I led focused on reaching up, reaching out, stepping up, and stepping out within the real and virtual worlds. 


Multisensory Intergration in VR

Vision, Audition, & Acition in Space & Time (VAAST) Lab
2021 - 2025
University of Utah. Salt Lake City, UT

Navigation is a cooperative task across all of our senses. This series of experiments focuses particularly on how we intergrate these senses together. To accomplish this I developed ways to impair vision within a VR headset and implement methodologies to pull apart how we rely on our senses. We attack this problem in two ways. One way we disrupt or manipulate one sense of the many that are used to wayfind to discover the effect of that sense. On the other hand, we can measure the accutity of each sense by isolating the cues to a particular  modality. Below is an example trial of our audio and visual integration experiment.                                     


Augmented Cues for Navigation

Vision, Audition, & Acition in Space & Time (VAAST) Lab
2021 - 2023
University of Utah. Salt Lake City, UT

Soldiers are required to navigate in at a high degree of accuracy with little to no useful cues to assist their wayfinding without disenaging from the primary task at hand. Here we simulated AR cues a soldier could use to wayfind in VR while tracking their gaze to understand how subjects actually use these cues. We also measured how much attention these cues draw by performing a DRT task.


Near Space Perception in AR & VR

Vision, Audition, & Acition in Space & Time (VAAST) Lab
2022 - 2023
University of Utah. Salt Lake City, UT

The work was the capstone of a colleague's dissertation. Here we compared reaching across realities (VR, AR, Real World). We did this with the use of the integrated Leap Motion in a Varjo XR-3 and its video passthrough augmented reality to perform reaching judgements.


Perceptual Biasis in VR  

Vision, Audition, & Acition in Space & Time (VAAST) Lab
2023 - 2025
University of Utah. Salt Lake City, UT

This precursor to my dissertation is a proof of concept of how to measure perceived distortion of users in a VR headset induced by a len’s prism effect.


Nagual a VR Game  

VPSC Lab
Class Project
2020-2021
University of Utah. Salt Lake City, UT
Website

This VR game was completed as a group project at the University of Utah. I led VR development with an emphasis on a shape shifting mechanic similar to a world in miniature apporach to locomotion where the user would shrink in size while maintaining an IPD relative to their original height. This would allow the user to access parts of puzzles otherwise not accessible.


Slap with Leap Motion

High Fidelity Virtual Environments (HI5) Lab
2019
University of Mississippi. Oxford, MS
Download

Slap is a VR experience created for Imagine the Possibilities Career Expo to expose to young students to how VR can be used to teach abstract concepts such as atomic structure in a fun and stimulating way. This demo is designed for a user to be seated at a table to have tactile feedback when interacting with the environment. The project requires SteamVR, Unity3D, and a Leap Motion.


Spiderman

High Fidelity Virtual Environments (HI5) Lab
 2018-2020
University of Mississippi. Oxford, MS
Download

Spiderman has been the Hi5 Lab's most popular demo for the serveral years I was apart of it. It was debuted at the C-Spire MVMT Technology Experience and has been in main rotation in similar events such as the eSports Eggbowl. Spiderman is a webslinging adventure though flying islands with a tracked body using inverse kinematics.


Change Blindness

High Fidelity Virtual Environments (HI5) Lab
 2019-2020
University of Mississippi. Oxford, MS
Download

Change Blindness is a senoir capstone project for the Hi5 Lab. It takes heavy influence from Dr. Evan Suma Rosenberg who is now a professor at the University of Minnesota running his own Illusioneering Lab. This project aims to fool the user into navigating a very large virtual space within a confined physical space. This project is has several modes to use. This different modes are to more easily showcase the change blindess aspect of the project as it can be difficult to discern when in a HMD. This modes are a top down version which can be seen below, a VR version which is also showcased below, and a FPS style mode with WASD movement and mouse look.


FPS Game

High Fidelity Virtual Environments (HI5) Lab
2019 - 2020
University of Mississippi. Oxford, MS
Download

With the help of Andrew Jelson, we designed and devloped a simple Quake inspired FPS game to showcase Unity3D abilities over a short time.


Dioptic Playground

High Fidelity Virtual Environments (HI5) Lab
2017 - 2018
University of Mississippi. Oxford, MS
Download

The Dioptic Playground is a simple project that can be used as a framework to be able to modify either eye in a HMD in Unity. This was actually quite more difficult that it leads on since the majority of HMDs don't want weird and possible distorting things happening to a user's vison. This project is a brief showcase of what things can be achieved as well as a jumping off point for those interested in studies


Personality Around the Globe

Class Final Project with Jordan Pyper & Jackson Leach
 2019 - 2020
University of Utah. Salt Lake City, UT
See the Site Here

This project explores the data collected across the globe from a psychometric survey that totals to over 1 million participants!

Our goal is to take this raw data and transform it into a visualization that is digestible for the public. The results to come of this project is that we found many countries are strikingly similar in personality on average. The change in personality across cultures did not come across as apparent as we thought it would be. This isn’t a failure however. Finding how similar we all are on the inside while being so diverse on the outside is a great result to have. We had great success in wrangling data this large thanks to our backgrounds. We also presented effective visualizations for this data and showed it in a clear way. We also explored different ways to present this data rather than just geographical and succeeded in that as well.