Kyle Marcroft

                     Taichikitty, LLC
                                                                                    Exploring Guiding User Motion Through Haptic Feedback

About Kyle Marcroft


Who Am I?

A seasoned SDET and QAE with over 15 years of industry experience; after 5 years at Amazon I decided it was time to reinvent myself as a VR creator. 

How Am I Reinventing Myself?

By teaching myself Unity while creating portfolio projects to demonstrate innovation in the VR space.

Why Haptics?

I strongly believe that the sense of touch is underutilized in mainstream VR.  I chose a haptics focus for my first portfolio project in order to showcase something truly "new".

Vision


Haptic feedback has the potential create a new interaction mechanic for VR that will convey information and aid in the training of muscle memory.  The ability to "feel" the space you are interacting with.  Imagine knowing if your hands are following the correct path while doing Taiji push-hands.  Imagine walking around a 3D data visualization showing averages over time, and be able to run your hand through the different segments to "feel" the level of variance in the underlying data sets.  Imagine training a player in a roleplaying game the arm gestures to cast magic spells.  Imagine what would be possible if a vocabulary of haptic feedback can be created to guide the motion of users' hands without needing auditory or visual cues.

I am working to explore these ideas in order to expand the possibilities for VR interaction design.

Download


Haptic Demo Download - a small demo of the basic concept.  Unzip and run the Haptic_Demo.exe from the destination folder.  Currently, this is a very simple proof-of-concept demonstration, without much in the way of polish.  The intent is to begin here and iterate improvements.
Goal - Learn to follow a 4 part path by feeling the haptic feedback while you are on the path.  The user interacts with one part of the path at a time.  When the end of the path is reached, a spray of hearts will appear.
Controls
  - The right Touch controller is the one to use to follow the path
  - Hand triggers cause four lights to appear, advancing from the beginning to the end of the current segment
  - Left thumbstick will smoothly move the user in the horizontal plane
  - "A" and "X" buttons cause the experience (scene) to reset
  - "B" and "Y" buttons will quit out of the experience

To Do List


Based on feedback from the above demo, here is a list of proposed enhancements

* More clearly distinguish the gradiated buzzing so that it is obvious what is "early" vs. what is "late" in the segment.  This will give a sense of 1-dimensional direction for the user

* Expand the size of the pipes to be more forgiving of hand location

* Introduce programmatic scaling so that users of different body sizes / arm lengths can be accommodated

* Add disjoint sequences to be able to teach stroke order and position for Chinese characters

Meet the "Team"


Kyle Marcroft

Founder

Contact Me


Email

LinkedIn