A seasoned SDET and QAE with over 15 years of industry experience; after 5 years at Amazon I decided it was time to reinvent myself as a VR creator.
By teaching myself Unity while creating portfolio projects to demonstrate innovation in the VR space.
I strongly believe that the sense of touch is underutilized in mainstream VR. I chose a haptics focus for my first portfolio project in order to showcase something truly "new".
Haptic feedback has the potential create a new interaction mechanic for VR that will convey information and aid in the training of muscle memory. The ability to "feel" the space you are interacting with. Imagine knowing if your hands are following the correct path while doing Taiji push-hands. Imagine walking around a 3D data visualization showing averages over time, and be able to run your hand through the different segments to "feel" the level of variance in the underlying data sets. Imagine training a player in a roleplaying game the arm gestures to cast magic spells. Imagine what would be possible if a vocabulary of haptic feedback can be created to guide the motion of users' hands without needing auditory or visual cues.
I am working to explore these ideas in order to expand the possibilities for VR interaction design.
Based on feedback from the above demo, here is a list of proposed enhancements
* More clearly distinguish the gradiated buzzing so that it is obvious what is "early" vs. what is "late" in the segment. This will give a sense of 1-dimensional direction for the user
* Expand the size of the pipes to be more forgiving of hand location
* Introduce programmatic scaling so that users of different body sizes / arm lengths can be accommodated
* Add disjoint sequences to be able to teach stroke order and position for Chinese characters