Alternate User Imterfaces
ongoing topic...
There are circumstances and users that require unconventional input devices, some examples:
Leapmotion, the hand gesture input device maker and Ultrahaptics, the leading supplier of ultrasonic Haptic (tactile feedback) device have joined forces to form ultraleap.com a promotion site site for the combining of these 2 separate products to provide an enhanced user experience.
The Ultrasonic emitter array used in Haptic interfaces is similar to the Sound Flashlight recently demonstrated. Some demonstrations allow a user to "feel" simple geometric shapes.
Leapmotion uses Stereo cameras and IR lights to identify hand gestures, it's a processing intensive and limited approach compared to scattered laser emitter 3D imaging using in Kinnect games and iPhone Face recognition, I lost some enthusiasm about Leapmotion when they retooled their software for use with VR headsets instead of desktop, the software seemingly in perpetual Beta was also discouraging. Leapmotion is most commonly paired with real time 3D gaming engines like Unity3d.
Simple Gesture input using standard camera
I have recently seen TV commercials for a couple of Google products using a simple camera to decern user gestures, videos playing on the Nest home devices could be pauses with a "talk to the hande" spread fingers gesture. An Android phone was shown scrolling pages with the wave os a hand.
from Eyegaze: 5 Ways the Eyes Can be Tracked with Eye Tracking Technology
(An aside, the Jay Leno's Garage showed an Indy car racer retired by a spinal injuring crash, using a special interface to drive a corvette by turning his head and blowing in a straw. $1 Million was spent by Arrow.com engineering this interface)
Pygaze open-source toolbox for eye tracking in Python
It works with $695 commercial development device that may work reliably enough for retail kiosk applications.
They do reference older projects using a single camera and Open CV, the processed images reduces the pupils to tiny specks, one would guess the result to be hit and miss.
"Smart Eyes" AI code for laboratory and automotive applications
code "Understands and predicts human intentions and actions"
2011 Oxford Press Book:
Eye Tracking: A comprehensive guide to methods and measures
Google Play preview of first 2 of 10 chapters:
In 2008 a cigarette vending machine used maching machine to identify the age of a customer.
In 2010 a vending machine would recommend products based on the age and gender of a customer (with a recognition probability of more than 70%). A 300% increase in sales was reported (perhaps a short lived novelty boost)
There have been pilot program vending machines (2014) with a "Store Manager" that would keep regular customers informed via their cell phone as well as greeting them as they pass on the street.
A 2017 article on future of vending machines using facial recognition and machine learning.
2019 announcements of Biometrics equipped machines, including the Facewatch system that would apparently identify criminals in the UK.
Perhaps a benefit of the database used by the Chinese citizen conduct monitor system that uses facial recognition to track people with security camera, giving people a citizenship grade, (one Innocuous example application posts"shaming" images of jaywalkers at intersections).
There are a number of Arduino compatible interface devices (detail later)
The PAJ7620U2 chip has a built-in camera, IR light and image processor with an I2C interface, it identifies hand gesture Swipe: Left, Right, Up, Down, In, Out, CW, CCW circle over a limited range, requires a bit of user training time to get reliable results. (Not versatile enough for untrained kiosk interfaces)
Z & X axis only gesture sensor
3D hall effect sensors have been used in joysticks, historically potentiometer wear has been a problem in this heavy use application.
There are non-contact 3D tablets that use (microchip MGC3130?) near-field 3D sensing to determine a hand location.
Skywriter Hat: This extra small tablet fits on a Raspberry Pi
Hand Gesture Input with Tactile Feedback
There are circumstances and users that require unconventional input devices, some examples:
Leapmotion, the hand gesture input device maker and Ultrahaptics, the leading supplier of ultrasonic Haptic (tactile feedback) device have joined forces to form ultraleap.com a promotion site site for the combining of these 2 separate products to provide an enhanced user experience.
The Ultrasonic emitter array used in Haptic interfaces is similar to the Sound Flashlight recently demonstrated. Some demonstrations allow a user to "feel" simple geometric shapes.
Leapmotion uses Stereo cameras and IR lights to identify hand gestures, it's a processing intensive and limited approach compared to scattered laser emitter 3D imaging using in Kinnect games and iPhone Face recognition, I lost some enthusiasm about Leapmotion when they retooled their software for use with VR headsets instead of desktop, the software seemingly in perpetual Beta was also discouraging. Leapmotion is most commonly paired with real time 3D gaming engines like Unity3d.
Simple Gesture input using standard camera
I have recently seen TV commercials for a couple of Google products using a simple camera to decern user gestures, videos playing on the Nest home devices could be pauses with a "talk to the hande" spread fingers gesture. An Android phone was shown scrolling pages with the wave os a hand.
Eye-tracking
Most commonly used by physically impaired computer users (Quadriplegics) after some training and practice in order to reliably use the interface. Eyegaze may be the best established system for the handicapped.from Eyegaze: 5 Ways the Eyes Can be Tracked with Eye Tracking Technology
(An aside, the Jay Leno's Garage showed an Indy car racer retired by a spinal injuring crash, using a special interface to drive a corvette by turning his head and blowing in a straw. $1 Million was spent by Arrow.com engineering this interface)
Pygaze open-source toolbox for eye tracking in Python
It works with $695 commercial development device that may work reliably enough for retail kiosk applications.
They do reference older projects using a single camera and Open CV, the processed images reduces the pupils to tiny specks, one would guess the result to be hit and miss.
"Smart Eyes" AI code for laboratory and automotive applications
code "Understands and predicts human intentions and actions"
2011 Oxford Press Book:
Eye Tracking: A comprehensive guide to methods and measures
Google Play preview of first 2 of 10 chapters:
Vending Machines
Due to an aging population and shortage of workers, Japan has developed new applications of robotics and automation. There is one Vending Machine for every 23 people in Japan.In 2008 a cigarette vending machine used maching machine to identify the age of a customer.
In 2010 a vending machine would recommend products based on the age and gender of a customer (with a recognition probability of more than 70%). A 300% increase in sales was reported (perhaps a short lived novelty boost)
There have been pilot program vending machines (2014) with a "Store Manager" that would keep regular customers informed via their cell phone as well as greeting them as they pass on the street.
A 2017 article on future of vending machines using facial recognition and machine learning.
2019 announcements of Biometrics equipped machines, including the Facewatch system that would apparently identify criminals in the UK.
Vending Machines in Big Brother Land
This image of a woman in China making a vending machine purchase, paying with facial recognition, was posted on the Tik Tok video sharing app.Perhaps a benefit of the database used by the Chinese citizen conduct monitor system that uses facial recognition to track people with security camera, giving people a citizenship grade, (one Innocuous example application posts"shaming" images of jaywalkers at intersections).
There are a number of Arduino compatible interface devices (detail later)
The PAJ7620U2 chip has a built-in camera, IR light and image processor with an I2C interface, it identifies hand gesture Swipe: Left, Right, Up, Down, In, Out, CW, CCW circle over a limited range, requires a bit of user training time to get reliable results. (Not versatile enough for untrained kiosk interfaces)
Z & X axis only gesture sensor
3D hall effect sensors have been used in joysticks, historically potentiometer wear has been a problem in this heavy use application.
There are non-contact 3D tablets that use (microchip MGC3130?) near-field 3D sensing to determine a hand location.
Skywriter Hat: This extra small tablet fits on a Raspberry Pi
Comments
Post a Comment