2018 - Fellow of Alfred P. Sloan Foundation
Chris Harrison focuses on Human–computer interaction, Mobile device, Gesture, Computer vision and Artificial intelligence. In his research on the topic of Human–computer interaction, Tilt is strongly related with Wearable computer. His Mobile device research is multidisciplinary, incorporating perspectives in Object, Tabletop computing, Point and Knuckle.
His Gesture study integrates concerns from other disciplines, such as Computer hardware, Surface and Simulation. His work deals with themes such as Moving parts and Interactive displays, which intersect with Computer vision. He has included themes like Stylus and Context menu in his Artificial intelligence study.
His primary scientific interests are in Human–computer interaction, Artificial intelligence, Computer vision, Mobile device and Multimedia. His Human–computer interaction research incorporates elements of Interactivity and Gaze. His Artificial intelligence research is multidisciplinary, incorporating elements of Signal, Surface, Knuckle and Touchpad.
In his research, Finger tracking is intimately related to Smartwatch, which falls under the overarching field of Computer vision. His study on Mobile device also encompasses disciplines like
His scientific interests lie mostly in Human–computer interaction, Computer vision, Artificial intelligence, Smart environment and Activity recognition. The Human–computer interaction study combines topics in areas such as Natural and Gaze. In the subject of general Computer vision, his work in Gesture is often linked to Second finger, thereby combining diverse domains of study.
His work carried out in the field of Gesture brings together such families of science as Interactivity and Smartwatch. Chris Harrison has researched Artificial intelligence in several fields, including Pipeline and Doppler radar. His research investigates the connection between Wearable computer and topics such as Tracking that intersect with problems in Computer graphics.
Chris Harrison mainly focuses on Human–computer interaction, Smart environment, Interactivity, Smartwatch and Activity recognition. As part of his studies on Human–computer interaction, Chris Harrison frequently links adjacent subjects like Feedback loop. His Interactivity research is multidisciplinary, incorporating perspectives in Pose and Multi-touch.
His studies deal with areas such as Finger tracking, Projection, Visibility and Graphics as well as Smartwatch. He has included themes like Leverage and Sound in his Activity recognition study. His study explores the link between Gesture and topics such as Face that cross with problems in Wearable computer.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
TeslaTouch: electrovibration for touch surfaces
Olivier Bau;Ivan Poupyrev;Ali Israr;Chris Harrison.
user interface software and technology (2010)
Skinput: appropriating the body as an input surface
Chris Harrison;Desney Tan;Dan Morris.
human factors in computing systems (2010)
OmniTouch: wearable multitouch interaction everywhere
Chris Harrison;Hrvoje Benko;Andrew D. Wilson.
user interface software and technology (2011)
Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects
Munehiko Sato;Ivan Poupyrev;Chris Harrison.
human factors in computing systems (2012)
Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices
Chris Harrison;Scott E. Hudson.
user interface software and technology (2009)
TapSense: enhancing finger interaction on touch surfaces
Chris Harrison;Julia Schwarz;Scott E. Hudson.
user interface software and technology (2011)
Providing dynamically changeable physical buttons on a visual display
Chris Harrison;Scott E. Hudson.
human factors in computing systems (2009)
ZoomBoard: a diminutive qwerty soft keyboard using iterative zooming for ultra-small devices
Stephen Oney;Chris Harrison;Amy Ogan;Jason Wiese.
human factors in computing systems (2013)
Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces
Chris Harrison;Scott E. Hudson.
user interface software and technology (2008)
Tomo: Wearable, Low-Cost Electrical Impedance Tomography for Hand Gesture Recognition
Yang Zhang;Chris Harrison.
user interface software and technology (2015)
Profile was last updated on December 6th, 2021.
Research.com Ranking is based on data retrieved from the Microsoft Academic Graph (MAG).
The ranking h-index is inferred from publications deemed to belong to the considered discipline.
If you think any of the details on this page are incorrect, let us know.
Carnegie Mellon University
Google (United States)
University of Washington
Carnegie Mellon University
University of Washington
Misapplied Sciences, Inc.
University of Michigan–Ann Arbor
Microsoft (United States)
Carnegie Mellon University
Microsoft (United States)
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below: