Professional Documents
Culture Documents
Acoustico UIST2020 Final
Acoustico UIST2020 Final
Acoustico UIST2020 Final
3
https://www.engineeringtoolbox.com/sound-speed-solids-d_713.html
and wave propagation velocities, making it hard for the Tap methods and hand configurations. On the wood
model to be generalized across different surfaces. surface, for X axis, we found a significant effect of hand
Thus, we chose to evaluate the tap localization accuracy configurations (F1, 19 = 9.105, p < 0.01). However, no
within each test condition using root-mean-square error significant effect was found for tap methods (F1, 19 = 2.734,
(RMSE) measurement. The error was measured from the p > 0.05). As for Y axis, we found significant effects of tap
center of each target. We calculated the leave-one-session- methods (F1, 19 = 5.659, p < 0.05) and hand configurations
out accuracy for each participant under each test condition (F1, 19 = 7.645, p < 0.05) both. We found no significant
by training a model using the data from three sessions and interaction effect for tap methods × hand configurations in
testing it using the remaining session. The average RMSE both X axis and Y axis (both p > 0.05).
for each participant in each test condition was calculated The average RMSEs in X axis and Y axis when tapping
by averaging all 4 possible combinations of training and using finger-pad were 7.06mm (s.e. = 0.35mm) and
test data. The overall accuracy was then averaged using the 4.31mm (s.e. = 0.23mm) respectively while they dropped
RMSEs from all participants. to 6.49mm (s.e. = 0.36mm) and 3.83mm (s.e. = 0.21mm)
Overall, the average RMSEs in X axis and Y axis across when using the fingernail to tap (Figure 10 left). Based on
all eight tested conditions were 7.57mm (s.e. = 0.20mm) the results, for X axis location interpolation, using
and 4.62mm (s.e. = 0.11mm) respectively. In particular, if fingernail did not help but it did help Y axis location
we removed sofa arm condition, the average RMSEs interpolation. One possible reason could be that tapping
decreased to 7.08mm (s.e. = 0.19mm) and 4.36mm (s.e. = with fingernail improved the TDOA estimation between
0.11mm) in X axis and Y axis (Figure 9 left). two different types of sensors (i.e., accelerometer and
microphone) since the received signals from both types of
sensors have clearer and stronger peaks.
Figure 12: Demo applications. (a) Dial pad, (b) Calculator, (c)
Whack-a-mole game, (d) Acustico with an optical flow sensor,
Figure 11: Feature importance for tap localization among
(e) User taps on the right side to “right click”.
the four TDOA estimation methods (top), and the six sensor
pairs (bottom). Optical Flow + Acustico for Surface-wide Localization:
With Less Training Data. We also analyzed the system Acustico focuses on tap localization relative to the
accuracy with less training data. We still took leave-one- wristband. We combine the Acustico with an optical flow
session-out approach but this time we only used the taps sensor underneath to additionally track the forearm motion
from four corners in three sessions (i.e., 12 taps) to train on the surface. This combination can be used to expand
the machine learning model. And we tested the model on any of the applications demonstrated above to support tap
the remaining sessions (i.e., 18 taps). localization over larger surface-wide AR interfaces. We
further implement a mouse experience (Figure 12(e))
Overall, with less training data, the average RMSEs in X where the user simply moves his or her wrist to control the
axis and Y axis across all eight tested conditions increased pointer position and taps on left/right region with respect
to 9.64mm (s.e. = 0.18mm) and 5.79mm (s.e. = 0.08mm) to the wristband to invoke left/right click.
DISCUSSION AND LIMITATIONS surfaces around the user beforehand and load the specific
In this section, we discuss the insights gained from this model when the user needs to interact on that surface.
work, and discuss limitations and future work.
Practicality. The system we presented is an early-stage
Active Acoustic Approach. As discussed in Related Work, proof-of-concept research prototype. Although it is in a
active acoustic approach also shows potential to localize a constrained wristband form factor similar to current wrist
finger [35, 54]. Based on the existing work, we also wearables in the market, more work is still needed to
initially investigated the idea of embedding both incorporate these sensors into a flexible and stretchable
transducer (e.g., speaker, actuator) and receiver (e.g., wristband. One direction to pursue in this regard is to only
microphone, accelerometer) on the wristband, and tried to place the accelerometers under the wristband in direct
localize the finger using reflected signal. However, as also contact with the surface and place the microphones on the
mentioned in FingerIO [35], we found out that it was hard top (in the watch-face) since they rely on in-air
to completely isolate the transducer and receiver within a propagation. This may introduce certain inconsistencies in
small wristband form factor. The received signal would be the sensor distances which could impact accuracies.
largely masked by the emitted signal from the transducer, Moreover, to capture the small TDOAs, we sampled the
making it hard to separate and extract the reflected signal. data in 1MHz, which is not supported in most of current
One possible way to solve the masking problem is to wrist wearable devices due to their limited computational
increase the frequency of the emitted signal. To test this, resources and batteries. But we believe this is not
we built a prototype using an ultrasound emitter. However, impossible as technology advances (e.g., ADS8330 from
this presents another problem – signal attenuation. High TI can sample at 1MHz and only consume 21mW). Plenty
frequency signals attenuate very quickly and therefore the of engineering efforts would be necessary to fully embed
received signals do not have enough SNR to make any this technique into commercial wrist wearable devices.
reasonable inferences. The set-up therefore needed more
power and coupling liquids which make the approach System Evaluation. We evaluated the system using a
untenable for practical use. region-based approach since Acustico mainly focuses on
discrete tap detection and localization. We plan to
Tap Detection and Localization in Real-World Settings. investigate how to use this technique to facilitate
For the tap detection, we did not evaluate the system when continuous position tracking or gesture-based sensing in
the hand is not on a surface, but vibration and sound signals our future work. To ensure the studies can be completed in
are still detected (e.g., user’s hand hits an object or user is 90 minutes, participants were asked to perform taps
walking). This is because we assume the system is aware sequentially in each session, which might reduce the wrist
of the user’s hand being placed on a surface, which could movement between taps. Evaluation in random tap
be easily achieved using a proximity sensor. However, our locations might need to be included in the future.
threshold-based method still needs to be tested more
intensively in the field. Another research direction would Finger-up Detection. The mouse demo enables clicking on
be exploring the effect of environmental noises on tap targets. However, Acustico only detects the finger-down
localization. While we envision Acustico to be most useful event which produces the acoustic waves but not the
in indoor scenarios where the noises are limited, we are finger-up event since it does not produce any acoustic
interested to quantify the lowest SNR for the system to waves. The detection of this release event will enable a
work. Furthermore, it is also interesting to look into how dragging state in the mouse [5] and is an interesting
surface properties (e.g., thickness/flatness) would affect challenge for wrist-based sensing.
system performance. We leave these for future work. CONCLUSION
This paper presents a passive acoustic sensing approach for
Ecological Validity. Acustico focuses on tap detection and
wrist-worn devices to detect and localize tap. We discuss
localization under the situation that the user’s wrist is
the sensing principle and our investigation on different
being placed on a surface. Although we did not receive any
sensor configurations. We built a wristband prototype with
negative feedback from participants, we understand that
four acoustic sensors including two accelerometers and
this requirement might introduce fatigue and be
two microphones. Through a 20-participant study, we
uncomfortable after long time use. Future research should
demonstrate that our system can reliably detect taps with
be conducted in ecological validity of this approach.
an F1-score of 0.9987 across different environmental
Re-calibration/Retrain for New Surfaces. Since surfaces noises and yield high localization accuracies with root-
made of different materials have different properties and mean-square-errors of 7.6mm (X-axis) and 4.6mm (Y-
wave propagation velocities, our system needs to be re- axis) across different surfaces and tapping techniques. Our
calibrated and retrained for any new surfaces that it has not work presents a novel sensing methodology for always-
seen before. However, we show in the evaluation that we available input on any unmodified surface. We believe it
can make the training set as small as 12 taps to achieve a holds the potential to further enrich the input
reasonable localization accuracy (X: 9.64mm error, Y: expressiveness of today’s computing devices (e.g.,
5.79mm error). Another possible solution is to train on the wearables and AR devices).
REFERENCES 10. Fabrice Devige and Jean-Pierre Nikolovski. 2003.
1. Hideki Koike, Yoichi Sato, Yoshinori Kobayashi. Accurate Interactive Acoustic Plate. US Patent
2001. Integrating paper and digital information on Application No. US2003/0066692 A1.
EnhancedDesk: a method for realtime finger 11. Rui Fukui, Masahiko Watanabe, Tomoaki Gyota,
tracking on an augmented desk system. ACM Trans. Masamichi Shimosaka and Tomomasa Sato. 2011.
Comput.-Hum. Interact., 8 (4). 307–322. Hand shape classification with a wrist contour
DOI=https://doi.org/10.1145/504704.504706 sensor: development of a prototype device. In
2. Ankur Agarwal, Shahram Izadi, Manmohan Proceedings of the 13th international conference on
Chandraker and Andrew Blake. 2007. High Ubiquitous computing (Ubicomp'11), 311-314.
Precision Multi-touch Sensing on Surfaces using DOI=https://doi.org/10.1145/2030112.2030154
Overhead Cameras. In Second Annual IEEE 12. Mayank Goel, Brendan Lee, Md. Tanvir Islam
International Workshop on Horizontal Interactive Aumi, Shwetak Patel, Gaetano Borriello, Stacie
Human-Computer Systems (TABLETOP'07), 197- Hibino and Bo Begole. 2014. SurfaceLink: using
200. inertial and acoustic sensing to enable multi-device
DOI=https://doi.org/10.1109/TABLETOP.2007.29 interaction on a surface. In Proceedings of the
3. Michael Boyle and Saul Greenberg. 2005. The SIGCHI Conference on Human Factors in
language of privacy: Learning from video media Computing Systems (CHI'14), 1387–1396.
space analysis and design. ACM Trans. Comput.- DOI=https://doi.org/10.1145/2556288.2557120
Hum. Interact., 12 (2). 328-370. 13. Jun Gong, Xing-Dong Yang and Pourang Irani.
DOI=https://doi.org/10.1145/1067860.1067868 2016. WristWhirl: One-handed Continuous
4. Alex Butler, Shahram Izadi and Steve Hodges. Smartwatch Input using Wrist Gestures. In
2008. SideSight: multi-"touch" interaction around Proceedings of the 29th Annual ACM Symposium
small devices. In Proceedings of the 21st annual on User Interface Software and Technology
ACM symposium on User interface software and (UIST'16).
technology (UIST ’08). 201-204. DOI=http://dx.doi.org/10.1145/2984511.2984563
DOI=https://doi.org/10.1145/1449715.1449746 14. Jun Gong, Yang Zhang, Xia Zhou and Xing-Dong
5. William Buxton. 1990. A three-state model of Yang. 2017. Pyro: Thumb-Tip Gesture Recognition
graphical input. In Proceedings of the IFIP TC13 Using Pyroelectric Infrared Sensing. In Proceedings
Third Interational Conference on Human-Computer of the 30th Annual ACM Symposium on User
Interaction (INTERACT ’90), 449–456. Interface Software and Technology (UIST'17), 553-
6. Liwei Chan, Yi-Ling Chen, Chi-Hao Hsieh, Rong- 563.
Hao Liang and Bing-Yu Chen. 2015. CyclopsRing: DOI=https://doi.org/10.1145/3126594.3126615
Enabling Whole-Hand and Context-Aware 15. Yizheng Gu, Chun Yu, Zhipeng Li, Weiqi Li,
Interactions Through a Fisheye Ring. In Shuchang Xu, Xiaoying Wei and Yuanchun Shi.
Proceedings of the 28th Annual ACM Symposium 2019. Accurate and Low-Latency Sensing of Touch
on User Interface Software and Technology Contact on Any Surface with Finger-Worn IMU
(UIST'15), 549-556. Sensor. In Proceedings of the 32nd Annual ACM
DOI= https://doi.org/10.1145/2807442.2807450 Symposium on User Interface Software and
7. Jae Sik Chang, Eun Yi Kim, KeeChul Jung and Technology (UIST'19), 1059–1070.
Hang Joon Kim. 2005. Real time hand tracking DOI=https://doi.org/10.1145/3332165.3347947
based on active contour model. In Proceedings of 16. Jefferson Y. Han. 2005. Low-cost multi-touch
the 2005 international conference on sensing through frustrated total internal reflection.
Computational Science and Its Applications. 999. In Proceedings of the 18th annual ACM symposium
DOI=https://doi.org/10.1007/11424925_104 on User interface software and technology
8. Ke-Yu Chen, Shwetak N. Patel and Sean Keller. (UIST'05), 115–118.
2016. Finexus: Tracking Precise Motions of DOI=https://doi.org/10.1145/1095034.1095054
Multiple Fingertips Using Magnetic Sensing. In 17. Teng Han, Khalad Hasan, Keisuke Nakamura,
Proceedings of the 2016 CHI Conference on Human Randy Gomez and Pourang Irani. 2017.
Factors in Computing Systems (CHI'16), 1504– SoundCraft: Enabling Spatial Interactions on
1514. Smartwatches using Hand Generated Acoustics. In
DOI=http://dx.doi.org/10.1145/2858036.2858125 Proceedings of the 30th Annual ACM Symposium
9. Artem Dementyev and Joseph A. Paradiso. 2014. on User Interface Software and Technology
WristFlex: low-power gesture input with wrist-worn (UIST'17), 579–591.
pressure sensors. In Proceedings of the 27th annual DOI=https://doi.org/10.1145/3126594.3126612
ACM symposium on User interface software and 18. Chris Harrison. 2010. Appropriated Interaction
technology (UIST'14), 161-166. Surfaces. Computer, 43 (6). 86-89.
DOI=https://doi.org/10.1145/2642918.2647396 DOI=https://doi.org/110.1109/MC.2010.158
19. Chris Harrison, Hrvoje Benko, and Andrew D. Conference on Intelligent Robots and Systems,
Wilson. 2011. OmniTouch: wearable multitouch 1184-1189 vol.1182.
interaction everywhere. In Proceedings of the 24th DOI=https://doi.org/10.1109/IRDS.2002.1043893
annual ACM symposium on User interface software 28. Gierad Laput, Robert Xiao and Chris Harrison.
and technology (UIST ’11), 441-450. 2016. ViBand: High-Fidelity Bio-Acoustic Sensing
DOI=https://doi.org/10.1145/2047196.2047255 Using Commodity Smartwatch Accelerometers. In
20. Chris Harrison and Scott E. Hudson. 2008. Scratch Proceedings of the 29th Annual Symposium on User
input: creating large, inexpensive, unpowered and Interface Software and Technology (UIST'16), 321-
mobile finger input surfaces. In Proceedings of the 333.
21th annual ACM symposium on User interface DOI=https://doi.org/10.1145/2984511.2984582
software and technology (UIST ’08), 205-208. 29. SK Lee, William Buxton and K. C. Smith. 1985. A
DOI=https://doi.org/10.1145/1449715.1449747 multi-touch three dimensional touch-sensitive
21. Chris Harrison, Desney S. Tan and Dan Morris. tablet. SIGCHI Bull., 16 (4). 21–25.
2010. Skinput: appropriating the body as an input DOI=https://doi.org/10.1145/1165385.317461
surface. In Proceedings of the SIGCHI Conference 30. Julien Letessier and François Bérard. 2004. Visual
on Human Factors in Computing Systems (CHI tracking of bare fingers for interactive surfaces. In
’10), 453-462. Proceedings of the 17th annual ACM symposium on
DOI=https://doi.org/10.1145/1753326.1753394 User interface software and technology (UIST'04),
22. Scott E. Hudson and Ian Smith. 1996. Techniques 119–122.
for addressing fundamental privacy and disruption DOI=https://doi.org/10.1145/1029632.1029652
tradeoffs in awareness support systems. In 31. Jaime Lien, Nicholas Gillian, M. Emre Karagozler,
Proceedings of the 1996 ACM conference on Patrick Amihood, Carsten Schwesig, Erik Olson,
Computer supported cooperative work (CSCW'96), Hakim Raja and Ivan Poupyrev. 2016. Soli:
248–257. Ubiquitous Gesture Sensing with Millimeter Wave
DOI=https://doi.org/10.1145/240080.240295 Radar. In ACM Trans. Graph (SIGGRAPH'16), 10.
23. Yasha Iravantchi, Yang Zhang, Evi Bernitsas, DOI=https://doi.org/10.1145/2897824.2925953
Mayank Goel and Chris Harrison. 2019. Interferi: 32. Nobuyuki Matsushita and Jun Rekimoto. 1997.
Gesture Sensing using On-Body Acoustic HoloWall: designing a finger, hand, body, and
Interferometry. In Proceedings of the 2019 CHI object sensitive wall. In Proceedings of the 10th
Conference on Human Factors in Computing annual ACM symposium on User interface software
Systems (CHI'19), 276. and technology (UIST'97), 209–210.
DOI=https://doi.org/10.1145/3290605.3300506 DOI=https://doi.org/10.1145/263407.263549
24. Hiroshi Ishii, Craig Wisneski, Julian Orbanes, Ben 33. Jess McIntosh, Asier Marzo and Mike Fraser. 2017.
Chun and Joe Paradiso. 1999. PingPongPlus: design SensIR: Detecting Hand Gestures with a Wearable
of an athletic-tangible interface for computer- Bracelet using Infrared Transmission and
supported cooperative play. In Proceedings of the Reflection. In Proceedings of the 30th Annual ACM
SIGCHI conference on Human Factors in Symposium on User Interface Software and
Computing Systems (CHI'99), 394–401. Technology (UIST'17), 593-597.
DOI=https://doi.org/10.1145/302979.303115 DOI=https://doi.org/10.1145/3126594.3126604
25. Shaun K. Kane, Daniel Avrahami, Jacob O. 34. Dan Morris, T. Scott Saponas and Desney Tan.
Wobbrock, Beverly Harrison, Adam D. Rea, 2011. Emerging Input Technologies for Always-
Matthai Philipose and Anthony LaMarca. 2009. Available Mobile Interaction. Found. Trends Hum.-
Bonfire: a nomadic system for hybrid laptop- Comput. Interact., 4 (4). 245–316.
tabletop interaction. In Proceedings of the 22nd DOI=https://doi.org/10.1561/1100000023
annual ACM symposium on User interface software 35. Rajalakshmi Nandakumar, Vikram Iyer, Desney
and technology (UIST'09), 129–138. Tan and Shyamnath Gollakota. 2016. FingerIO:
DOI=https://doi.org/10.1145/1622176.1622202 Using Active Sonar for Fine-Grained Finger
26. Wolf Kienzle and Ken Hinckley. 2014. LightRing: Tracking. In Proceedings of the 2016 CHI
always-available 2D input on any surface. In Conference on Human Factors in Computing
Proceedings of the 27th annual ACM symposium on Systems (CHI'16), 1515-1525.
User interface software and technology (UIST'14), DOI=https://doi.org/10.1145/2858036.2858580
157-160. 36. Shijia Pan, Ceferino G. Ramirez, Mostafa
DOI=https://doi.org/10.1145/2642918.2647376 Mirshekari, Jonathon Fagert, Albert J. Chung, Chih
27. A. H. F. Lam, W. J. Li, Liu Yunhui and Xi Ning. C. Hu, John P. Shen, Hae Y. Noh and Pei Zhang.
2002. MIDS: micro input devices system using 2017. SurfaceVibe: Vibration-Based Tap and Swipe
MEMS sensors. In IEEE/RSJ International Tracking on Ubiquitous Surfaces. In 16th
ACM/IEEE International Conference on
Information Processing in Sensor Networks 45. Andrew D. Wilson. 2004. TouchLight: an imaging
(IPSN'17), 197-208. touch screen and display for gesture-based
37. J. A. Paradiso, Leo Che King, N. Checka and Hsiao interaction. In Proceedings of the 6th international
Kaijen. 2002. Passive acoustic sensing for tracking conference on Multimodal interfaces (ICMI'04),
knocks atop large interactive displays. In SENSORS, 69–76.
2002 IEEE, 521-527 vol.521. DOI=https://doi.org/10.1145/1027933.1027946
DOI=https://doi.org/10.1109/ICSENS.2002.103715 46. George S. K. Wong. 1986. Speed of sound in
0 standard air. The Journal of the Acoustical Society
38. Jun Rekimoto. 2001. GestureWrist and GesturePad: of America, 79 (5). 1359-1366.
Unobtrusive Wearable Interaction Devices. In DOI=https://doi.org/10.1121/1.393664
Proceedings of the 5th IEEE International 47. Robert Xiao, Julia Schwarz, Nick Throm, Andrew
Symposium on Wearable Computers (ISWC'01), 21. D. Wilson and Hrvoje Benko. 2018. MRTouch:
DOI=https://doi.org/10.1109/ISWC.2001.962092 Adding Touch Input to Head-Mounted Mixed
39. Elliot N. Saba, Eric C. Larson and Shwetak N. Reality. IEEE Transactions on Visualization and
Patel. 2012. Dante vision: In-air and touch gesture Computer Graphics, 24 (4). 1653-1660.
sensing for natural surface interaction with DOI=https://doi.org/10.1109/TVCG.2018.2794222
combined depth and thermal cameras. In 2012 IEEE 48. Robert Xiao, Greg Lew, James Marsanico, Divya
International Conference on Emerging Signal Hariharan, Scott Hudson and Chris Harrison. 2014.
Processing Applications, 167-170. Toffee: enabling ad hoc, around-device interaction
DOI=https://doi.org/10.1109/ESPA.2012.6152472 with acoustic time-of-arrival correlation. In
40. T. Scott Saponas, Desney S. Tan, Dan Morris, Proceedings of the 16th international conference on
Ravin Balakrishnan, Jim Turner and James A. Human-computer interaction with mobile devices
Landay. 2009. Enabling always-available input with and services (MobileHCI'14), 67-76.
muscle-computer interfaces. In Proceedings of the DOI=https://doi.org/10.1145/2628363.2628383
22nd annual ACM symposium on User interface 49. Ming Yang. 2011. In-Solid Acoustic Source
software and technology (UIST'09), 167-176. Localization Using Likelihood Mapping Algorithm.
DOI=https://doi.org/10.1145/1622176.1622208 Open Journal of Acoustics, 1. 34-40.
41. Katie A. Siek, Yvonne Rogers and Kay H. DOI=https://doi.org/10.4236/oja.2011.12005
Connelly. 2005. Fat finger worries: how older and 50. Xing-Dong Yang, Tovi Grossman, Daniel Wigdor
younger users physically interact with PDAs. In and George Fitzmaurice. 2012. Magic Finger:
Proceedings of the 2005 IFIP TC13 international Always-Available Input through Finger
conference on Human-Computer Interaction, 267- Instrumentation. In Proceedings of the 25nd annual
280. ACM symposium on User interface software and
DOI=https://doi.org/10.1007/11555261_24 technology (UIST'12), 147 - 156.
42. Hoang Truong, Shuo Zhang, Ufuk Muncuk, Phuc DOI=https://doi.org/10.1145/2380116.2380137
Nguyen, Nam Bui, Anh Nguyen, Qin Lv, Kaushik 51. Sang Ho Yoon, Ke Huo, Yunbo Zhang, Guiming
Chowdhury, Thang Dinh and Tam Vu. 2018. Chen, Luis Paredes, Subramanian Chidambaram
CapBand: Battery-free Successive Capacitance and Karthik Ramani. 2017. iSoft: A Customizable
Sensing Wristband for Hand Gesture Recognition. Soft Sensor with Real-time Continuous Contact and
In Proceedings of the 16th ACM Conference on Stretching Sensing. In Proceedings of the 30th
Embedded Networked Sensor Systems (SenSys'18), Annual ACM Symposium on User Interface
54–67. Software and Technology (UIST'17), 665-678.
DOI=https://doi.org/10.1145/3274783.3274854 DOI=https://doi.org/10.1145/3126594.3126654
43. Dong Wei, Steven Z. Zhou and Du Xie. 2010. 52. Cheng Zhang, AbdelKareem Bedri, Gabriel Reyes,
MTMR: A conceptual interior design framework Bailey Bercik, Omer T. Inan, Thad E. Starner and
integrating Mixed Reality with the Multi-Touch Gregory D. Abowd. 2016. TapSkin: Recognizing
tabletop interface. In 2010 IEEE International On-Skin Input for Smartwatches. In Proceedings of
Symposium on Mixed and Augmented Reality, 279- the 2016 ACM International Conference on
280. Interactive Surfaces and Spaces, 13–22.
DOI=https://doi.org/10.1109/ISMAR.2010.5643606 DOI=https://doi.org/10.1145/2992154.2992187
44. Hongyi Wen, Julian Ramos Rojas and Anind K. 53. Cheng Zhang, Anandghan Waghmare, Pranav
Dey. 2016. Serendipity: Finger Gesture Recognition Kundra, Yiming Pu, Scott Gilliland, Thomas Ploetz,
using an Off-the-Shelf Smartwatch. In Proceedings Thad E. Starner, Omer T. Inan and Gregory D.
of the 2016 CHI Conference on Human Factors in Abowd. 2017. FingerSound: Recognizing unistroke
Computing Systems (CHI'16), 3847–3851. thumb gestures using a ring. In Proc. ACM Interact.
DOI=https://doi.org/10.1145/2858036.2858466 Mob. Wearable Ubiquitous Technol., 1 (3). Article
DOI=https://doi.org/120. 10.1145/3130985
54. Cheng Zhang, Qiuyue Xue, Anandghan Waghmare, Annual ACM Symposium on User Interface
Sumeet Jain, Yiming Pu, Sinan Hersek, Kent Software and Technology (UIST'19), 1151–1159.
Lyons, Kenneth A. Cunefare, Omer T. Inan and DOI=https://doi.org/10.1145/3332165.3347869
Gregory D. Abowd. 2017. SoundTrak: Continuous 58. Yang Zhang, Gierad Laput and Chris Harrison.
3D Tracking of a Finger Using Active Acoustics. In 2017. Electrick: Low-Cost Touch Sensing Using
Proc. ACM Interact. Mob. Wearable Ubiquitous Electric Field Tomography. In Proceedings of the
DOI=https://doi.org/10.1145/3090095 2017 CHI Conference on Human Factors in
55. Yang Zhang and Chris Harrison. 2018. Pulp Computing Systems (CHI'17), 1-14.
Nonfiction: Low-Cost Touch Tracking for Paper. In DOI=https://doi.org/10.1145/3025453.3025842
Proceedings of the 2018 CHI Conference on Human 59. Yang Zhang, Chouchang Yang, Scott E. Hudson,
Factors in Computing Systems (CHI'18), 1-11. Chris Harrison and Alanson Sample. 2018. Wall++:
DOI=https://doi.org/10.1145/3173574.3173691 Room-Scale Interactive and Context-Aware
56. Yang Zhang and Chris Harrison. 2015. Tomo: Sensing. In Proceedings of the 2018 CHI
Wearable, Low-Cost Electrical Impedance Conference on Human Factors in Computing
Tomography for Hand Gesture Recognition. In Systems, 273.
Proceedings of the 28th Annual ACM Symposium DOI=https://doi.org/10.1145/3173574.3173847
on User Interface Software and Technology 60. Yang Zhang, Junhan Zhou, Gierad Laput and Chris
(UIST'15), 167-173. Harrison. 2016. SkinTrack: Using the Body as an
DOI=https://doi.org/10.1145/2807442.2807480 Electrical Waveguide for Continuous Finger
57. Yang Zhang, Wolf Kienzle, Yanjun Ma, Shiu S. Ng, Tracking on the Skin. In Proceedings of the 2016
Hrvoje Benko and Chris Harrison. 2019. CHI Conference on Human Factors in Computing
ActiTouch: Robust Touch Detection for On-Skin Systems (CHI'16), 1491-1503.
AR/VR Interfaces. In Proceedings of the 32nd DOI=https://doi.org/10.1145/2858036.2858082