Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 管理學院
  3. 資訊管理學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/69551
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳炳宇(Bing-Yu Chen)
dc.contributor.authorWei-Lun Lien
dc.contributor.author李瑋倫zh_TW
dc.date.accessioned2021-06-17T03:19:02Z-
dc.date.available2021-07-06
dc.date.copyright2018-07-06
dc.date.issued2018
dc.date.submitted2018-06-27
dc.identifier.citation[1] C. Ahlberg and B. Shneiderman. The alphaslider: A compact and rapid selector. In Proceedings
of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’94,
pages 365–371, New York, NY, USA, 1994. ACM.
[2] D. Bonnet, C. Appert, and M. Beaudouin-Lafon. Extending the vocabulary of touch events
with thumbrock. In Proceedings of Graphics Interface 2013, GI ’13, pages 221–228,
Toronto, Ont., Canada, Canada, 2013. Canadian Information Processing Society.
[3] G. Bradski and A. Kaehler. Opencv. Dr. Dobb’s journal of software tools, 3, 2000.
[4] L. Bretzner, I. Laptev, and T. Lindeberg. Hand gesture recognition using multi-scale colour
features, hierarchical models and particle filtering. In Proceedings of Fifth IEEE International
Conference on Automatic Face Gesture Recognition, pages 423–428, May 2002.
[5] S. A. Brewster and L. M. Brown. Non-visual information display using tactons. In CHI ’04
Extended Abstracts on Human Factors in Computing Systems, CHI EA ’04, pages 787–788,
New York, NY, USA, 2004. ACM.
[6] L. M. Brown, S. A. Brewster, and H. C. Purchase. A first investigation into the effectiveness
of tactons. In First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for
Virtual Environment and Teleoperator Systems. World Haptics Conference, pages 167–176,
March 2005.
[7] J. R. Cauchard, J. L. Cheng, T. Pietrzak, and J. A. Landay. Activibe: Design and evaluation
of vibrations for progress monitoring. In Proceedings of the 2016 CHI Conference on
Human Factors in Computing Systems, CHI ’16, pages 3261–3271, New York, NY, USA,
2016. ACM.
[8] X. A. Chen, J. Schwarz, C. Harrison, J. Mankoff, and S. E. Hudson. Air+touch: Interweaving touch & in-air gestures. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, UIST ’14, pages 519–525, New York, NY, USA, 2014.
ACM. touch & in-air gestures. In Proceedings of the 27th Annual ACM Symposium on User
Interface Software and Technology, UIST ’14, pages 519–525, New York, NY, USA, 2014.
ACM.
[9] G. Cohn, D. Morris, S. Patel, and D. Tan. Humantenna: Using the body as an antenna for
real-time whole-body interaction. In Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems, CHI ’12, pages 1901–1910, New York, NY, USA, 2012.
ACM.
[10] C. Corsten, B. Daehlmann, S. Voelker, and J. Borchers. Backxpress: Using back-of-device
finger pressure to augment touchscreen input on smartphones. In Proceedings of the 2017
CHI Conference on Human Factors in Computing Systems, CHI ’17, pages 4654–4666, New
York, NY, USA, 2017. ACM.
[11] N. H. Dardas and N. D. Georganas. Real-time hand gesture detection and recognition using
bag-of-features and support vector machine techniques. IEEE Transactions on Instrumentation
and Measurement, 60(11):3592–3607, Nov 2011.
[12] M. V. den Bergh and L. V. Gool. Combining rgb and tof cameras for real-time 3d hand
gesture interaction. In 2011 IEEE Workshop on Applications of Computer Vision (WACV),
pages 66–72, Jan 2011.
[13] H. Francke, J. Ruiz-del Solar, and R. Verschae. Real-time hand gesture detection and recognition
using boosted classifiers and active learning. In D. Mery and L. Rueda, editors, Advances
in Image and Video Technology, pages 533–547, Berlin, Heidelberg, 2007. Springer
Berlin Heidelberg.
[14] J. Griffin. Smart multi-tap text input, June 3 2008. US Patent 7,382,359.
[15] S. Gupta, D. Morris, S. Patel, and D. Tan. Soundwave: Using the doppler effect to sense gestures.
In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems,
CHI ’12, pages 1911–1914, New York, NY, USA, 2012. ACM.
[16] H. Gutowitz. Method and apparatus for improved multi-tap text input, Apr. 17 2001. US
Patent 6,219,731.
[17] C. Harrison and S. Hudson. Using shear as a supplemental two-dimensional input channel
for rich touchscreen interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’12, pages 3149–3152, New York, NY, USA, 2012.
ACM.
[18] C. Harrison, J. Schwarz, and S. E. Hudson. Tapsense: Enhancing finger interaction on touch
surfaces. In Proceedings of the 24th Annual ACM Symposium on User Interface Software
and Technology, UIST ’11, pages 627–636, New York, NY, USA, 2011. ACM.
[19] S. Heo and G. Lee. Forcetap: Extending the input vocabulary of mobile touch screens
by adding tap gestures. In Proceedings of the 13th International Conference on Human
Computer Interaction with Mobile Devices and Services, MobileHCI ’11, pages 113–122,
New York, NY, USA, 2011. ACM.
[20] S. Herz, S. Forstall, and M. Matas. Portable multifunction device, method, and graphical
user interface for interpreting a finger gesture, Jan. 5 2016. US Patent 9,229,634.
[21] S. Hwang, M. Ahn, and K.-y. Wohn. Maggetz: Customizable passive tangible controllers
on and around conventional mobile devices. In Proceedings of the 26th Annual ACM Symposium
on User Interface Software and Technology, UIST ’13, pages 411–416, New York,
NY, USA, 2013. ACM.
[22] B. Kellogg, V. Talla, and S. Gollakota. Bringing gesture recognition to all devices. In
Proceedings of the 11th USENIX Conference on Networked Systems Design and Implementation,
NSDI’14, pages 303–316, Berkeley, CA, USA, 2014. USENIX Association.
[23] W. Kienzle and K. Hinckley. Lightring: Always-available 2d input on any surface. In Proceedings
of the 27th Annual ACM Symposium on User Interface Software and Technology,
UIST ’14, pages 157–160, New York, NY, USA, 2014. ACM.
[24] M. Le Goc, S. Taylor, S. Izadi, and C. Keskin. A low-cost transparent electric field sensor
for 3d interaction on mobile devices. In Proceedings of the 32Nd Annual ACM Conference
on Human Factors in Computing Systems, CHI ’14, pages 3167–3170, New York, NY, USA,
2014. ACM.
[25] E. C. Lechelt. Temporal numerosity discrimination: Intermodal comparisons revisited.
British Journal of Psychology, 66(1):101–108, 1975.
[26] Y.-C. Liao, Y.-C. Chen, L. Chan, and B.-Y. Chen. Dwell+: Multi-level mode selection using vibrotactile cues. In Proceedings of the 30th Annual ACM Symposium on User Interface
Software and Technology, UIST ’17, pages 5–16, New York, NY, USA, 2017. ACM.
[27] J. Lien, N. Gillian, M. E. Karagozler, P. Amihood, C. Schwesig, E. Olson, H. Raja, and
I. Poupyrev. Soli: Ubiquitous gesture sensing with millimeter wave radar. ACM Trans.
Graph., 35(4):142:1–142:19, July 2016.
[28] S.-Y. Lin, C.-H. Su, K.-Y. Cheng, R.-H. Liang, T.-H. Kuo, and B.-Y. Chen. Pub - point
upon body: Exploring eyes-free interaction and methods on an arm. In Proceedings of the
24th Annual ACM Symposium on User Interface Software and Technology, UIST ’11, pages
481–488, New York, NY, USA, 2011. ACM.
[29] X. Liu and K. Fujimura. Hand gesture recognition using depth data. In Sixth IEEE International
Conference on Automatic Face and Gesture Recognition, 2004. Proceedings., pages
529–534, May 2004.
[30] E. Ohn-Bar and M. M. Trivedi. Hand gesture recognition in real time for automotive interfaces:
A multimodal vision-based approach and evaluations. IEEE transactions on intelligent
transportation systems, 15(6):2368–2377, 2014.
[31] A. Pavlovych and W. Stuerzlinger. Model for non-expert text entry speed on 12-button
phone keypads. In Proceedings of the SIGCHI Conference on Human Factors in Computing
Systems, CHI ’04, pages 351–358, New York, NY, USA, 2004. ACM.
[32] Q. Pu, S. Gupta, S. Gollakota, and S. Patel. Whole-home gesture recognition using wireless
signals. In Proceedings of the 19th Annual International Conference on Mobile Computing
& Networking, MobiCom ’13, pages 27–38, New York, NY, USA, 2013. ACM.
[33] J. Segen and S. Kumar. Gesture vr: Vision-based 3d hand interace for spatial interaction.
In Proceedings of the Sixth ACM International Conference on Multimedia, MULTIMEDIA
’98, pages 455–464, New York, NY, USA, 1998. ACM.
[34] A. Shahrokni, J. Jenaro, T. Gustafsson, A. Vinnberg, J. Sandsjo, and M. Fjeld. One- ¨
dimensional force feedback slider: Going from an analogue to a digital platform. In Proceedings
of the 4th Nordic Conference on Human-computer Interaction: Changing Roles,
NordiCHI ’06, pages 453–456, New York, NY, USA, 2006. ACM.
[35] S. S. Snibbe, K. E. MacLean, R. Shaw, J. Roderick, W. L. Verplank, and M. Scheeff. Haptic
techniques for media control. In Proceedings of the 14th Annual ACM Symposium on User
Interface Software and Technology, UIST ’01, pages 199–208, New York, NY, USA, 2001.
ACM.
[36] Q. Wan, Y. Li, C. Li, and R. Pal. Gesture recognition for smart home applications using
portable radar sensors. In 2014 36th Annual International Conference of the IEEE Engineering
in Medicine and Biology Society, pages 6414–6417, Aug 2014.
[37] C. Zhao, K.-Y. Chen, M. T. I. Aumi, S. Patel, and M. S. Reynolds. Sideswipe: Detecting
in-air gestures around mobile devices using actual gsm signal. In Proceedings of the 27th
Annual ACM Symposium on User Interface Software and Technology, UIST ’14, pages 527–
534, New York, NY, USA, 2014. ACM.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/69551-
dc.description.abstract近年來,雖然偵測手勢的技術已經被充分的探索,也能做到相當
微小的辨識。但是人類的感知與運動能力卻不足以讓人類能在微小的
手勢上做操作,這大大的降低人類與機器互動的可能性。為了有效地
提升操作空間,這篇論文提出 HapTick,藉由觸覺回饋來提升一維滑
動手勢的表達性。藉由滑動路徑間所感受到的震動次數,使用者在傳
統的一次性滑動手勢中,能夠準確地知道所選擇到的目標或是模式。
為了驗證我們的想法,我們做了三個實驗。第一個實驗中,我們發
現在大於三毫米的間距下,受試者的準確率能夠大於九成五。在第
二個實驗中,我們比較了 HapTick 以及傳統的多次滑動手勢,並且詢
問使用者執行任務時的主觀感受,結果顯示,在生理需求以及整體
喜好上,HapTick 明顯勝過傳統的滑動手勢。第三個實驗中,我們將
HapTick 應用在不同的互動情境下:手臂上、物體表面上以及空中,
並且評估其準確率即完成時間。最後我們也提出了幾個互動的情境以
及應用。
zh_TW
dc.description.abstractWhile high-resolution and miniature gesture sensing technology has been widely explored, the interaction space is still limited due to the nature of low resolution human proprioceptive sense. To better utilize the control space, we introduce HapTick, a method that discretizes one-dimensional swiping gestures with prompt tactile cues. By counting the tactile stimuli on the path of swiping, the user could effectively select numeric target in one typical swipe. We first derived the effective interval between modes. The results showed that
with more-than-3mm distance between ticks, the overall accuracy of 95% can be achieved. In the second study, we compared two methods for selecting a digit ranging from 1 to 10. While there’s no differences in completion time
between multiple swiping selection and HapTick (3.2 sec vs 3.4 sec), HapTick outperforms in both physical demands (5 vs. 2*) and overall preference (2.41 vs. 4.41*). Lastly, we confirm the feasibility of applying HapTick to
other interaction domain, e.g.on-forearm swiping, input on 2D surface and in-air gesture, in an explorative study. Several scenarios were also proposed based on our findings.
en
dc.description.provenanceMade available in DSpace on 2021-06-17T03:19:02Z (GMT). No. of bitstreams: 1
ntu-107-R05725005-1.pdf: 10889985 bytes, checksum: 40041fb7361f4b6e67629f20a87a2c21 (MD5)
Previous issue date: 2018
en
dc.description.tableofcontents中文摘要 i
Abstract ii
List of Figures v
Chapter 1 Introduction 1
1.1 HapTick . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
Chapter 2 Related Work 5
2.1 Designing Swipe Interaction . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 Technique of sensing gesture . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.3 Numerosity Perception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.4 Exploring Finger-Touch Modalities . . . . . . . . . . . . . . . . . . . . . . 7
Chapter 3 STUDY OVERVIEW 9
3.1 PILOT STUDY: EXPLORTING THE LENGTH OF GAP OF HAPTICK 10
3.1.1 Study Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.1.2 Tasks and Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.1.3 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.1.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
3.2 USER STUDY 1: BASELINE PERFORMANCE OF HAPTICK ON HAND 12
3.2.1 Study Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
3.2.2 Apparatus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.2.3 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.2.4 Tasks and Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.2.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.2.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.3 USER STUDY 2 : SUBJECTIVE ANALYSIS BETWEEN MULTIPLE
SWIPING SELECTION AND HAPTICK . . . . . . . . . . . . . . . . . . 18
3.3.1 Study Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.3.2 Participants and Apparatus . . . . . . . . . . . . . . . . . . . . . . . 19
3.3.3 Tasks and Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.3.4 Longitudinal Pilot . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.3.5 Subjective Rating Analysis . . . . . . . . . . . . . . . . . . . . . . 20
3.3.6 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.3.7 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.4 USER STUDY 3 : HAPTICK: ON TABLE, ON FOREARM, IN AIR . . 25
3.4.1 Study Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.4.2 Participants and Apparatus . . . . . . . . . . . . . . . . . . . . . . . 26
3.4.3 Tasks and Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.4.4 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.4.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Chapter 4 INTERACTION SCENARIOS 29
4.1 Controlling IoT Devices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
4.2 Eyes-free Fast Selection on Wearables . . . . . . . . . . . . . . . . . . . . . 29
4.3 Private and subtle Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
4.4 Integrated Into Modern Devices and Other Works . . . . . . . . . . . . . . 31
Chapter 5 DISCUSSION 34
5.1 Higher Dimensional Scenario . . . . . . . . . . . . . . . . . . . . . . . . . . 34
5.2 Different Fingers and Body Parts . . . . . . . . . . . . . . . . . . . . . . . . 34
5.3 Integrate with other Input Method to Increase Modality . . . . . . . . . . . 35
5.4 The Trade-Off between Length of Interval and Completion Time . . . . . 35
Chapter 6 LIMITATIONS AND FUTURE WORK 36
6.1 Real-world Scenario and Multi-tasking . . . . . . . . . . . . . . . . . . . . 36
6.2 Evaluation of Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Chapter 7 CONCLUSION 37
Bibliography 38
dc.language.isoen
dc.subject輸入zh_TW
dc.subject滑動zh_TW
dc.subject觸摸zh_TW
dc.subject震動回饋zh_TW
dc.subject手指zh_TW
dc.subject實驗zh_TW
dc.subjectExperimenten
dc.subjectHaptically-augmented Inputen
dc.subjectInput Modalityen
dc.subjectFingeren
dc.subjectSwipeen
dc.subjectTouchen
dc.subjectVibrotactile Feedbacken
dc.subjectNumerosity Perceptionen
dc.title利用震動回饋改善滑動手勢用以增加多個輸入模式zh_TW
dc.titleHapTick: Highly Accessible Gestures Using Tactile Cuesen
dc.typeThesis
dc.date.schoolyear106-2
dc.description.degree碩士
dc.contributor.oralexamcommittee余能豪,詹力韋,黃大源,張永儒
dc.subject.keyword滑動,觸摸,震動回饋,輸入,手指,實驗,zh_TW
dc.subject.keywordSwipe,Touch,Vibrotactile Feedback,Numerosity Perception,Haptically-augmented Input,Input Modality,Finger,Experiment,en
dc.relation.page42
dc.identifier.doi10.6342/NTU201801152
dc.rights.note有償授權
dc.date.accepted2018-06-27
dc.contributor.author-college管理學院zh_TW
dc.contributor.author-dept資訊管理學研究所zh_TW
顯示於系所單位:資訊管理學系

文件中的檔案:
檔案 大小格式 
ntu-107-1.pdf
  未授權公開取用
10.63 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved