請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/58263
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 陳彥仰 | |
dc.contributor.author | Min-Lun Tsai | en |
dc.contributor.author | 蔡銘倫 | zh_TW |
dc.date.accessioned | 2021-06-16T08:09:42Z | - |
dc.date.available | 2019-07-22 | |
dc.date.copyright | 2014-07-22 | |
dc.date.issued | 2014 | |
dc.date.submitted | 2014-04-24 | |
dc.identifier.citation | [1] X. Bi, T. Moscovich, G. Ramos, R. Balakrishnan, and K. Hinckley. An exploration of pen rolling for pen-based interaction. In Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, UIST ’08, pages 191–200, New York, NY, USA, 2008. ACM.
[2] S. Boring, D. Ledo, X. A. Chen, N. Marquardt, A. Tang, and S. Greenberg. The fat thumb: Using the thumb’s contact size for single-handed mobile interaction. In Proceedings of the 14th International Conference on Human-computer Interaction with Mobile Devices and Services Companion, MobileHCI ’12, pages 207–208, New York, NY, USA, 2012. ACM. [3] B. L. Harrison, K. P. Fishkin, A. Gujar, C. Mochon, and R. Want. Squeeze me, hold me, tilt me! an exploration of manipulative user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’98, pages 17–24, New York, NY, USA, 1998. ACM Press/Addison-Wesley Publishing Co. [4] C. Harrison and S. Hudson. Using shear as a supplemental two-dimensional input channel for rich touchscreen interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’12, pages 3149–3152, New York, NY, USA, 2012. ACM. [5] C. Harrison, J. Schwarz, and S. E. Hudson. Tapsense: Enhancing finger interaction on touch surfaces. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, UIST ’11, pages 627–636, New York, NY, USA, 2011. ACM. [6] K. Hasan, X.-D. Yang, A. Bunt, and P. Irani. A-coord input: Coordinating auxiliary input streams for augmenting contextual pen-based interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’12, pages 805–814, New York, NY, USA, 2012. ACM. [7] S. Heo and G. Lee. Force gestures: Augmenting touch screen gestures with normal and tangential forces. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, UIST ’11, pages 621–626, New York, NY, USA, 2011. ACM. [8]S.HeoandG.Lee.Forcetap:Extendingtheinputvocabularyofmobiletouchscreens by adding tap gestures. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, MobileHCI ’11, pages 113–122, New York, NY, USA, 2011. ACM. [9] S. Heo and G. Lee. Forcedrag: Using pressure as a touch input modifier. In Pro- ceedings of the 24th Australian Computer-Human Interaction Conference, OzCHI ’12, pages 204–207, New York, NY, USA, 2012. ACM. [10] S. Heo and G. Lee. Indirect shear force estimation for multi-point shear force oper- ations. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13, pages 281–284, New York, NY, USA, 2013. ACM. [11] K. Hinckley and H. Song. Sensor synaesthesia: Touch in motion, and motion in touch. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’11, pages 801–810, New York, NY, USA, 2011. ACM. [12] C. Holz and P. Baudisch. The generalized perceived input point model and how to double touch accuracy by extracting fingerprints. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’10, pages 581–590, New York, NY, USA, 2010. ACM. [13] C. Holz and P. Baudisch. Fiberio: A touchscreen that senses fingerprints. In Pro- ceedings of the 26th Annual ACM Symposium on User Interface Software and Tech- nology, UIST ’13, pages 41–50, New York, NY, USA, 2013. ACM. [14] K. Iwasaki, T. Miyaki, and J. Rekimoto. Expressive typing: A new way to sense typing pressure and its applications. In CHI ’09 Extended Abstracts on Human Fac- tors in Computing Systems, CHI EA ’09, pages 4369–4374, New York, NY, USA, 2009. ACM. [15] K. Partridge, S. Chatterjee, V. Sazawal, G. Borriello, and R. Want. Tilttype: Accelerometer-supported text entry for very small devices. In Proceedings of the 15th Annual ACM Symposium on User Interface Software and Technology, UIST ’02, pages 201–204, New York, NY, USA, 2002. ACM. [16] G. Ramos and R. Balakrishnan. Zliding: Fluid zooming and sliding for high preci- sion parameter manipulation. In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology, UIST ’05, pages 143–152, New York, NY, USA, 2005. ACM. [17] G. Ramos, M. Boulos, and R. Balakrishnan. Pressure widgets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’04, pages 487–494, New York, NY, USA, 2004. ACM. [18] V. Roth and T. Turner. Bezel swipe: Conflict-free scrolling and multiple selection on mobile touch screen devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’09, pages 1523–1526, New York, NY, USA, 2009. ACM. [19] A. Roudaut, E. Lecolinet, and Y. Guiard. Microrolls: Expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’09, pages 927– 936, New York, NY, USA, 2009. ACM. [20] A. Sugiura and Y. Koseki. A user interface using fingerprint recognition: Holding commands and data objects on fingers. In Proceedings of the 11th Annual ACM Symposium on User Interface Software and Technology, UIST ’98, pages 71–79, New York, NY, USA, 1998. ACM. [21] Y. Suzuki, K. Misue, and J. Tanaka. Stylus enhancement to enrich interaction with computers. In Proceedings of the 12th International Conference on Human- computer Interaction: Interaction Platforms and Techniques, HCI’07, pages 133– 142, Berlin, Heidelberg, 2007. Springer-Verlag. [22] F. Tian, X. Ao, H. Wang, V. Setlur, and G. Dai. The tilt cursor: Enhancing stimulus- response compatibility by providing 3d orientation cue of pen. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’07, pages 303–306, New York, NY, USA, 2007. ACM. [23] F. Tian, L. Xu, H. Wang, X. Zhang, Y. Liu, V. Setlur, and G. Dai. Tilt menu: Using the 3d orientation information of pen devices to extend the selection capability of pen-based user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’08, pages 1371–1380, New York, NY, USA, 2008. ACM. [24] D. Vogel and G. Casiez. Conte: Multimodal input inspired by an artist’s crayon. In Proceedings of the 24th Annual ACM Symposium on User Interface Soft- ware and Technology, UIST ’11, pages 357–366, New York, NY, USA, 2011. ACM. [25] D. Wigdor and R. Balakrishnan. Tilttext: Using tilt for text input to mobile phones. In Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology, UIST ’03, pages 81–90, New York, NY, USA, 2003. ACM. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/58263 | - |
dc.description.abstract | 「TouchSense」是一種旨在讓使用者在智慧型穿戴裝置上快速切換 輸入模式的技術。以智慧型手錶為例,其操作介面還是以一般觸控螢 幕方式作互動,但穿戴式裝置因為追求體積小,所能提供的操作介面 相對縮小,一般的輸入方式,並不適合此操作情境(譬如:由於螢幕面 積小,較難以雙指作 pinch 手勢來放大地圖),其原因來自於在一般的 操作介面下,每一次的手指點按,只能產生基本的輸入資訊(按下/非 按下),導致互動方式因此受限。
「TouchSense」提出了一個嶄新的想法,我們決定在人的食指上不同 區域安裝不同的功能,當使用者使用不同手指部位觸按螢幕,代表著 不同的輸入模式,即可跳脫出目前觸控螢幕使用方式的框架,我們設 計了兩個使用者實驗,找出使用者區分手指不同部位的能力,並提出 互動方式的設計方針。之後我們將兩個動態傳感器 (inertial motion unit) 分別安裝在使用者的手指及智慧型手錶上,並透過 SVM 技術辨識出手 指相對於觸控螢幕的觸控姿勢,進一步地即時推論出使用者欲使用哪 個手指部位觸按螢幕。最後設計了數個實驗比較「TouchSense」與數 個在智慧型手錶上常用的模式切換方法,實驗結果表示「TouchSense」 讓使用者能夠在模式切換上擁有較佳的表現。 | zh_TW |
dc.description.abstract | We present TouchSense, which provides additional touchscreen input vo- cabulary by distinguishing the areas of users’ finger pads contacting the touch- screen. It requires minimal touch input area and minimal movement, mak- ing it especially ideal for wearable devices such as smart watches and smart glasses. For example, users of a calculator application on a smart watch could tap normally to enter numbers, and tap with the right side of their fingers to en- ter the operators (e.g. +, -, =). Results from two human-factor studies showed that users could tap a touchscreen with five or more distinct areas of their finger pads. Also, they were able to tap with more distinct areas closer to their fingertips. As a proof of concept, we developed a TouchSense smart watch prototype using inertial measurement sensors, and three example ap- plications: a calculator, a map viewer, and a text editor. In a follow-up study, we further reported user performance and user feedbacks on the TouchSense applications. | en |
dc.description.provenance | Made available in DSpace on 2021-06-16T08:09:42Z (GMT). No. of bitstreams: 1 ntu-103-R00944005-1.pdf: 16197617 bytes, checksum: 0f231d5f78f014b52bfb29e4c06388ec (MD5) Previous issue date: 2014 | en |
dc.description.tableofcontents | 誌謝 iii
摘要 iv Abstract v 1 Introduction 1 2 Related Works 4 2.1 Mode-SwitchingTechniquesUsingaStylus . . . . . . . . . . . . . . . . 4 2.2 Mode-Switching Techniques Using Finger Touches . . . . . . . . . . . . 5 2.2.1 Single-TapModeSwitching .................... 5 2.2.2 Multi-StepModeSwitching .................... 6 3 User Study A: Targeting On a Finger Pad 7 3.1 RationalebehindtheuseofContactPointModel . . . . . . . . . . . . . 7 3.2 InterfaceandApparatus........................... 9 3.3 TaskandProcedure............................. 9 3.4 Participants ................................. 10 3.5 ResultsandDiscussion ........................... 10 4 User Study B: Working with touch interaction 13 4.1 InterfaceandApparatus........................... 13 4.2 TaskandProcedure............................. 13 4.3 Participants ................................. 14 4.4 ResultandDiscussion............................ 15 5 Prototype 16 5.1 DesignConsideration............................ 16 5.2 Implementation ............................... 16 5.3 ExampleApplications............................ 17 6 Understanding TouchSense on Real Applications 20 6.1 Calculator:DiscreteInteraction....................... 20 6.1.1 Procedure.............................. 20 6.1.2 Task................................. 21 6.1.3 Participants............................. 21 6.1.4 Hypothesis ............................. 21 6.1.5 Results:............................... 21 6.2 MapViewer:ContinuousInteraction .................... 22 6.2.1 Procedure.............................. 22 6.2.2 Task: ................................ 23 6.2.3 Hypothesis:............................. 23 6.2.4 Results:............................... 24 6.3 Discussion.................................. 25 7 Conclusion 27 Bibliography 28 | |
dc.language.iso | en | |
dc.title | 基於不同手指部位快速切換輸入模式之研究 | zh_TW |
dc.title | TouchSense: Expanding Touchscreen Vocabulary using Different Areas of Users’ Finger Pads | en |
dc.type | Thesis | |
dc.date.schoolyear | 102-1 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 梁容輝,唐玄輝,江振維 | |
dc.subject.keyword | 模式切換,智慧型手錶,行動裝置,觸控螢幕, | zh_TW |
dc.subject.keyword | Augmented finger input,Single-tap mode switching,Input modality,Smart watch,Small-screen devices, | en |
dc.relation.page | 31 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2014-04-24 | |
dc.contributor.author-college | 電機資訊學院 | zh_TW |
dc.contributor.author-dept | 資訊網路與多媒體研究所 | zh_TW |
顯示於系所單位: | 資訊網路與多媒體研究所 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-103-1.pdf 目前未授權公開取用 | 15.82 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。