請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/51726
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 莊裕澤(Yuh-Jzer Joung) | |
dc.contributor.author | Wei-Hung Chen | en |
dc.contributor.author | 陳威宏 | zh_TW |
dc.date.accessioned | 2021-06-15T13:46:32Z | - |
dc.date.available | 2017-12-02 | |
dc.date.copyright | 2015-12-02 | |
dc.date.issued | 2015 | |
dc.date.submitted | 2015-11-25 | |
dc.identifier.citation | [1] Huang, D.-Y., Tsai, M.-C., Tung, Y.-C., Tsai, M.-L., Yeh, Y.-T., Chan, L., Hung, Y.-P., and Chen, M. Y. Touchsense: Expanding touchscreen input vocabulary using different areas of users’ finger pads. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’14, ACM (New York, NY, USA, 2014), 189–192.
[2] Xiao, R., Laput, G., and Harrison, C. Expanding the input expressivity of smartwatches with mechanical pan, twist, tilt and click. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’14, ACM (New York, NY, USA, 2014), 193–196. [3] Perrault, S. T., Lecolinet, E., Eagan, J., and Guiard, Y. Watchit: Simple gestures and eyes-free interaction for wristwatches and bracelets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13, ACM (New York, NY, USA, 2013), 1451–1460. [4] Baudisch, P., and Chu, G. Back-of-device interaction allows creating very small touch devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’09, ACM (New York, NY, USA, 2009), 1923–1932. [5] Butler, A., Izadi, S., and Hodges, S. Sidesight: Multi-”touch” interaction around small devices. In Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, UIST ’08, ACM (New York, NY, USA, 2008), 201–204. [6] Harrison, C., and Hudson, S. E. Abracadabra: Wireless, high-precision, and unpowered finger input for very small mobile devices. In Proceedings of the 22Nd Annual ACM Symposium on User Interface Software and Technology, UIST ’09, ACM (New York, NY, USA, 2009), 121–124. [7] Rekimoto, J. Gesturewrist and gesturepad: Unobtrusive wearable interaction devices. In Proceedings of the 5th IEEE International Symposium on Wearable Computers, ISWC ’01, IEEE Computer Society (Washington, DC, USA, 2001), 21–. [8] Su, C.-H., Chan, L., Weng, C.-T., Liang, R.-H., Cheng, K.-Y., and Chen, B.-Y. Naildisplay: Bringing an always available visual display to fingertips. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13, ACM (New York, NY, USA, 2013), 1461– 1464. [9] Gierad Laput, Robert Xiao, Xiang 'Anthony' Chen, Scott E. Hudson, Chris Harrison. Skin Buttons: Cheap, Small, Low-Power and Clickable Fixed-Icon Laser Projectors. In In Proceedings of Proceeding of the 27th annual ACM symposium on User interface software and technology, UIST '14, ACM (New York, NY, USA, 2008), Pages 389-394. [10] Patel, S. N., and Abowd, G. D. Blui: Low-cost localized blowable user interfaces. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology, UIST ’07, ACM (New York, NY, USA, 2007), 217–220. [11] Igarashi, T., and Hughes, J. F. Voice as sound: Using non-verbal voice input for interactive control. In Proceedings of the 14th Annual ACM Symposium on User Interface Software and Technology, UIST ’01, ACM (New York, NY, USA, 2001), 155–156. [12] Susumu Harada, James A. Landay, Jonathan Malkin, Xiao Li, Jeff A. Bilmes. The Vocal Joystick: Evaluation of Voice-based Cursor Control Techniques. Assets '06: Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility. Pages 197 - 204. [13] Saponas, T. S., Tan, D. S., Morris, D., Balakrishnan, R., Turner, J., and Landay, J. A. Enabling always-available input with muscle-computer interfaces. In Proceedings of the 22Nd Annual ACM Symposium on User Interface Software and Technology, UIST ’09, ACM (New York, NY, USA, 2009), 167–176. [14] Karimullah, A. S., Sears, A., Lin, M. and Goldman, R. (2003). Speech-based cursor control: Understanding the effects of variable cursor speed on target selection. Proceedings of HCII 2003, pp. 681--685. [15] Jackson Feijó Filho , Wilson Prata , Thiago Valle, Breath mobile: a low-cost software-based breathing controlled mobile phone interface, Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services companion, September 21-24, 2012, San Francisco, California, USA [16] Liwei Dai , Rich Goldman , Andrew Sears , Jeremy Lozier, Speech-based cursor control: a study of grid-based solutions, Proceedings of the 6th international ACM SIGACCESS conference on Computers and accessibility, October 18-20, 2004, Atlanta, GA, USA [doi>10.1145/1028630.1028648] [17] Zhu, Z and Ji, Q. Eye and Gaze Tracking for Interactive Graphic Display. In Machine Vision and Application. Volume 15. Number 3. July 2004. pp. 139-148. [18] Carlos H. Morimoto , Marcio R. M. Mimica, Eye gaze tracking techniques for interactive applications, Computer Vision and Image Understanding, v.98 n.1, p.4-24, April 2005 [doi>10.1016/j.cviu.2004.07.010] [19] Ajzen, I. The theory of planned behavior. Organizational behavior and human decision processes, 50(2), 179-211. [20] Piezoelectric Sound Meter: http://www.nerdkits.com/videos/sound_meter/ | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/51726 | - |
dc.description.abstract | 手機、平板電腦等智慧型裝置的出現為我們帶來了許多便利,降低了人與人之間溝通的障礙,並且讓資訊取得更加容易。以目前主流的互動方式而言,使用者主要都是透過觸控介面來與這些智慧化的行動裝置進行互動。但也由於裝置本身互動方式的侷限,使用者在特定雙手被佔用的情境下,如手 物、撐傘、搭乘大眾交通工具、騎腳踏車等等,往往無法輕易地透過觸控操作來滿足取得資訊的需求。另外,對部分的肢體殘障人士而言使用這些智慧型裝置同樣是十分困難而幾乎無法達成的任務。因此在這篇研究中我們將介紹一種新穎的人機互動技術——Blowatch。透過吹氣作為 使用者輸入方式的另外一種可能性,為特定使用情境與有特殊需求的使用者 供免手持互動 (Hands-free interaction)。我們將展示 Blowatch 所支持的互動技術範例,以及相關的應用方 向,並且 出以吹氣為出發點的互動設計準則與介面規範。 | zh_TW |
dc.description.abstract | The emergence of smart devices promises to further enhance convenience to common communication and information retrieval tasks. However, there are some scenarios when users have trouble operating touch screens, such as, carrying a weighted bag or riding a bike. We introduce Blowatch, a novel input method for wrist-wearable devices and potentially for other mobile devices, which provides hands-free interaction to solve the scenarios that our hands are already busy. Users blow air on the wrist-wearable device to invoke several operations, such as, adjusting music volume, taking pictures or answering calls. In this paper, we explore the input dimension of the blow-based gesture set and examine the potential use of the blow-based gestures in a user study. To illustrate the potential of our approach, we developed a set of example applications for blow-based gestures and craft our proof-of-concept prototype. Finally, we compared the social acceptance between common hands-free interactions and our research. | en |
dc.description.provenance | Made available in DSpace on 2021-06-15T13:46:32Z (GMT). No. of bitstreams: 1 ntu-104-R02725033-1.pdf: 13381731 bytes, checksum: 52b56940d36cc6feecb8bd794e691970 (MD5) Previous issue date: 2015 | en |
dc.description.tableofcontents | 口試委員審定書
致謝 .................................................................................... i 中文摘要 .............................................................................. ii 英文摘要 .............................................................................. iii 圖目次.................................................................................... vi 表目次.................................................................................... viii 第一章 緒論........................................................................ 1 第一節 研究動機與介紹 ................................................ 2 第二節 研究問題與目的 ................................................ 3 第二章 相關文獻探討 ............................................................... 4 第一節 行動裝置螢幕觸控的遮蔽問題 (Fat-finger problem) ... 4 第二節 不使用手進行操作的互動設計 (Hands-free interaction)11 第三章 使用者研究與實驗法 ...................................................... 21 第一節 使用者中心設計 (User-centered design) ............ 21 第二節 使用者自行定義手勢 (User-defined gesture set) ... 22 第三節 使用者偏好與深度訪談 .................................... 26 第四章 原型實作與概念驗證 ...................................................... 31 第一節 以壓電式蜂鳴器 (Piezo element) 作為訊號感測裝置 31 第二節 使用者介面與互動設計 ....................................... 39 第三節 易用性測試 (Usability test) 與使用者回饋 ...... 49 第四節 社交接受度 (Social acceptance) 評估 .................. 68 第五章 結論與建議..................................................................... 73 第一節 結論 ............................................................... 73 第二節 建議 ............................................................... 76 第三節 未來研究方向 ................................................ 78 參考文獻 .............................................................................. 81 | |
dc.language.iso | zh-TW | |
dc.title | 以自然吹氣作為人機互動介面的研究 | zh_TW |
dc.title | Research on blowable and hands-free interaction | en |
dc.type | Thesis | |
dc.date.schoolyear | 104-1 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 陳炳宇(Ping-Yu Chen),岳修平(H-P Yueh) | |
dc.subject.keyword | 人機互動,介面設計,免手持互動,穿戴式裝置,吹氣式互動, | zh_TW |
dc.subject.keyword | Human-computer interaction,wearable device,hands-free interaction,breath-based interface,interaction design,human-centered design, | en |
dc.relation.page | 83 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2015-11-25 | |
dc.contributor.author-college | 管理學院 | zh_TW |
dc.contributor.author-dept | 資訊管理學研究所 | zh_TW |
顯示於系所單位: | 資訊管理學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-104-1.pdf 目前未授權公開取用 | 13.07 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。