Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
  • 搜尋 TDR
  • 授權 Q&A
  • 幫助
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/57834
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳彥仰
dc.contributor.authorWei-Chen Chuen
dc.contributor.author朱唯辰zh_TW
dc.date.accessioned2021-06-16T07:06:25Z-
dc.date.available2014-07-15
dc.date.copyright2014-07-15
dc.date.issued2014
dc.date.submitted2014-07-10
dc.identifier.citation[1] Bi, x., smith, b., and zhai, s. (2012) multilingual touchscreen keyboard design and optimization, hu-man computer interaction, to appear. available online.
[2] Book review: Handbook of human-computer interaction ed. by martin helander (north-holland, amsterdam 1988). page 81.
[3] Epson moverio bt-100. https://www.epson.com/cgi-bin/Store/jsp/Moverio/ Home.do. Accessed: 2010-09-30.
[4] Google glass. http://www.google.com/glass/start/. Accessed: 2010-09-30.
[5] D. Ashbrook, P. Baudisch, and S. White. Nenya: Subtle and eyes-free mobile input with a magnetically-tracked finger ring. In Proc. ACM CHI’11 (2011), pages 2043– 2046. ACM.
[6] B. Bajer, I. S. MacKenzie, and M. Baljko. Huffman base-4 text entry glove (h4 teg). In ISWC (2012), pages 41–47. IEEE.
[7] X. Bi, B. A. Smith, and S. Zhai. Quasi-qwerty soft keyboard optimization. In Proc. ACM CHI’10 (2010), pages 283–286. ACM.
[8] A.Butler,S.Izadi,andS.Hodges.Sidesight:Multi-”touch”interactionaroundsmall devices. In Proc. ACM UIST’08 (2008), pages 201–204. ACM.
[9] L.Chan,R.-H.Liang,M.-C.Tsai,K.-Y.Cheng,C.-H.Su,M.Y.Chen,W.-H.Cheng, and B.-Y. Chen. Fingerpad: Private and subtle interaction using fingertips. In UIST’13 (2013), pages 255–260. ACM.
27
[10] C.-C. Chang and C.-J. Lin. Libsvm: A library for support vector machines. pages 27:1–27:27. ACM.
[11] K.-Y. Chen, K. Lyons, S. White, and S. Patel. utrack: 3d input using two magnetic sensors. In Proc. ACM UIST’13 (2013), pages 237–244. ACM.
[12] N. Dezfuli, M. Khalilbeigi, J. Huber, F. Mller, and M. Mühlhäuser. Palmrc: Imag- inary palm-based remote control for eyes-free television interaction. In Proc. ACM EuroiTV ’12 (2012), pages 27–34. ACM.
[13] M. Gandy, T. Starner, J. Auxier, and D. Ashbrook. The gesture pendant: a self- illuminating, wearable, infrared computer vision system for home automation con- trol and medical monitoring. In Proc. IEEE ISWC ’00 (2000), pages 87–94. IEEE.
[14] M. Goldstein and D. Chincholle. Finger-joint gesture wearable keypad. In Proc. ACM MobileHCI’99 (1999), pages 9–18. ACM.
[15] U. Gollner, T. Bieling, and G. Joost. Mobile lorm glove: Introducing a communica- tion device for deaf-blind people. In Proc. TEI’12 (2012), pages 127–130. ACM.
[16] S. Gustafson, C. Holz, and P. Baudisch. Imaginary phone: Learning imaginary inter- faces by transferring spatial memory from a familiar device. In Proc. ACM UIST’11 (2011), pages 283–292. ACM.
[17] S.G.Gustafson,B.Rabe,andP.M.Baudisch.Understandingpalm-basedimaginary interfaces: The role of visual and tactile cues when browsing. In Proc. ACM CHI’13 (2013), pages 889–898. ACM.
[18] C. Harrison, H. Benko, and A. D. Wilson. Omnitouch: Wearable multitouch inter- action everywhere. In Proc. ACM UIST’11 (2011), pages 441–450. ACM.
[19] C. Harrison, D. Tan, and D. Morris. Skinput: Appropriating the body as an input surface. In Proc. ACM CHI’10 (2010), pages 453–462. ACM.
[20] S. Hwang and G. Lee. Qwerty-like 3x4 keypad layouts for mobile phone. In Proc. ACM CHI EA’05 (2005), pages 1479–1482. ACM.
28
[21] K. T.-I. I. Scott MacKenzie. In Text Entry Systems: Mobility, Accessibility, Univer- sality. Morgan Kaufmann.
[22] E. Jones, J. Alexander, A. Andreou, P. Irani, and S. Subramanian. Gestext: Accelerometer-based gestural text-entry systems. In Proc. ACM CHI’10 (2010), pages 2173–2182. ACM.
[23] D. Kim, O. Hilliges, S. Izadi, A. D. Butler, J. Chen, I. Oikonomidis, and P. Olivier. Digits: Freehand 3d interactions anywhere using a wrist-worn gloveless sensor. In Proc. ACM UIST’12 (2012), pages 167–176. ACM.
[24] P. O. Kristensson, T. Nicholson, and A. Quigley. Continuous recognition of one- handed and two-handed gestures using 3d full-body motion tracking sensors. In Proc. ACM IUI’12 (2012), pages 89–92. ACM.
[25] F. Kuester, M. Chen, M. E. Phair, and C. Mehring. Towards keyboard independent touch typing in vr. In Proc. ACM VRST ’05 (2005), pages 86–95. ACM.
[26] F. C. Y. Li, R. T. Guy, K. Yatani, and K. N. Truong. The 1line keyboard: A qwerty layout in a single line. In Proc. ACM UIST’11 (2011), pages 461–470. ACM.
[27] K. Nakatsuma, H. Shinoda, Y. Makino, K. Sato, and T. Maeno. Touch interface on back of the hand. In ACM SIGGRAPH 2011 Emerging Technologies, pages 19:1– 19:1. ACM.
[28] T.Ni,D.Bowman,andC.North.Airstroke:Bringingunistroketextentrytofreehand gesture interfaces. In Proc. ACM CHI’11 (2011), pages 2473–2476. ACM.
[29] H. Roeber, J. Bacus, and C. Tomasi. Typing in thin air: The canesta projection key- board - a new method of interaction with electronic devices. In Proc. ACM CHI’03 Extended Abstracts on Human Factors in Computing Systems (2003), pages 712– 713. ACM.
29
[30] G. Shoemaker, L. Findlater, J. Q. Dawson, and K. S. Booth. Mid-air text input techniques for very large wall displays. In Proc. CIPS GI’09 (2009), pages 231– 238. Canadian Information Processing Society.
[31] R. W. Soukoreff and I. S. MacKenzie. Metrics for text entry research: An evaluation of msd and kspc, and a new unified error metric. In Proc. ACM CHI’03 (2003), pages 113–120. ACM.
[32] E. Tamaki, T. Miyaki, and J. Rekimoto. Brainy hand: An ear-worn hand gesture interaction device. In Proc. ACM CHI EA’09 (2009), pages 4255–4260. ACM.
[33] K. Vertanen and P. O. Kristensson. A versatile dataset for text entry evaluations based on genuine mobile emails. In Proc. ACM MobileHCI’11 (2011), pages 295– 298. ACM.
[34] Y. K. Xinying Han, Hiroaki Seki and M. Hikizu. Wearable handwriting input device using magnetic field. In SICE ’07, pages 365–368. ACM.
[35] Y. K. Xinying Han, Hiroaki Seki and M. Hikizu. Wearable handwriting input device using magnetic field: 2nd report: Influence of misalignment of magnet and writing plane. In PRECIS ENG ’10, pages 37–43. ELSSEVIER.
[36] S. Zhai and P. O. Kristensson. Interlaced qwerty: Accommodating ease of visual search and input flexibility in shape writing. In Proc. ACM CHI’08 (2008), pages 593–596. ACM.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/57834-
dc.description.abstractPalmType提供了Google Glass這類智慧型眼鏡以及穿戴式顯示器一個全新的輸入方式,利用本體感覺,使用者可以在看著前方的同時在手掌上打字,顯示器上會呈現對應的視覺回饋,即使不記得按鍵的位置依然可以做輸入。借由配戴在手腕上的15顆紅外線測距感測器,可以偵測手指按壓在手掌上的位置以及點擊狀態,而且使用者不用額外握持多餘的裝置。我們設計了design sessions來了解使用者會如何根據本體感覺在手掌上排列QWERTY layout。為了評估PalmType的打字效率與了解使用者們的偏好,我們用Google Glass與Vicon動作捕捉系統進行了12人的user study,結果顯示PalmType with user-defined QWERTY這種layout比現有的觸控板輸入方式快了39%,此外,92%的使用者偏好使用PalmType作為智慧眼鏡上的文字輸入方式。zh_TW
dc.description.abstractWe present PalmType, which uses palms as interactive keyboards for smart wearable displays, such as Google Glass.
PalmType leverages users' innate ability to pinpoint specific areas of their palms and fingers without visual attention (i.e. proprioception), and provides visual feedback via the wearable displays.
With wrist-worn sensors and wearable displays, PalmType enables typing without requiring users to hold any devices and does not require visual attention to their hands. We conducted design sessions to see how users map QWERTY layout to their hands based on their proprioception. To evaluate typing performance and preference, we conducted a 12-person user study using Google Glass and Vicon motion tracking system, which showed that PalmType with user-defined QWERTY layout is 39% faster than current touchpad-based keyboards. In addition, PalmType is preferred by 92% of the participants as the input method for smart glasses. We demonstrate the feasibility of wearable PalmType by building a prototype that uses a wrist-worn array of 15 infrared sensors to detect users' finger position and taps, and provides visual feedback via Google Glass.
en
dc.description.provenanceMade available in DSpace on 2021-06-16T07:06:25Z (GMT). No. of bitstreams: 1
ntu-103-R01922002-1.pdf: 19099312 bytes, checksum: a829f1abeca62b95bee1ec50f59f81fa (MD5)
Previous issue date: 2014
en
dc.description.tableofcontents口試委員會審定書 ii
誌謝 iii
摘要 iv
Abstract v
1 Introduction 1
2 RELATEDWORK 4
2.1 Palm-andArm-based Interfaces 4
2.2 Mid-airText Input 5
2.3 Virtual QWERTYKeyboards 5
2.4 Wearable Sensing Techniques 6
3 PALMTYPE DESIGN 8
3.1 Design Sessions 8
3.2 Results 9
4 EVALUATION 11
4.1 Implementation using Motion Tracking System 11
4.2 Design 12
4.3 Participants 14
4.4 Results 15
4.5 Not Corrected Error Rate and Corrected Error Rate 16
4.6 Subjective Measures 17
5 WEARABLE PALMTYPE PROTOTYPE 20
5.1 Sensor Board 20
5.2 Touch Event Detection Accuracy 21
6 DISCUSSION 23
6.1 Additional Keyboard Layouts 23
6.2 IR Proximity Sensor Sensitivity 23
7 LIMITATION AND FUTURE WORK 25
8 Conclusion 26
Bibliography 27
dc.language.isoen
dc.title為智慧眼鏡設計之手掌鍵盤研究zh_TW
dc.titlePalmType: Using Palms as Keyboards for Smart Glassesen
dc.typeThesis
dc.date.schoolyear102-2
dc.description.degree碩士
dc.contributor.oralexamcommittee陳顥齡,陳炳宇,梁容輝,林維真
dc.subject.keyword鍵盤,智慧眼鏡,手掌,zh_TW
dc.subject.keywordkeyboard,smart glasses,palm,en
dc.relation.page30
dc.rights.note有償授權
dc.date.accepted2014-07-10
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-103-1.pdf
  目前未授權公開取用
18.65 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved