Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/74354
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳彥仰(Mike Y. Chen)
dc.contributor.authorTzu-Chuan Chenen
dc.contributor.author陳子權zh_TW
dc.date.accessioned2021-06-17T08:31:22Z-
dc.date.available2020-08-13
dc.date.copyright2019-08-13
dc.date.issued2019
dc.date.submitted2019-08-12
dc.identifier.citation[1] Amazon. Amazon ec2 service instance, 2018.
[2] D. Anderson, C. Bailey, and M. Skubic. Hidden markov model symbol recognition for sketch-based interfaces. In AAAI fall symposium, pages 15–21. Washington, DC,2004.
[3] L. Anthony, Y. Kim, and L. Findlater. Analyzing user-generated youtube videos to understand touchscreen use by people with motor impairments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13, pages 1223–1232, New York, NY, USA, 2013. ACM.
[4] L. Anthony and J. O. Wobbrock. A lightweight multistroke recognizer for user interface prototypes. In Proceedings of Graphics Interface 2010, GI ’10, pages 245–252,Toronto, Ont., Canada, Canada, 2010. Canadian Information Processing Society.
[5] Apple. Apple gesture recognizer, 2018.
[6] Apple. Apple swipe recognizer, 2018.
[7] Apple. Apple ui guideline, 2018.
[8] Apple. Touch accommodations, 2018.
[9] D. S. Asakawa, J. T. Dennerlein, and D. L. Jindrich. Index finger and thumb kine- matics and performance measurements for common touchscreen gestures. Appliedergonomics, 58:176–181, 2017.
[10] X. Bi and S. Zhai. Bayesian touch: a statistical criterion of target selection with finger touch. In Proceedings of the 26th annual ACM symposium on User interface software and technology, pages 51–60. ACM, 2013.
[11] X. Bi and S. Zhai. Predicting finger-touch accuracy based on the dual gaussian distribution model. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, pages 313–319. ACM, 2016.
[12] M. C. Buzzi, M. Buzzi, B. Leporini, and A. Trujillo. Analyzing visually impaired people’s touch gestures on smartphones. Multimedia Tools and Applications, 76(4):5141–5169, 2017.
[13] N. Caprani, N. E. O’Connor, and C. Gurrin. Touch screens for the older user. In F. A. A. Cheein, editor, Assistive Technologies, chapter 5. IntechOpen, Rijeka, 2012.
[14] S. H.-H. Chang, R. Blagojevic, and B. Plimmer. Rata. gesture: A gesture recognizer developed using data mining. AI EDAM, 26(3):351–366, 2012.
[15] K. B. Chen, A. B. Savage, A. O. Chourasia, D. A. Wiegmann, and M. E. Sesto. Touch screen performance by individuals with and without motor control disabilities. Applied Ergonomics, 44(2):297 – 302, 2013.
[16] K. Cichoń, J. Sobecki, and J. M. Szymański. Gesture tracking and recognition in touchscreens usability testing. In Proceedings of the International Conference on Multimedia, Interaction, Design and Innovation, MIDI ’13, pages 9:1–9:8, New York, NY, USA, 2013. ACM.
[17] M. Cirelli and R. Nakamura. A survey on multi-touch gesture recognition and multi-touch frameworks. In Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces, pages 35–44. ACM, 2014.
[18] S. Duff, C. B. Irwin, J. L. Skye, M. Sesto, and D. Wiegmann. The effect of disability and approach on touch screen performance during a number entry task. volume 54,pages 566–570, 09 2010.
[19] L. Findlater, J. E. Froehlich, K. Fattal, J. O. Wobbrock, and T. Dastyar. Age-related differences in performance with touchscreens compared to traditional mouse input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’13, pages 343–346, New York, NY, USA, 2013. ACM.
[20] L. Findlater, A. Jansen, K. Shinohara, M. Dixon, P. Kamb, J. Rakita, and J. O. Wobbrock. Enhanced area cursors: Reducing fine pointing demands for people with motor impairments. In Proceedings of the 23Nd Annual ACM Symposium on User Interface Software and Technology, UIST ’10, pages 153–162, New York, NY, USA,2010. ACM.
[21] L. Findlater, K. Moffatt, J. E. Froehlich, M. Malu, and J. Zhang. Comparing touch-screen and mouse input performance by people with and without upper body motor impairments. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI ’17, pages 6056–6061, New York, NY, USA, 2017. ACM.
[22] S. G. Kratz and M. Rohs. A $3 gesture recognizer: Simple gesture recognition for devices equipped with 3d acceleration sensors. pages 341–344, 02 2010.
[23] Q. Gao and Q. Sun. Examining the usability of touch screen gestures for older and younger adults. Human factors, 57(5):835–863, 2015.
[24] Google. Google gesture design, 2018.
[25] T. Guerreiro, H. Nicolau, J. Jorge, and D. Gonçalves. Towards accessible touch interfaces. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS ’10, pages 19–26, New York, NY, USA, 2010. ACM.
[26] H. Hwangbo, S. H. Yoon, B. S. Jin, Y. S. Han, and Y. G. Ji. A study of pointing performance of elderly users on smartphones. International Journal of Human-Computer Interaction, 29(9):604–618, 2013.
[27] C. B. Irwin and M. E. Sesto. Performance and touch characteristics of disabled and non-disabled participants during a reciprocal tapping task using touch screen technology. Applied ergonomics, 43(6):1038–1043, 2012.
[28] J. Jeong, N. Kim, and H. In. Adaptive kinetic scrolling: Kinetic scrolling for large datasets on mobile devices. Applied Sciences, 8(11):2015, 2018.
[29] S. Ji, W. Xu, M. Yang, and K. Yu. 3d convolutional neural networks for human action recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence,35(1):221–231, Jan 2013.
[30] Z. X. Jin, T. Plocher, and L. Kiff. Touch screen user interfaces for older adults:button size and spacing. In International Conference on Universal Access in Human-Computer Interaction, pages 933–941. Springer, 2007.
[31] J. Kim, A. X. Zhang, J. Kim, R. C. Miller, and K. Z. Gajos. Content-aware kinetic scrolling for supporting web page navigation. In Proceedings of the 27th annual ACM symposium on User interface software and technology, pages 123–127. ACM,2014.
[32] M. Kobayashi, A. Hiyama, T. Miura, C. Asakawa, M. Hirose, and T. Ifukube. Elderly user evaluation of mobile touchscreen interactions. In Proceedings of the 13th IFIP TC 13 International Conference on Human-computer Interaction - Volume Part I, INTERACT’11, pages 83–99, Berlin, Heidelberg, 2011. Springer-Verlag.
[33] L. A. Leiva, D. Martín-Albo, and R.-D. Vatavu. Synthesizing stroke gestures across user populations: A case for users with visual impairments. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pages 4182–4193. ACM, 2017.
[34] Y. Li. Protractor: A fast and accurate gesture recognizer. In CHI 2010: ACM Conference on Human Factors in Computing Systems, 2010.
[35] A. Mertens, N. Jochems, C. Schlick, D. Dünnebacke, and J. Henrik Dornberg. Design pattern trabing: touchscreen-based input technique for people affected by intention tremor. pages 267–272, 01 2010.
[36] K. Montague, H. Nicolau, and V. L. Hanson. Motor-impaired touchscreen interactions in the wild. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers & Accessibility, ASSETS ’14, pages 123–130, New York, NY,USA, 2014. ACM.
[37] M. E. Mott, R.-D. Vatavu, S. K. Kane, and J. O. Wobbrock. Smart touch:Improving touch accuracy for people with motor impairments with template matching. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems,CHI ’16, pages 1934–1946, New York, NY, USA, 2016. ACM.
[38] M. E. Mott, R.-D. Vatavu, S. K. Kane, and J. O. Wobbrock. Smart touch: Improving touch accuracy for people with motor impairments with template matching. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems,pages 1934–1946. ACM, 2016.
[39] L. G. Motti, N. Vigouroux, and P. Gorce. Drag-and-drop for older adults using touch-screen devices: Effects of screen sizes and interaction techniques on accuracy. In Proceedings of the 26th Conference on L’Interaction Homme-Machine, IHM ’14,pages 139–146, New York, NY, USA, 2014. ACM.
[40] K. Murakami and H. Taguchi. Gesture recognition using recurrent neural networks.In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems,CHI ’91, pages 237–242, New York, NY, USA, 1991. ACM.
[41] H. Nicolau, T. Guerreiro, J. Jorge, and D. Gonçalves. Mobile touchscreen user interfaces: bridging the gap between motor-impaired and able-bodied users. volume 13,pages 303–313, 08 2013.
[42] NVIDIA. Nvidia cuda, 2018.
[43] W. Ritter, G. Kempter, and T. Werner. User-acceptance of latency in touch interactions. In M. Antona and C. Stephanidis, editors, Universal Access in Human-Computer Interaction. Access to Interaction, pages 139–147, Cham, 2015. Springer International Publishing.
[44] M. E. Sesto, C. B. Irwin, K. B. Chen, A. O. Chourasia, and D. A. Wiegmann. Effect of touch screen button size and spacing on touch characteristics of users with and without disabilities. Human Factors, 54(3):425–436, 2012.
[45] O.-C. Ungurean, R.-D. Vatavu, L. A. Leiva, and R. Plamondon. Gesture input for users with motor impairments on touchscreens: Empirical results based on the kinematic theory. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, page LBW537. ACM, 2018.
[46] R.-D. Vatavu. The effect of sampling rate on the performance of template-based gesture recognizers. In Proceedings of the 13th international conference on multimodal interfaces, pages 271–278. ACM, 2011.
[47] C. Wacharamanotham, J. Hurtmanns, A. Mertens, M. Kronenbuerger, C. Schlick,and J. Borchers. Evaluating swabbing: A touchscreen input method for elderly users with tremor. pages 623–626, 05 2011.
[48] D. Weir, S. Rogers, R. Murray-Smith, and M. Löchtefeld. A user-specific machine learning approach for improving touch accuracy on mobile devices. In Proceedings of the 25th annual ACM symposium on User interface software and technology,pages 465–476. ACM, 2012.
[49] J. O. Wobbrock, S. K. Kane, K. Z. Gajos, S. Harada, and J. Froehlich. Ability-based design: Concept, principles and examples. volume 3, pages 9:1–9:27, New York,NY, USA, Apr. 2011. ACM.
[50] J. O. Wobbrock, S. K. Kane, K. Z. Gajos, S. Harada, and J. Froehlich. Ability-based design: Concept, principles and examples. ACM Trans. Access. Comput., 3(3):9:1–9:27, Apr. 2011.
[51] J. O. Wobbrock, A. D. Wilson, and Y. Li. Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In Proceedings of the 20th annual ACM symposium on User interface software and technology, pages 159–168.ACM, 2007.
[52] L. Zhang, G. Zhu, P. Shen, and J. Song. Learning spatiotemporal features using 3dcnn and convolutional lstm for gesture recognition. In 2017 IEEE International Conference on Computer Vision Workshops (ICCVW), pages 3120–3128, Oct 2017.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/74354-
dc.description.abstract銀髮族與上肢行動不便者具有操作平板上的困難,目前常見手機預設的辨識器未必適合特殊需求的人們,我們提出 DeepGesture , 學習使用者的行為,建立兩階段的模型,在第一階段,卷積網絡模型分辨使用者手勢,模型顯著地改善常見手勢的辨識率如點擊與滑動,在點擊手勢辨識成功後,利用嶄新的點擊優化器找出最有重要性的觸控點,以提升點擊的成功率。結果顯示 DeepGesture 可以比預設系統的辨識器達到更高的成功率。zh_TW
dc.description.abstractElderly people and the motor impaired have difficulty in interacting with touch screen devices. Commonly-used mobile system uses a general model for gesture recognition. However, the general threshold-based model may not meet their special needs. Hence, we present DeepGesture, a 2-stage model providing self-learning function for gesture recognition. In first stage, convolution neutral network is used to classify gesture.It remarkably improves the success rate of recognizing common gestures, such as tap and pan etc. After tapping gesture recognized,a novel tap optimizer is used to choose most important touch point to obtain higher tapping success rate. The results show that DeepGesture achieves a higher success rate than iOS default recognizer.en
dc.description.provenanceMade available in DSpace on 2021-06-17T08:31:22Z (GMT). No. of bitstreams: 1
ntu-108-R06922088-1.pdf: 5529697 bytes, checksum: d3d44ad0b2f8b56c0e8715f4d92c5ba2 (MD5)
Previous issue date: 2019
en
dc.description.tableofcontents誌謝 i
摘要 ii
Abstract iii
0.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
0.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
0.2.1 Behavior of Motor Impaired and Elderly Users . . . . . . . . . . 3
0.2.2 Personal Model for Optimization of Specific Gestures . . . . . . 3
0.2.3 Touch gesture classification . . . . . . . . . . . . . . . . . . . . 4
0.2.4 Kinematic feature . . . . . . . . . . . . . . . . . . . . . . . . . . 5
0.3 Gesture Classes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
0.4 User Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
0.4.1 Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
0.4.2 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
0.5 Behavior Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
0.5.1 Kinematics Features . . . . . . . . . . . . . . . . . . . . . . . . 10
0.5.2 Duration of gesture . . . . . . . . . . . . . . . . . . . . . . . . . 11
0.5.3 Gesture Event Analysis . . . . . . . . . . . . . . . . . . . . . . . 12
0.6 System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
0.6.1 Task Classification Model . . . . . . . . . . . . . . . . . . . . . 18
0.6.2 Tap Optimizer . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
0.7 Success Verification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
0.7.1 Task Classification Verification . . . . . . . . . . . . . . . . . . 26
0.7.2 Tap Optimizer Verification . . . . . . . . . . . . . . . . . . . . . 29
0.8 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
0.8.1 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
0.8.2 Tap Optimizer Results . . . . . . . . . . . . . . . . . . . . . . . 36
0.9 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
0.10 limitation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
0.11 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Bibliography 42
dc.language.isoen
dc.subject觸控手勢辨識zh_TW
dc.subject深度學習zh_TW
dc.subject輔助使用zh_TW
dc.subjectaccessibilityen
dc.subjectdeep learningen
dc.subjecttouch gesture recognitionen
dc.titleDeepGesture:利用卷積類神經網絡改善運動神經損傷者觸控手勢的辨識zh_TW
dc.titleDeepGesture: Improving Touchscreen Gesture Recognition using Convolutional Neural Network for Users with Varying Motor Skill Levelsen
dc.typeThesis
dc.date.schoolyear107-2
dc.description.degree碩士
dc.contributor.oralexamcommittee黃大源(Da-Yuan Huang),詹力韋(Li-Wei Chan),鄭龍磻(Lung-Pan Cheng)
dc.subject.keyword輔助使用,深度學習,觸控手勢辨識,zh_TW
dc.subject.keywordaccessibility,deep learning,touch gesture recognition,en
dc.relation.page48
dc.identifier.doi10.6342/NTU201902596
dc.rights.note有償授權
dc.date.accepted2019-08-12
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-108-1.pdf
  未授權公開取用
5.4 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved