Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/63344
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳彥仰(Mike Y. Chen)
dc.contributor.authorLung-Pan Chengen
dc.contributor.author鄭龍磻zh_TW
dc.date.accessioned2021-06-16T16:36:04Z-
dc.date.available2012-11-22
dc.date.copyright2012-11-22
dc.date.issued2012
dc.date.submitted2012-10-23
dc.identifier.citation1. Balakrishnan, R., Baudel, T., Kurtenbach, G., and Fitzmaurice, G. The Rockin’Mouse: integral 3D manipulation on a plane. Proceedings of the SIGCHI conference on Human factors in computing systems, ACM (1997), 311–318.
2. Bartlett, J.F. Rock ’n’ Scroll is Here to Stay. IEEE Computer Graphics and Applications, May (2000), 40–45.
3. Bazen, A.M. and Veldhuis, R.N.J. Likelihood-Ratio-Based Biometric Verification. IEEE Transactions on Circuits and Systems for Video Technology 14, 1 (2004), 86-94.
4. Blasko, G., Beaver, W., Kamvar, M., and Feiner, S. Workplane-orientation sensing techniques for tablet PCs. Proceedings of the 17th Annual ACM symposium on User Interface Software and Technology, (2004).
5. Bradski, G.R., Clara, S., and Corporation, I. Computer Vision Face Tracking For Use in a Perceptual User Interface. Intel Technology Journal 2, 2 (1998), 12–21.
6. Butler, A., Izadi, S., and Hodges, S. SideSight: multi-touch interaction around small devices. Proceedings of the 21st annual ACM symposium on User interface software and technology, ACM (2008), 201–204.
7. Chang, C.-C. and Lin, C.-J. LIBSVM: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology (TIST) 2, 3 (2011).
8. Fitzmaurice, G.W., Balakrishnan, R., Kurtenbach, G., and Buxton, B. An exploration into supporting artwork orientation in the user interface. Proceedings of the SIGCHI conference on Human factors in computing systems: the CHI is the limit, ACM (1999), 167–174.
38
9. Forstall, S. and Blumenberg, C. Portrait-Landscape Rotation Heuristics for a Portable Multifunction Device. US Patent App. 20,080/165,144, 2011. http://www.freepatentsonline.com/y2008/0165144.html.
10. Gu, J. and Lee, G. TouchString: a flexible linear multi-touch sensor for prototyping a freeform multi-touch surface. Proceedings of the 24th annual ACM symposium adjunct on User interface software and technology, ACM (2011), 75–76.
11. Hannuksela, J., Sangi, P., Turtinen, M., and Heikkila, J. Face tracking for spatially aware mobile user interfaces. Image and Signal Processing, (2008), 405–412.
12. Hinckley, K., Pierce, J., Sinclair, M., and Horvitz, E. Sensing techniques for mobile interaction. Proceedings of the 13th annual ACM symposium on User interface software and technology, ACM (2000), 91–100.
13. Hinckley, K., Sinclair, M., Hanson, E., Szeliski, R., and Conway, M. The VideoMouse: a camera-based multi-degree-of-freedom input device. Proceedings of the 12th annual ACM symposium on User interface software and technology, ACM (1999), 103–112.
14. Hinckley, K. and Song, H. Sensor synaesthesia: touch in motion, and motion in touch. Proceedings of the 2011 annual conference on Human factors in computing systems, ACM (2011), 801–810.
15. Janicek, M. CAPTURING AN IMAGE WITH A CAMERA INTEGRATED IN AN ELECTRONIC DISPLAY. US Patent App. 20,090/009,628, 2007. http://www.freepatentsonline.com/y2009/0009628.html.
16. Kim, K., Chang, W., Cho, S.-jung, et al. Hand grip pattern recognition for mobile
user interfaces. Proceedings of the National Conference on Artificial Intelligence, 39
Menlo Park, CA; Cambridge, MA; London; AAAI Press; MIT Press; 1999
(2006), 1789-1794.
17. Lienhart, R., Kuranov, A., and Pisarevsky, V. Empirical analysis of detection
cascades of boosted classifiers for rapid object detection. The German 25th
Pattern Recognition Symposium (DAGM ’03), (2003), 297–304.
18. Nybergh, K. and Himberg, J. Touch detection system for mobile terminals. Mobile HCI 2005 7th International Conference on Human-Computer Interaction
with Mobile Devices and Services, (2004), 331-336.
19. Oliver, B.M. Time Domain Reflectometry. HP Journal, (1964), 15(6).
20. Ording, B., Van Os, M., and Chaudhri, I. Screen Rotation Gestures on a Portable
Multifunction Device. US Patent App. 20,080/, 2011.
http://www.freepatentsonline.com/y2008/0211778.html.
21. Pai, D. and VanDerLoo, E. The tango: A tangible tangoreceptive whole-hand
human interface. WHC ’05 Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, (2005), 141 - 147.
22. Platt, J. Sequential minimal optimization: A fast algorithm for training support vector machines. (1998), 1-21.
23. Schmidt, A., Beigl, M., and Gellersen, H.W. There is more to context than location. Computers & Graphics 23, 6 (1999), 893–901.
24. Sohn, M. and Lee, G. ISeeU: camera-based user interface for a handheld computer. Proceedings of the 7th international conference on Human computer interaction with mobile devices & services, ACM (2005), 299–302.
25. Taylor, B.T. and Bove Jr, V.M. Graspables: grasp-recognition as a user interface.
Proceedings of the 27th international conference on Human factors in computing
40
systems, ACM (2009), 917–926.
26. Veldhuis, R., Bazen, A., Kauffman, J., and Hartel, P. Biometric verification
based on grip-pattern recognition. Security, Steganography, and Watermarking of
Multimedia Contents, volume 5306 of Proceedings of SPIE, (2004), 634–641.
27. Wang, J. and Canny, J. TinyMotion: camera phone based interaction methods.
CHI’06 extended abstracts on Human factors in computing systems, ACM (2006),
339–344.
28. Wimmer, R. and Baudisch, P. Modular and deformable touch-sensitive surfaces
based on time domain reflectometry. Proceedings of the 24th annual ACM
symposium on User interface software and technology, ACM Press (2011), 517.
29. Wimmer, R. and Boring, S. HandSense: discriminating different ways of grasping and holding a tangible user interface. Proceedings of the 3rd International Conference on Tangible and Embedded Interaction, ACM (2009),
359–362.
30. Wimmer, R. FlyEye: grasp-sensitive surfaces using optical fiber. TEI ’10
Proceedings of the fourth international conference on Tangible, embedded, and
embodied interaction, (2010), 245-248.
31. Wimmer, R. Grasp sensing for human-computer interaction. TEI ’11
Proceedings of the fifth international conference on Tangible, embedded, and
embodied interaction, (2011), 221-228.
32. Yu, N.-H., Chan, L.-W., Lau, S.-Y., et al. TUIC: enabling tangible interaction on
capacitive multi-touch displays. CHI ’11 Proceedings of the 2011 annual
conference on Human factors in computing systems, (2011), 2995-3004.
33. Yu, N.H., Tsai, S.S., Hsiao, I.-chun, et al. Clip-on gadgets: expanding
multi-touch interaction area with unpowered tactile controls. Proceedings of the 41
24th annual ACM symposium on User interface software and technology, ACM (2011), 367–372.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/63344-
dc.description.abstract螢幕自動旋轉功能增進了在行動裝置上的瀏覽體驗,然而目前以重力方向來 判斷的方式並無法讓使用者在任何姿勢下使用,例如側躺的時候。用手動切換螢 幕方向則需要使用者自己藉由其他方法輸入,例如實體或虛擬按鈕。我們調查了 513 位使用者,其中 42%的使用者至少一個禮拜會遇到螢幕自動旋轉成錯誤方向, 24%的使用者覺得這是一個嚴重的問題。
我們提出兩種方法,iRotate 與 iGrasp,來自動選轉行動裝置上的螢幕方向至 使用者觀看的方向。iRotate 以行動裝置上的前置鏡頭來偵測使用者的臉部並且旋 轉至該方向,增廣目前現有以重力方向來判斷的方式。這個方法不需要讓使用者 做額外的輸入,而且使用者在任何姿勢與任何觀看方向都可以運作。我們在 iPhone 與 iPad 上實作了可以即時判斷的 iRotate 的原型,並且執行了 20 人的可行性評測, 測量 iRotate 的準確性跟限制。
iGrasp 是偵測使用者如何握裝置來自動旋轉行動裝置上的螢幕至使用者目前 的觀看方向。我們發現使用者在一個方向上的握法是一致的,而且跟另一個方向 上的握法有非常顯著的差異。我們的 iGrasp 原型用 32 個光敏電阻鑲在 iPod Touch 的四個邊跟背面上,並且用 support vector machine (SVM)以 25Hz 的採樣率來辨識 使用者的握法。我們收集了 6 個使用者在不同 54 個情況下使用的資料:1)左手持 握,右手持握,雙手持握,2)捲動畫面,放大縮小畫面,打字,3)直向,橫向左, 橫向右,4)坐著,側躺。結果顯示以握的方式判斷是可行的,而且我們的 iGrasp 原型有 86.7-90.5%的準確率可以旋轉螢幕至正確方向。
zh_TW
dc.description.abstractAutomatic screen rotation improves viewing experience and usability of mobile devices, but current gravity-based approaches do not support postures, such as lying on one side; and manual rotation switches require explicit user input. Our survey of 513 users shows that 42% currently experience unintentional auto-rotation that leads to incorrect viewing orientation at least several times a week, and 24% find the problem to be from very serious to extremely serious.
We present two approaches, iRotate and iGrasp, to automatically rotate screens on mobile devices to match users’ viewing orientation. iRotate augments gravity-based approach, and uses front cameras on mobile devices to detect users’ faces and rotates screens accordingly. It requires no explicit user input and supports different user postures and device orientations. We have implemented an iRotate that works in real-time on iPhone and iPad, and we assess the accuracy and limitations of iRotate through a 20- participant feasibility study.
iGrasp automatically rotates screens of mobile devices to match users’ viewing orientations based on how users are grasping the devices. Our insight is that users’ grasps are consistent for each orientation, but significantly differ between different orientations. Our prototype embeds a total of 32 light sensors along the four sides and the back of an iPod Touch, and uses support vector machine (SVM) to recognize grasps at 25Hz. We collected 6-users’ usage under 54 different conditions: 1) grasping the device using left, right, and both hands, 2) scrolling, zooming and typing, 3) in portrait, landscape-left, and landscape-right orientations, and while 4) sitting and lying down on one side. Results show that our grasp-based approach is promising, and our iGrasp
prototype could correctly rotate the screen 86.7-90.5% of the time when training and testing on different users.
en
dc.description.provenanceMade available in DSpace on 2021-06-16T16:36:04Z (GMT). No. of bitstreams: 1
ntu-101-R98922168-1.pdf: 6317207 bytes, checksum: 6a7cb8b83a78359a01e1d1d9354e1abf (MD5)
Previous issue date: 2012
en
dc.description.tableofcontents口試委員會審定書 ........................................................................................................... i 誌謝 .................................................................................................................................. ii 中文摘要 ......................................................................................................................... iii
ABSTRACT .................................................................................................................... iv CONTENTS .................................................................................................................... vi LIST OF FIGURES ....................................................................................................... viii LIST OF TABLES.............................................................................................................x Chapter 1 Introduction ................................................................................................1 Chapter 2 Related Work..............................................................................................7
2.1 Screen Rotation.................................................................................................7
2.2 Face Detection Applications on Mobile Devices..............................................8
2.3 Grasp Sensing ...................................................................................................8
2.4 Grasp-based User Interface...............................................................................9
Chapter 3 Survey: Auto-rotation in the wild ...........................................................10
3.1 When and How Incorrect Rotation Occurs .....................................................10
3.2 Awareness and Usability of Rotation Lock ....................................................11
Chapter 4 User Study on Face Detection..................................................................13
4.1 4.2 4.3
4.4
Device .............................................................................................................13 Experiment ......................................................................................................14 Design .............................................................................................................14
4.3.1 Orientation Threshold...........................................................................15
Implementation ...............................................................................................16 vi
4.5 Detection Performance....................................................................................17
4.6 Feasibility Analysis.........................................................................................20
Chapter 5 User Study on Grasp Sensing..................................................................23
5.1 5.2 5.3 5.4
Chapter
6.1 6.2 6.3 6.4 6.5
Feasibility Study .............................................................................................23 Prototyping......................................................................................................23 Recognizing Grasp Orientation.......................................................................26 Evaluation .......................................................................................................27
5.4.1 Data Collection.....................................................................................27
5.4.2 Recognition Accuracy ..........................................................................28
5.4.3 Online Evaluation.................................................................................30
6 Discussion ..................................................................................................32
Orientation Detection based on Partial Faces .................................................32 Limitation........................................................................................................32 Unusual Grasps and Personalized Learning....................................................33 Sensing Grasps................................................................................................33 Determining User Posture...............................................................................35
Chapter 7 Conclusion.................................................................................................36
REFERENCE ..................................................................................................................38
dc.language.isoen
dc.subject螢幕自動旋轉zh_TW
dc.subject裝置方向zh_TW
dc.subject握法辨識zh_TW
dc.subject人臉辨識zh_TW
dc.subject行動裝置zh_TW
dc.subjectDevice orientationen
dc.subjectFace Detectionen
dc.subjectMobile Devicesen
dc.subjectGrasp Recognitionen
dc.subjectAutomatic Screen Rotationen
dc.titleiRotate:智慧型使用者螢幕方向感測系統zh_TW
dc.titleiRotate: An Intelligent Automatic Screen Rotation Systemen
dc.typeThesis
dc.date.schoolyear101-1
dc.description.degree碩士
dc.contributor.oralexamcommittee王浩全(Hao-Chuan Wang),鄧怡莘(Yi-Shin Deng),許聞廉(Wen-Lian Hsu)
dc.subject.keyword螢幕自動旋轉,裝置方向,握法辨識,人臉辨識,行動裝置,zh_TW
dc.subject.keywordAutomatic Screen Rotation,Grasp Recognition,Device orientation,Face Detection,Mobile Devices,en
dc.relation.page42
dc.rights.note有償授權
dc.date.accepted2012-10-23
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-101-1.pdf
  未授權公開取用
6.17 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved