Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊工程學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/61913
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor洪一平(Yi-Ping Hung)
dc.contributor.authorChia-Wei Hsuen
dc.contributor.author許家瑋zh_TW
dc.date.accessioned2021-06-16T13:18:27Z-
dc.date.available2017-07-31
dc.date.copyright2013-07-31
dc.date.issued2013
dc.date.submitted2013-07-29
dc.identifier.citation[1]Seifried, T., Rendl, C., Perteneder, F., Leitner, J., Haller, M., Sakamoto, D., & Scott,S. D. (2009, August). CRISTAL, control of remotely interfaced systems using touch-based actions in living spaces. In ACM SIGGRAPH 2009 Emerging Technologies (p. 6). ACM.
[2]Eun, D., Rhee, T. H., Kang, S., Choi, M., Lee, S., & Kim, H. J. (2011). Virtual Bridge: AR-Based Mobile Interaction for Easy Multimedia Control of Remote Home Devices. In HCI International 2011–Posters’ Extended Abstracts (pp. 102-106). Springer Berlin Heidelberg.
[3]Flynn, C. (2011). Visualising and Interacting with a CAVE using Real-World Sensor Data. In Irish HCI 2011, 8th-9th September 2011, Cork
[4]Vlahakis, V., Ioannidis, M., Karigiannis, J., Tsotros, M., Gounaris, M., Stricker, D., Gleue, T., Daehne, P., & Almeida, L. (2002). Archeoguide: An augmented reality guide for archaeological sites. Computer Graphics and Applications, IEEE, 22(5), 52-60.
[5]Trahanias, P., and Argyros, A. (2000). TOURBOT: Interactive Museum Telepresence through Robotic Avatars. International World Wide Web Conference Culture Track, Session A-2: Museums on the Web - Case Study, Organizer: A.M. Ronchi, Amsterdam, Netherlands.
[6]Hwang, J., Jung, J., & Kim, G. J. (2006, November). Hand-held virtual reality: a feasibility study. In Proceedings of the ACM symposium on Virtual reality software and technology (pp. 356-363). ACM.
[7]Joshi, N., Kar, A., & Cohen, M. (2012, May). Looking At You: fused gyro and face tracking for viewing large imagery on mobile devices. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems (pp. 2211-2220). ACM.
[8]Hinckley, K., Pierce, J., Sinclair, M., & Horvitz, E. (2000, November). Sensing techniques for mobile interaction. In Proceedings of the 13th annual ACM symposium on User interface software and technology (pp. 91-100). ACM.
[9]Wang, J., Zhai, S., & Canny, J. (2006, October). Camera phone based motion sensing: interaction techniques, applications and performance study. In Proceedings of the 19th annual ACM symposium on User interface software and technology (pp. 101-110). ACM.
[10]Hachet, M., Pouderoux, J., & Guitton, P. (2005, April). A camera-based interface for interaction with mobile handheld computers. In Proceedings of the 2005 symposium on Interactive 3D graphics and games (pp. 65-72). ACM.
[11]Sankar, A., & Seitz, S. (2012, October). Capturing indoor scenes with smartphones. In Proceedings of the 25th annual ACM symposium on User interface software and technology (pp. 403-412). ACM.
[12]Fan, M., Patterson, D., & Shi, Y. (2012, September). When camera meets accelerometer: a novel way for 3d interaction of mobile phone. In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services companion (pp. 131-136). ACM.
[13]Boring, S., Baur, D., Butz, A., Gustafson, S., & Baudisch, P. (2010, April). Touch projector: mobile interaction through video. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 2287-2296). ACM.
[14]Wagner, J., Huot, S., & Mackay, W. (2012, May). Bitouch and bipad: Designing bimanual interaction for hand-held tablets. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems (pp. 2317-2326). ACM.
[15]Campbell, J., Sukthankar, R., Nourbakhsh, I., & Pahwa, A. (2005, April). A robust visual odometry and precipice detection system using consumer-grade monocular vision. In Robotics and Automation, 2005. ICRA 2005. Proceedings of the 2005 IEEE International Conference on (pp. 3421-3427). IEEE.
[16]Shafie, A. A., Hafiz, F., & Ali, M. H. (2009). Motion detection techniques using optical flow. World Academy of Science, Engineering and Technology, 56, 559-561.
[17]Baraldi, P., De Micheli, E., & Uras, S. (1989, September). Motion and depth from optical flow. In 5th Alvey Vision Conference, Reading, UK.
[18]Kitani, K. M., Okabe, T., Sato, Y., & Sugimoto, A. (2011, June). Fast unsupervised ego-action learning for first-person sports videos. In Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on (pp. 3241-3248). IEEE.
[19]Tuck, K. (2007). Tilt sensing using linear accelerometers. Freescale Semiconductor Application Note AN3107.
[20]Ouilhet, H. (2010, September). Google Sky Map: using your phone as an interface. In Proceedings of the 12th international conference on Human computer interaction with mobile devices and services (pp. 419-422). ACM.
[21]Ayub, S., Heravi, B. M., Bahraminasab, A., & Honary, B. (2012, September). Pedestrian Direction of Movement Determination using Smartphone. In Next Generation Mobile Applications, Services and Technologies (NGMAST), 2012 6th International Conference on (pp. 64-69). IEEE.
[22]Android Developer Website. URL available at: http://developer.android.com/. [accessed 11 June 26 2013].
[23]Breiman, L. (2001). Random forests. Machine learning, 45(1), 5-32
[24]Mladenov, M., & Mock, M. (2009, June). A step counter service for Java-enabled devices using a built-in accelerometer. In Proceedings of the 1st international workshop on context-aware middleware and services: affiliated with the 4th international conference on communication system software and middleware (COMSWARE 2009) (pp. 1-5). ACM.
[25]Butterworth, S. (1930). On the theory of filter amplifiers. Wireless Engineer, 7, 536-541.
[26]Kim, H. L., Kim, D. H., Ryu, Y. S., & Kim, Y. K. (1996, November). A study on pitch detection using the local peak and valley for Korean speech recognition. In TENCON'96. Proceedings. 1996 IEEE TENCON. Digital Signal Processing Applications (Vol. 1, pp. 107-112). IEEE.
[27]Lu, Z., Luo, W., Sun, Z., Ben-Ezra, M., & Brown, M. S. (2012). Imaging buddhist art with a digital large-format camera: A field study report from the dunhuang caves. Journal on Computing and Cultural Heritage (JOCCH), 5(3), 9.
[28]Bay, H., Tuytelaars, T., & Van Gool, L. (2006). Surf: Speeded up robust features. In Computer Vision–ECCV 2006 (pp. 404-417). Springer Berlin Heidelberg.
[29]Mattern, F. (2000). State of the art and future trends in distributed systems and ubiquitous computing. Vontobel TeKnoBase.
[30]Lee, Y., Oh, S., Shin, C., & Woo, W. (2008, July). Recent trends in ubiquitous virtual reality. In Ubiquitous Virtual Reality, 2008. ISUVR 2008. International Symposium on (pp. 33-36). IEEE.
[31]Bowman, D. A., Gabbard, J. L., & Hix, D. (2002). A survey of usability evaluation in virtual environments: classification and comparison of methods. Presence: Teleoperators & Virtual Environments, 11(4), 404-424.
[32]Bowman, D. A., & Hodges, L. F. (1997, April). An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments. In Proceedings of the 1997 symposium on Interactive 3D graphics (pp. 35-ff). ACM.
[33]Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence questionnaire. Presence, 7(3), 225-240.
[34]Kalman, R. E. (1960). A new approach to linear filtering and prediction problems. Journal of basic Engineering, 82(1), 35-45.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/61913-
dc.description.abstract在本篇論文中,我們提出一種運用行動裝置與空間互動之技術,透過本技術,使用者將能直覺地利用行動裝置與空間進行互動。與空間互動的方式有很多種,最常見的就是直接操控空間中的物體(如:電器)、從空間中獲取資訊或是經由互動的過程對於空間有更進一步的認識等等。由於要實現直覺性的操作,本技術必須支援第一人稱的空間瀏覽方式,因此我們整合行動裝置相機與感測器所獲取的影像光流特徵及方向和轉動資訊,用以偵測並追蹤使用者的自身動作與視角改變。此外,為了讓使用者有身歷其境的感受,我們提出兩種於行動裝置螢幕上呈現空間內容的方式,一種為2D呈現方式,另一種為3D呈現方式。在2D呈現方式中,我們利用720度的球型環場影像建立虛擬空間,而在3D呈現方式中,我們則利用該空間的3D模型建立虛擬空間。基於上述的追蹤與呈現方法,我們便能在虛擬空間中保有真實世界的空間感,並且能以使用者的自身動作去操作在虛擬空間中的瀏覽。接著,我們進一步將此技術使用在兩種應用,分別為「陌生空間」及「熟悉空間」之應用。在「陌生空間」應用中,我們實作了「神遊敦煌」,「神遊敦煌」為一遠端導覽系統,此系統利用高解析度平板裝置,將使用者帶進虛擬的敦煌石窟,並結合體感互動,令其產生身歷其境的感受。而在「熟悉空間」應用中,我們實作了「行動中控室」及「智慧家庭控制介面」,此兩種應用皆為快速、直覺而且便於攜帶的控制介面。透過「行動中控室」,既便保全人員不在中控室,仍可利用對於中控室的空間記憶與空間感,以直覺的方式進行監控工作。而透過「智慧家庭控制介面」,儘管使用者不在家中,仍可直覺地操控家中的電器,就如同在家的時候一樣。最後,我們將所提出的介面與其他傳統的介面相比,並且以沉浸度、直覺性、完成時間及錯誤率作為評比的標準,實驗結果顯示,我們的方法比其他介面有更佳的表現。zh_TW
dc.description.abstractIn this thesis, we present TelePort, a technique based on users’ spatial perception to intuitively interact with spaces using mobile devices. The versatile interaction can include directly controlling or manipulating objects in one space, acquiring some information from one space, touring in one space and so on. In order to provide intuitive interfaces, we support first person navigation during the interaction with space. Therefore, we combine the optical cue and the orientation cue extracted from a build-in camera and sensors in mobile device respectively to estimate the user’s motion and viewpoint. To immerse the user seamlessly into the virtual space, we provide two visualization methods to display the virtual space on the screen of the mobile device: one is the 2D visualization method which we build a 720 degrees three-dimensional spherical panorama to reconstruct the space scene, and the other is the 3D visualization method which we establish a 3D space model for spatial representation. Based on the proposed estimation and visualization methods, the spatial perception in the real world would be preserved in the virtual space and the virtual space navigation can simulate the corresponding motion as the user’s body gestures. Moreover, we applied the technique in two different categories of applications – applications to unfamiliar space and applications to familiar space. In the applications to unfamiliar space, we demonstrate “Dream of Dunhuang”, a remote guide system which allows visitors feeling like being in the Dunhuang cave only by one tablet. In the applications to familiar space, we demonstrate “Mobile Central Control Room for Surveillance” and “Smart Home Control Interface”, which are fast,intuitive and portable interfaces that make users can revisit the central control rooms or their houses anywhere and interact with the objects in the places based on their spatial memory. Finally, we compare our method with other traditional interfaces for environment interaction and evaluate our interface in terms of the level of immersion and presence, completion time and error rate. The experimental results show our interface outperforms the other methods.en
dc.description.provenanceMade available in DSpace on 2021-06-16T13:18:27Z (GMT). No. of bitstreams: 1
ntu-102-R00922025-1.pdf: 5998470 bytes, checksum: cb4a310f665c94e9426b1b860f45cdca (MD5)
Previous issue date: 2013
en
dc.description.tableofcontents口試委員會審定書 #
誌謝 i
摘要 ii
Abstract iv
Contents vi
List of Figures ix
List of Tables xii
Chapter 1 Introduction 1
Chapter 2 Related Work 4
2.1 Intuitive Interaction and Navigation in Space 4
2.2 Sensor-Based Control Interfaces 7
2.3 Motion Estimation Using Optical Flow 9
Chapter 3 System Overview 11
Chapter 4 Motion Estimation for Mobile Devices 14
4.1 Sensors in Mobile Devices 15
4.2 Estimating Orientation of Mobile Devices 16
4.3 Detecting Forward/Backward Movement 23
4.3.1 Detection of Z-Axis Motion 24
4.3.2 Classification of Optical Flow 25
4.3.3 Algorithm 26
4.3.4 Evaluation 28
4.4 Estimating the Number of Steps 29
4.4.1 Algorithm 29
4.4.2 Evaluation 32
Chapter 5 Visualization 34
5.1 2D Visualization 34
5.2 3D Visualization 37
Chapter 6 Applications 40
6.1 Applications to Unfamiliar Space 40
6.1.1 Dream of Dunhuang 41
6.1.1.1 System Architecture 42
6.1.1.2 Multimedia Content Production 46
6.1.1.3 Pattern Matching for Gateways 47
6.2 Applications to Familiar Space 49
6.2.1 Smart Home Control Interface 51
6.2.2 Mobile Central Control Room for Surveillance 52
Chapter 7 Experiments 54
7.1 User Study 1: View Manipulation Using TelePort vs. Touch Screen (TS) 55
7.1.1 Space Selection for Experiment 55
7.1.2 Interfaces 56
7.1.3 Tasks 57
7.1.4 Design and Participants 59
7.1.5 Hypothesis 60
7.1.6 Result 61
7.1.7 Observation and Discussion 63
7.2 User Study 2: TelePort vs. Traditional Interfaces 65
7.2.1 Interfaces 65
7.2.2 Tasks 66
7.2.3 Design and Participants 66
7.2.4 Hypothesis 67
7.2.5 Result 67
7.2.6 Observation and Discussion 68
Chapter 8 Conclusion and Future Work 69
References 71
dc.language.isoen
dc.subject行動zh_TW
dc.subject瀏覽zh_TW
dc.subject人機介面zh_TW
dc.subject空間感zh_TW
dc.subject空間記憶zh_TW
dc.subjectMobileen
dc.subjectNavigationen
dc.subjectHCIen
dc.subjectSpatial Perceptionen
dc.subjectSpatial Memoryen
dc.title任意門:運用行動裝置與空間互動之技術zh_TW
dc.titleTelePort: A Technique for Interacting with Space Using Mobile Devicesen
dc.typeThesis
dc.date.schoolyear101-2
dc.description.degree碩士
dc.contributor.oralexamcommittee張智星(Jyh-Shing Jang),李明穗(Ming-Sui Lee),余孟杰(Meng-Chieh Yu),黃俊翔(Chun-Hsiang Huang)
dc.subject.keyword行動,瀏覽,人機介面,空間感,空間記憶,zh_TW
dc.subject.keywordMobile,Navigation,HCI,Spatial Perception,Spatial Memory,en
dc.relation.page75
dc.rights.note有償授權
dc.date.accepted2013-07-29
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊工程學研究所zh_TW
顯示於系所單位:資訊工程學系

文件中的檔案:
檔案 大小格式 
ntu-102-1.pdf
  未授權公開取用
5.86 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved