請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/51329
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 康仕仲(Shih-Chung Kang) | |
dc.contributor.author | Ming-Chang Wen | en |
dc.contributor.author | 溫明璋 | zh_TW |
dc.date.accessioned | 2021-06-15T13:30:42Z | - |
dc.date.available | 2018-03-08 | |
dc.date.copyright | 2016-03-08 | |
dc.date.issued | 2016 | |
dc.date.submitted | 2016-02-03 | |
dc.identifier.citation | [1] Christopher C Higgins and Sarah Slaughter. Assessment of construction automation and robotics. 1993.
[2] Nancy L Rose and Paul L Joskow. The diffusion of new technologies: evidence from the electric utility industry. Technical report, National Bureau of Economic Research, 1988. [3] Christopher A Voss. Implementation: A key issue in manufacturing technology: The need for a field of study. Research policy, 17(2):55–63, 1988. [4] M Cross. Technical change, the supply of new skills, and product diffusion. In Technological Change and Regional Development, pages 54–67. Pion London, 1983. [5] Clayton M Christensen and Richard S Rosenbloom. Explaining the attacker’s advantage: Technological paradigms, organizational dynamics, and the value network. Research Policy, 24(2):233–257, 1995. [6] Suei Jen Chen, Bjorn Hein, and Heinz Worn. Swing attenuation of suspended objects transported by robot manipulator using acceleration compensation. In Intelligent Robots and Systems, 2007. IROS 2007. IEEE/RSJ International Conference on, pages 2919–2924. IEEE, 2007. [7] Ziyad N Masoud. Effect of hoisting cable elasticity on anti-sway controllers of quayside container cranes. Nonlinear Dynamics, 58(1-2):129–140, 2009. [8] F Ju, YS Choo, and FS Cui. Dynamic response of tower crane induced by the pendulummotion of the payload. International Journal of Solids and Structures, 43(2): 376–389, 2006. [9] Hung-Lin Chi and Shih-Chung Kang. A physics-based simulation approach for cooperative erection activities. Automation in Construction, 19(6):750–761, 2010. [10] Vineet R Kamat and Julio C Martinez. Dynamic 3d visualization of articulated construction equipment. Journal of computing in civil engineering, 2005. [11] Mohamed Al-Hussein, Sabah Alkass, and Osama Moselhi. Optimization algorithm for selection and on site location of mobile cranes. Journal of construction engineering and management, 131(5):579–590, 2005. [12] Shih-Chung Kang, Hung-Lin Chi, and Eduardo Miranda. Three-dimensional simulation and visualization of crane assisted construction erection processes. Journal of Computing in Civil Engineering, 23(6):363–371, 2009. [13] Hung-Lin Chi, Yi-Chen Chen, Shih-Chung Kang, and Shang-Hsien Hsieh. Development of user interface for tele-operated cranes. Advanced Engineering Informatics, 26(3):641–652, 2012. [14] Alvaro E Gil, Kevin M Passino, Shrikanth Ganapathy, and Andrew Sparks. Cooperative task scheduling for networked uninhabited air vehicles. Aerospace and Electronic Systems, IEEE Transactions on, 44(2):561–581, 2008. [15] L Van Breda. Operator performance in multi maritime unmanned air vehicle control. Technical report, DTIC Document, 1995. [16] Sharon D Manning, Clarence E Rash, Patricia A LeDuc, Robert K Noback, and Joseph McKeon. The role of human causal factors in us army unmanned aerial vehicle accidents. Technical report, DTIC Document, 2004. [17] Axel Schulte, Claudia Meitinger, and Reiner Onken. Human factors in the guidance of uninhabited vehicles: oxymoron or tautology? Cognition, Technology & Work, 11(1):71–86, 2009. [18] Bill Tomlinson, Eric Baumer, Man Lok Yau, Paul Mac Alpine, Lorenzo Canales, Andrew Correa, Bryant Hornick, and Anju Sharma. Dreaming of adaptive interface agents. In CHI’07 Extended Abstracts on Human Factors in Computing Systems. ACM, 2007. [19] James T Hing and Paul Y Oh. Development of an unmanned aerial vehicle piloting system with integrated motion cueing for training and pilot evaluation. In Unmanned Aircraft Systems, pages 3–19. Springer, 2009. [20] Terrence Fong and Charles Thorpe. Vehicle teleoperation interfaces. Autonomous robots, 11(1):9–18, 2001. [21] Thomas B Sheridan. Telerobotics, automation, and human supervisory control. MIT press, 1992. [22] Joseph Cooper, Michael Goodrich, et al. Towards combining uav and sensor operator roles in uav-enabled visual search. In Human-Robot Interaction (HRI), 2008 3rd ACM/IEEE International Conference on, pages 351–358. IEEE, 2008. [23] Jill L Drury, Justin Richer, Nathan Rackliffe, and Michael A Goodrich. Comparing situation awareness for two unmanned aerial vehicle human interface approaches. Technical report, DTIC Document, 2006. [24] Jennifer L Burke, Robin R Murphy, Michael D Coovert, and Dawn L Riddle. Moonlight in miami: Field study of human-robot interaction in the context of an urban search and rescue disaster response training exercise. Human–Computer Interaction, 19(1-2):85–116, 2004. [25] Jennifer Casper and Robin Roberson Murphy. Human-robot interactions during the robot-assisted urban search and rescue response at the world trade center. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, 33(3):367–385, 2003. [26] Jerry L Franke, Vera Zaychik, Thomas M Spura, and Erin E Alves. Inverting the operator/vehicle ratio: Approaches to next generation uav command and control. Proceedings of AUVSI Unmanned Systems North America, 2005, 2005. [27] Nathanael Chambers, James Allen, Lucian Galescu, and Hyuckchul Jung. A dialogue-based approach to multi-robot team control. In Multi-Robot Systems. From Swarms to Intelligent Automata Volume III, pages 257–262. Springer, 2005. [28] Hans-Joachim Bohme, Torsten Wilhelm, Jurgen Key, Carsten Schauer, Christof Schroter, Horst-Michael Gros, and Torsten Hempel. An approach to multi-modal human–machine interaction for intelligent service robots. Robotics and Autonomous Systems, 44(1):83–96, 2003. [29] Jan BF van Erp. Controlling unmanned vehicles: The human factors solution. In Advances in vehicle systems concepts and integration: RTO Meeting Proceedings, volume 44, 2000. [30] T Lam, Valentina D Amelio, Max Mulder, and MM Rene Van Paassen. Uav teleoperation using haptics with a degraded visual interface. In Systems, Man and Cybernetics, 2006. SMC’06. IEEE International Conference on, volume 3, pages 2440– 2445. IEEE, 2006. [31] Heath A Ruff, Sundaram Narayanan, and Mark H Draper. Human interaction with levels of automation and decision-aid fidelity in the supervisory control of multiple simulated unmanned air vehicles. Presence: Teleoperators and virtual environments, 11(4):335–351, 2002. [32] Douglas E McGovern. Experience and results in teleoperation of land vehicles. In Pictorial communication in virtual and real environments, pages 182–195. Taylor & Francis, Inc., 1991. [33] Michael J Barnes, Beverly G Knapp, Barry W Tillman, Brett A Walters, and Darlene Velicki. Crew systems analysis of unmanned aerial vehicle (uav) future job and tasking environments. Technical report, DTIC Document, 2000. [34] Nancy J Cooke, Harry K Pedersen, Olena Connor, Jamie C Gorman, and Dee Andrews. 20. acquiring team-level command and control skill for uav operation. Human factors of remotely operated vehicles, 7:285–297, 2006. [35] Michael Goodrich, Joseph L Cooper, Julie Adams, Curtis Humphrey, Ron Zeeman, Brian G Buss, et al. Using a mini-uav to support wilderness search and rescue: Practices for human-robot teaming. In Safety, Security and Rescue Robotics, 2007. SSRR 2007. IEEE International Workshop on, pages 1–6. IEEE, 2007. [36] Robin R Murphy, Kevin S Pratt, and Jennifer L Burke. Crew roles and operational protocols for rotary-wing micro-uavs in close urban environments. In Human-Robot Interaction (HRI), 2008 3rd ACM/IEEE International Conference on, pages 73–80. IEEE, 2008. [37] Clarence E Rash, Patricia A LeDuc, and Sharon D Manning. Human factors in us military unmanned aerial vehicle accidents. Human Factors of Remotely Operated Vehicles, 7:117–131, 2006. [38] Joshua Michael Peschel and Robin Roberson Murphy. On the human-machine interaction of unmanned aerial system mission specialists. Human-Machine Systems, IEEE Transactions on, 43(1):53–62, 2013. [39] Raja Parasuraman, Thomas B Sheridan, and Christopher D Wickens. A model for types and levels of human interaction with automation. Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on, 30(3):286–297, 2000. [40] Christopher D Wickens. Imperfect and unreliable automation and its implications for attention allocation, information access and situation awareness. 2000. [41] Ming Hou, Haibin Zhu, MengChu Zhou, and G Robert Arrabito. Optimizing operator–agent interaction in intelligent adaptive interface design: A conceptual framework. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 41(2):161–178, 2011. [42] Imad Elhajj, Ning Xi, Wai Keung Fung, Yun Hui Liu, Wen J Li, Tomoyuki Kaga, and Toshio Fukuda. Haptic information in internet-based teleoperation. Mechatronics, IEEE/ASME Transactions on, 6(3):295–304, 2001. [43] Sangyoon Lee, Gaurav S Sukhatme, Gerard Jounghyun Kim, and Chan-Mo Park. Haptic control of a mobile robot: A user study. In Intelligent Robots and Systems, 2002. IEEE/RSJ International Conference on, volume 3, pages 2867–2874. IEEE, 2002. [44] Jason S McCarley and Christopher D Wickens. Human factors implications of UAVs in the national airspace. University of Illinois at Urbana-Champaign, Aviation Human Factors Division, 2005. [45] Thanh Mung Lam, Harmen Wigert Boschloo, Max Mulder, and Marinus M Van Paassen. Artificial force field for haptic feedback in uav teleoperation. Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on, 39(6):1316–1330, 2009. [46] Conrad B Monson, Craig S Fong, Richard A Marsh, and Michael W Haas. Addressing the human element in unmanned aerial vehicles. In Proc. 36th Aerosp. Sci. Meeting Exhib, pages 12–15, 1998. [47] Nicola Diolaiti and Claudio Melchiorri. Teleoperation of a mobile robot through haptic feedback. In Haptic Virtual Environments and Their Applications, IEEE International Workshop 2002 HAVE, pages 67–72. IEEE, 2002. [48] Stephen R Ellis, Michael W McGreevy, and Robert J Hitchcock. Perspective traffic display format and airline pilot traffic avoidance. Human Factors: The Journal of the Human Factors and Ergonomics Society, 29(4):371–382, 1987. [49] Ian D Haskell and Christopher D Wickens. Two-and three-dimensional displays for aviation: A theoretical and empirical comparison. The International Journal of Aviation Psychology, 3(2):87–109, 1993. [50] MH Draper, HA Ruff, DW Repperger, and LG Lu. Multi-sensory interface concepts supporting turbulence detection by uav controllers. In Proceedings of the Human Performance, Situational Awareness and Automation Conference, pages 107–112, 2000. [51] Mark St John, Michael B Cowen, Harvey S Smallman, and Heather M Oonk. The use of 2d and 3d displays for shape-understanding versus relative-position tasks. Human Factors: The Journal of the Human Factors and Ergonomics Society, 43(1):79–98, 2001. [52] Francesca De Crescenzio, Giovanni Miranda, Franco Persiani, and Tiziano Bombardi. A first implementation of an advanced 3d interface to control and supervise uav (uninhabited aerial vehicles) missions. Presence: Teleoperators and Virtual Environments, 18(3):171–184, 2009. [53] R Schaefer. Unmanned aerial vehicle reliability study. Office of the Secretary of Defense, Washington, DC, 2003. [54] Jessie YC Chen, Ellen C Haas, and Michael J Barnes. Human performance issues and user interface design for teleoperated robots. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 37(6):1231–1245, 2007. [55] Murat Cenk Cavuşoğlu, Alana Sherman, and Frank Tendick. Design of bilateral teleoperation controllers for haptic exploration and telemanipulation of soft environments. Robotics and Automation, IEEE Transactions on, 18(4):641–647, 2002. [56] Gudrun De Gersem, Hendrik Van Brussel, and Frank Tendick. Reliable and enhanced stiffness perception in soft-tissue telemanipulation. The International Journal of Robotics Research, 24(10):805–822, 2005. [57] Pawel Malysz and Shahin Sirouspour. Nonlinear and filtered force/position mappings in bilateral teleoperation with application to enhanced stiffness discrimination. Robotics, IEEE Transactions on, 25(5):1134–1149, 2009. [58] D Botturi, M Vicentini, M Righele, and C Secchi. Perception-centric force scaling in bilateral teleoperation. Mechatronics, 20(7):802–811, 2010. [59] Hyoung Il Son, Tapomayukh Bhattacharjee, and Hideki Hashimoto. Enhancement in operator’s perception of soft tissues and its experimental validation for scaled teleoperation systems. Mechatronics, IEEE/ASME Transactions on, 16(6):1096– 1109, 2011. [60] Claire C Gordon, Thomas Churchill, Charles E Clauser, Bruce Bradtmiller, and John T McConville. Anthropometric survey of us army personnel: methods and summary statistics 1988. Technical report, DTIC Document, 1989. | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/51329 | - |
dc.description.abstract | 本研究發展一遙現系統以改善現有遠端控制系統人機介面之不足。現今遠端控制系統之影像介面多採用二維影像,深度資訊之不足易導致對遠端環境認知缺乏,操控者對於影像中物件之速度、尺寸以及彼此距離之判斷皆為相對關係,不精確的空間感進而造成安全隱憂。除此之外,現今系統在遠端視角控制上,於固定位置之單一視角控制上使用多自由度控制而易造成遲滯與迷向,於實務操作上,操控效率高度仰賴操控者經驗。有鑑於此,本研究開發了一遙現系統,應用立體視覺與動作追蹤之技術,改善距離感知與控制介面之不足。發展一符合人眼透視之感測器模組,進行雙眼影像擷取,經由無線傳輸與影像處理後,再透過頭戴式顯示器對操控者展示。頭戴式顯示器上嵌入慣性測量感測器,實時追蹤操控者頭部各軸向的轉動量,並將其與裝載立體視覺感測器之伺服雲台同步。透過三維之立體視覺回饋、操控者以頭部轉動方式改變視角,提供深度資訊判斷能力並簡化控制介面自由度,減少肇因於操控者面對影像失真與控制複雜之錯誤判斷,藉以降低風險與提升效率。驗證方面,本研究設計一實驗方法,使五名受測者重覆執行特定任務,分別比較使用現有系統與本研究開發之系統,模擬實際系統操控之情境。結果顯示,相較於現行系統,本系統顯著降低其中四名操控者之距離判斷誤差分別達66.84%、47.75%、33.41%、40.58%,亦顯著縮短其中四名操控者之任務時間分別達19.24%、23.86%、22.63%、0.93%。總結來說,本研究開發之系統能夠提升操控者對於遠端環境之認知,並且提升操控效率,進而增進任務執行效率與降低人為誤判造成之事故。 | zh_TW |
dc.description.abstract | This research develops a telepresence system aiming to reduce visual distortion of remote environment. Most of teleoperation applications take place in remote environments that are unstructured and not precisely known a priori. Therefore, fully autonomous control of an unmanned system is infeasible in such cases. To impose human intelligence on the task to cope with uncertainties, an improved telepresence system that eliminates the gap between machine and operator is essential. Current methods mainly use two-dimensional (2D) image covering a restricted field-of-view (FOV) as visual aid and numeric onboard sensor readings as references for operators. Since, sensory cues are significantly lost which include ambient visual information, depth information, and kinesthetic input. Also, in cases that use unmanned vehicle system (UVS) to inspect fully unknown areas, sensor information is not applicable until it is analyzed or processed. Instead of sending multiple sensors into the field with unmanned vehicles and retrieve a large amount of data for time-consuming post analysis, an instinct and sensory method for quick understanding and reconstruction of onsite situation is needed.
The major effort of this research is taking advantage of state-of-art technologies of three-dimensional (3D) input/output and developing an avatar-like mechanism to synchronize the physical behavior of an operator. Two of 3D input/output methods are used in this research: stereoscopic vision and motion tracking. The system is designed to work under a close-loop control with human feedback involved. Two cameras that are optimized according to human eyes angle-of-view (AOV) for accommodating human eyes’ perspective are used as stereoscopic vision input on the unmanned vehicle. The stereoscopic image is then streamed back to the operator through radio downlink in realtime. On the operator end, a head mount display (HMD) is used for displaying the stereoscopic image and tracking head movement of the operator with embedded sensors. The head tracking data are interpreted to control signal and returned to unmanned vehicle for controlling a three-axis gimbal mechanism on which the two cameras are installed. Then the loop is closed by synchronizing the operator and cameras’ FOV. Teleyes system has been validated on a designed experimental application scenario comparing with current methods with five operators. The result shows that Teleyes significantly reduces distance error of four test operators by 66.84%, 47.75%, 33.41%, and 40.58%. It also significantly reduces time usage of four test operators by 19.24%, 23.86%, 22.63%, and 0.93%. In conclusion, the Teleyes system has improved visual experience and operating efficiency that have potential on saving general resources and expanding the application. The developed system provides operator an immersive first-person view (FPV) that provides the operator a visual experience as being onboard. | en |
dc.description.provenance | Made available in DSpace on 2021-06-15T13:30:42Z (GMT). No. of bitstreams: 1 ntu-105-R02521609-1.pdf: 18149376 bytes, checksum: 43a47e916e81deade3bc5938b350dd70 (MD5) Previous issue date: 2016 | en |
dc.description.tableofcontents | 摘要 i
Abstract ii Contents iv List of Figures vi List of Tables vii 1 Introduction 1 2 Literature Review 4 2.1 Teleoperated Construction Systems . . . . . . . . . . . . . . . . . . . . . 4 2.2 Telepresence Systems of UVS . . . . . . . . . . . . . . . . . . . . . . . 5 2.3 Human-Machine Interface . . . . . . . . . . . . . . . . . . . . . . . . . 7 2.4 Visual Aid and Perception . . . . . . . . . . . . . . . . . . . . . . . . . 8 2.5 User Operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 3 Research Goal 10 4 Teleyes Methods 11 4.1 Control Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 4.2 Motion Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 4.3 Stereoscopic Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 iv 5 Teleyes Implementation 19 5.1 System Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 5.2 Stereoscopic Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 5.3 Data Transmission . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 5.4 Head Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 6 Validation 27 6.1 Design of Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 6.2 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 6.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 7 Conclusion 38 Appendix A Raw Data of Experiement 39 Bibliography 45 | |
dc.language.iso | en | |
dc.title | 基於立體視覺與頭部動作追蹤之遙現系統 | zh_TW |
dc.title | Teleyes: a Telepresence System based on Stereoscopic Vision and Head Motion Tracking | en |
dc.type | Thesis | |
dc.date.schoolyear | 104-1 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 張國鎮(Kuo-Chun Chang),羅仁權(Ren Luo),賴維祥(Wei-Hsiang Lai),劉寅春(Peter Liu) | |
dc.subject.keyword | 遙現,立體視覺,動作追蹤,遙控,人機界面, | zh_TW |
dc.subject.keyword | telepresence,stereoscopic vision,motion tracking,teleoperation,human machine interface, | en |
dc.relation.page | 52 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2016-02-03 | |
dc.contributor.author-college | 工學院 | zh_TW |
dc.contributor.author-dept | 土木工程學研究所 | zh_TW |
顯示於系所單位: | 土木工程學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-105-1.pdf 目前未授權公開取用 | 17.72 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。