Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資料科學學位學程
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/94055
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor陳伶志zh_TW
dc.contributor.advisorLing-Jyh Chenen
dc.contributor.author何珮語zh_TW
dc.contributor.authorPei-Yu Hoen
dc.date.accessioned2024-08-14T16:27:55Z-
dc.date.available2024-08-15-
dc.date.copyright2024-08-13-
dc.date.issued2024-
dc.date.submitted2024-08-09-
dc.identifier.citation[1] Robert V. Levine and Ara Norenzayan. The Pace of Life in 31 Countries. Journal of Cross-Cultural Psychology, 30(2):178–205, March 1999.
[2] Marc H. Bornstein and Helen G. Bornstein. The pace of life. Nature, 259(5544):557–559, February 1976.
[3] Marek Franěk. Environmental Factors Influencing Pedestrian Walking Speed. Perceptual and Motor Skills, 116(3):992–1019, June 2013.
[4] Catrine Tudor-Locke and David A. Rowe. Using Cadence to Study Free-Living Ambulatory Behaviour. Sports Medicine, 42(5):381–398, May 2012.
[5] Xuan Wang, Guoliang Chen, Xiaoxiang Cao, Zhenghua Zhang, Mengyi Yang, and Saizhou Jin. Robust and Accurate Step Counting Based on Motion Mode Recognition for Pedestrian Indoor Positioning Using a Smartphone. IEEE Sensors Journal, 22(6):4893–4907, March 2022.
[6] Ying Xu, Guofeng Li, Zeyu Li, Hao Yu, Jianhui Cui, Jin Wang, and Yu Chen. Smartphone-Based Unconstrained Step Detection Fusing a Variable Sliding Window and an Adaptive Threshold. Remote Sensing, 14(12):2926, January 2022.
[7] Xin Liu, Ning Li, Geng Xu, and Yonggang Zhang. A Novel Robust Step Detection Algorithm for Foot-Mounted IMU. IEEE Sensors Journal, 21(4):5331–5339, February 2021.
[8] Apratim Shrivastav, Utkarsh Kuchhal, Samarth Singhal, Tanmay Jain, Divyashikha Sethia, Vidushi Chaudhary, Rajiv Nigam, and Ashish Kumar Namdeo. Optimised Algorithm for Step Count Estimation Using Sensor Data from Smartphones and Wearables. In 2022 IEEE Region 10 Symposium (TENSYMP), pages 1–6, July 2022.
[9] XiangChen Wu, Xiaoqin Zeng, Xiaoxiang Lu, and Keman Zhang. Step detection in complex walking environments based on continuous wavelet transform. Multimedia Tools and Applications, 83(12):1–25, May 2023.
[10] Warnnaphorn Suksuganjana, Seksan Laitrakun, Krit Athikulwongse, Yuko Hara-Azumi, and Somrudee Deepaisarn. Improved Step Detection with Smartphone Handheld Mode Recognition. In 2021 13th International Conference on Knowledge and Smart Technology (KST), pages 55–60. IEEE, January 2021.
[11] G. Prigent, E. Barthelet, K. Aminian, and A. Paraschiv-Ionescu. Walking and running cadence estimation using a single trunk-fixed accelerometer for daily physical activities assessment. In 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), pages 3645–3648, July 2022.
[12] Marta Karas, Jacek K. Urbanek, Vittorio P. Illiano, Guy Bogaarts, Ciprian M. Crainiceanu, and Jonas F. Dorn. Estimation of free-living walking cadence from wrist-worn sensor accelerometry data and its association with SF-36 quality of life scores. Physiological Measurement, 42(6):065006, June 2021.
[13] Shehroz S. Khan and Ali Abedi. Step Counting with Attention-based LSTM. In 2022 IEEE Symposium Series on Computational Intelligence (SSCI), pages 559–566. IEEE, December 2022.
[14] Stef Vandermeeren and Heidi Steendam. Deep-Learning-Based Step Detection and Step Length Estimation With a Handheld IMU. IEEE Sensors Journal, 22(24):24205–24221, February 2022.
[15] Long Luu, Arvind Pillai, Halsey Lea, Ruben Buendia, Faisal M. Khan, and Glynn Dennis. Accurate Step Count with Generalized and Personalized Deep Learning on Accelerometer Data. Sensors, 22(11):3989, January 2022.
[16] Yuyang Qian, Kaiming Yang, Yu Zhu, Wei Wang, and Chenhui Wan. Combining deep learning and model-based method using Bayesian Inference for walking speed estimation. Biomedical Signal Processing and Control, 62:102117, September 2020.
[17] Zihan Song, Hye-Jin Park, Ngeemasara Thapa, Ja-Gyeong Yang, Kenji Harada, Sangyoon Lee, Hiroyuki Shimada, Hyuntae Park, and Byung-Kwon Park. Carrying Position-Independent Ensemble Machine Learning Step-Counting Algorithm for Smartphones. Sensors, 22(10):3736, January 2022.
[18] Parastoo Alinia, Ramin Fallahzadeh, Christopher P. Connolly, and Hassan Ghasemzadeh. ParaLabel: Autonomous Parameter Learning for Cross-Domain Step Counting in Wearable Sensors. IEEE Sensors Journal, 20(23):13867–13879, February 2020.
[19] Aawesh Shrestha and Myounggyu Won. DeepWalking: Enabling Smartphone-Based Walking Speed Estimation Using Deep Learning. In 2018 IEEE Global Communications Conference (GLOBECOM), pages 1–6, February 2018.
[20] Kun Qian, Chenshu Wu, Zheng Yang, Yunhao Liu, and Kyle Jamieson. Widar: Decimeter-Level Passive Tracking via Velocity Monitoring with Commodity Wi-Fi. In Proceedings of the 18th ACM International Symposium on Mobile Ad Hoc Networking and Computing, pages 1–10, July 2017.
[21] Yang Xu, Wei Yang, Jianxin Wang, Xing Zhou, Hong Li, and Liusheng Huang. WiStep: Device-free Step Counting with WiFi Signals. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol., 1(4):172:1–172:23, January 2018.
[22] Jing He and Wei Yang. IMep: Device-Free Multiplayer Step Counting With WiFi Signals. IEEE Transactions on Mobile Computing, 22(10):5887–5899, October 2023.
[23] Yohanna MejiaCruz, Juan M. Caicedo, Zhaoshuo Jiang, and Jean M. Franco. Probabilistic Estimation of Cadence and Walking Speed From Floor Vibrations. IEEE Journal of Translational Engineering in Health and Medicine, 12:508–519, June 2024.
[24] Dan Jiao and Teng Fei. Pedestrian walking speed monitoring at street scale by an in-flight drone. PeerJ Computer Science, 9:e1226, January 2023.
[25] Chien-Yao Wang, Alexey Bochkovskiy, and Hong-Yuan Mark Liao. Yolov7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. In 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 7464–7475, June 2023.
[26] Alex Bewley, Zongyuan Ge, Lionel Ott, Fabio Ramos, and Ben Upcroft. Simple Online and Realtime Tracking. In 2016 IEEE International Conference on Image Processing (ICIP), pages 3464–3468, September 2016.
[27] Walter Pirker and Regina Katzenschlager. Gait disorders in adults and the elderly: A clinical guide. Wiener Klinische Wochenschrift, 129(3-4):81–95, February 2017.
[28] Stuart P. Lloyd. Least squares quantization in PCM. IEEE Transactions on Information Theory, 28(2):129–137, March 1982.
[29] Robert L. Thorndike. Who belongs in the family? Psychometrika, 18(4):267–276, December 1953.
[30] 行政院主計總處. 111年家庭收支調查報告. 行政院主計總處, October 2023.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/94055-
dc.description.abstract在都市規劃、公共安全與經濟發展領域中,行人動態分析提供了重要的見解。行人步頻可做為評估城市行人流動性的一個指標,不僅顯示了城市規劃的效果,亦反映出城市生活節奏(pace of life)。本研究旨在運用電腦視覺與訊號處理技術,進行行人步頻探測,探索行人步頻與城市環境之間的關聯性。
本研究方法包含三個模組:即時影像收集、行人特徵擷取、行人步頻估計。即時影像收集模組從多個公共直播攝影機自動化獲取影像數據;行人特徵擷取模組透過You Only Look Once version 7(YOLOv7)模型與Simple Online and Realtime Tracking(SORT)演算法處理影像數據,以此獲得行人特徵時間序列數據;步頻估計模組則運用訊號處理技術分析這些行人特徵時間序列數據,從而估計出行人步頻。這種非侵入式的分析方法允許在不干擾行人自然行為的情況下進行數據收集與分析,從而提高了數據的真實性和可靠性。
本研究應用此方法於多個公開直播攝影機,以估計和分析行人步頻,並分析不同地理位置和地點類型(居住教育、旅遊、商業、交通)之間的行人步頻差異。透過對這多個影像源的行人步頻累積分布函數(cumulative distribution function, CDF)數據進行K-均值聚類(K-Means clustering)分析,探討這些聚類形成的背後成因。此外,本研究還探討了台灣不同城市之間,收入與行人步頻的相關性。總結而言,本研究為城市規劃者和政策制定者提供了見解,旨在促進智慧城市的發展,並提高城市規劃的效率和效能。
zh_TW
dc.description.abstractIn the fields of urban planning, public safety, and economic development, pedestrian dynamic analysis provides critical insights. Pedestrian cadence serves as an indicator for assessing urban pedestrian mobility, reflecting not only the effectiveness of urban planning but also the pace of life. This study employs computer vision and signal processing technologies to detect pedestrian cadence, aiming to explore the relationship between pedestrian cadence and urban environments.
The methodology of this study comprises three modules: Real-Time Image Collection, Pedestrian Feature Extraction, and Pedestrian Cadence Estimation. The Real-Time Image Collection Module automatically captures video data from multiple public live cameras; the Pedestrian Feature Extraction Module processes this data using the You Only Look Once version 7 (YOLOv7) model and Simple Online and Realtime Tracking (SORT) algorithm to acquire time series data of pedestrian features; the Pedestrian Cadence Estimation Module then analyzes these time series data using signal processing techniques to compute pedestrian cadence. This non-invasive analytical approach allows for data collection and analysis without disturbing natural pedestrian behaviors, thereby enhancing the authenticity and reliability of the data.
The method of this study has been applied to several public live cameras to estimate and analyze pedestrian cadence, examining the variations in cadence across different geographical locations and types of locations (living and learning, tourism, business, and traffic). Through K-Means clustering analysis of the cumulative distribution function (CDF) data of pedestrian cadence from these locations, analyzing the reasons behind these cluster formations. Additionally, this study explored the correlation between income and pedestrian cadence across different cities in Taiwan. In sum, this study provides insights for urban planners and policymakers, aiming to facilitate the development of smart cities and improve the efficiency and effectiveness of urban planning.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-08-14T16:27:55Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2024-08-14T16:27:55Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents摘要 .............................................................................................................. i
Abstract ....................................................................................................... ii
目次 ............................................................................................................. iv
圖次 ............................................................................................................. vi
表次 ............................................................................................................. vii
第一章 緒論 .................................................................................................... 1
第二章 文獻回顧 .......................................................................................... 4
2.1 主動式測量方法 ................................................................................. 4
2.2 被動式測量方法 ................................................................................. 6
第三章 研究方法 .......................................................................................... 8
3.1 即時影像收集模組 ............................................................................. 8
3.2 行人特徵提取模組 ............................................................................. 11
3.3 行人步頻估計模組 ............................................................................. 12
3.4 方法限制 ........................................................................................... 15
第四章 資料集 ............................................................................................ 16
第五章 研究結果和討論 .......................................................................... 23
5.1 步伐計算準確度 ................................................................................. 23
5.2 步頻統計數據 ................................................................................... 27
5.3 台灣行人步頻聚類分析 ................................................................... 28
5.4 多國行人步頻聚類分析 ........................................................... 31
5.5 台灣收入與行人步頻之間的相關性 ......................................... 35
5.6 討論 ................................................................................................... 37
第六章 結論 ............................................................................................... 40
參考文獻 .................................................................................................... 42
-
dc.language.isozh_TW-
dc.subject電腦視覺zh_TW
dc.subject行人步頻zh_TW
dc.subject生活節奏zh_TW
dc.subject智慧城市zh_TW
dc.subject訊號處理zh_TW
dc.subjectPedestrian Cadenceen
dc.subjectComputer Visionen
dc.subjectPace of Lifeen
dc.subjectSmart Cityen
dc.subjectSignal Processingen
dc.title利用電腦視覺與訊號處理技術於公開直播攝影機進行行人步頻探測zh_TW
dc.titlePedestrian Cadence Sensing in Public Live Cameras Using Computer Vision and Signal Processingen
dc.typeThesis-
dc.date.schoolyear112-2-
dc.description.degree碩士-
dc.contributor.coadvisor陳祝嵩zh_TW
dc.contributor.coadvisorChu-Song Chenen
dc.contributor.oralexamcommittee賀耀華;巫芳璟zh_TW
dc.contributor.oralexamcommitteeYao-Hua Ho;Fang-Jing Wuen
dc.subject.keyword行人步頻,電腦視覺,訊號處理,智慧城市,生活節奏,zh_TW
dc.subject.keywordPedestrian Cadence,Computer Vision,Signal Processing,Smart City,Pace of Life,en
dc.relation.page45-
dc.identifier.doi10.6342/NTU202404122-
dc.rights.note未授權-
dc.date.accepted2024-08-12-
dc.contributor.author-college電機資訊學院-
dc.contributor.author-dept資料科學學位學程-
顯示於系所單位:資料科學學位學程

文件中的檔案:
檔案 大小格式 
ntu-112-2.pdf
  未授權公開取用
5.34 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved