Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 電機資訊學院
  3. 資訊網路與多媒體研究所
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/42436
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor吳家麟(Ja-Ling Wu)
dc.contributor.authorPing-Chieh Changen
dc.contributor.author張炳傑zh_TW
dc.date.accessioned2021-06-15T01:13:45Z-
dc.date.available2014-08-04
dc.date.copyright2009-08-04
dc.date.issued2009
dc.date.submitted2009-07-29
dc.identifier.citation1 L.T. Kozlowski, J.E. Cutting, Recognizing the sex of a walker from dynamic point-light display, Percept. Psychophys. 21(6) 1977
2 B.A. Golomb, D.T. Lawrence, T.J. Sejnowski, Sexnet: a neural network identifies sex from human faces, in: Advances in Neural Information Processing Systems (NIPS), 1991, pp. 572–577.
3 R. Brunelli, T. Poggio, Hyperbf networks for gender classification, in: DRAPA Image Understanding Workshop, 1992, pp. 311–314.
4 S. Tamura, H. Kawai, H. Mitsumoto, Male/female identification from 8 6 very low resolution face images by neural network, Pattern Recognition 29 (2) (1996) 331–335.
5 S. Gutta, H. Wechsler, P.J. Phillips, Gender and ethnic classification, in: IEEE International Conference on Automatic Face & Gesture Recognition (FG), 1998, pp. 194–199.
6 B. Moghaddam, M. Yang, Learning gender with support faces, IEEE Trans. Pattern Anal. Mach. Intell. 24 (5) (2002) 707–711.
7 L. Walawalkar, M. Yeasin, A. Narasimhamurthy, R. Sharma, Support vector learning for gender classification using audio and visual cues, Int. J. Pattern Recognition Artif. Intell. 17 (3) (2003) 417–439.
8 A.B.A. Graf, F.A. Wichmann, Gender classification of human faces, in: International Workshop on Biologically Motivated Computer Vision, 2002, pp. 491–500.
9 G. Shakhnarovich, P.A. Viola, B. Moghaddam, A unified learning framework for real time face detection and classification, in: IEEE International Conference on Automatic Face & Gesture Recognition (FG), 2002, pp. 14–21.
10 P. Viola, M. Jones, Rapid object detection using a boosted cascade of simple features, in: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2001, pp. 511–518.
11 Inman, V. T., Ralston, H. J., and Todd, F.: Human Walking. Williams & Wilkins, Baltimore (1981)
12 Murray, M. P., Drought, A. B., and Kory, R. C.: Walking Patterns of Normal Men. Journal of Bone and Joint Surgery, 46A(2) (1964) 335-360
13 Johansson, G.: Visual Perception of Biological Motion and a Model for Its Analysis. Perception and Psychophysics, 14(2) (1973) 201-211
14 Winter, D. A.: The Biomechanics and Motor Control of Human Gait: Normal, Elderly and Pathological. Waterloo Biomechanics, Ontario (1991)
15 Murray, M. P., Drought, A. B., and Kory, R. C.: Walking Patterns of Normal Men. Journal of Bone and Joint Surgery, 46A(2) (1964) 335-360
16 Whittle, M. W.: Gait Analysis: An Introduction. 3rd eds. Butterworth Heinemann (2002)
17 C.D. Barclay, J.E. Cutting, L.T. Kozlowski, Temporal and spatial actors in gait perception that influence gender recognition, Percept. Psychophys. 23 (2) (1978) 145–152.
18 V. Bruce, A.M. Burton, N. Dench, E. Hanna, P. Healey, O. Mason, A. Coombes, R. Fright, A. Linney, Sex discrimination: How do we tell the difference between male and female faces, Perception 22 (1993) 131–152.
19 A.J. O’Toole, T. Vetter, N.F. Troje, H.H. Bulthoff, Sex classification is better with three-dimensional structure than with image intensity information, Perception 26 (1997) 75–84.
20 Mather, G., and Murdoch, L.: Gender Discrimination in Biological Motion Displays based on Dynamic Cues. In Proceedings of the Royal Society of London, Vol.B (1994) 273-279
21 J.E. Cutting, D.R. Proffitt, L.T. Kozlowski, A biomechanical invariant for gait perception, J. Exp. Psychol. Human Percept. Perform. 4 (3) (1978) 357–372.
22 S. Hirashima, Recognition on the gender of point-light walkers moving in different directions, Jpn. J. Psychol. 70 (2) (1999) 149–153.
23 G. Mather, L. Murdoch, Gender discrimination in biological motion displays based on dynamic cues, Proc. R. Soc.: Biol. Sci. 258 (1353) (1994) 273–279.
24 N.F. Troje, Decomposing biological motion: a framework for analysis and synthesis of human gait patterns, J. Vision 2 (5) (2002) 371–387.
25 N.F. Troje, Decomposing biological motion: a framework for analysis and synthesis of human gait patterns, J. Vision 2 (5) (2002) 371–387.
26 J.W. Davis, H. Gao, Gender recognition from walking movements using adaptive three-mode PCA, in: IEEE CVPR Workshop on Articulated and Nonrigid Motion, 2004, p. 9.
27 J.W. Davis, H. Gao, An expressive three-mode principal components model for gender recognition, J. Vision 4 (5) (2004) 362–377.
28 J.W. Davis, H. Gao, Recognizing human action efforts: an adaptive three-mode PCA framework, in: IEEE International Conference on Computer Vision (ICCV), 2003, pp. 1463–1469.
29 J.W. Davis, H. Gao, An expressive three-mode principal components model of human action style, Image Vision Comput. 21 (11) (2003) 1001–1016.
30 L. Lee, Gait analysis for classification, Technical Report 2003-014, MIT AI Lab, June 2003.
31 L. Lee, W.E.L. Grimson, Gait analysis for recognition and classification, in: IEEE International Conference on Automatic Face & Gesture Recognition (FG), 2002, pp. 155–162.
32 J.-H. Yoo, D. Hwang, M.S. Nixon, Gender classification in human gait using support vector machine, in: Proceedings of Advanced Concepts for Intelligent Vision Systems, 2005, pp. 138–145.
33 X. Li, S.J. Maybank, D. Tao, Gender recognition based on local body motions, in: IEEE International Conference on Systems, Man, and Cybernetics (SMC), 2007.
34 X. Li, S.J. Maybank, S. Yan, D. Tao, D. Xu, Gait components and their applications to gender recognition, IEEE Trans. Syst. Man Cybern. Part C 38 (2) (2008) 145–155.
35 J. Han, B. Bhanu, Individual recognition using gait energy image, IEEE Trans. Pattern Anal. Mach. Intell. 28 (2) (2006) 316–322.
36 X. Zhou, B. Bhanu, Integrating face and gait for human recognition, in: IEEE Conference on Computer Vision and Pattern Recognition Workshop, 2006, p. 55.
37 J.J. Little and J.E. Boyd, “Recognizing People by Their Gait: The Shape of Motion,” Videre: J. Computer Vision Research, vol. 1, no. 2, pp. 1-32, 1998.
38 C. Shan, S. Gong and P. McOwan. Fusing gait and face cues for human gender recognition. Neurocomputing, 2007.
39 S. Yu, D. Tan, T. Tan, A framework for evaluating the effect of view angle clothing and carrying condition on gait recognition, in: International Conference on Pattern Recognition (ICPR), 2006, pp. 441–444.
40 Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley- Interscience, Chichester (2000)
41 Tu, J., Zhang, Z., Zeng, Z., Huang, T.: Face localization via hierarchical condensation with fisher boosting feature selection. In: CVPR 2004. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 02, pp. 719–724 (2004)
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/42436-
dc.description.abstract本篇論文研究的課題是,如何透過人類步態來進行性別辨識。這是一個十分重要但仍未完全解決的問題。在過程中,我們證明了使用GEI(Gait Energy Image)可以有效地描述從不同角度所觀察到的人類步態。並且,以GEI為特徵,我們透過幾個不同方法,建構了人類步態的角度辨識法,以及性別辨識法。最後透過實驗,顯示了依照我們所提出的方法所建構的系統,可以有效地將即時性別辨識應用於實際狀況中。zh_TW
dc.description.abstractIn this thesis, we investigate an important but understudied problem, gender classification from human gaits. And we have proved the ability of using GEI (Gait Energy Image) as a representation of human gait for arbitrary view angles. Using GEI as a discriminative feature, we constructed angle classifiers and gender classifiers from different approaches. Experiments have shown that our system achieved a good performance and is able to be applied to real-world application.en
dc.description.provenanceMade available in DSpace on 2021-06-15T01:13:45Z (GMT). No. of bitstreams: 1
ntu-98-R96944006-1.pdf: 1730933 bytes, checksum: b94f84ec149ddab02eb75e312d7d0f3e (MD5)
Previous issue date: 2009
en
dc.description.tableofcontents中文摘要 ii
Abstract iii
Chapter 1 Introduction 1
1.1 MOTIVATION 1
1.2 RELATED WORKS 3
1.2.1 Psychophysical Studies 3
1.2.2 Computational Approach to Gender Classification from Human Gait 5
1.2.3 Gait Energy Image 7
Chapter 2 Human Gait Modeling 13
Chapter 3 Angle Classification 19
3.1 ELEVEN-CLASS ANGLE CLASSIFICATION 19
3.2 FIVE-GROUP ANGLE CLASSIFICATION 26
Chapter 4 Gender Classification 28
4.1 FISHER-BOOSTING 28
4.2 ELEVEN-CLASS GENDER CLASSIFICATION 30
4.3 FIVE-GROUP GENDER CLASSIFICATION 33
Chapter 5 Experimental Results 35
5.1 SYSTEM OVERVIEW 35
5.2 ANGLE CLASSIFICATION + GENDER CLASSIFICATION 37
5.2.1 Eleven-Class Approach 37
5.2.2 Five-Group Approach 38
5.3 REAL-WORLD VIDEO TESTING 38
Chapter 6 Conclusion and Future Work 40
Reference 42
dc.language.isoen
dc.subject性別辨識zh_TW
dc.subject步態能量圖zh_TW
dc.subject線性判別分析zh_TW
dc.subject視覺監視系統zh_TW
dc.subject人類步態zh_TW
dc.subjectVisual Surveillanceen
dc.subjectGender classificationen
dc.subjectHuman Gaiten
dc.subjectGEI (Gait Energy Image)en
dc.subjectLDAen
dc.subjectFisher-Boostingen
dc.title基於人類步態之任意角度即時性別辨識zh_TW
dc.titleReal-time Gender Classification From Human Gait for Arbitrary View Anglesen
dc.typeThesis
dc.date.schoolyear97-2
dc.description.degree碩士
dc.contributor.oralexamcommittee許永真(Yung-Jen Hsu),莊永裕(Yung-Yu Chuang),許秋婷(Chiou-Ting Hsu)
dc.subject.keyword性別辨識,人類步態,步態能量圖,線性判別分析,視覺監視系統,zh_TW
dc.subject.keywordGender classification,Human Gait,GEI (Gait Energy Image),LDA,Fisher-Boosting,Visual Surveillance,en
dc.relation.page47
dc.rights.note有償授權
dc.date.accepted2009-07-29
dc.contributor.author-college電機資訊學院zh_TW
dc.contributor.author-dept資訊網路與多媒體研究所zh_TW
顯示於系所單位:資訊網路與多媒體研究所

文件中的檔案:
檔案 大小格式 
ntu-98-1.pdf
  未授權公開取用
1.69 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved