請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/48543完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 葉素玲(Su-Ling Yeh) | |
| dc.contributor.author | Kuan-Ming Chen | en |
| dc.contributor.author | 陳冠銘 | zh_TW |
| dc.date.accessioned | 2021-06-15T07:01:19Z | - |
| dc.date.available | 2011-02-09 | |
| dc.date.copyright | 2011-02-09 | |
| dc.date.issued | 2011 | |
| dc.date.submitted | 2011-01-20 | |
| dc.identifier.citation | Alais, D., & Burr, D. (2003). The 'Flash-Lag' effect occurs in audition and cross-modally. Current Biology, 13, 59-63.
Alais, D., & Burr, D. (2004a). No direction-specific bimodal facilitation for audiovisual motion detection. Cognitive Brain Research, 19, 185-194. Alais, D., & Burr, D. (2004b). The ventriloquist effect results from near-optimal bimodal integration. Current Biology, 14, 257-262. Andersen, T. S., Tiippana, K., & Sams, M. (2004). Factors influencing audiovisual fission and fusion illusions. Cognitive Brain Research, 21, 301-308. Angrilli, A., Cherubini, P., Pavese, A., & Mantredini, S. (1997). The influence of affective factors on time perception. Perception & Psychophysics, 59, 972-982. Arrighi, R., Alais, D., & Burr, D. (2006). Perceptual synchrony of audiovisual streams for natural and artificial motion sequences. Journal of Vision, 6, 260-268. Aschersleben, G., & Bertelson, P. (2003). Temporal ventriloquism: Crossmodal interaction on the time dimension. 2. Evidence from sensorimotor synchronization. International Journal of Psychophysiology, 50, 157-163. Battaglia, P. W. (2010). Bayesian perceptual inference in linear Gaussian models (Computer Science and Artificial Intelligence Laboratory Technical Report). Cambridge, MA, USA: Massachusetts Institute of Technology. Battaglia, P. W., Di Luca, M., Ernst, M. O., Schrater, P. R., Machulla, T., & Kersten, D. (2010). Within- and cross-modal distance information disambiguate visual size-change perception. PLoS Computational Biology, 6, e1000697. Battaglia, P. W., Jacobs, R. A., & Aslin, R. N. (2003). Bayesian integration of visual and auditory signals for spatial localization. Journal of the Optical Society of America. A, Optics, Image Science, and Vision, 20, 1391-1397. Berger, T. D., Martelli, M., & Pelli, D. G. (2003). Flicker flutter: Is an illusory event as good as the real thing? Journal of Vision, 3, 406-412. Bernstein, L. E., Auer, E. T., Jr., & Moore, J. K. (2004). Audiovisual speech binding: Convergence or association? In G. A. Calvert, C. Spence & B. E. Stein (Eds.), The handbook of multisensory processes (pp. 203-223). Cambridge, MA: The MIT Press. Bertelson, P., & de Gelder, B. (2004). The psychology of multimodal perception. In C. Spence & J. Driver (Eds.), Crossmodal space and crossmodal attention (pp. 141-177). Oxford: Oxford University Press. Bertelson, P., Pavani, F., Ladavas, E., Vroomen, J., & de Gelder, B. (2000). Ventriloquism in patients with unilateral visual neglect. Neuropsychologia, 38, 1634-1642. Bertelson, P., Vroomen, J., de Gelder, B., & Driver, J. (2000). The ventriloquist effect does not depend on the direction of deliberate visual attention. Perception & Psychophysics, 62, 321-332. Block, R. A., & Zakay, D. (1997). Prospective and retrospective duration judgments: A meta-analytic review. Psychonomic Bulletin & Review, 4, 184-197. Botvinick, M., & Cohen, J. (1998). Rubber hands 'feel' touch that eyes see. Nature, 391, 756. Bueti, D., Walsh, V., Frith, C., & Rees, G. (2008). Different brain circuits underlie motor and perceptual representations of temporal intervals. Journal of Cognitive Neuroscience, 20, 204-214. Burr, D., & Alais, D. (2006). Combining visual and auditory information. Progress in Brain Research, 155, 243-258. Burr, D., Banks, M. S., & Morrone, M. C. (2009). Auditory dominance over vision in the perception of interval duration. Experimental Brain Research, 198, 49-57. Burr, D., Morrone, M. C., & Banks, M. (2006). Auditory capture of visual stimuli in time is statistically optimal. Journal of Vision, 6, 387-387. Cappe, C., Thut, G., Romei, V., & Murray, M. M. (2009). Selective integration of auditory-visual looming cues by humans. Neuropsychologia, 47, 1045-1052. Chen, K. M., & Yeh, S. L. (2009). Asymmetric cross-modal effects in time perception. Acta Psychologica, 130, 225-234. Chen, Y. C., & Yeh, S. L. (2008). Visual events modulated by sound in repetition blindness. Psychonomic Bulletin & Review, 15, 404-408. Clarke, J. J., & Yuille, A. L. (1990). Data fusion for sensory information processing. Boston, MA: Kluwer Academic. de Gelder, B., & Bertelson, P. (2003). Multisensory integration, perception and ecological validity. Trends in Cognitive Sciences, 7, 460-467. Deneve, S., Latham, P. E., & Pouget, A. (2001). Efficient computation and cue integration with noisy population codes. Nature Neuroscience, 4, 826-831. Dixon, N. F., & Spitz, L. (1980). The detection of auditory visual desynchrony. Perception, 9, 719-721. Driver, J., & Spence, C. (2000). Multisensory perception: Beyond modularity and convergence. Current Biology, 10, R731-R735. Droit-Volet, S., & Meck, W. H. (2007). How emotions colour our perception of time. Trends in Cognitive Sciences, 11, 504-513. Eagleman, D. M., Tse, P. U., Buonomano, D., Janssen, P., Nobre, A. C., & Holcombe, A. O. (2005). Time and the brain: How subjective time relates to neural time. The Journal of Neuroscience, 25, 10369-10371. Efron, B., & Tibshirani, R. (1993). An introduction to the bootstrap. New York: Chapman & Hall. Efron, R. (1970a). The minimum duration of a perception. Neuropsychologia, 8, 57-63. Efron, R. (1970b). The relationship between the duration of a stimulus and the duration of a perception. Neuropsychologia, 8, 37-55. Ehrsson, H. H. (2007). The experimental induction of out-of-body experiences. Science, 317, 1048. Eliason, S. R. (1993). Maximum likelihood estimation : Logic and practice. Newbury Park, CA: SAGE Publications. Epstein, R. (1984). The principle of parsimony and some applications in Psychology. The Journal of Mind and Behavior, 5, 119-130. Ernst, M. O., & Banks, M. S. (2002). Humans integrate visual and haptic information in a statistically optimal fashion. Nature, 415, 429-433. Ernst, M. O., & Bulthoff, H. H. (2004). Merging the senses into a robust percept. Trends in Cognitive Sciences, 8, 162-169. Fraisse, P. (1984). Perception and estimation of time. Annual Review of Psychology, 35, 1-36. Freides, D. (1974). Human information processing and sensory modality: Cross-modal functions, information complexity, memory, and deficit. Psychological Bulletin, 81, 284-310. Gepshtein, S., & Banks, M. S. (2003). Viewing geometry determines how vision and haptics combine in size perception. Current Biology, 13, 483-488. Ghazanfar, A. A., & Schroeder, C. E. (2006). Is neocortex essentially multisensory? Trends in Cognitive Sciences, 10, 278-285. Gilaie-Dotan, S., Kanai, R., & Rees, G. (2010). Individual differences in time perception indicate different modality-independent mechanisms for different temporal durations. Journal of Vision, 10, 1407. Goldstone, S., & Lhamon, W. T. (1974). Studies of auditory-visual differences in human time judgment: 1. Sounds are judged longer than lights. Perceptual and Motor Skills, 39, 63-82. Guttman, S. E., Gilroy, L. A., & Blake, R. (2005). Hearing what the eyes see. Psychological Science, 16, 228-235. Hay, J. C., Pick, H. L., & Ikeda, K. (1965). Visual capture produced by prism spectacles. Psychonomic Science, 2, 215-216. Hicks, R. E., Miller, G. W., Gaes, G., & Bierman, K. (1977). Concurrent processing demands and the experience of time-in-passing. American Journal of Psychology, 90, 431-446. Hicks, R. E., Miller, G. W., & Kinsbourne, M. (1976). Prospective and retrospective judgments of time as a function of amount of information processed. American Journal of Psychology, 89, 719-730. Hillis, J. M., Ernst, M. O., Banks, M. S., & Landy, M. S. (2002). Combining sensory information: Mandatory fusion within, but not between, senses. Science, 298, 1627-1630. Hillis, J. M., Watt, S. J., Landy, M. S., & Banks, M. S. (2004). Slant from texture and disparity cues: Optimal cue combination. Journal of Vision, 4, 967-992. Howard, I. P., & Templeton, W. B. (1966). Human spatial orientation. London: Wiley. Jacobs, R. A. (1999). Optimal integration of texture and motion cues to depth. Vision Research, 39, 3621-3629. Jousmaki, V., & Hari, R. (1998). Parchment-skin illusion: Sound-biased touch. Current Biology, 8, R190. Kanai, R., Paffen, C. L., Hogendoorn, H., & Verstraten, F. A. J. (2006). Time dilation in dynamic visual display. Journal of Vision, 6, 1421-1430. Kanai, R., & Watanabe, M. (2006). Visual onset expands subjective time. Perception & Psychophysics, 68, 1113-1123. King, A. J. (2004). Development of multisensory spatial integration. In C. Spence & J. Driver (Eds.), Crossmodal space and crossmodal attention (pp. 1-24). Oxford: Oxford University Press. Knill, D. C., Kersten, D., & Yuille, A. L. (1996). Introduction: A Bayesian formulation of visual perception. In D. C. Knill & W. Richards (Eds.), Perception as Bayesian inference (pp. 1-21). Cambridge: Cambridge University Press. Ladavas, E., & Farne, A. (2004). Neuropsychological evidence for multimodal representations of space near specific body parts. In C. Spence & J. Driver (Eds.), Crossmodal space and crossmodal attention (pp. 69-98). Oxford: Oxford University Press. Landy, M. S., Maloney, L. T., Johnston, E. B., & Young, M. (1995). Measurement and modeling of depth cue combination: In defense of weak fusion. Vision Research, 35, 389-412. Lapid, E., Ulrich, R., & Rammsayer, T. (2008). On estimating the difference limen in duration discrimination tasks: A comparison of the 2AFC and the reminder task. Perception & Psychophysics, 70, 291-305. Lederman, S. J., & Klatzky, R. L. (2004). Multisensory texture perception. In G. A. Calvert, C. Spence & B. E. Stein (Eds.), The handbook of multisensory processes (pp. 107-122). Cambridge, MA: The MIT Press. Lhamon, W. T., & Goldstone, S. (1974). Studies of auditory-visual differences in human time judgment: 2. More transmitted information with sounds than lights. Perceptual and Motor Skills, 39, 295-307. Lindley, D. V., & Smith, A. F. M. (1972). Bayes estimates for the linear model. Journal of the Royal Statistical Society. Series B (Methodological), 34, 1-41. Luo, H., Liu, Z., & Poeppel, D. (2010). Auditory cortex tracks both auditory and visual stimulus dynamics using low-frequency neuronal phase modulation. PLoS Biologly, 8, e1000445. MacDonald, J., & McGurk, H. (1978). Visual influences on speech perception processes. Perception & Psychophysics, 24, 253-257. Maier, J. X., Neuhoff, J. G., Logothetis, N. K., & Ghazanfar, A. A. (2004). Multisensory integration of looming signals by rhesus monkeys. Neuron, 43, 177-181. McGurk, H., & MacDonald, J. (1976). Hearing lips and seeing voices. Nature, 264, 746-748. Morein-Zamir, S., Soto-Faraco, S., & Kingstone, A. (2003). Auditory capture of vision: Examining temporal ventriloquism. Cognitive Brain Research, 17, 154-163. Morgan, M. J., Watamaniuk, S. N., & McKee, S. P. (2000). The use of an implicit standard for measuring discrimination thresholds. Vision Research, 40, 2341-2349. Nachmias, J. (2006). The role of virtual standards in visual discrimination. Vision Research, 46, 2456-2464. Ono, F., & Kawahara, J. (2007). The subjective size of visual stimuli affects the perceived duration of their presentation. Perception & Psychophysics, 69, 952-957. Oruç, I., Maloney, L. T., & Landy, M. S. (2003). Weighted linear cue combination with possibly correlated error. Vision Research, 43, 2451-2468. Partan, S. R. (2004). Multisensory animal communication. In G. A. Calvert, C. Spence & B. E. Stein (Eds.), The handbook of multisensory processes (pp. 225-240). Cambridge, MA: The MIT Press. Partan, S. R., & Marler, P. (1999). Communication goes multimodal. Science, 283, 1272-1273. Pavani, F., Spence, C., & Driver, J. (2000). Visual capture of touch: Out-of-the-body experiences with rubber gloves. Psychological Science, 11, 353-359. Pick, H. L., Warren, D. H., & Hay, J. C. (1969). Sensory conflict in judgments of spatial direction. Perception & Psychophysics, 6, 203-205. Pouget, A., Deneve, S., & Duhamel, J.-R. (2004). A computational neural theory of multisensory spatial representations. In C. Spence & J. Driver (Eds.), Crossmodal space and crossmodal attention (pp. 123-140). Oxford: Oxford University Press. Radeau, M., & Bertelson, P. (1976). The effect of a textured visual field on modality dominance in a ventriloquism situation. Perception & Psychophysics, 20, 227-235. Radeau, M., & Bertelson, P. (1987). Auditory-visual interaction and the timing of inputs: Thomas (1941) revisited. Psychological Research, 49, 17-22. Repp, B. H., & Penel, A. (2002). Auditory dominance in temporal processing: New evidence from synchronization with simultaneous visual and auditory sequences. Journal of Experimental Psychology: Human Perception & Performance, 28, 1085-1099. Rock, I., & Victor, J. (1964). Vision and touch: An experimentally created conflict between the two senses. Science, 143, 594-596. Schiff, W., Caviness, J. A., & Gibson, J. J. (1962). Persistent fear responses in rhesus monkeys to the optical stimulus of 'looming'. Science, 136, 982-983. Shams, L., & Beierholm, U. R. (2010). Causal inference in perception. Trends in Cognitive Sciences, 14, 425-432. Shams, L., Kamitani, Y., & Shimojo, S. (2000). Illusions. What you see is what you hear. Nature, 408, 788. Shams, L., Kamitani, Y., & Shimojo, S. (2004). Modulations of visual perception by sound. In G. A. Calvert, C. Spence & B. E. Stein (Eds.), The handbook of multisensory processes (pp. 27-33). Cambridge, MA: The MIT Press. Shams, L., Kamitani, Y., Thompson, S., & Shimojo, S. (2001). Sound alters visual evoked potentials in humans. NeuroReport, 12, 3849-3852. Shams, L., & Seitz, A. R. (2008). Benefits of multisensory learning. Trends in Cognitive Sciences, 12, 411-417. Shimojo, S., & Shams, L. (2001). Sensory modalities are not separate modalities: Plasticity and interactions. Current Opinion in Neurobiology, 11, 505-509. Shipley, T. (1964). Auditory flutter-driving of visual flicker. Science, 145, 1328-1330. Shore, D. I., Spence, C., & Klein, R. M. (2001). Visual prior entry. Psychological Science, 12, 205-212. Sober, E. (1981). The principle of parsimony. British Journal for the Philosophy of Science, 32, 145-156. Soto-Faraco, S., & Kingstone, A. (2004). Multisensory integration of dynamic information. In G. A. Calvert, C. Spence & B. E. Stein (Eds.), The handbook of multisensory processes (pp. 49-67). Cambridge, MA: The MIT Press. Spence, C., Shore, D. I., & Klein, R. M. (2001). Multisensory prior entry. Journal of Experimental Psychology: General, 130, 799-832. Stein, B. E., & Meredith, M. A. (1993). The merging of the senses. Cambridge, MA: MIT Press. Sugita, Y., & Suzuki, Y. (2003). Audiovisual perception: Implicit estimation of sound-arrival time. Nature, 421, 911. Tsakiris, M., & Haggard, P. (2005). The rubber hand illusion revisited: Visuotactile integration and self-attribution. Journal of Experimental Psychology: Human Perception & Performance, 31, 80-91. Tse, P. U., Intriligator, J., Rivest, J., & Cavanagh, P. (2004). Attention and the subjective expansion of time. Perception & Psychophysics, 66, 1171-1189. Ulbrich, P., Churan, J., Fink, M., & Wittmann, M. (2007). Temporal reproduction: Further evidence for two processes. Acta Psychologica, 125, 51-65. van Beers, R. J., Sittig, A. C., & Gon, J. J. (1999). Integration of proprioceptive and visual position-information: An experimentally supported model. Journal of Neurophysiology, 81, 1355-1364. van Wassenhove, V., Buonomano, D. V., Shimojo, S., & Shams, L. (2008). Distortions of subjective time perception within and across senses. PLoS ONE, 3, e1437. Vroomen, J., Bertelson, P., & de Gelder, B. (2001). The ventriloquist effect does not depend on the direction of automatic visual attention. Perception & Psychophysics, 63, 651-659. Vroomen, J., & de Gelder, B. (2000). Sound enhances visual perception: Cross-modal effects of auditory organization on vision. Journal of Experimental Psychology: Human Perception and Performance, 26, 1583-1590. Wada, Y., Kitagawa, N., & Noguchi, K. (2003). Audio-visual integration in temporal perception. International Journal of Psychophysiology, 50, 117-124. Walker, J. T., & Scott, K. J. (1981). Auditory-visual conflicts in the perceived duration of lights, tones and gaps. Journal of Experimental Psychology: Human Perception and Performance, 7, 1327-1339. Wearden, J. H., Edwards, H., Fakhri, M., & Percival, A. (1998). Why 'sounds are judged longer than lights': Application of a model of the internal clock in humans. Quarterly Journal of Experimental Psychology: Section B, 51, 97 - 120. Welch, R. B. (1999). Meaning, attention, and the unity assumption in the intersensory bias of spatial and temporal perceptions. In G. Aschersleben, T. Bachmann & J. Musseler (Eds.), Cognitive contributions to the perception of spatial and temporal events (pp. 371-387). Amsterdam: Elsevier. Welch, R. B., DuttonHurt, L. D., & Warren, D. H. (1986). Contributions of audition and vision to temporal rate perception. Perception & Psychophysics, 39, 294-300. Welch, R. B., & Warren, D. H. (1980). Immediate perceptual response to intersensory discrepancy. Psychological Bulletin, 88, 638-667. Xuan, B., Zhang, D., He, S., & Chen, X. (2007). Larger stimuli are judged to last longer. Journal of Vision, 7, 1-5. Young, M. J., Landy, M. S., & Maloney, L. T. (1993). A perturbation analysis of depth perception from combinations of texture and motion cues. Vision Research, 33, 2685-2696. Yuille, A. L., & Bulthoff, H. H. (1996). Bayesian decision theory and psychophysics. In D. C. Knill & W. Richards (Eds.), Perception as Bayesian inference (pp. 123-161). Cambridge: Cambridge University Press. Zakay, D. (1993). Time estimation methods - Do they influence prospective duration estimates? Perception, 22, 91-101. Zakay, D., & Block, R. A. (1997). Temporal cognition. Current Directions in Psychological Science, 6, 12-16. Zampini, M., Shore, D. I., & Spence, C. (2003). Audiovisual temporal order judgments. Experimental Brain Research, 152, 198-210. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/48543 | - |
| dc.description.abstract | 本研究以時間知覺為對象,檢驗跨感官處理的兩種假說。感官適切性假說預測在視聽訊息皆呈現的情況下,時間向度的處理會以聽覺為判斷來源;最大似然率估計假說則以感官信號可靠性為權重標準,以不同信號間的線性組合使形成多重感官最佳整合為解釋方向,因此預測視覺與聽覺皆會影響時間知覺,而影響的程度依個別感官的可靠性而定,愈可靠的感官在整合時具有愈高的權重。實驗一採用一前一後出現具兩種時間長度的刺激,標準刺激感官呈現的方式為視覺或聽覺,比較刺激則為視覺、聽覺、或二者皆有,令參與者依據某一感官判斷何者的時間較長。結果顯示視覺刺激時間長度的判斷會因聽覺刺激出現而延展,但聽覺刺激時間長度則不受視覺刺激出現的影響,支持感官適切性假說。實驗二操弄標準刺激的時間長度,發現僅在中間的幾個時間長度,聽覺刺激會展延視覺刺激的時間長度,但視覺刺激對聽覺刺激的時間長度皆無影響。實驗三操弄在比較刺激與標準刺激皆共同呈現視聽刺激,發現在最長時間長度下,符合最大似然率估計假說的預測,可由視覺和聽覺刺激單獨呈現的表現來預測。實驗四操弄聽覺刺激時間長度的模糊程度(可靠性),發現聽覺刺激的權重遠大於視覺刺激的權重,而在聽覺刺激較不可靠時,才符合最大似然率估計假說的預測。實驗五增強視覺刺激的時間可用性,發現視覺刺激權重亦可能大於聽覺刺激,而以視覺為判斷來源。綜觀本研究的五個實驗結果,皆指出聽覺偏好為時間知覺的基礎,但當聽覺信號可靠性降低,或視覺信號具時間判斷的可用性時,聽覺偏好現象便產生改變,使得視覺訊息影響時間長度的判斷。因此感官適切性假說與最大似然率估計假說皆無法解釋本文的實驗結果。本文提出整合的理論模型:在時間知覺中,聽覺偏好為其先前假設,因此人類在面對多重感官刺激以判斷時間長度時,基於先前假設會使聽覺訊息的貢獻增加,而視覺訊息的貢獻則被減少。基於此先前假設整合訊息時,則以個別感官信號的可靠性產生各自感官的權重,依此權重做線性組合,產生視聽訊息整合後的時間感。 | zh_TW |
| dc.description.abstract | We focus on time perception to test two approaches about multisensory perception. The modality appropriateness hypothesis states that audition determines the temporal judgment of audiovisual stimuli whereas the maximum likelihood estimation (MLE) model proposes an optimal cue-combination so that visual and auditory signals are integrated based on the weighting scheme that is proportional to the estimate reliability of a signal. In Experiment 1 observers compared the duration between two intervals based on the common modality. The standard stimulus was presented in visual or auditory modality, and the comparison stimulus was presented in visual, auditory, or both modalities. The results support the modality appropriateness hypothesis in that the sound expanded the perceived visual duration whereas the disk did not affect the perceived auditory duration. In Experiment 2 the standard stimulus was set in different durations, and the sound effect on the perceived visual duration was observed only at the intermediate four durations whereas the disk did not affect the perceived auditory duration at all. In Experiment 3 when both the standard and comparison stimuli were presented bimodally, only with the longest duration the bimodal performance was predictable from unimodal performances according to the MLE model. In Experiment 4 the auditory signal was manipulated into different extents of reliability. The contribution of auditory modality to the bimodal estimate was greater than the visual modality, and the prediction of the MLE model only applied to the condition with less reliable auditory signal. In Experiment 5 a looming disk was used to make vision more temporally related. The results showed that visual modality can determine the bimodal judgment. Across five experiments an auditory bias in bimodal time perception was observed. However, with higher reliability of the auditory signal or more temporally related visual signal, the prior auditory bias declined and more contribution of visual signal came into effect. In conclusion, neither the modality appropriateness hypothesis nor the MLE model can explain our results, and a hybrid model was proposed: The auditory bias acts as a prior assumption held by observers in time perception to combine sensory signals, and the signals integrate based on the weighting scheme according to the reliability of signals. | en |
| dc.description.provenance | Made available in DSpace on 2021-06-15T07:01:19Z (GMT). No. of bitstreams: 1 ntu-100-D92227003-1.pdf: 10854568 bytes, checksum: 11e2296ed60fd002867889333831ee06 (MD5) Previous issue date: 2011 | en |
| dc.description.tableofcontents | Chapter 1 Introduction......1
The Modality Appropriateness Hypothesis......4 The Maximum Likelihood Estimation Model......12 Details of the MLE Model......18 Multisensory Temporal Processing......22 Overview of the Current Study......26 Chapter 2 Asymmetric Cross-modal Effects in Time Perception......29 Experiment 1......30 Method......32 Results and discussion......35 Experiment 2......44 Method......45 Results and discussion......46 Chapter 3 Multisensory Interaction in Time Perception......51 Experiment 3......52 Method......53 Results and discussion......55 Experiment 4......61 Method......63 Results and discussion......66 Experiment 5......79 Method......81 Results and discussion......82 Chapter 4 General Discussion......93 Summary of Five Experiments......93 Modeling Multisensory Time Perception......96 The Hybrid Model......107 Contributions of the Current Study......117 Future Directions......119 References......121 Figures......137 | |
| dc.language.iso | en | |
| dc.subject | 跨感官整合 | zh_TW |
| dc.subject | 時間知覺 | zh_TW |
| dc.subject | 感官適切性假說 | zh_TW |
| dc.subject | 最大似然率估計假說 | zh_TW |
| dc.subject | modality appropriateness hypothesis | en |
| dc.subject | maximum likelihood estimation model | en |
| dc.subject | multisensory integration | en |
| dc.subject | time perception | en |
| dc.title | 視聽互動機制:以時間知覺為例 | zh_TW |
| dc.title | Audiovisual Interaction: A Case of Time Perception | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 99-1 | |
| dc.description.degree | 博士 | |
| dc.contributor.oralexamcommittee | 吳嫻(Denise Hsien Wu),徐永豐(Yung-Fong Hsu),黃榮村(Jong-Tsun Huang),梁庚辰(Keng-Chen Liang),陳一平(I-Ping Chen) | |
| dc.subject.keyword | 時間知覺,跨感官整合,感官適切性假說,最大似然率估計假說, | zh_TW |
| dc.subject.keyword | maximum likelihood estimation model,modality appropriateness hypothesis,multisensory integration,time perception, | en |
| dc.relation.page | 171 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2011-01-20 | |
| dc.contributor.author-college | 理學院 | zh_TW |
| dc.contributor.author-dept | 心理學研究所 | zh_TW |
| 顯示於系所單位: | 心理學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-100-1.pdf 未授權公開取用 | 10.6 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
