Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 理學院
  3. 心理學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/21477
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor葉素玲
dc.contributor.authorDa Lien
dc.contributor.author李達zh_TW
dc.date.accessioned2021-06-08T03:35:13Z-
dc.date.copyright2019-08-05
dc.date.issued2019
dc.date.submitted2019-07-31
dc.identifier.citationAlais, D., & Burr, D. (2004). The ventriloquist effect results from near-optimal bimodal integration. Current Biology, 14, 257-262.
Bang, D., Aitchison, L., Moran, R., Castanon, S. H., Rafiee, B., Mahmoodi, A., ... & Summerfield, C. (2017). Confidence matching in group decision-making. Nature Human Behaviour, 1, 117-123.
Bang, D., & Frith, C. D. (2017). Making better decisions in groups. Royal Society Open Science, 4, 1-22.
Bang, J., Shekhar, M., & Rahnev, D. (2018). Sensory noise increases metacognitive efficiency. Journal of Experimental Psychology: General, 148, 437-452.
Barsalou, L. W. (2016). On staying grounded and avoiding quixotic dead ends. Psychonomic Bulletin & Review, 23, 1122-1142.
Bates, E., D’Amico, S., Jacobsen, T., Székely, A., Andonova, E., Devescovi, A., & Tzeng, O. (2003). Timed picture naming in seven languages. Psychonomic Bulletin & Review, 10, 344–380
Beck, B., Peña-Vivas, V., Fleming, S., & Haggard, P. (2019). Metacognition across sensory modalities: Vision, warmth, and nociceptive pain. Cognition, 186, 32-41.
Beran, M. J., Brandl, J. L., Perner, J., & Proust, J. (Eds.). (2012). Foundations of metacognition. United Kingdom: Oxford University Press.
Boehler, C. N., Schoenfeld, M. A., Heinze, H. J., & Hopf, J. M. (2008). Rapid recurrent processing gates awareness in primary visual cortex. Proceedings of the National Academy of Sciences, 105, 8742-8747.
Boutonnet, B., & Lupyan, G. (2015). Words jump-start vision: A label advantage in object recognition. Journal of Neuroscience, 35, 9329-9335.
Bullier, J. (2001). Integrated model of visual processing. Brain Research Reviews, 36, 96-107.
Chen, Y. C., Huang, P. C., Woods, A., & Spence, C. (2019). I know that “Kiki” is angular: The metacognition underlying sound–shape correspondences. Psychonomic Bulletin & Review, 26, 261-268.
Chen, Y. C., & Spence, C. (2010). When hearing the bark helps to identify the dog: Semantically-congruent sounds modulate the identification of masked pictures. Cognition, 114, 389-404.
Chen, Y. C., & Spence, C. (2011). Crossmodal semantic priming by naturalistic sounds and spoken words enhances visual sensitivity. Journal of Experimental Psychology: Human Perception and Performance, 37, 1554-1568.
Chen, Y. C., & Spence, C. (2013). The time-course of the cross-modal semantic modulation of visual picture processing by naturalistic sounds and spoken words. Multisensory Research, 26, 371-386.
Chen, Y. C., & Spence, C. (2017). Dissociating the time courses of the cross-modal semantic priming effects elicited by naturalistic sounds and spoken words. Psychonomic Bulletin & Review, 25, 1138-1146.
Chen, Y. C., & Spence, C. (2018). Audiovisual semantic interactions between linguistic and nonlinguistic stimuli: The time-courses and categorical specificity. Journal of Experimental Psychology. Human Perception and Performance, 44, 1488-1507.
De Martino, B., Fleming, S. M., Garrett, N., & Dolan, R. J. (2013). Confidence in value-based choice. Nature Neuroscience, 16, 105-112.
DiCarlo, J. J., Zoccolan, D., & Rust, N. C. (2012). How does the brain solve visual object recognition?. Neuron, 73, 415-434.
Edmiston, P., & Lupyan, G. (2015). What makes words special? Words as unmotivated cues. Cognition, 100, 93-100.
Fabre-Thorpe, M. (2011). The characteristics and limits of rapid visual categorization. Frontiers in Psychology, 2, 243-254.
Fahrenfort, J. J., Snijders, T. M., Heinen, K., van Gaal, S., Scholte, H. S., & Lamme, V. A. (2012). Neuronal integration in visual cortex elevates face category tuning to conscious face perception. Proceedings of the National Academy of Sciences, 109, 21504-21509.
Faivre, N., Filevich, E., Solovey, G., Kühn, S., & Blanke, O. (2018). Behavioral, modeling, and electrophysiological evidence for supramodality in human metacognition. Journal of Neuroscience, 38, 263-277.
Firestone, C., & Scholl, B. J. (2016). Cognition does not affect perception: Evaluating the evidence for “top-down” effects. Behavioral and Brain Sciences, 39-115.
Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive–developmental inquiry. American Psychologist, 34, 906-911.
Fleming, S. M. (2014). The power of reflection. Scientific American Mind, 25, 30-37.
Fleming, S. M. (2017). HMeta-d: Hierarchical Bayesian estimation of metacognitive efficiency from confidence ratings. Neuroscience of Consciousness, 2017, 1-14
Fleming, S. M., & Daw, N. D. (2017). Self-evaluation of decision-making: A general Bayesian framework for metacognitive computation. Psychological Review, 124, 91-114.
Fleming, S. M., Dolan, R. J., & Frith, C. D. (2012). Metacognition: Computation, biology and function. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences, 367, 1280-1286.
Fleming, S. M., & Lau, H. C. (2014). How to measure metacognition. Frontiers in Human Neuroscience, 8, 1-9.
Galvin, S. J., Podd, J. V., Drga, V., & Whitmore, J. (2003). Type 2 tasks in the theory of signal detectability: Discrimination between correct and incorrect decisions. Psychonomic Bulletin & Review, 10, 843-876.
Haynes, J. D., Driver, J., & Rees, G. (2005). Visibility reflects dynamic changes of effective connectivity between V1 and fusiform cortex. Neuron, 46, 811-821.
Jouravlev, O., Taikh, A., & Jared, D. (2018). Effects of lexical ambiguity on perception: A test of the label feedback hypothesis using a visual oddball paradigm. Journal of Experimental Psychology: Human Perception and Performance, 44, 1842-1855.
Klemfuss, N., Prinzmetal, B., & Ivry, R. B. (2012). How does language change perception: A cautionary note. Frontiers in psychology, 3, 1-6
Kunimoto, C., Miller, J., & Pashler, H. (2001). Confidence and accuracy of near-threshold discrimination responses. Consciousness and Cognition, 10, 294-340.
Lamme, V. A. (2006). Towards a true neural stance on consciousness. Trends in Cognitive Sciences, 10, 494-501.
Lamme, V. A. (2010). How neuroscience will change our view on consciousness. Cognitive Neuroscience, 1, 204-220.
Lamme, V. A., & Roelfsema, P. R. (2000). The distinct modes of vision offered by feedforward and recurrent processing. Trends in Neurosciences, 23, 571-579.
Lamme, V. A., Super, H., & Spekreijse, H. (1998). Feedforward, horizontal, and feedback processing in the visual cortex. Current Opinion in Neurobiology, 8, 529-535.
Lau, H., & Rosenthal, D. (2011). Empirical support for higher-order theories of conscious awareness. Trends in Cognitive Sciences, 15, 365-373.
Laurienti, P. J., Kraft, R. A., Maldjian, J. A., Burdette, J. H., & Wallace, M. T. (2004). Semantic congruence is a critical factor in multisensory behavioral performance. Experimental Brain Research, 158, 405-414.
Lupyan, G. (2012a). Linguistically modulated perception and cognition: The label-feedback hypothesis. Frontiers in Psychology, 3, 1-13.
Lupyan, G. (2012b). What do words do? Toward a theory of language-augmented thought. In B. H. Ross (Ed.), Psychology of learning and motivation (Vol. 57, pp. 255-297). San Diego, CA: Elsevier Academic.
Lupyan, G., & Spivey, M. J. (2010). Redundant spoken labels facilitate perception of multiple items. Attention, Perception, & Psychophysics, 72, 2236-2253.
Lupyan, G., & Thompson-Schill, S. L. (2012). The evocative power of words: Activation of concepts by verbal and nonverbal means. Journal of Experimental Psychology: General, 141, 170-186.
Lupyan, G., & Ward, E. J. (2013). Language can boost otherwise unseen objects into visual awareness. Proceedings of the National Academy of Sciences, 110, 14196-14201.
Mack, M. L., & Palmeri, T. J. (2015). The dynamics of categorization: Unraveling rapid categorization. Journal of Experimental Psychology: General, 144, 551-569.
Macmillan, N. A., & Creelman, C. D. (2004). Detection theory: A user's guide. United Kingdom: Cambridge University Press.
Mahon, B. Z., & Hickok, G. (2016). Arguments about the nature of concepts: Symbols, embodiment, and beyond. Psychonomic Bulletin & Review, 23, 941-958.
Maier, M., & Abdel Rahman, R. (2018). Native language promotes access to visual consciousness. Psychological Science, 29, 1757-1772.
Maniscalco, B., & Lau, H. (2012). A signal detection theoretic approach for estimating metacognitive sensitivity from confidence ratings. Consciousness and Cognition, 21, 422-430.
Maniscalco, B., & Lau, H. (2016). The signal processing architecture underlying subjective reports of sensory awareness. Neuroscience of Consciousness, 2016, 1-17.
Meuwese, J. D., van Loon, A. M., Lamme, V. A., & Fahrenfort, J. J. (2014). The subjective experience of object recognition: Comparing metacognition for object detection and object categorization. Attention, Perception, & Psychophysics, 76, 1057-1068.
Meyer, K., Kaplan, J. T., Essex, R., Webber, C., Damasio, H., & Damasio, A. (2010). Predicting visual stimuli on the basis of activity in auditory cortices. Nature Neuroscience, 13, 667-668.
Maguire, J. F., & Howe, P. D. (2016). Failure to detect meaning in RSVP at 27 ms per picture. Attention, Perception, & Psychophysics, 78, 1405-1413.
Molholm, S., Ritter, W., Javitt, D. C., & Foxe, J. J. (2004). Multisensory visual–auditory object recognition in humans: A high-density electrical mapping study. Cerebral Cortex, 14, 452-465.
Murray, M. M., & Spierer, L. (2009). Auditory spatio-temporal brain dynamics and their consequences for multisensory interactions in humans. Hearing research, 258, 121-133.
Navajas, J., Bahrami, B., & Latham, P. E. (2016). Post-decisional accounts of biases in confidence. Current Opinion in Behavioral Sciences, 100, 55-60.
Pleskac, T. J., & Busemeyer, J. R. (2010). Two-stage dynamic signal detection: a theory of choice, decision time, and confidence. Psychological Review, 117(3), 864-901.
Potter, M. C. (1993). Very short-term conceptual memory. Memory & cognition, 21, 156-161.
Rausch, M., & Zehetleitner, M. (2016). Visibility is not equivalent to confidence in a low contrast orientation discrimination task. Frontiers in psychology, 7, 1-15.
Roelofs, A. (2005). The visual-auditory color-word Stroop asymmetry and its time course. Memory & Cognition, 33, 1325-1336.
Salin, P. A., & Bullier, J. (1995). Corticocortical connections in the visual system: Structure and function. Physiological Reviews, 75, 107-154.
Shea, N., Boldt, A., Bang, D., Yeung, N., Heyes, C., & Frith, C. D. (2014). Supra-personal cognitive control and metacognition. Trends in Cognitive Sciences, 18, 186-193.
Shea, N., & Frith, C. D. (2019). The global workspace needs metacognition. Trends in Cognitive Sciences, 23, 560-571.
Snodgrass, J. G., & Vanderwart, M. (1980). A standardized set of 260 pictures: norms for name agreement, image agreement, familiarity, and visual complexity. Journal of experimental psychology. Human learning and memory, 6, 174-215.
Snyder, J. S., & Gregg, M. K. (2011). Memory for sound, with an ear toward hearing in complex auditory scenes. Attention, Perception, & Psychophysics, 73, 1993-2007.
Soemer, A., & Saito, S. (2015). Maintenance of auditory-nonverbal information in working memory. Psychonomic Bulletin & Review, 22, 1777-1783.
Suied, C., Bonneel, N., & Viaud-Delmon, I. (2009). Integration of auditory and visual information in the recognition of realistic objects. Experimental Brain Research, 194, 91-102.
Tan, J. S., & Yeh, S. L. (2015). Audiovisual integration facilitates unconscious visual scene processing. Journal of Experimental Psychology: Human Perception and Performance, 41, 1325-1335.
VanRullen, R., & Koch, C. (2003). Visual selective behavior can be triggered by a feed-forward process. Journal of Cognitive Neuroscience, 15, 209-217.
Van den Berg, R., Anandalingam, K., Zylberberg, A., Kiani, R., Shadlen, M. N., & Wolpert, D. M. (2016). A common mechanism underlies changes of mind about decisions and confidence. Elife, 5, 1-21.
Vetter, P., Smith, F. W., & Muckli, L. (2014). Decoding sound and imagery content in early visual cortex. Current Biology, 24, 1256-1262.
Waxman, S. R., & Gelman, S. A. (2009). Early word-learning entails reference, not merely associations. Trends in Cognitive Sciences, 13, 258-263.
Yu, S., Pleskac, T. J., & Zeigenfuse, M. D. (2015). Dynamics of postdecisional processing of confidence. Journal of Experimental Psychology: General, 144, 489-510.
Yuval-Greenberg, S., & Deouell, L. Y. (2009). The dog’s meow: Asymmetrical interaction in cross-modal object recognition. Experimental Brain Research, 193, 603-614.
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/21477-
dc.description.abstractVisual sensitivity (d’) of a picture can be enhanced crossmodally by the presentation of an auditory cue that is semantically congruent rather than incongruent. However, it remains unknown whether such crossmodal semantic congruency can modulate metacognitive sensitivity of picture processing (meta-d’, the ability to discriminate whether one’s own perceptual judgment is correct). We examined this issue by measuring the d’ and meta-d’ in a picture detection task, and their quotient (meta-d’/d’, called M-ratio) is an index of metacognitive efficiency which controls the influence of task difficulty. The auditory cue (a naturalistic sound or a spoken word) and the object picture (presented for 37 ms, sandwiched by two 13-ms masks) were either congruent (e.g., a dog barking or the spoken word “dog” paired with a dog picture) or incongruent (e.g., a piano note or the spoken word “piano” paired with a dog picture). Auditory cues were presented at -1000, -350, 0 and 500 ms stimulus onset asynchronies (SOAs, negative values indicate auditory leading cue is leading). Participants had to detect the presence of an object picture, and then to rate their confidence regarding the detection judgment. When a naturalistic sound or spoken word was presented earlier than or simultaneously with the picture, the d’ was higher in the congruent than in the incongruent condition. However, only the spoken word elicited the semantic congruency effects on d’ even at 500 ms SOA. Spoken words also induced semantic congruency effects on meta-d’ across all the SOAs. By contrast, naturalistic sounds only elicit semantic congruency effects on meta-d’ at the -350 ms SOA. Interestingly, the metacognitive efficiency was higher in the congruent than in the incongruent condition only when spoken words were presented simultaneously or later than the picture. Hence, hearing a semantically-congruent (as compared to incongruent) auditory cue can facilitate not only visual perception but also metacognition. Moreover, spoken words elicit more pronounced semantic congruency effects on both visual perception and metacognition. We provide a unified framework to explain the different congruency effects of spoken words and naturalistic sounds.en
dc.description.provenanceMade available in DSpace on 2021-06-08T03:35:13Z (GMT). No. of bitstreams: 1
ntu-108-R05227119-1.pdf: 1733817 bytes, checksum: 2985a790af8e2c330f2ed05dcd35e597 (MD5)
Previous issue date: 2019
en
dc.description.tableofcontentsIntroduction 1
Auditory Semantic Modulations on Visual Perception 2
Metacognition 7
Goal of the Present Study 9
General Methods 11
Participants 11
Apparatus and Stimuli 11
Design 12
Procedure 14
Data Analysis 16
Experiment 1: Spoken Words 20
Results 20
Discussion 25
Experiment 2: Naturalistic Sound 29
Results 29
Discussion 35
General Discussion 37
References 46
Appendix 54
dc.language.isoen
dc.title聽覺意義訊息對視知覺和後設認知的影響zh_TW
dc.titleAuditory Semantic Modulations on Visual Perception and Metacognitionen
dc.typeThesis
dc.date.schoolyear107-2
dc.description.degree碩士
dc.contributor.oralexamcommittee陳奕全,黃榮村,吳建德
dc.subject.keyword跨感官,語音,自然音,意義一致性,後設認知,zh_TW
dc.subject.keywordcrossmodal,spoken word,naturalistic sound,semantic congruency,metacognition,en
dc.relation.page59
dc.identifier.doi10.6342/NTU201902154
dc.rights.note未授權
dc.date.accepted2019-07-31
dc.contributor.author-college理學院zh_TW
dc.contributor.author-dept心理學研究所zh_TW
顯示於系所單位:心理學系

文件中的檔案:
檔案 大小格式 
ntu-108-1.pdf
  未授權公開取用
1.69 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved