請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/96845
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 鄭瑋 | zh_TW |
dc.contributor.advisor | Wei Jeng | en |
dc.contributor.author | 高庭萱 | zh_TW |
dc.contributor.author | Ting Hsuan Kao | en |
dc.date.accessioned | 2025-02-24T16:13:43Z | - |
dc.date.available | 2025-02-25 | - |
dc.date.copyright | 2025-02-24 | - |
dc.date.issued | 2025 | - |
dc.date.submitted | 2025-01-08 | - |
dc.identifier.citation | 中文部分
Maxwell, J. A.(2018)。質性研究設計:互動取向的方法(陳劍涵譯)。新北市:心理出版。(原著出版於2004年) Stanovich, K. E.(2018)。這才是心理學!(楊中芳譯;第三版)。 臺北市:遠流出版。(原著出版於2018年) 王雲東(2016)。社會研究方法:量化與質性取向及其應用(第三版)。新北市:揚智文化。 石易儒(2021)。臺灣犯罪學開放研究資料及典藏庫之實踐(碩士論文)。華藝線上圖書館。https://doi.org/10.6342/NTU202104572 林日清、石倉安 (2018)。邁向頂尖大學計畫執行情形之審計。政府審計季刊,38(2),21-34。 周倩(2020)。你要準備提科技部計畫了嗎?。科技部誠信電子報,43。臺北市:科技部研究誠信辦公室。 胡幼慧(1996),質性研究--理論、方法及本土女性實例研究。臺北市:巨流圖書。 祝堅恆等(2023)。邁向更嚴謹的體育運動學術研究:複製危機與預先註冊。體育學報,56(4),387-404。https://doi.org/10.6222/pej.202312_56(4).0002 張仁和(2019)。心理科學之複製危機與轉機:回顧、因應及對體育研究的啟發。體育學報,52(1),1-15。https://doi.org/10.3966/102472972019035201001 黃金蘭、林以正、楊中芳(2012)。中庸信念-價值量表之修訂。本土心理學研究,38,3-41。 黃慕萱、嚴竹蓮(2016)。同儕審查的起源,研究現況與展望。圖書資訊學刊,14(1),41-85。 英文部分 American Psychological Association. (n.d.). 2001 APA Ethics Committee Rules and Procedures. Retrieved May 25, 2022, from https://www.apa.org/ethics/code/committee#PII1 Asendorpf, J. B., Conner, M., de Fruyt, F., De Houwer, J., Denissen, J. J. A., et al. (2016). Recommendations for increasing replicability in psychology. Recommendations for Increasing Replicability in Psychology. European Journal of Personality, 27(2), 108-119. Baker, M. (2016). 1,500 scientists lift the lid on reproducibility. Nature, 533,452–454. Bausell, R. B. (2021). The Problem with Science: The Reproducibility Crisis and What to Do about It. Oxford, England: Oxford University Press. Beecher, H. K. (1955). The powerful placebo. Journal of the American Medical Association, 159(17), 1602-1606. http://doi.org/10.1001/jama.1955.02960340022006 Blanes i Vidal, J., & Leaver, C. (2015). Bias in open peer-review: Evidence from the English superior courts. The Journal of Law, Economics, and Organization, 31(3), 431-471. https://doi.org/10.1093/jleo/ewv004 Breznau, N. (2021). Does Sociology Need Open Science? Societies, 11(1), 1-25. https://doi.org/10.3390/soc11010009 Carey, B. (2011). Fraud case seen as a red flag for psychology research. The New York Times, 3. Retrieved May 29, 2022, from https://www.nytimes.com/2011/11/03/health/research/noted-dutch-psychologist-stapel-accused-of-research-fraud.html Center of Open Science. (n.d.). New Measure Rates Quality of Research Journals’ Policies to Promote Transparency and Reproducibility. Retrieved May 20, 2022, from https://www.cos.io/about/news/new-measure-rates-quality-research-journals-policies-promote-transparency-and-reproducibility Christian, T. M., Gooch, A., Vision, T., & Hull, E. (2020). Journal data policies: Exploring how the understanding of editors and authors corresponds to the policies themselves. PloS one, 15(3): e0230281. https://doi.org/10.1371/journal.pone.0230281 Collabra: Psychology. (n.d.). Editorial Policies. Retrieved May 29, 2022 from https://online.ucpress.edu/collabra/pages/journalpolicies Cooper, A. (1999). The Inmates are Running the Asylum. United States: Sams Publishing. Creswell, J. W. (2014). Research Design: Qualitative, Quantitative and Mixed Methods Approaches (4th ed.). CA: Sage. Curty, R. G., Lee, J. S., Chang, W., Kao, T. H., & Jeng, W. (2022). Practicing What is Preached: Exploring Reproducibility Compliance of Papers on Reproducible Research. Information for a Better World: Shaping the Global Future (iConference 2022), 255-264. Springer, Cham. DA-RT. (n.d.). About DA-RT. Retrieved May 29, 2022 from https://www.dartstatement.org/about DA-RT. (n.d.). Policies. Retrieved May 29, 2022 from https://www.dartstatement.org/policies Elsevier. (n.d.). Cortex: author information pack. Retrieved May 29, 2022 from https://www.elsevier.com/wps/find/journaldescription.cws_home/714334?generatepdf=true Enserink, M. (2012). Final report on Stapel also blames field as a whole. Science, 338(6112), 1270-1271. https://doi.org/10.1126/science.338.6112.127 Fidler, F. & Wilcox, J. (2021). Reproducibility of Scientific Results. The Stanford Encyclopedia of Philosophy. Retrieved June 5, 2022 from https://plato.stanford.edu/archives/sum2021/entries/scientific-reproducibility/ European Open Science Cloud [EOSC]. (n.d.). What the European Open Science Cloud is. Retrieved May 29, 2022 from https://ec.europa.eu/info/research-and-innovation/strategy/strategy-2020-2024/our-digital-future/open-science/european-open-science-cloud-eosc_en GO FAIR. (n.d.). GO FAIR Initiative. Retrieved May 29, 2022 from https://www.go-fair.org/go-fair-initiative/ Görögh, B. S., Vilte, B, Vilius, S, & Saskia W. (2019). Deliverable D3.1– Practices, evaluation and mapping: Methods, tools and user needs. Zenodo. https://doi.org/10.5281/zenodo.2557272 Hardwicke, T. E., Mathur, M. B., MacDonald, K., Nilsonne, G., Banks, G. C., et al. (2018). Data availability, reusability, and analytic reproducibility: Evaluating the impact of a mandatory open data policy at the journal Cognition. Royal Society open science, 5(8). https://doi.org/10.1098/rsos.180448 Hanke M., Halchenko Y.O., Haxby J.V., Pollmann, S. (2010). Statistical learning analysis in neuroscience: aiming for transparency. Frontiers in neuroscience, 4(38). https://doi.org/10.3389/neuro.01.007.2010 Holdren, J. P. (2013). Memorandum for the heads of executive departments and agencies: Increasing access to the results of federally funded scientific research. Retrieved May 29, 2022 from https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/ostp_public_access_memo_2013.pdf Hrynaszkiewicz, I. (2020). Publishers’ responsibilities in promoting data quality and reproducibility. Handbook of experimental pharmacology, 257, 319-348. https://doi.org/10.1007/164_2019_290 Ioannidis, J. P. (2005). Why most published research findings are false. PLoS medicine, 2(8). https://doi.org/10.1371/journal.pmed.0020124 Ishiyama, J. (2014). Replication, research transparency, and journal publications: Individualism, community models, and the future of replication studies. PS: Political Science & Politics, 47(1), 78-83. https://doi.org/10.1017/S1049096513001765 John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological science, 23(5), 524-532. https://doi.org/10.1177/0956797611430953 Journal of Cognition (n.d.). Editorial Policies. Retrieved May 29, 2022 from https://www.journalofcognition.org/about/editorialpolicies/ Kerr, N. L. (1998). HARKing: Hypothesizing after the results are known. Personality and social psychology review, 2(3), 196-217. https://doi.org/10.1207/s15327957pspr02034 Kidwell, M. C., Lazarević, L. B., Baranski, E., Hardwicke, T. E., Piechowski, S., et al. (2016). Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency. PLoS biology, 14(5). https://doi.org/10.1371/journal.pbio.1002456 Kim, Y., & Adler, M. (2015). Social scientists’ data sharing behaviors: Investigating the roles of individual motivations, institutional pressures, and data repositories. International journal of information management, 35(4), 408-418. https://doi.org/10.1016/j.ijinfomgt.2015.04.007 Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams Jr, R. B., et al. (2018). Many Labs 2: Investigating variation in replicability across samples and settings. Advances in Methods and Practices in Psychological Science, 1(4), 443-490. https://doi.org/10.1177/2515245918810225 Konkol, M., Kray, C., & Pfeiffer, M. (2019). Computational reproducibility in geoscientific papers: Insights from a series of studies with geoscientists and a reproduction study. International Journal of Geographical Information Science, 33(2), 408-429. https://doi.org/10.1080/13658816.2018.1508687 Kovanis, M., Porcher, R., Ravaud, P., & Trinquart, L. (2016). The global burden of journal peer review in the biomedical literature: Strong imbalance in the collective enterprise. PloS one, 11(11): e0166387. https://doi.org/10.1371/journal.pone.0166387 Lage, K., Losoff, B., & Maness, J. (2011). Receptivity to library involvement in scientific data curation: A case study at the university of colorado boulder. portal: Libraries and the Academy, 11(4), 915-937. https://doi.org/10.1353/pla.2011.0049 Lin, J., & Strasser, C. (2014). Recommendations for the role of publishers in access to data. PLoS biology, 12(10). https://doi.org/10.1371/journal.pbio.1001975 Long, F. (2009). Real or Imaginary: The Effectiveness of Using Personas in Product Design. Irish Ergonomics Review, Proceedings of the IES Conference 2009, Dublin. https://www.frontend.com/thinking/using-personas-in-product-design/ Lopez, D.A., Cardenas-Iniguez, C., Subramaniam, P., Adise, S., Bottenhorn, K.L., et al. (2024). Transparency and reproducibility in the ADolescent Brain Cognitive Development (ABCD) study. Developmental cognitive neuroscience, 68: e101408. https://doi.org/10.1016/j.dcn.2024.101408 Lyon, L. (2007). Dealing with Data: Roles, Rights, Responsibilities and Relationships. Consultancy Report. UKOLN. Lyon, L. (2016). Transparency: The Emerging Third Dimension of Open Science and Open Data. Liber Quarterly: The Journal of European Research Libraries, 25(4). https://doi.org/10.18352/lq.10113 Lyon, L., Jeng, W., & Mattern, E. (2020). Developing the tasks-toward-transparency (T3) model for research transparency in open science using the lifecycle as a grounding framework. Library & Information Science Research, 42(1). https://doi.org/10.1016/j.lisr.2019.100999 Makel, M. C., Plucker, J. A., & Hegarty, B. (2012). Replications in Psychology Research: How Often Do They Really Occur? Perspectives on Psychological Science, 7(6), 537-542. https://doi.org/10.1177/1745691612460688 Malone, R. E. (1999). Should peer review be an open process? Journal of Emergency Nursing, 25(2), 150-152. https://doi.org/10.1016/S0099-1767(99)70163-7 Maness, J. M., Miaskiewicz, T., & Sumner, T. (2008). Using personas to understand the needs and goals of institutional repository users. D-Lib Magazine, 14(9/10). http://www.dlib.org/dlib/september08/maness/09maness.html McIntyre, Lee C. (2019). The scientific attitude: defending science from denial, fraud, and pseudoscience. Cambridge: The MIT Press. McNutt, M. (2014). Reproducibility. Science, 343(6168), 229-229. https://doi.org/10.1126/science.1250475 Meta-Psychology. (n.d.). About. Retrieved May 29, 2022 from https://open.lnu.se/index.php/metapsychology/about Miguel, E., Camerer, C., Casey, K., Cohen, J., Esterling, K. M., et al. (2014). Promoting transparency in social science research. Science, 343(6166), 30-31. https://doi.org/10.1126/science.1245317 Mulligan, A., Hall, L., & Raphael, E. (2013). Peer review in a changing world: An international study measuring the attitudes of researchers. Journal of the American Society for Information Science and Technology, 64(1), 132-161. https://doi.org/10.1002/asi.22798 Munafò, M. R., Nosek, B. A., Bishop, D. V., Button, K. S., Chambers, C. D., et al. (2017). A manifesto for reproducible science. Nature human behaviour, 1(1), 1-9. https://doi.org/10.1038/s41562-016-0021 National Academies of Sciences, Engineering, and Medicine (2019). Reproducibility and Replicability in Science. Washington: The National Academies Press. National Institutes of Health [NIH]. (n.d.). Final NIH Policy for Data Management and Sharing. Retrieved May 29, 2022 from https://grants.nih.gov/grants/guide/notice-files/NOT-OD-21-013.html National Institutes of Health [NIH]. (n.d.). The CDE Repository. Retrieved May 29, 2022 from https://cde.nlm.nih.gov/home Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., et al. (2015). Promoting an open research culture. Science, 348(6242), 1422-1425. http://doi.org/10.1126/science.aab2374 Nuijten, M. B., Borghuis, J., Veldkamp, C. L., Dominguez-Alvarez, L., Van Assen, M. A., et al. (2017). Journal data sharing policies and statistical reporting inconsistencies in psychology. Collabra: Psychology, 3(1). https://doi.org/10.1525/collabra.102 Nüst, D., Granell, C., Hofer, B., Konkol, M., Ostermann, F. O., et al. (2018). Reproducible research and GIScience: an evaluation using AGILE conference papers. PeerJ, 6. https://doi.org/10.7717/peerj.5072e5072 Offit, P. (2010). Autism's False Prophets: Bad Science, Risky Medicine, and the Search for a Cure. New York: Columbia University Press. Open Science Collaboration (2015). Estimating the reproducibility of psychological science. Science, 349(6251). https://doi.org/10.1126/science.aac4716 Pashler, H., & Wagenmakers, E. J. (2012). Editors’ introduction to the special section on replicability in psychological science: A crisis of confidence? Perspectives on psychological science, 7(6), 528-530. https://doi.org/10.1177/1745691612465253 Pasquetto, I. V., Borgman, C. L., & Wofford, M. F. (2019). Uses and Reuses of Scientific Data: The Data Creators’ Advantage. Harvard Data Science Review, 1(2). https://doi.org/10.1162/99608f92.fc14bf2d Peer Community In Registered Reports. (n.d.). About PCI Registered Reports. Retrieved May 29, 2022 from https://rr.peercommunityin.org/about Personality Science. (n.d.). Open Science: Transparency Guidelines. Retrieved May 29, 2022 from https://ps.psychopen.eu/index.php/ps/open-science Rabin, R. C. (2009). Alcohol’s good for you? Some scientists doubt it. The New York Times. Retrieved May 29, 2022 from https://www.nytimes.com/2009/06/16/health/16alco.html Ronald L. W. & Nicole A. L. (2016). The ASA Statement on p-Values: Context, Process, and Purpose. The American Statistician, 70(2), 129-133. https://doi.org/10.1080/00031305.2016.1154108 Ross-Hellauer, T. (2017). What is open peer review? A systematic review. F1000Research, 6. https://doi.org/10.12688/f1000research.11369.2 Publications Office of the European Union (2016). Realising the European open science cloud : first report and recommendations of the Commission high level expert group on the European open science cloud. https://data.europa.eu/doi/10.2777/940154 SGAE journals. (n.d.). Advances in Methods and Practices in Psychological Science: Submission Guidelines. Retrieved May 29, 2022 from https://journals.sagepub.com/author-instructions/AMP Sijtsma, K. (2016). Playing with data—or how to discourage questionable research practices and stimulate researchers to do things right. Psychometrika, 81(1), 1-15. https://doi.org/10.1007/s11336-015-9446-0 Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science, 22(11), 1359-1366. https://doi.org/10.1177/0956797611417632 Sinek, S. (2009). Start with Why: How Great Leaders Inspire Everyone to Take Action. London: Penguin Group. Spitschan, M., Schmidt, M. H., & Blume, C. (2020). Principles of open, transparent and reproducible science in author guidelines of sleep research and chronobiology journals. Wellcome Open Research, 5. https://doi.org/10.12688/wellcomeopenres.16111.2 Stagge, J. H., Rosenberg, D. E., Abdallah, A. M., Akbar, H., Attallah, N. A., et al. (2019). Assessing data availability and research reproducibility in hydrology and water resources. Scientific data, 6(1), 1-12. https://doi.org/10.1038/sdata.2019.30 Stall, S., Yarmey, L., Cutcher-Gershenfeld, J., Hanson, B., Lehnert, K., et al. (2019). Make scientific data FAIR. Nature, 570, 27-29. https://doi.org/10.1038/d41586-019-01720-7 Steneck, N. H. (2006). Fostering integrity in research: Definitions, current knowledge, and future directions. Science and engineering ethics, 12(1), 53-74. https://doi.org/10.1007/pl00022268 Tegbaru, D., Braverman, L., Zietman, A. L., Yom, S. S., Lee, W. R., et al. (2019). ASTRO journals’ data sharing policy and recommended best practices. Advances in radiation oncology, 4(4), 551-558. https://doi.org/10.1016/j.adro.2019.08.002 The Coalition for Publishing Data in the Earth and Space Sciences [COPDESS]. (n.d.). Commitment statement in the Earth, space, and environmental sciences. Retrieved May 29, 2022 from https://copdess.org/enabling-fair-data-project/commitment-statement-in-the-earth-space-and-environmental-sciences/ The GO FAIR international support and coordination office [GFISCO]. (2017). Germany and the Netherlands call for action on European Open Science Cloud. Retrieved May 29, 2022 from https://www.go-fair.org/2017/05/31/germany-netherlands-call-action-european-open-science-cloud/ The GO FAIR international support and coordination office [GFISCO]. (n. d.). How to GO FAIR. Retrieved May 29, 2022 from https://www.go-fair.org/how-to-go-fair/ Vasilevsky, N. A., Minnier, J., Haendel, M. A., & Champieux, R. E. (2017). Reproducible and reusable research: are journal data sharing policies meeting the mark? PeerJ, 5. https://doi.org/10.7717/peerj.3208 Vlaeminck, S. (2013). Data management in scholarly journals and possible roles for libraries–Some insights from EDaWaX. Liber Quarterly, 23(1), 48-79. http://hdl.handle.net/10419/106609 Wagenmakers, E.-J., Wetzels, R., Borsboom, D., van der Maas, H. L. J., & Kievit, R. A. (2012). An Agenda for Purely Confirmatory Research. Perspectives on Psychological Science, 7(6), 632–638. https://doi.org/10.1177/1745691612463078 Wicherts, J. M., Borsboom, D., Kats, J., & Molenaar, D. (2006). The poor availability of psychological research data for reanalysis. American psychologist, 61(7), 726. https://doi.org/10.1037/0003-066X.61.7.726 Wiley Online Library. (2021). European Journal of Personality: Author Guidelines. Retrieved May 29, 2022 from https://onlinelibrary.wiley.com/page/journal/10990984/homepage/forauthors.html Wilkinson, M. D., Dumontier, M., Aalbersberg, I. J., Appleton, G., Axton, M., et al. (2016). The FAIR Guiding Principles for scientific data management and stewardship. Scientific data, 3(1), 1-9. https://doi.org/10.1038/sdata.2016.18 Wolfram, D., Wang, P., Hembree, A., & Park, H. (2020). Open peer review: promoting transparency in open science. Scientometrics, 125(2), 1033-1051. https://doi.org/10.1007/s11192-020-03488-4 | - |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/96845 | - |
dc.description.abstract | 研究誠信乃是學者體現良好科學態度之重要要素,隨著學術界逐漸產生再現性危機之討論聲浪,使得研究通透度與研究再現性為維護研究誠信提供的正面價值備受重視。國際間對於開放科學議題之討論及實踐已有長足發展,其中又以心理學領域展現其豐碩的研究成果。然而,現今國內學術環境對於研究資料開放及再現研究之議題的探討較稀缺。本研究以此為缺口,以臺灣心理學領域作為研究探索的基礎,挖掘期刊編輯與作者於研究通透度與再現性之意見,並探究本土心理學期刊於實踐資料政策之意向。
本研究以期刊編輯與作者為研究對象,使用半結構式深度訪談探索編輯觀點,並輔以問卷調查法蒐集期刊作者意見。本研究透過預先設計之訪談綱要和問卷題項,探索學者對於研究不實行為與研究通透度之間的認知,並討論本土心理學期刊於推動資料政策之意願。本研究額外運用人物誌方法,將研究結果聚斂成三種人物誌,提供具有實境感之學者立場,提供未來圖書館服務規劃之參酌。 研究結果體現出本土心理學者於開放科學的發展抱持肯定價值,而受訪者也為本土期刊是否應推動資料政策進行評估並提出考量。本研究最終彙整學者意見並繪製成三種人物誌,分別刻畫「本土保守派」、「積極響應派」及「理性推動派」三方學者立場。人物誌特質除了展現出重視本土研究價值,且擔憂資料政策影響本土期刊發展之守舊面向;也體現支持學術邁向資料通透、鼓勵期刊推動資料政策,並為政策內容提供細節規劃之積極面向。藉由人物誌揭露之學者觀點與需求,本研究亦以圖書館策劃學術服務之角度,論述相關服務策略之內容。 本研究在於揭示心理學領域學者對於研究通透度與再現性之認知與政策實務推動之考量,以展現本土心理學領域於維護科學誠信之目的。本研究亦描繪出多種面向的人物誌立場,為圖書館提供其策劃學術服務與資源時的參考雛型。 | zh_TW |
dc.description.abstract | Research integrity is a crucial element in demonstrating a good scientific attitude among scholars. With the increasing discourse on the reproducibility crisis in academia, the positive value of research transparency and reproducibility in maintaining research integrity has garnered significant attention. The international discussions and practices regarding open science have advanced considerably, particularly in the field of psychology, which has achieved notable research results. However, within Taiwanese academic environment, discussions about open research data and reproducibility studies remain scarce. This study addresses this gap by investigating journal editors' and authors’ perceptions of research transparency and reproducibility in the field of psychology in Taiwan, and examining the willingness of Taiwanese psychology journals to implement data-sharing policies.
The study employs a mixed-methods approach, combining semi-structured in-depth interviews with journal editors to gather qualitative insights and questionnaire surveys to collect quantitative data from authors. Using a pre-designed interview outline and tailored questionnaire, this research explores scholars' perceptions of the relationship between questionable research practices and research transparency, and their attitudes toward adopting data-sharing policies. Additionally, the study incorporates personas to synthesize the findings into three representative figures, providing nuanced and contextually rich perspectives to inform academic library service planning in the future. The results reflect strong support among Taiwanese psychologists for the value of open science. Respondents provided critical considerations and risk assessments regarding the adoption of data policies by Taiwanese journals. The study consolidated these perspectives into three personas: the “Local Preservationist,” the “Proactive Advocate,” and the “Pragmatic Facilitator.” These personas illustrate a range of attitudes, from emphasizing the protection of Taiwanese research values and expressing concerns about the potential impact of data-sharing policies on journal development, to supporting greater data transparency and encouraging journals to implement data-sharing policies with detailed planning. The persona characteristics not only reflect a strong emphasis on safeguarding the value of local research and concerns about the influence of data policies on journal growth but also embody proactive attitudes toward advancing academic transparency, promoting data policies, and providing detailed policy planning. Through the perspectives and requirements of these personas, the study constructs strategies for academic library services to support open science initiatives. This study aims to reveal the perceptions of Taiwanese psychologists regarding research transparency and reproducibility, as well as their considerations in promoting practical data-sharing policies, with the purpose of protecting scientific integrity within the field of Taiwanese psychology. The study also depicts diverse viewpoints of the personas, providing a reference prototype for libraries in planning academic services and resources. | en |
dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-02-24T16:13:43Z No. of bitstreams: 0 | en |
dc.description.provenance | Made available in DSpace on 2025-02-24T16:13:43Z (GMT). No. of bitstreams: 0 | en |
dc.description.tableofcontents | 謝 辭……..……………………………………………………….…………………….i
摘 要……..………………………………………………………….………………….iii Abstract……..………...…………………………………………….………………….v 圖目次 ……..……………………………………………………….………………….ix 表目次……………………………………………………………………….….….…...xi 第一章 緒論……………………………………………………………………….…..1 第一節 開放科學、科學態度與科學誠信………………………….….………...1 第二節 國內外對研究通透度與再現性之重視……………………………….....5 第三節 心理學領域之可再現危機…………………………………………...…..8 第四節 研究動機與研究問題……………………………………………...……10 第二章 文獻回顧……………………………………………………………….……14 第一節 有疑義之研究實踐..……………………………………………...……..14 第二節 促進科學通透度與再現性之國際實踐……….………….…………….20 第三節 期刊資料政策增進研究通透與再現…………………………………...28 第四節 心理學領域之科學價值與再現研究……………………………...……34 第五節 人物誌與圖書館服務…………………………………………………...42 第三章 研究設計與實施……….……………………………………………………45 第一節 研究方法………………………………………………………………...45 第二節 研究對象與預期抽樣…………………………………………………...48 第三節 研究工具與設計………………………………………………………...50 第四節 資料蒐集………………………………………………………………...56 第五節 資料處理與分析………………………………………………………...58 第四章 研究結果與討論……………………………………………………………...67 第一節 受訪者背景與研究領域敘述…………………………………………...67 第二節 本土心理學領域學者於研究通透度與再現性之認知………………...70 第三節 本土心理學領域學者於期刊實踐資料政策的態度……..…………….81 第四節 本土學術環境中之期刊生態……………………………………..…....91 第五章 圖書館服務與結論…………………………………………………………...95 第一節 心理學學者人物誌……………………………………………………...95 第二節 圖書館開放科學服務規劃…………………………………………….109 第三節 結論…………………………………………………………..…….......120 第四節 研究限制與結語…………………………………………………..…...128 參考文獻……………………………………………………………………………...133 附錄一……………………………………………………………………………… ..145 附錄二………………………………………………………………………………...152 | - |
dc.language.iso | zh_TW | - |
dc.title | 心理學期刊開放科學政策與研究通透度:應用人物誌於圖書館服務 | zh_TW |
dc.title | Open Science Policy and Perspectives of Transparency in Taiwanese Psychology Communities: Using Personas in Library Service Design | en |
dc.type | Thesis | - |
dc.date.schoolyear | 113-1 | - |
dc.description.degree | 碩士 | - |
dc.contributor.oralexamcommittee | 唐牧群;林雯瑤 | zh_TW |
dc.contributor.oralexamcommittee | Muh-Chyun Tang;Wen-Yau Lin | en |
dc.subject.keyword | 開放科學,研究通透度,期刊資料分享政策,人物誌, | zh_TW |
dc.subject.keyword | Open Science,Research Transparency,Data-Sharing Policies,Persona, | en |
dc.relation.page | 159 | - |
dc.identifier.doi | 10.6342/NTU202500048 | - |
dc.rights.note | 同意授權(全球公開) | - |
dc.date.accepted | 2025-01-09 | - |
dc.contributor.author-college | 文學院 | - |
dc.contributor.author-dept | 圖書資訊學系 | - |
dc.date.embargo-lift | 2025-02-25 | - |
顯示於系所單位: | 圖書資訊學系 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-113-1.pdf | 1.42 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。