請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/96082完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 鄭瑋 | zh_TW |
| dc.contributor.advisor | Wei Jeng | en |
| dc.contributor.author | 丁昱寧 | zh_TW |
| dc.contributor.author | Yu-Ning Ting | en |
| dc.date.accessioned | 2024-10-14T16:06:41Z | - |
| dc.date.available | 2024-10-15 | - |
| dc.date.copyright | 2024-10-14 | - |
| dc.date.issued | 2024 | - |
| dc.date.submitted | 2024-09-29 | - |
| dc.identifier.citation | 王梅玲(2002)。焦點團體研究法的理論與應用。圖資與檔案學刊,40,29-46。
吳紹群(2002)。內容分析法與圖書館學研究。圖資與檔案學刊,40,47-61。 林雯瑤(2020年5月8日)。當學術期刊編輯遇上圖書館員。CONCERT 2020 研習暨座談會。https://concert.stpi.narl.org.tw/uploads/schedule_file/speaker_file/file/561/%E6%9E%97%E9%9B%AF%E7%91%A4%E6%95%99%E6%8E%88%E8%AC%9B%E7%BE%A9_renew.pdf 周倩、潘璿安(2020)。我捏造了一系列實驗!荷蘭社會心理學家 Diederik Stapel 假造研究數據案(上)。科技部研究誠信電子報,39,3-14。 教育部(2023)。112學年度大專校院一覽表:心理學系。大專校院一覽表。 https://udb.moe.edu.tw/ulist/ISCED 齊力、林本炫 (2005)。質性研究方法與資料分析。南華教社所。 Allard, S., Mack, T. R., Feltner-Reichert, M. (2005). The librarian’s role in institutional repositories: A content analysis of the literature. Reference Services Review, 33(3), 325-336. https://doi.org/10.1108/00907320510611357 Arizona State Library, Archives and Public Records (2023, March 29). Research Data. Reproducibility Librarian. Arizona State Library. https://azlibrary.gov/jobline/job/7074 Association of College & Research Libraries (2023, September 1). Scholarly Communication Toolkit: Scholarly Communication Overview. ACRL LibGuides. https://acrl.libguides.com/scholcomm/toolkit/ Babbie, E. (2021)。研究方法:基礎理論與技巧(蔡毓智譯;三版)。新加坡商聖智學習。(原著出版於2011年) Baker, M. (2015a, April 30). First results from psychology’s largest reproducibility test. Nature. https://doi.org/10.1038/nature.2015.17433 Baker, M. (2015b, August 27). Over half of psychology studies fail reproducibility test. Nature. https://doi.org/10.1038/nature.2015.18248 Bakker, M., Veldkamp, C. L. S., van Assen, M. A. L. M., Crompvoets, E. A. V., Ong, H. H., Nosek, B. A., Soderberg, C. K., Mellor, D., & Wicherts, J. M. (2020). Ensuring the quality and specificity of preregistrations. PLOS Biology, 18(12). https://doi.org/10.1371/journal.pbio.3000937 Barbour, R. (2010)。焦點團體研究法(張可婷譯)。韋伯。(原著出版於2007年) Begley, C., & Ellis, L. (2012). Raise standards for preclinical cancer research. Nature, 483(7391), 531-533. https://doi.org/10.1038/483531a Berelson, B. (1952). Content analysis in communication research. Free Press. Bosnjak, M., Fiebach, C. J., Mellor, D., Mueller, S., O'Connor, D. B., Oswald, F. L., & Sokol-Chang, R. I. (2022). A template for preregistration of quantitative research in psychology: report of the joint psychological societies preregistration task force. American psychologist, 77(4), 602-615. https://doi.org/10.1037/amp0000879 Camerer, C. F., Dreber, A., Holzmeister, F., Ho, T.-H., Huber, J., Johannesson, M., Kirchler, M., Nave, G., Nosek, B. A., Pfeiffer, T., Altmejd, A., Buttrick, N., Chan, T., Chen, Y., Forsell, E., Gampa, A., Heikensten, E., Hummer, L., Imai, T.,…, Wu, H. (2018). Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nature Human Behavior, 2, 637-644. https://doi.org/10.1038/s41562-018-0399-z Chan, A.-W., Hróbjartsson, A., Haahr, M. T., Gøtzsche, P. C., & Alt-man, D. G. (2004). Empirical evidence for selective reporting of outcomes in randomized trials: comparison of protocols to published articles. JAMA, 291(20), 2457-2465. https://doi.org/10.1001/jama.291.20.2457 Chambers, C. D., & Tzavella, L. (2022). The past, present and future of Registered Reports. Nature Human Behavior, 6, 29-42. https://doi.org/10.1038/s41562-021-01193-7 Charmaz, K. (2009)。建構扎根理論(顏寧、黃詠光、吳欣隆譯)。五南。(原著出版於2006年) Chavan, V., Penev, L. (2011). The data paper: a mechanism to incentivize data publishing in biodiversity science. BMC Bioinformatics, 12. https://doi.org/10.1186/1471-2105-12-S15-S2 Claesen, A., Gomes, S., Tuerlinckx, F., & Vanpaemel, W. (2021). Comparing dream to reality: an assessment of adherence of the first generation of preregistered studies. Royal Society open science, 8(10). https://doi.org/10.1098/rsos.211037 Conry-Murray, C., Mcconnon, A., Bower, M. (2022). The effect of preregistration and p-value patterns on trust in psychology and biology research. Collabra: Psychology, 8(1). https://doi.org/10.1525/collabra.36306 Cragin, M.H., Palmer, C.L., Carlson, J., & Witt, M. (2010). Data sharing, small science and institutional repositories. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 368(1926), 4023-4038. Cruwell, S., & Evans, N. J. (2021). Preregistration in diverse contexts: a preregistration template for the application of cognitive models. Royal Society Open Science, 8(10). http://doi.org/10.1098/rsos.210155 Dejong, M., & Schellens, P. J. (1997). Reader-Focused Text Evaluation: An Overview of Goals and Methods. Journal of Business and Technical Communication, 11(4), 402-432. https://doi.org/10.1177/1050651997011004003 Eder, A. B., & Frings, C. (2021). Registered Report 2.0: The PCI RR Initiative. Experimental Psychology, 68(1), 1-3. https://doi.org/10.1027/1618-3169/a000512 Ewart, R., Lausen, H., & Millian, N. (2009). Undisclosed changes in outcomes in randomized controlled trials: an observational study. Annals of family medicine, 7(6), 542–546. https://doi.org/10.1370/afm.1017 Field, S. M., Wagenmakers, E-J., Kiers, H. A. L., Hoekstra, R., Ernst, A.F., van Ravenzwaaij, D. (2020). The effect of preregistration on trust in empirical research findings: results of a registered report. Royal Society Open Science, 7(4). http://dx.doi.org/10.1098/rsos.181351 Fleming, P. S., Koletsi, D., Dwan, K., & Pandis, N. (2015). Outcome discrepancies and selective reporting: impacting the leading journals. PLOS ONE, 10(5). https://doi.org/10.1371/journal.pone.0127495. Fraser, H., Parker, T., Nakagawa, S., Barnett, A., & Fidler, F. (2018). Questionable research practices in ecology and evolution. PLOS ONE, 13(7). https://doi.org/10.1371/journal.pone.0200303 Fu, K. K., Yang, M. C., and Wood, K. L. (2016). Design Principles: Literature Review, Analysis, and Future Directions. Journal of Mechanical Design, 138(10). https://doi.org/10.1115/1.4034105 Ganier, F. (2004). Factors affecting the processing of procedural instructions: implications for document design. IEEE Transactions on Professional Communication, 47(1), 15-26. https://doi.org/10.1109/TPC.2004.824289 Gilbert, D. T., King, G., Pettigrew, S., & Wilson, T. D. (2016). Comment on "Estimating the reproducibility of psychological science". Science, 351(6277), 1037. https://doi.org/10.1126/science.aad7243 Glaser, B., & Strauss, A. (1967). The Discovery of Grounded Theory: Strategies for Qualitative Research. Mill Valley, CA: Sociology Press. Grossetta Nardini, H. K., Batten, J., Funaro, M. C., Garcia-Milian, R., Nyhan, K., Spak, J. M., Wang, L., & Glover, J. G. (2019). Librarians as methodological peer reviewers for systematic reviews: results of an online survey. Research Integrity and Peer Review, 4. https://doi.org/10.1186/s41073-019-0083-5 Hagger, M. S., Chatzisarantis, N. L. D., Alberts, H. C. O., Calvillo, D. P., Campbell, W. K., Cannon, P. R., Carlucci, M., Carruth, N. P., Cheung, T., Crowell, A., De Ridder, D. T. D., Dewitte, S. M., Elson, M., Evans, J. R., Fay, B. A., Fennis, B. M., Finley, A., Francis, Z., Heise, E., …, Zwienenberg, M. (2016). A multilab preregistered replication of the ego-depletion effect. Perspectives on Psychological Science, 11(4), 546-573. https://doi.org/10.1177/1745691616652873 Hardwicke T.E., & Ioannidis, J.P.A. (2018). Mapping the universe of registered reports. Nature Human Behavior, 2(11), 793-796. https://doi.org/10.1038/s41562-018-0444-y Haven, T. L. & van Grootel, L. (2019). Preregistering qualitative research. Accountability in Research, 26(3), 229-244. https://doi.org/10.1080/08989621.2019.1580147 Heirene, R., LaPlante, D., Louderback, E. R., Keen, B., Bakker, M., Serafimovska, A., & Gainsbury, S. M. (2021). Preregistration specificity & adherence: A review of preregistered gambling studies & cross-disciplinary comparison. https://doi.org/10.31234/osf.io/nj4es Hettne, K., Proppert, R., Nab, L., Rojas-Saunero, L. P., & Gawehns, D. (2020). ReprohackNL 2019: how libraries can promote research reproducibility through community engagement. IASSIST Quarterly, 44(1-2), 1-10. https://doi.org/10.29173/iq977 Huma, B., & Joyce, J. B. (2023). ‘One size doesn't fit all’: Lessons from interaction analysis on tailoring Open Science practices to qualitative research. British Journal of Social Psychology, 62(4), 1590-1604. https://doi.org/10.1111/bjso.12568 IBM. (2023, November 2). What are large language models (LLMs). IBM. https://www.ibm.com/topics/large-language-models Ikeda, A., Xu, H., Fuji, N., Zhu, S., & Yamada, Y. (2019). Questionable research practices following pre-registration. Japanese Psychological Review, 62(3), 281-295. https://doi.org/10.24602/sjpr.62.3_281 Ioannidis, J., Allison, D., Ball, C., Coulibaly, I., Cui, X., Culhane, A., Falchi, M., Furlanello, C., Game, L., Jurman, G., Mangion, J., Mehta, T., Nitzberg, M., Page, G., Petretto, E., & van Noort, V. (2009). Repeatability of published microarray gene expression analyses. Nature Genetics, 41, 149-155. https://doi.org/10.1038/ng.295 Kern, F. G., & Gleditsch, K. S. (2017). Exploring pre-registration and pre-analysis plans for qualitative inference. Pre-print. https://www.researchgate.net/publication/319141144_Exploring_Pre-registration_and_Pre-analysis_Plans_for_Qualitative_Inference Krishna, A. (2021). The need for synergy in academic policies: An introduction to the dialogue on pre‐registration. Journal of Consumer Psychology, 31(1), 146-150. https://doi.org/10.1002/jcpy.1211 Krueger, R.A. and Casey, M.A. (2000) Focus groups: A practical guide for applied research. Sage Publications Inc, Thousand Oaks. LaPolla, F. W. Z., Bakker, C. J., Exner, N., Montnech, T., Surkis, A., & Ye, H. (2022). Rigor and reproducibility instruction in academic medical libraries. Journal of the Medical Library Association: JMLA, 110(3), 281-293. https://doi.org/10.5195/jmla.2022.1443 Lee, J.-S. and Jeng, W. (2019), The landscape of archived studies in a social science data infrastructure: Investigating the ICPSR metadata records. Proceedings of the Association for Information Science and Technology, 56: 147-156. https://doi.org/10.1002/pra2.62 Lee, S. M. (2021, August 26). A famous honesty researcher is retracting a study over. fake data. BuzzFeed News. https://www.buzzfeednews.com/article/stephaniemlee/dan-ariely-honesty-study-retraction Lin, L., & Evans, S. (2012). Structural patterns in empirical research articles: A cross-disciplinary study. English for Specific Purposes, 31(3), 150-160. https://doi.org/10.1016/j.esp.2011.10.002 Liu, L., & Liu, W. (2023). The engagement of academic libraries in open science: A systematic review. The Journal of Academic Librarianship, 49(3). https://doi.org/10.1016/j.acalib.2023.102711 MacEachern, S. N., & Van Zandt, T. (2019). Preregistration of modeling exercises may not be useful. Comput Brain Behav, 2, 179-182. https://doi.org/10.1007/s42113-019-00038-x Maier, M. (2017). Content analysis: advantages and disadvantages. The SAGE Encyclopedia of Communication Research Methods, 4, 240-242. https://doi.org/10.4135/9781483381411 Mathieu, S., Boutron, I., Moher, D., Altman, D. G., & Ravaud, P. (2009). Comparison of registered and published primary outcomes in randomized controlled trials. JAMA, 302(9), 977-984. https://doi.org/10.1001/jama.2009.1242 McKinsey & Company (2024, March 22). What is prompt engineering? McKinsey & Company. https://www.mckinsey.com/featured-insights/mckinsey-explainers/what-is-prompt-engineering Mclntyre, L. (2021)。科學態度(楊惟芬譯)。國立陽明交通大學出版社。(原著出版於2019年) Mertens, G., & Krypotos, A.-M. (2019). Preregistration of analyses of preexisting data. Psychologica Belgica, 59(1), 338-352. https://doi.org/10.5334/pb.493 Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., du Sert, N. P., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1. https://doi.org/10.1038/s41562-016-0021 Nielsen, J. (2012, January 3). Usability 101: Introduction to Usability. Nielsen Norman. Group. https://www.nngroup.com/articles/usability-101-introduction-to-usability/ Nosek, B. A., Ebersole, C. R., DeHaven, A. C., & Mellor, D. T. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences, 115(11). 2600-2606. https://doi.org/10.1073/pnas.1708274114 Nosek, B. A., & Lindsay, D. S. (2018, February 28). Preregistration becoming the norm in psychological science. APS Observer, 31(3). https://www.psychologicalscience.org/observer/preregistration-becoming-the-norm-in-psychological-science Nosek, B. A., Beck, E. D., Campbell, L., Flake, J. K., Hardwicke, T. E., Mellor, D. T., van't Veer, A. E., & Vazire, S. (2019). Preregistration is hard, and worthwhile. Trends in cognitive sciences, 23(10), 815–818. https://doi.org/10.1016/j.tics.2019.07.009 Ofosu, G., & Posner, D. (2023). Pre-Analysis Plans: An Early Stocktaking. Perspectives on Politics, 21(1), 174-190. https://doi.org/10.1017/S1537592721000931 Ogungbeni, J. I., Obiamalu, A. R., Ssemambo, S., & Bazibu, C. M. (2018). The roles of academic libraries in propagating open science: A qualitative literature review. Information Development, 34(2), 113-121. https://doi.org/10.1177/0266666916678444 Open Science Collaboration (2015). Estimating the reproducibility of psychological science. Science, 349(6251). https://doi.org/10.1126/science.aac4716 Palmer, C. (2018, May 1). How to review a manuscript. Monitor on Psychology, 49(5). https://www.apa.org/monitor/2018/05/review-manuscript Patel, Swapnesh C., Drury, Colin G., & Prabhu, Prasad. (1993). Design and usability evaluation of work control documentation. Proceedings of the Human Factors and Ergonomics Society 37th Annual Meeting, 1156-1160. Pham, M. T., & Oh, T. T. (2021). Preregistration Is Neither Sufficient nor Necessary for Good Science. Journal of Consumer Psychology, 31(1), 163-176. https://doi.org/10.1002/jcpy.1209 PLOS. (2024). Peer Review Training Resources. PLOS. https://plos.org/resource/peer- review-training-resources/ Rethlefsen, M. L., Farrell, A. M., Osterhaus Trzasko, L. C., & Brigham, T. J. (2015). Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews. Journal of clinical epidemiology, 68(6), 617-626. https://doi.org/10.1016/j.jclinepi.2014.11.025 Sarafoglou, A., Hoogeveen, S., & Wagenmakers, E. -J. (2023). Comparing analysis blinding with preregistration in the many-analysts religion project. Advances in Methods and Practices in Psychological Science, 6(1). https://doi.org/10.1177/25152459221128319 Sayre, F., & Riegelman, A. (2018). The Reproducibility Crisis and Academic Libraries. College & Research Libraries, 79(1). https://doi.org/10.5860/crl.79.1.2 Sayre, F., & Riegelman, A. (2019). Replicable Services for Reproducible Research: A Model for Academic Libraries. College & Research Libraries, 80(2), 260-272. https://doi.org/10.5860/crl.80.2.260 Schneider, J., Backfisch, I., & Lachner, A. (2022). Facilitating open science practices for research syntheses: PreregRS guides preregistration. Research Synthesis Methods, 13(2), 284-289. https://doi.org/10.1002/jrsm.1540 Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science, 22(11), 1359-1366. https://doi.org/10.1177/0956797611417632 Stanovich, K. E.(2018)。這才是心理學(楊中芳譯)。遠流。(原著出版於2018年) Steeves, V. (2017). Reproducibility Librarianship. Collaborative Librarianship, 9(2). https://digitalcommons.du.edu/collaborativelibrarianship/vol9/iss2/4 Stodden, V., Bailey, D. H., Borwein, J. M., LeVeque, R. J., Rider, W. J., & Stein, W. A. (2013). Setting the default to reproducible reproducibility in computational and experimental mathematics. https://icerm.brown.edu/topical_workshops/tw12-5-rcem/icerm_report.pdf Strausss, A. & Corbin, J. (2001)。質性研究入門:紮根理論研究方法(吳芝儀、廖梅花譯)。濤石。(原著出版於1998年) TARG Meta-Research Group and Collaborators (2022). Discrepancy review: a feasibility study of a novel peer review intervention to reduce undisclosed discrepancies between registrations and publications. Royal Society open science, 9(7). https://doi.org/10.1098/rsos.220142 Tenopir, C., Allard, S., Douglass, K., Aydinoglu, U., Wu, L., Read, E., Manoff, M. & Frame, M. (2011). Data sharing by scientists: Practices and perceptions. PLoS One,6(6). https://doi.org/10.1371/journal.pone.0021101 Tenopir, C., Dalton, E. D., Allard, S., Frame, M., Pjesivac, I., Birch, B., Pollock D., & Dorsett, K. (2015). Changes in data sharing and data reuse practices and perceptions among scientists worldwide. PLoS One, 10(8). https://doi.org/10.1371/journal.pone.0134826 Thibault, R. T., Pennington, C. R., & Munafò, M. R. (2023). Reflections on Preregistration: Core Criteria, Badges, Complementary Workflows. Journal of Trial & Error. https://doi.org/10.36850/mr6 Tzanova, S. (2020). Changes in Academic Libraries in the Era of Open Science. Education for Information, 36(3), 281-299. https://doi.org/10.3233/EFI-190259 UNESCO. (2023, September 21). UNESCO Recommendation on Open Science. UNESCO. https://www.unesco.org/en/open-science/about?hub=686 van den Akker, O. R. (2019). Protocol for Matching 3.0. Open Sience Framework. https://osf.io/ta3yd/ van den Akker, O. R., Weston, S., Campbell, L., Chopik, B., Damian, R., Davis-Kean, P., Hall, A., Kosie, J., Kruse, E., Olsen, J., Ritchie, S., Valentine, K. D., Veer, A. V., & Bakker, M. (2021). Preregistration of secondary data analysis: A template and tutorial. Meta-Psychology, 5. https://doi.org/10.15626/MP.2020.2625 van den Akker, O. R., van Assen M. A. L. M., Enting, M., Jonge, M., Ong, H. H., Rüffer, F., Schoenmakers, M., Stoevenbelt, A. H., Wicherts, J., & Bakker, M. (2023a). Selective hypothesis reporting in psychology: Comparing preregistrations and corresponding publications. Advances in Methods and Practices in Psychological Science, 6(3). https://doi.org/10.1177/25152459231187988 van den Akker, O. R., Bakker, M., Assen, M., Pennington, C., Verweij, L., Elsherif, M., Claesen, A., Gaillard, S., Yeung, S. K., Frankenberger, J-L., Krautter, K., Cockcroft, J., Kreuer, K., Evans, T., Heppel, F., Schoch, S., Korbmacher, M., Yamada, Y., Albayrak-Aydemir, N., …, Wicherts, J. (2023b). The effectiveness of preregistration in psychology: Assessing preregistration strictness and preregistration-study consistency. https://doi.org/10.31222/osf.io/h8xjw Vassar, M., Roberts, W., Cooper, C. M., Wayant, C., & Bibens, M. (2020). Evaluation of selective outcome reporting and trial registration practices among addiction clinical trials. Addiction, 115(6), 1172-1179. https://doi.org/10.1111/add.14902 Vitale, C. R. H. (2016). Is research reproducibility the new data management for libraries. Bulletin of the Association for Information Science and Technology, 42(3), 38-41. https://doi.org/10.1002/bul2.2016.1720420313 Wagenmakers, E.-J., Wetzels, R., Borsboom, D., van der Maas, H. L. J., & Kievit, R. A. (2012). An Agenda for Purely Confirmatory Research. Perspectives on Psychological Science, 7(6), 632-638. https://doi.org/10.1177/1745691612463078 Wagenmakers, E.-J., & Evans, N. (2018, November 26). “Don’t Interfere with my. Art”: on the disputed role of preregistration in exploratory model building. Bayesian Spectacles. https://www.bayesianspectacles.org/dont-interfere-with-my-art-on-the-disputed-role-of-preregistration-in-exploratory-model-building/ Wicherts, J. M., Veldkamp, C. L. S., Augusteijn, H. E. M., Bakker, M., van Aert, R. C. M., & van Assen, M. A. L. M. (2016). Degrees of freedom in planning, running, analyzing, and reporting psychological studies: A checklist to avoid p-Hacking. Front Psychol, 7. https://doi.org/10.3389/fpsyg.2016.01832 Wilkinson, M., Dumontier, M., Aalbersberg, I. Appleton, G., Axton, M., Baak, A., Blomberg, N., Boiten, J.-W., Santos, L. O. B. D. S., Bourne, P. E., Bouwman, J., Brookes, A. J., Clark, T., Crosas, M., Dillo, I., Dumon, O., Edmunds, S., Evelo, C. T., Finkers, R.,…, Mons, B. (2016). The FAIR guiding principles for scientific data management and stewardship. Sci Data, 3. https://doi.org/10.1038/sdata.2016.18 Willroth, E. C., & Atherton, O. E. (2024). Best laid plans: A guide to reporting preregistration deviations. Advances in Methods and Practices in Psychological Science, 7(1). https://doi.org/10.1177/25152459231213802 | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/96082 | - |
| dc.description.abstract | 近年來,科學界發現許多已發表於期刊的研究結果難以被他人重現,引發對科學知識長遠發展的擔憂。為提升研究通透度,預註冊(preregistration)逐漸被視為用以區別探索性與驗證性研究的實踐方法。然而,愈來愈多的研究表明預註冊計畫與最終發表的論文之間,往往存在未被揭露的不一致之處,可能影響研究的可信度,且期刊通常缺乏足夠的人力或資源進行預註冊遵循度審查。在開放科學概念的推廣下,學術圖書館憑藉其成熟的系統與豐富的資源,也逐漸成為開放科學服務的提供者,值得探索這類非特定學科領域的開放科學專家,在預註冊遵循度評估中所能協助的潛在角色。
對此,本研究以心理學為例,探討開放科學專家在協助評估預註冊偏差的潛力。在第一部分透過內容分析法,依循了一群心理學專家評估預註冊偏差的框架,對25份預註冊計畫和文章進行分析,比較開放科學專家與心理學專家對預註冊偏差的評估差異。第二部分則採用焦點團體法,討論開放科學專家與心理學專家對預註冊審查協議的形式偏好,探索可如何提供審查者更有效的支持。 研究結果顯示,開放科學專家與心理學專家對相同的預註冊計畫樣本的遵循度評估,一致率為72.6%。兩者評估的預註冊審查項目中,結果最為分歧的為研究的排除標準,且開放科學專家在不熟悉心理學的隱性規範下,可能難以辨識領域特定的知識,進而將描述較為模糊的內容視為非偏差。此外,關於對預註冊審查協議的形式偏好,開放科學專家較為傾向有提供具體操作指示的審查協議,心理學專家則更重視審查過程中的自主性。不過整體上,兩群專家皆認為理想的預註冊審查協議,應結合結構式與開放式的回應,以便審查者能清晰地闡釋其審查判定理由。 針對上述研究發現,本研究歸納出可通用於不同專業背景的審查者之預註冊審查協議設計原則,共涵蓋三個面向,第一,「賦能」審查者,使其具備足夠知識與能力進行審查;第二,「促進效率與效能」,透過降低審查者的認知負荷來優化審查過程;第三,「深化互動」,確保審查者的審查意見能充分回饋至被審稿者。藉由整理出預註冊審查協議的設計原則,有助於未來期刊重視預註冊遵循度的議題時,能有效地支持更多人參與審查以及完成審查任務。 | zh_TW |
| dc.description.abstract | As scientific fields discover that many published research results are difficult to replicate, concerns about the long-term development of scientific knowledge have arisen. To enhance research transparency, preregistration has gradually been recognized as a practice to distinguish between exploratory and confirmatory research. However, increasing evidence shows that there are often undisclosed deviations between preregistration plans and the final published papers, which can undermine research credibility. Furthermore, journals typically lack sufficient resources to review preregistration adherence.
With the promotion of open science, academic libraries, utilizing their well-established systems and abundant resources, are becoming service providers for open science. It is worth exploring the potential role that these open science experts, not limited to specific disciplines, could play in evaluating preregistration adherence. This study uses psychology as an example to investigate the potential of open science experts in assisting with the evaluation of preregistration deviations. The first part employed content analysis, following a framework used by psychology experts to assess preregistration deviations. Twenty-five preregistration plans and articles were analyzed, comparing the differences in preregistration deviations assessments between open science experts and psychology experts. The second part used focus groups to discuss the preferences of both groups regarding the format of preregistration review protocols, exploring how to provide more effective support to reviewers. Results show that the agreement rate between open science experts and psychology experts on adherence assessments of the same preregistration samples was 72.6%. The most significant discrepancies in assessments occurred in the studies’ exclusion criteria. Open science experts, unfamiliar with implicit norms in psychology, may struggle to recognize domain-specific knowledge, potentially interpreting ambiguous content as no deviations. Regarding preferences for the format of preregistration review protocols, open science experts favored protocols that offered specific operational instructions, while psychology experts valued autonomy in the review process. However, both groups agreed that an ideal protocol should combine structured and open-ended responses, allowing reviewers to clearly articulate their reasoning for review decisions. Based on these findings, this study outlines design principles for preregistration review protocols that can be applied across different professional backgrounds. These principles cover three aspects: First, “empowering” reviewers by ensuring they have sufficient knowledge and capability to conduct reviews. Second, “enhancing efficiency and effectiveness” by reducing reviewers’ cognitive load during the review process. Third, “deepening interaction” to ensure that reviewers’ comments are fully communicated to authors. By establishing these design principles for preregistration review protocols, the study aims to effectively support broader participation in preregistration reviews as journals prioritize preregistration adherence issues in the future. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2024-10-14T16:06:41Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2024-10-14T16:06:41Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | 謝辭 I
摘要 III ABSTRACT V 圖次 X 表次 XI 第一章 緒論 1 第一節 研究背景與動機 1 第二節 研究目的與研究問題 6 第二章 文獻回顧 8 第一節 預註冊對科學界的影響 8 一、預註冊的重要性 8 二、預註冊適用的研究範圍限制 10 三、心理學的預註冊採用情形 12 第二節 預註冊遵循度評估 13 一、預註冊遵循度評估的問題 13 二、提升預註冊遵循度的工具 15 第三節 學術圖書館與開放科學的關係 20 一、學術圖書館在開放科學扮演的角色 21 二、學術圖書館之開放科學服務 22 第三章 研究設計與實施 27 第一節 整體研究架構與方法 27 第二節 子研究一:預註冊研究遵循度評估差異 30 一、子研究一—研究方法 30 二、子研究一—研究標的與抽樣 31 三、子研究一—資料蒐集與分析 32 第三節 子研究二:預註冊項目重要性與預註冊審查協議形式 38 一、子研究二—研究方法 39 二、子研究二—研究對象與招募方式 39 三、子研究二—資料蒐集 41 四、子研究二—研究工具設計與實施 41 五、子研究二—研究執行流程 45 六、子研究二—資料處理與分析 48 第四章 子研究一:預註冊遵循度研究評估差異結果與發現 54 第一節 心理學專家與開放科學專家判定預註冊之差異 54 一、判定標準差異 56 二、Claesen等(2021)判定標準的制訂與執行的一致性問題 57 三、領域知識差異 57 第二節 開放科學專家判定預註冊偏差上的難題 57 一、預註冊本身的格式與存放位置 58 二、作者書寫風格 58 三、審查者的背景知識 60 四、小結 60 第五章 子研究二:預註冊審查協議偏好研究結果與發現 62 第一節 受訪者背景資訊 62 第二節 專家對預註冊審查協議之偏好 65 一、結構式預註冊審查協議好用性評估 68 二、半結構式預註冊審查協議好用性評估 76 三、小結 81 第三節 預註冊審查協議設計要點 82 一、賦能 85 二、執行效率與效能 87 三、深化互動 89 第四節 專家對預註冊項目的重要性排序 91 一、預註冊項目對心理學領域內讀者的重要性 92 二、預註冊項目對心理學領域外讀者的重要性 95 第六章 綜合討論與結論 97 第一節 研究結果討論 97 一、影響專家評估差異的三大面向:審查者、預註冊範本及作者 97 二、專家對預註冊審查協議偏好的差異與共通性 98 三、預註冊項目的重要性對預註冊遵循度及審查共識的影響 99 第二節 結論與建議 100 一、結論 100 二、研究限制 101 三、未來研究方向 103 四、研究貢獻 104 參考文獻 106 附錄一 結構式預註冊審查協議 119 附錄二 半結構式預註冊審查協議 129 附錄三 討論題綱 135 附錄四 子研究一預註冊審查清單 141 | - |
| dc.language.iso | zh_TW | - |
| dc.title | 心理學專家與開放科學專家對預註冊遵循度辨識的差異性 | zh_TW |
| dc.title | Differences in Preregistration Adherence Recognition Between Psychology Experts and Open Science Experts | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 113-1 | - |
| dc.description.degree | 碩士 | - |
| dc.contributor.oralexamcommittee | 唐牧群;林君昱 | zh_TW |
| dc.contributor.oralexamcommittee | Muh-Chyun Tang;Chun-Yu Lin | en |
| dc.subject.keyword | 開放科學,預註冊,心理學,學術傳播,學術圖書館, | zh_TW |
| dc.subject.keyword | open science,preregistration,psychology,scholarly communication,academic library, | en |
| dc.relation.page | 143 | - |
| dc.identifier.doi | 10.6342/NTU202404429 | - |
| dc.rights.note | 同意授權(全球公開) | - |
| dc.date.accepted | 2024-09-30 | - |
| dc.contributor.author-college | 文學院 | - |
| dc.contributor.author-dept | 圖書資訊學系 | - |
| 顯示於系所單位: | 圖書資訊學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-113-1.pdf | 3.06 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
