請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/70848
完整後設資料紀錄
DC 欄位 | 值 | 語言 |
---|---|---|
dc.contributor.advisor | 陳慧玲(Huey-Ling Chen) | |
dc.contributor.author | Kuei-Ting Tung | en |
dc.contributor.author | 董奎廷 | zh_TW |
dc.date.accessioned | 2021-06-17T04:40:51Z | - |
dc.date.available | 2018-09-06 | |
dc.date.copyright | 2018-09-06 | |
dc.date.issued | 2018 | |
dc.date.submitted | 2018-08-06 | |
dc.identifier.citation | 1. Swing S.R., The ACGME outcome project: retrospective and prospective. Medical Teacher, 2007 Sep;29(7):648-54 2. Frank JR, Snell LS, Cate OT, Holmboe ES et al., Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638-45
3. Jason R. Frank, Deborah Danoff (2007) The CanMEDS initiative: implementing an outcomes-based framework of physician competencies, Medical Teacher, 29:7, 642-647 4. Hsieh BS: Primary care training as basis for clinical education. Journal of Medical Education 2000; 4: 273-4. 5. Chen YY, Wu CC, Hsu HC, The Postgraduate General Medical Training in Taiwan: Past, Present and Future J Med Education 2013; 17: 80∼ 91 6. Miller GE, The Assessment of Clinical Skills/Competence/Performance; Acad. Med. 1990; 65(9): 63–67 7. Mehay R., The Essential Handbook for GP Training and Education, 2009, Chapter 29: Assessment and Competence, p. 414 8. Toolbox of Assessment Methods, Accreditation Council for Graduate Medical Education (ACGME) and American Board of Medical Specialties (ABMS). Version 1.1 https://www.partners.org/Assets/Documents/Graduate-Medical-Education/ToolTable.pdf 9. Swing S.R., Assessing the ACGME General Competencies: General Considerations and Assessment Methods, Academic Emergency Medicine, 2002; 9: 12781288. 10. Frenk J, Chen L, Bhutta ZA, Cohen J, et al., Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 2010 Dec 4;376(9756):1923-58 11. Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA. 2009 Sep 23;302(12):1316-26 12. Harden RM, Stevenson M, Downie WW, Wilson GM.. Assessment of clinical competence using objective structured examination. Br Med J 1975 1:447–451. 13. Khan KZ, Ramachandran S, Gaunt K, Pushkar P,The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part I: an historical and theoretical perspective. Med Teach. 2013;35(9):e1437–e1446 14. Particio M et al., Systematic Review on the feasibility, reliability and validity of OSCE in undergraduate medical students, https://bemecollaboration.org/Reviews+In+Progress/OSCE/ 15. Sloan DA, Donnelly MB, Schwartz RW, Strodel WE. The Objective Structured Clinical Examination. The new gold standard for evaluating postgraduate clinical performance. Annals of surgery. 1995;222(6):735-42 16. Petrusa ER, Blackwell TA, Ainsworth MA. Reliability and validity of an objective structured clinical examination for assessing the clinical performance of residents. Archives of internal medicine 1990;150:573-7 17. Joorabchi B, Devries JM. Evaluation of clinical competence: the gap between expectation and performance. Pediatrics. 1996;97(2):179-84. 18. Kreptul D, Thomas RE, Family medicine resident OSCEs: a systematic review Educ Prim Care. 2016 Nov;27(6):471-47 19. Patrício MF, Miguel Julião, Filipa Fareleira & António Vaz Carneiro, Is the OSCE a feasible tool to assess competencies in undergraduate medical education?, Medical Teacher, 2013, 35:6, 503-514 20. Pell G, Fuller R, Homer M, Roberts T; How to measure the quality of the OSCE: A review of metrics – AMEE Guide no. 49. Med Teach. 2010;32(10):802-11 21. Tavakol M, Dennick R. Post-examination interpretation of objective test data: Monitoring and improving the quality of high-stakes examinations: AMEE Guide No. 66, Medical Teacher 2012, 34:3, e161-e175 22. Simmons BJ, Zoghbi Y, Askari M, et al. Significance of Objective Structured Clinical Examinations to Plastic Surgery Residency Training. Annals of plastic surgery. 2017;79(3):312-9. 23. McMurray L, Hall AK, Rich J, Merchant S, Chaplin T. The Nightmares Course: A Longitudinal, Multidisciplinary, Simulation-Based Curriculum to Train and Assess Resident Competence in Resuscitation. Journal of graduate medical education. 2017;9(4):503-8. 24. Brannick MT, Erol-Korkmaz HT, Prewett M. A systematic review of the reliability of objective structured clinical examination scores. Medical Education 2011: 45: 1181–1189 25. Yang, YY, Lee FY, Hsu HC et al. Assessment of first-year post-graduate residents: Usefulness of multiple tools. Journal of the Chinese Medical Association 74 (2011) 531-538 26. Lin, JL, Hsu, YW, Lin, RL et al. Small-scale OSCE is Useful for Evaluation of the ACGME General Competencies of PGY1 Residents in Internal Medicine. J Med Education 2014; 18: 114-123 27. Downing, SM. Validity: on the meaningful interpretation of assessment data. Medical education 2003; 37: 830-837 | |
dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/70848 | - |
dc.description.abstract | 背景
過去客觀結構式臨床技能測驗在醫學生跟幾個畢業後專科的評估都被發現具有信度及效度。但是有學者對於它的可行性以及是否適合評估核心能力導向醫學教育有疑慮。 過去對於跨專科客觀結構式臨床技能測驗在畢業後醫學教育的資料有限。 台灣有獨特的畢業後一般醫學訓練及(Postgraduate year)PGY住院醫師制度。 過去針對這個族群使用客觀結構式臨床技能測驗的信度及效度證據也有限。 目標跟研究問題 1. 評估跨專科客觀結構式臨床技能測驗在畢業後一般醫學訓練住院醫師的可行性 2. 檢視客觀結構式臨床技能測驗在這個族群及情境下的信度及效度 3. 探討可能影響這個測驗效度的因素 方法 在2016年6月以及2017年6月,總共有83位PGY學員參加亞東紀念醫院的期末客觀結構式臨床技能測驗。 我們報告了測驗的設計、藍圖、考官訓練、標主化病患訓練、成績、問卷調等為測業技術面可行性探討。在經濟可行性方面我們估計了一次測驗的費用。 信度及效度評估包含了站與站之間的內部一致性、單一站項目的內部一致性、評估者間信度檢驗、分數跟及格率、項目分數跟整體評估的關聯性等。 我們評估站數、考官訓練、多位考官對於信度的影響。 結果 即使客觀結構式臨床技能測驗需要相當多的資源,這是一個可行的測驗,也得到教職跟學員正向的反應。 站與站之間的信度(Cronbach’s α 0.104~0.464)跟項目與項目之間的信度(Cronbach’s α -0.217 to 0.483)偏低。 大致上評估者間的信度為中等到好的信度,項目分數跟整體評估的關聯性也大致都是高的。 使用多位評估者讓站與站之間的信度增加。 三位評估者的評估者間信度比兩位評估者來的高。 討論 我們提供了單一家教學醫院的跨專科客觀結構式臨床技能測驗在畢業後一般醫學訓練住院醫師的可行性證據。 但對於不同規模的訓練醫院,同樣的評估方式不一定有同樣的可行性。在亞東紀念醫院,透過醫院的教學部以及教職的支持,這樣的評估技術上以及經濟上都是可行的。 同時也需要不同專科密切的配合。 站與站之間跟項目與項目之間的信度偏低。 一個可能的解釋是我們用的是因為在跨專科的考試中評估了不同面向以及長站的設計在單一站中評估了不一樣的核心能力跟臨床技能。 我們發現使用多位評估者的分數讓站與站間的信度改站。在文獻回顧中,其他可能影響站與站之間跟項目與項目之間的信度的因素還包含站數,評估項目數,跟評核的臨床技能。 在我們的測驗中,評估者間信度整題而言是高的。 三位評估者的評估者間信度高於兩位評估者。 整題而言,項目分數跟整體評估的關聯性、高的評估者間信度以及內容、回應、內部架構等提供這個評估的信度以及效度證據。 結論 在台灣的畢業後一般醫學訓練,以跨專科客觀結構式臨床技能測驗作為評估模式是技術上以及經濟上可行的。 這個測驗可以評估多項核心能力跟重要的臨床技能。 信度可能受不同因素的影響,使用兩位評估者是一個有效率增加信度的方式。 | zh_TW |
dc.description.abstract | Background
The objective structured clinical examination (OSCE) is known as a reliable and valid assessment of both undergraduate medical students and in several post-graduate medical specialties. However, there are concerns regarding its feasibility and whether it is suitable as an assessment of competence based medical education (CBME). There is limited data on the use of multi-specialty OSCE for assessment in postgraduate medicine. Taiwan also has an unique postgraduate year (PGY) general medicine training program and there is limited reliability or validity evidence of OSCE in this population. Aim and Research Questions 1. Evaluate the feasibility of multi-specialty OSCE in the post-graduate year (PGY) general medicine training program 2. Examine the reliability and validity of multi-specialty OSCE in this population and setting 3. Identify potential factors that affect reliability of this assessment Concise Methods During June 2016 and June 2017, 83 PGY residents participated in four seperate end-of-year OSCE assessment at Far Eastern Memorial Hospital (FEMH). The design, blueprint, faculty and standardized patient training, outcomes, as well as questionnaire responses were reported as evidence of technical feasibility. Economically feasibility was evaluated through estimating the cost of applying an OSCE exam. Reliability and validity evidence was gathered through analysis of across-station, across-item, inter-rater reliability, as well as scores and correlations parameters. The effects of rater training and different station lengths and having multiple raters were examined. Results OSCE was a feasible but resource demanding method of assessment with positive response and satisfaction from faculty and trainees. The across-station station reliability (Cronbach’s α 0.104~0.464) and across-item reliability (Cronbach’s α -0.217 to 0.483) were low. Overall good correlation between checklist items with global rating (Coefficient of determination of R2 0.32~0.907) and moderate to good inter-rater reliability was found. Using the scores of multiple raters improved across-station reliability and inter-rater of three raters was higher than two raters. Discussion Feasibility of multi-specialty OSCE in PGY residents of the general medicine training program in one hospital was provided. However, feasibility using the same format may not apply to all hospitals, depending on the size of each program. It was technically and economically feasible at FEMH due to the strong support of the hospital’s medical education department and faculty and also through the close collaboration between the different specialties involved. Across-station and across-item reliability were in general low, and a potential explanation is evaluation of distinct constructs due to the multi-specialty design and measure of multiple clinical skills and competence in long stations. We found that having multiple raters improved across-station reliability. Based on previous literature, other potential factors that may affect across-station and across-item reliability include number of stations, length of checklist and clinical skill tested. Overall inter-rater reliability was good and three raters compared to two raters in general improved inter-rater reliability. Overall correlation between checklist and global rating, inter-rater reliability, validity in the form of content, response process, internal structure and criterion validity provided evidence towards overall fair reliability and validity of this assessment. Conclusion A multi-specialty OSCE as an end-of-year summative and formative assessment in Taiwanese PGY residents general medicine training program is technically and also economically feasible. It can be used to assess multiple core competencies and important clinical skills. Reliability may be affected by various factors, and the use of double raters is an effective way to increase reliability. | en |
dc.description.provenance | Made available in DSpace on 2021-06-17T04:40:51Z (GMT). No. of bitstreams: 1 ntu-107-R05457001-1.pdf: 1757280 bytes, checksum: 64870507e9ad444fa13978ac341e67ad (MD5) Previous issue date: 2018 | en |
dc.description.tableofcontents | 口試委員會審定書 ..………….……………………………………… i
謝詞 ..……….…………………………………………………………. ii 中文摘要 ……………………………………………………………… iii Abstract ..…...…………………………………………………………. iv Chapter 1. Introduction……………………………………….…..…… 1 1.1 Background . ………..……………………….…......... 1 1.2 Evaluation of clinical and core competencies …...….. 2 1.3 OSCE for resident evaluation ……………..…..…… 4 1.4 Research Question and Aim ………..………………. 5 Chapter 2 Methods ………………………………………...…………. 7 2.1 Study Population and Settings …….………..……… 7 2.2 Evaluation of OSCE Feasibility ………….………… 7 2.3 Examining the OSCE reliability and validity ..…..… 8 2.4 Factors that affect OSCE reliability............................. 9 2.5 Statistical analysis …………….……………….…… 9 Chapter 3 Results……………...…………………………………….. 10 3.1 Overview, design and feasibility of OSCE..……… 10 3.2 OSCE scores and reliability analysis………..…… 20 3.3 Effect of multiple raters on reliability….………… 27 Chapter 4 Discussion……………………..………………………….. 29 4.1 Feasibility of OSCE ………..……………………… 30 4.2 Reliability of OSCE ………………..….…...……… 32 4.3 Validity of OSCE …………….………….………… 36 4.4 Limitations of OSCE………….….………………… 37 4.5 Recommendation for improving OSCE……….…… 37 Chapter 5 Limitation and recommendation for future research……... 39 Chapter 6 Conclusion……………………………………………..... 41 References ……………………………………………………….…… 42 Appendix ……………………………………………………………. 45 | |
dc.language.iso | en | |
dc.title | 畢業後一般醫學訓練住院醫師「客觀結構式臨床技能測驗」的可行性評估 | zh_TW |
dc.title | Evaluating the feasibility of Objective Structured Clinical Examination (OSCE) in the Postgraduate Year (PGY) General Medicine Training Program | en |
dc.type | Thesis | |
dc.date.schoolyear | 106-2 | |
dc.description.degree | 碩士 | |
dc.contributor.oralexamcommittee | 楊志偉(Chih-Wei Yang),朱宗信(Tzong-Shinn Chu) | |
dc.subject.keyword | 客觀結構式臨床技能測驗,畢業後一般醫學訓練,可行性,信度,效度,OSCE,PGY, | zh_TW |
dc.subject.keyword | objective structured clinical examination,OSCE,Postgraduate year,PGY,Feasibility,Reliability,Validity, | en |
dc.relation.page | 48 | |
dc.identifier.doi | 10.6342/NTU201802505 | |
dc.rights.note | 有償授權 | |
dc.date.accepted | 2018-08-06 | |
dc.contributor.author-college | 醫學院 | zh_TW |
dc.contributor.author-dept | 醫學教育暨生醫倫理研究所 | zh_TW |
顯示於系所單位: | 醫學教育暨生醫倫理學科所 |
文件中的檔案:
檔案 | 大小 | 格式 | |
---|---|---|---|
ntu-107-1.pdf 目前未授權公開取用 | 1.72 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。