Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 法律學院
  3. 法律學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/94301
標題: 數位平台內容審查之治理研析
Research and Analysis on the Governance of Digital Platform Content Moderation
作者: 繆欣儒
Hsin-Ju Miao
指導教授: 李建良
Chien-Liang Lee
關鍵字: 數位平台,內容審查,演算法,自動化決策,共同管制,透明性,問責性,正當程序,
Digital platforms,content moderation,algorithms,automated decision- making,co-regulation,transparency,accountability,due process,
出版年 : 2024
學位: 碩士
摘要: 在現今數位時代下,數位平台逐漸成為人們資訊交流、表現自我、產業廣告等之主要途徑,已然成為人們日常生活中不可或缺的重要支柱,更甚者現今人們似已不得不被迫參與數位平台以與日益變動、革新之社會接軌。然而相對越來越普及的大眾參與數位平台、資訊量的大幅提升、訊息傳遞速率的飛速與傳遞範圍的無遠弗屆,此一現象之反面即為線上違法內容、有害內容等危害社會秩序、公共福祉之資訊內容亦爆炸性增長。
為控管線上違法內容內容之散布,部分國家轉向提升數位平台之責任,數位平台亦採納自動化工具為內容審查,惟其所為之內容審查可能因審查時偏重之因素,及科技之局限性,而造成對於線上言論之過度、不當管制,因而本文旨在研究對其治理之框架規範。
  本文先自數位平台為內容審查之因素、類型及技術之限制為分析,與歸整出內容審查可能產生之系統性風險,為內容審查治理所欲處理之問題。接續透過引導出各方利害關係人間之基本權衝突,說明難以僅保障一方之困境。
  其次則藉由爬梳現行各國中,多種治理角色所提出之治理框架,分析其優點與不足之處。再進一步提出可能之解決途徑,宜以制定專法為較適之途徑,並分別自監管形式與監管內容為探究,而得出共同管制,與先行著重程序性規範而非對內容審查為限制之規範,應為較無違憲疑慮之管制模式。此外,並提出在我國現為有法規規制下,得由法院於個案中藉判決提供足資依循程序性框架之方式,以保障使用者、受影響者與公共利益。
  最後,則以歐盟數位服務法為鑑,比較數位服務法與數位中介服務法草案之差異,提出應保留數中法草案中打擊違法內容之多方協力參與、強化平台內容審查之透明性與問責性,及提供訴訟外紛爭解決機制之建議,盡速立法以應對內容審查所可能產生之系統性風險問題。
From an individual’s self-expression to commercial advertisements, digital platforms have gradually become the primary means of information exchange for the masses, in our current digital era. These digital platforms have become an indispensable and vital pillar in people's daily lives. Moreover, to keep up with the progressively changing and constantly innovating society, people have little choice but to engage in these digital platforms. However, as digital platforms gain popularity, comes a substantial increase in the volume of information, with the rapid speed of information transmission, and the unlimited distance of transmission have led to an explosive growth of illegal and harmful content online, which poses a threat to public order and social welfare.
There are countries that have turned to enhancing digital platforms liability to control the spread of illegal content online. In the meantime, digital platforms have also start adopting automated tools for content moderation. However, the forementioned content moderation methods may result in excessive and improper blocking of online speech. This paper aims to study the framework and regulations for the governance of digital platforms online content moderation.
This paper begins by analyzing the factors, types, and technical limitations of content moderation on digital platforms. Then summarizes the systemic risks that may arise from digital platforms content moderation and addressing issues in content moderation governance. This paper continues by highlighting the conflict of fundamental rights among various stakeholders and explains the balancing conundrum.
Next, by examining the governance frameworks proposed by different countries, and analyzes their advantages and disadvantages. This paper aims to propose possible solutions, and explain why formulating special laws might be a more appropriate approach. By analyzing the form and content of digital platforms content moderation, this paper comes to the conclusion that a co-regulation regulatory model that emphasizes on procedural provisions in content moderation over specific restrictions on the actual contents, will less likely be deemed unconstitutional.
Additionally, this paper suggest that under our current laws and regulations, the Judicial system can provide a sufficient procedural framework to protect users, third parties, and public interests through judgment in individual cases.
Finally, after compares the differences between the EU Digital Services Act and the draft Digital Intermediary Services Act. This paper suggests retaining certain aspects of the draft Digital Intermediary Services Act, like the multi-stakeholder joint participation in combating illegal content, the strengthening of transparency and accountability in platform content moderation, and the alternative dispute resolution mechanisms. This paper also recommends an immediate legislative intervention to address the systemic risks that may arise from content moderation.
URI: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/94301
DOI: 10.6342/NTU202402021
全文授權: 未授權
顯示於系所單位:法律學系

文件中的檔案:
檔案 大小格式 
ntu-112-2.pdf
  目前未授權公開取用
3.18 MBAdobe PDF
顯示文件完整紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved