Skip navigation

DSpace

機構典藏 DSpace 系統致力於保存各式數位資料(如:文字、圖片、PDF)並使其易於取用。

點此認識 DSpace
DSpace logo
English
中文
  • 瀏覽論文
    • 校院系所
    • 出版年
    • 作者
    • 標題
    • 關鍵字
    • 指導教授
  • 搜尋 TDR
  • 授權 Q&A
    • 我的頁面
    • 接受 E-mail 通知
    • 編輯個人資料
  1. NTU Theses and Dissertations Repository
  2. 法律學院
  3. 法律學系
請用此 Handle URI 來引用此文件: http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/89079
完整後設資料紀錄
DC 欄位值語言
dc.contributor.advisor蘇慧婕zh_TW
dc.contributor.advisorHui-chieh Suen
dc.contributor.author鄭詠綺zh_TW
dc.contributor.authorYong-Ci Jhengen
dc.date.accessioned2023-08-16T17:02:37Z-
dc.date.available2023-11-09-
dc.date.copyright2023-08-16-
dc.date.issued2023-
dc.date.submitted2023-08-11-
dc.identifier.citationK. Stern(著),蔡宗珍(譯),(2019),〈基本權保護義務之功能──法學上的一大發現〉,《月旦法學雜誌》,175卷,頁46-59。
王澤鑑(2012),《人格權法:法釋義學、比較法、案例研究》,自版。
李震山(2020),〈論資訊自決權〉,收於:氏著,《人性尊嚴與人權保障(五版)》,頁239-314,元照。
林子儀(2015),〈公共隱私權〉,收於:國立臺灣大學法律學院(編),《馬漢寶講座論文彙編,第五屆》,頁7-62,財團法人馬氏思上文教基金會。
林子儀(2015),〈隱私權法制的新議題:監控與隱私自我管理〉,收於:國立臺灣大學法律學院(編),《馬漢寶講座論文彙編,第五屆》,頁65-115,財團法人馬氏思上文教基金會。
邱文聰(2009),〈從資訊自決與資訊隱私的概念區分:評「電腦處理個人資料保護法修正草案」的結構性問題〉,《月旦法學雜誌》,168期,頁172-189。
邱文聰(2018),〈初探人工智慧中的個資保護發展趨勢與潛在的反歧視難題〉,收於:劉靜怡(編),《人工智慧相關法律議題芻議》,頁149-175,元照。
邱文聰(2023),〈亦步亦趨的模仿還是超前部署的控制?--AI的兩種能力和它們帶來的挑戰〉,收於:李建良、林文源(編),《人文社會的跨領域AI探索》,頁285-299,國立清華大學出版社。
翁逸泓(2022),〈社群媒體不實訊息之治理:以個資保護模式為選項〉,《中研院法學期刊》,30期,頁171-230。
張陳弘(2018),〈新興科技下的資訊隱私保護:「告知後同意原則」的侷限性與修正方法之提出〉,《臺大法學論叢》,47卷1期,頁201-297。
張陳弘(2021),〈科技智慧防疫與個人資料保護:陌生但關鍵的資料保護影響評估程序〉,《臺大法學論叢》,50卷2期,頁337-400。
陳榮華(2017),《海德格《存在與時間》闡釋》(三版),臺大出版中心。
黃舒芃(2015),〈歐盟基本權利憲章對各會員國之拘束:由新進實務發展與理論爭議反思基本權利保障在歐盟的實踐途徑〉,收於:洪德欽、陳淳文(編),《歐盟法之基礎原則與實務發展(上)》,頁223-254,國立臺灣大學出版中心。
葉俊榮(2016),〈探尋隱私權的空間意涵:大法官對基本權利的脈絡論證〉,《中研院法學期刊》,18期,頁1-40。
劉靜怡(2012),〈社群網路時代的隱私困境:以Facebook為討論對象〉,《臺大法學論叢》,41卷1期,頁1-70。
劉靜怡(2019),〈淺談 GDPR 的國際衝擊及其可能因應之道〉,《月旦法學雜誌》,286卷5-31。
劉靜怡(2021),〈說故事的自由?:從歐洲人權法院近年隱私權相關判決談起〉,氏著,《網路時代的隱私保護困境》,頁51-81,元照。
蘇慧婕(2016),〈歐盟被遺忘權的概念發展——以歐盟法院Google Spain v. AEPD判決分析為中心〉,《憲政時代》,41卷4期,頁473-516。
蘇慧婕(2019),〈假訊息管制與資訊揭露義務──以選罷法、公投法及其修正草案為中心〉,《月旦法學雜誌》,292期,頁42-59。
蘇慧婕(2020),〈正當平台程序作為網路中介者的免責要件:德國網路執行法的合憲性評析〉,《臺大法學論叢》,49卷4期,頁1915-1977。
蘇慧婕(2022),〈歐盟被遺忘權的內國保障:德國聯邦憲法法院第一、二次被遺忘權判決評析〉,《臺大法學論叢》,51卷1期,頁1-65。
Acquisti, A., & Grossklags, J. (2005). Privacy and rationality in individual decision making. IEEE security & privacy, 3(1), 26-33.
Alexy, R. (2002). A Theory of Constitutional Rights (J. Rivers, Trans.). Oxford University Press.
Andreou, A., Venkatadri, G., Goga, O., Gummadi, K. P., Loiseau, P., & Mislove, A. (2018). Investigating ad transparency mechanisms in social media: A case study of Facebook's explanations. Network and Distributed System Security Symposium 2018.
Ansorge, J. T. (2016). Identify and sort: How digital power changed world politics. Oxford University Press.
Article 29 Data Protection Working Party. (2007). Opinion 4/2007 on the concept of personal data. https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2007/wp136_en.pdf
Article 29 Data Protection Working Party. (2012). Article 29 Data Protection Working Party, Opinion 1/2012 on the data protection reform proposals. https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2012/wp191_en.pdf
Article 29 Data Protection Working Party. (2013). Advice paper on essential elements of a definition and a provision on profiling within the EU General Data Protection Regulation.
Article 29 Data Protection Working Party. (2017). Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679. https://ec.europa.eu/newsroom/just/document.cfm?doc_id=47711
Article 29 Data Protection Working Party. (2018). Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679. https://ec.europa.eu/newsroom/article29/redirection/document/49826
Balkin, J. M. (2021). To Reform Social Media, Reform Informational Capitalism. In Bollinger L. C. & Stone G. R. (Eds.), Social Media, Freedom of Speech and the Future of Our Democracy (pp. 233-254). Oxford University Press.
Bogard, W. (2012). Simulation and post-panopticism. In K. Ball, K. Haggerty, & D. Lyon (Eds.), Routledge handbook of surveillance studies (pp. 30-37). Routledge.
boyd, d., & Crawford, K. (2012). Critical Questions for Big Data. Information, Communication & Society, 15(5), 662-679. https://doi.org/10.1080/1369118X.2012.678878
boyd, d., & Ellison, N. B. (2007). Social network sites: Definition, history, and scholarship. Journal of computer‐mediated Communication, 13(1), 210-230.
Bozdag, E. (2013). Bias in Algorithmic Filtering and Personalization. Ethics and information technology, 15, 209-227.
Brkan, M. (2017). The Court of Justice of the EU, privacy and data protection: Judge-made law as a leitmotif in fundamental rights protection. In M. Brkan & E. Psychogiopoulou (Eds.), Courts, Privacy and Data Protection in the Digital Environment. Edward Elgar Publishing. https://doi.org/10.4337/9781784718718
Burk, D. L. (2020). Algorithmic Legal Metrics. Notre Dame Law Review, 96(3), 1147-1204.
Bygrave, L. A. (2014). Data Privacy Law: An International Perspective. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199675555.001.0001
Bygrave, L. A. (2020a). Article 4(4) Profiling. In C. Kuner, L. A. Bygrave, C. Docksey, & L. Drechsler (Eds.), The EU General Data Protection Regulation (GDPR): A Commentary (pp. 127–131). Oxford University Press. https://doi.org/10.1093/oso/9780198826491.003.0010
Bygrave, L. A. (2020b). Article 25 Data protection by design and by default. In C. Kuner, L. A. Bygrave, C. Docksey, & L. Drechsler (Eds.), The EU General Data Protection Regulation (GDPR): A Commentary (pp. 0571–0581). Oxford University Press. https://doi.org/10.1093/oso/9780198826491.003.0060
Cohen, J. E. (2000). Examined lives: Informational privacy and the subject as object. Stanford Law Review, 52(5), 1373-1438. https://doi.org/Doi 10.2307/1229517
Cohen, J. E. (2012). Configuring the networked self: Law, code, and the play of everyday practice. Yale University Press.
Cohen, J. E. (2013). What Privacy Is For. Harvard Law Review, 126(7), 1904-1933.
Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford University Press.
Crawford, K. (2021). The atlas of AI: Power, politics, and the planetary costs of artificial intelligence. Yale University Press.
Deleuze, G. (1992). Postscript on the Societies of Control. October, 59, 3-7.
van Dijck, J. (2013). The culture of connectivity: A critical history of social media. Oxford University Press.
van Dijck, J., Poell, T., & De Waal, M. (2018). The platform society: Public values in a connective world. Oxford University Press.
Dobber, T., Ó Fathaigh, R., & Zuiderveen Borgesius, F. (2019). The Regulation of Online Political Micro-targeting in Europe. Internet Policy Review, 8(4), 1-20.
van Drunen, M., et al. (2022, January 25). Transparency and (no) more in the Political Advertising Regulation (opinion). Internet Policy Review. Retrieved July 3, 2023, from https://policyreview.info/articles/news/transparency-and-no-more-political-advertising-regulation/1616
Durante, M. (2017). Ethics, law and the politics of information: a guide to the philosophy of Luciano Floridi. Springer.
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin's Press.
European Court of Human Rights. (2021). Guide to the Case-Law of the of the European Court of Human Rights: Data Protection. https://ks.echr.coe.int/documents/d/echr-ks/guide_art_8_eng
European Data Protection Board. (2021a). Guidelines 8/2020 on the targeting of social media users. https://edpb.europa.eu/system/files/2021-04/edpb_guidelines_082020_on_the_targeting_of_social_media_users_en.pdf
European Data Protection Board. (2021b). Opinion 1/2021 on the Proposal for a Digital Services Act . https://edps.europa.eu/system/files/2021-02/21-02-10-opinion_on_digital_services_act_en.pdf
European Data Protection Board. (2022). Binding Decision 4/2022 on the dispute submitted by the Irish SA on Meta Platforms Ireland Limited and its Instagram service (Art. 65 GDPR).
Fabbrini, F. (2015). The EU Charter of Fundamental Rights and the rights to data privacy: the EU Court of Justice as a Human Rights Court. In S. d. Vries, U. Bernitz, & S. Weatherill (Eds.), The EU Charter of Fundamental Rights as a Binding Instrument (pp. 261-286). Hart.
Federal Constitutional Court. (2022). Decisions of the Federal Constitutional Court: General right of personality. Nomos.
Floridi, L. (2011). The Informational Nature of Personal Identity. Minds and machines, 21, 549-566.
Floridi, L. (2013). The Ethics of Information. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199641321.001.0001
Frantziou, E. (2015). The Horizontal Effect of the Charter of Fundamental Rights of the EU: Rediscovering the Reasons for Horizontality. European Law Journal, 21(5), 657-679.
Fuchs, C. (2014). Social Media: A Critical Introduction. SAGE.
Fuster, G. G. (2014). The emergence of personal data protection as a fundamental right of the EU. Springer.
Garland, D. (1997). Governmentality'and the problem of crime: Foucault, criminology, sociology. Theoretical criminology, 1(2), 173-214.
Georgieva, L., & Kuner, C. (2020). Article 9 Processing of special categories of personal data. In C. Kuner, L. A. Bygrave, C. Docksey, & L. Drechsler (Eds.), The EU General Data Protection Regulation (GDPR): A Commentary (pp. 365–384). Oxford University Press. https://doi.org/10.1093/oso/9780198826491.003.0038
Gibbons, T. (2020). Providing a Platform for Speech: Possible Duties and Re sponsibilities. In A. T. Kenyon & A. Scott (Eds.), Positive Free Speech: Rationales, Methods and Implications (pp. 11-23). Bloomsbury Publishing.
Gorton, W. A. (2016). Manipulating Citizens: How Political Campaigns’ Use of Behavioral Social Science Harms Democracy. New Political Science, 38(1), 61-80. https://doi.org/10.1080/07393148.2015.1125119
Grafanaki, S. (2018). Platforms, the First Amendment and Online Speech Regulating the Filters. Pace Law Review, 39(1), 111-162.
De Gregorio, G. (2022). Digital constitutionalism in Europe: Reframing rights and powers in the algorithmic society. Cambridge University Press.
Helberger, N., Dobber, T., & de Vreese, C. (2021). Towards unfair political practices law: Learning lessons from the regulation of unfair commercial practices for online political advertising. Journal of Intellectual Property, Information Technology and Electronic Commerce Law, 12, 273.
Helberger, N., van Drunen, M., Vrijenhoek, S., & Möller, J. (2021, February 26). Regulation of news recommenders in the Digital Services Act: empowering David against the Very Large Online Goliath (opinion). Internet Policy Review. Retrieved July 3, 2023, from https://policyreview.info/articles/news/regulation-news-recommenders-digital-services-act-empowering-david-against-very-large
Henman, P. (2020). Governing by algorithms and algorithmic governmentality: Towards machinic judgement. In M. Schuilenburg & R. Peeters (Eds.), The Algorithmic Society: Technology, Power, and Knowledge (pp. 19-34). Routledge.
Hermann, E. (2022). Artificial intelligence and mass personalization of communication content—An ethical and literacy perspective. New Media & Society, 24(5), 1258-1277.
Hildebrandt, M. (2006). Profiling: From Data to Knowledge. Datenschutz und Datensicherheit, 30(9), 548-552.
Hildebrandt, M. (2008). Defining Profiling: A New Type of Knowledge? In M. Hildebrandt & S. Gutwirth (Eds.), Profiling the European Citizen (pp. 17-30). Springer.
Husovec, M., & Roche Laguna, I. (2023). Digital Services Act: A Short Primer [Unpublished manuscript]. In M. Husovec & I. Roche Laguna (Eds.), Principles of the Digital Services Act. Available at: https://ssrn.com/abstract=4153796 or http://dx.doi.org/10.2139/ssrn.4153796
Information Commissioner’s Office. (2018). Investigation into the Use of Data Analytics in Political Campaigns. https://ico.org.uk/media/action-weve-taken/2259371/investigation-into-data-analytics-for-political-purposes-update.pdf
Jaquet-Chiffelle, D.-O. (2008). Reply: Direct and indirect profiling in the light of virtual persons. In M. Hildebrandt & S. Gutwirth (Eds.), Profiling the European Citizen (pp. 34-43).
Jordan, M. I., & Mitchell, T. M. (2015). Machine learning: Trends, perspectives, and prospects. Science, 349(6245), 255-260.
Kantardzic, M. (2020). Data mining : concepts, models, methods, and algorithms (Third edition. ed.). John Wiley & Sons, Inc.
Kokott, J., & Sobotta, C. (2013). The Distinction Between Privacy and Data Protection in the Jurisprudence of the CJEU and the ECtHR. International Data Privacy Law, 3(4), 222-228.
Kotras, B. (2020). Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge. Big data & society, 7(2), 1-14.
Kramer, L. (2022). A Deliberate Leap in the Opposite Direction: The Need to Rethink Free Speech. In L. C. Bollinger & G. R. Stone (Eds.), Social Media, Freedom of Speech and the Future of Our Democracy (pp. 17-39). Oxford University Press.
Kranenborg, H. (2020). Article 2 Material scope. In C. Kuner, L. A. Bygrave, C. Docksey, & L. Drechsler (Eds.), The EU General Data Protection Regulation (GDPR): A Commentary (pp. 60–73). Oxford University Press. https://doi.org/10.1093/oso/9780198826491.003.0004
Kranenborg, H. (2021). Article 8. In S. Peers, T. Hervey, J. Kenner, & A. Ward (Eds.), The EU Charter of Fundamental Rights : A Commentary (pp. 231-290). Bloomsbury Publishing.
Lindroos-Hovinheimo, S. (2021). Private Selves: Legal Personhood in European Privacy Protection. Cambridge University Press. https://doi.org/DOI: 10.1017/9781108781381
Lindroos-Hovinheimo, S. (2022, February 23). The Proposed EU Regulation on Political Advertising Has Good Intentions, But Too Wide a Scope. European Blog. Retrieved July 3, 2023, from https://europeanlawblog.eu/2022/02/23/the-proposed-eu-regulation-on-political-advertising-has-good-intentions-but-too-wide-a-scope/
Lock, T. (2019). Article 16 CFR. In M. Kellerbauer, M. Klamert, & J. Tomkin (Eds.), The EU Treaties and the Charter of Fundamental Rights: A Commentary (pp. 2147–2148). Oxford University Press. https://doi.org/10.1093/oso/9780198759393.003.536
Lynskey, O. (2015). The foundations of EU data protection law. Oxford University Press.
Lynskey, O. (2017, February). Regulating'platform power' (LSE Legal Studies, Working Paper No. 1/2017). https://ssrn.com/abstract=2921021 or http://dx.doi.org/10.2139/ssrn.2921021
Lyon, D. (2003a). Introduction. In D. Lyon (Ed.), Surveillance as Social Sorting: Privacy, Risk and Digital Discrimination (pp. 1-9). Routledge.
Lyon, D. (2003b). Surveillance as social sorting: Computer codes and mobile body. In D. Lyon (Ed.), Surveillance as Social Sorting: Privacy, Risk and Digital Discrimination (pp. 13-30). Routledge.
Lyon, D. (2007). Surveillance studies: An overview. Polity.
Mangan, D. (2021). Article 7 (Private Life, Home and Communications). In S. Peers, T. Hervey, J. Kenner, & A. Ward (Eds.), The EU Charter of Fundamental Rights : A Commentary (2 ed., pp. 151-194). Bloomsbury Publishing.
Marcuse, H. (2013). One-dimensional man: Studies in the ideology of advanced industrial society. Routledge.
Matz, S. C., Appel, R. E., & Kosinski, M. (2020). Privacy in the age of psychological targeting. Current opinion in psychology, 31, 116-121.
Mayer-Schönberger, V., & Cukier, K. (2013). Big data: A revolution that will transform how we live, work, and think. Houghton Mifflin Harcourt.
Mock, W. B. T., & Demuro, G. (2010). Forward: The Charter's History and Current Legal Status. In W. B. T. Mock & G. Demuro (Eds.), Human Rights in Europe: Commentary on the Charter of Fundamental Rights of The European Union (pp. i-xxx). Carolina Academic Press.
Moreham, N. A. (2008). The Right to Respect for Private Life in The European Convention on Human Rights: A Re-Examination. European Human Rights Law, 9(3), 44-79.
Napoli, P. M. (2019). Social Media and the Public Interest: Media Regulation in the Disinformation Age. Columbia University Press. https://doi.org/doi:10.7312/napo18454
Nissenbaum, H. (2009). Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford University Press. http://www.sup.org/books/title/?id=8862
O’Callaghan, P. (2015). Article 8 ECHR as a General Personality Right? Journal of European Tort Law, 6(1), 69-84.
Pariser, E. (2011). The filter bubble: How the new personalized web is changing what we read and how we think. Penguin.
Pasquale, F. (2015). The black box society : the secret algorithms that control money and information. Harvard University Press.
Post, R. C. (1989). The social foundations of privacy: Community and self in the common law tort. California Law Review, 77, 957-1010.
Post, R. C. (2000). Three concepts of privacy. Georgetown Law Journal, 89, 2087-2098.
Post, R. C. (2018). Data Privacy and Dignitary Privacy: Google Spain, the Right to Be Forgotten, and the Construction of the Public Sphere. Duke Law Journal, 67(5), 981-1072.
Prosser, W. L. (1960). Privacy. California Law Review, 48, 383-423.
Purtova, N. (2018). The law of everything. Broad concept of personal data and future of EU data protection law. Law, Innovation and Technology, 10(1), 40-81.
(1977). Restatement (Second) of Torts. The American Law Institute.
Ricci, F., Rokach, L., & Shapira, B. (2010). Introduction to Recommender Systems Handbook. In Recommender Systems Handbook (pp. 1-35). Springer.
Richardson, M. (2017). The Right to Privacy: Origins and Influence of a Nineteenth-Century Idea. Cambridge University Press.
Ross Arguedas, A., Robertson, C., Fletcher, R., & Nielsen, R. (2022). Echo chambers, filter bubbles, and polarisation: A literature review. https://reutersinstitute.politics.ox.ac.uk/echo-chambers-filter-bubbles-and-polarisation-literature-review.
Schauer, F. (2006). Profiles, Probabilities and Stereotypes. Harvard University Press.
Schwartz, B. (1968). The Social Psychology of Privacy. American Journal of Sociology, 73(6), 741-752.
Schwartz, P. M. (1999). Privacy and Democracy in Cyberspace. Vanderbilt Law Review, 52(6), 1607-1702.
Solove, D. J. (2002). Conceptualizing privacy. California Law Review, 90(4), 1087-1155. https://doi.org/Doi 10.2307/3481326
Solove, D. J. (2004). The digital person: Technology and privacy in the information age. NYU Press.
Spencer, S. B. (2020). The Problem of Online Manipulation. University of Illinois Law Review, 2020(3), 959-1006.
Stępkowski, Ł. (2014). The Right to be Forgotten under the EU Personal Data Protection Regime. In M. Sitek, P. Terem, & M. Wójcicka (Eds.), Collective Human Rights in the First Half of the 21st Century (pp. 105-118). Alcide De Gasperi University of Euroregional Economy in Józefów.
Sunstein, C. R. (2004). Democracy and filtering. Communications of the ACM, 47(12), 57-59.
Sunstein, C. R. (2016). The ethics of influence: Government in the age of behavioral science. Cambridge University Press.
Sunstein, C. R. (2018). #Republic: Divided democracy in the age of social media. Princeton university press.
Susser, D., Roessler, B., & Nissenbaum, H. (2019). Online manipulation: Hidden influences in a digital world. Georgetown Law Technology Review, 4(1), 1-46.
Virilio, P. (1994). The vision machine. Indiana University Press.
Vrabec, H. U. (2021). Data Subject Rights under the GDPR. Oxford University Press. https://doi.org/10.1093/oso/9780198868422.001.0001
Warren, S., & Brandeis, L. (1890). The right to privacy. Harvard Law Review, 4(5), 193-220.
Woods, L. (2017a). Digital freedom of expression in the EU. In S. Douglas-Scott, N. Hatzis, & L. Woods (Eds.), Research Handbook on EU Law and Human Rights. Edward Elgar Publishing. https://doi.org/10.4337/9781782546405
Woods, L. (2017b). Social media: It is not just about Article 10. In L. Woods, D. Mangan, & L. E. Gillies (Eds.), The Legal Challenges of Social Media. Edward Elgar Publishing. https://doi.org/10.4337/9781785364518.00019
Woods, L. (2021). Article 11 Freedom of Expression and Information. In S. Peers, T. Hervey, J. Kenner, & A. Ward (Eds.), The EU Charter of Fundamental Rights: A Commentary. Bloomsbury Publishing. https://doi.org/10.5771/9783748913245
Wu, T. (2017). The Attention Merchants: The Epic Scramble to Get Inside Our Heads (Vintage Books Edition). Vintage Books.
Wu, T. (2018). Is the first amendment obsolete? Michigan Law Review, 117, 547-581. https://doi.org/https://doi.org/10.36644/mlr.117.3.first
Zanfir-Fortuna, G. (2020). Article 21 Right to object. In C. Kuner, L. A. Bygrave, C. Docksey, & L. Drechsler (Eds.), The EU General Data Protection Regulation (GDPR): A Commentary (pp. 508–521). Oxford University Press. https://doi.org/10.1093/oso/9780198826491.003.0054
Zarsky, T. Z. (2019). Privacy and manipulation in the digital age. Theoretical Inquiries in Law, 20(1), 157-188.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
Zuiderveen Borgesius, F., Möller, J., Kruikemeier, S., Ó Fathaigh, R., Irion, K., Dobber, T., Bodo, B., & de Vreese, C. H. (2018). Online political microtargeting: promises and threats for democracy. Utrecht Law Review, 14(1), 82-96.
-
dc.identifier.urihttp://tdr.lib.ntu.edu.tw/jspui/handle/123456789/89079-
dc.description.abstract本文旨在釐清歐盟基本權憲章中的私生活權是否、如何作為次級法管制社群媒體平台精準投放之初級法基礎。著眼於歐盟次級法中,個資法制與內容法制在精準投放管制領域中之匯流現狀,純然從個資保護出發不再足以正當化所有對精準投放的管制。本文以歐盟基本權憲章之私生活權切入,論證保障私生活權是除了個資保護權外,管制精準投放之立法所欲追求的正當目的。本文所要回答的問題是:精準投放管制與資料保護一般規則間的落差,是否能夠透過保障私生活權之目的說明?如果可以,現有的次級法管制有何不當與不足之處?
本文主張,私生活權保護開放、持續的人格發展所處的外在環境,精準投放具有使人格發展僵固及受到操弄的危險,從而形成平台對使用者私生活權之限制。相比於個資保護權,私生活權額外地考慮了個資處理對人格發展的影響。從保障私生活權的立法目的出發,現有歐盟次級法對於精準投放的管制部分延續個資保護法制的中性技術管制模式,未依據訊息投放對人格發展的影響區分,而有不當之處;以個資的處理作為管制對象,未能涵蓋某些有害的操弄形式,亦未能更宏觀地處理精準投放的演算法對資訊環境所生影響,而有不足之處。
本文論證分為三個部分:歐盟次級法的精準投放管制現狀分析(第二章)、歐盟基本權憲章的理論分析與解釋(第三章)與透過第三章確立的理論檢驗歐盟次級法對精準投放之管制架構(第四章)。
第二章透過歐盟資料保護一般規則、數位服務法與政治廣告透明性及投放規則草案三者之分析與比較,呈現歐盟次級法的個資法制與內容法制在平台精準投放的議題上匯流。此種匯流肇因於精準投放是平台透過個資進行內容篩選排序之活動本質,並反映於立法目的、管制手段的重合,以及在內容法制中隱含的個資保護考量。除此之外,次級法現狀亦彰顯歐盟立法者對於精準投放管制,並非只考量個資保護,從而需要額外的理論基礎。
第三章分析憲章之私生活權與個資保護權的內涵,與精準投放如何構成權利之限制。本文指出,歐盟法院對私生活權的理解相容於考量人格發展環境之自主觀點,從而保障領域能夠擴張及於人格發展的環境。平台的精準投放透過塑造僵化、單一的人格發展條件,以及對於脆弱性的操弄,形成對私生活權的限制。相比於著重制衡個資處理權力的個資保護權,私生活權能夠將精準投放對人格發展的影響納入考量。兩者形成一體、但作用不盡相同的保障。
第四章從基本權的視角出發,評析歐盟管制架構的不當與不足之處。本文指出,在個資保護法制的基礎框架外,次級法額外對於精準投放施加的管制,不當之處在於部分因循個資保護的特種個資、特徵剖繪之概念,而非著眼於訊息投放的可能影響,而使得管制手段未能貼合保障私生活權之目的。不足之處則在於,對廣告系統及特種個資的管制未能涵蓋其他有害的操弄形式,以及對超大型線上平台的推薦系統管制將目光限縮於前階段的特徵剖繪,而非評估整體演算法對使用者接觸之資訊環境所造成的影響。
zh_TW
dc.description.abstractThis thesis seeks to understand whether and to what extent the right to respect for private life in the Charter of Fundamental Rights of the European Union serves as the primary law for the secondary law regulating micro-targeting on social media platforms. As data protection law and content regime under EU secondary law converge in the field of regulation on micro-targeting, a purely data protection perspective is no longer sufficient to legitimize all forms of regulation on micro-targeting. The thesis focuses on the right to respect for private life under the Charter and argues that, in addition to the right to data protection, safeguarding the right to respect for private life is a legitimate aim for regulating micro-targeting. The purpose of the thesis is to answer whether the gap between micro-targeting regulation and the General Data Protection Regulation can be justified through the aim of safeguarding the right to respect for private life. If so, what are the deficiencies in current secondary law regulations?
The thesis argues that the right to respect for private life safeguards the external environment which is essential for the open and continuous development of personality, and that platform’s micro-targeting interferes with the right by inhibiting its development and exposing it to the risk of manipulation. Compared to the right to data protection, the right to respect for private life addresses the influence of data processing on personality. From the purpose of safeguarding the right to private life, current regulation on micro-targeting under EU secondary law partly follows the neutral technical regulation model of data protection law. Therefore, it is inadequate in that it fails to differentiate the impacts of targeting on personality development. Moreover, current regulation on data processing is insufficient because it fails to cover certain harmful manipulative practices and respond to the broader impacts of micro-targeting algorithms on the information environment.
The argument in the thesis is divided into three parts: an analysis of the current state of micro-targeting regulation in EU secondary laws (Chapter 2), a theoretical analysis and interpretation of the EU Charter of Fundamental Rights (Chapter 3), and the examination of the regulatory framework for micro-targeting in EU secondary laws based on the theoretical foundation established in Chapter 3 (Chapter 4).
Chapter 2 presents the convergence of data protection law and content regime in the context of platform’s micro-targeting through an analysis and comparison of three EU secondary laws: the General Data Protection Regulation, the Digital Services Act, and the proposal for regulation on the transparency and targeting of political advertising. This convergence comes from the nature of platform’s micro-targeting which curates content based on personal data. It is reflected in the alignment of the legislative aims and regulatory means, as well as the implicit consideration of data protection within content regulations. Furthermore, current secondary law also indicates that the center of attention of the micro-targeting regulation is not solely data protection and therefore requires additional theoretical foundation.
Chapter 3 analyzes the coverage of the right to respect for private life and the right to data protection as enshrined in the Charter. It examines how micro-targeting may interfere with these rights. The chapter points out the alignment between the Court of Justice of the European Union's interpretation of the right to respect for private life and the notion of autonomy, which duly accounts for the developmental context of one's personality. Consequently, this perspective supports the expansion of the right's scope to encompass the surrounding environment. Platform’s micro-targeting limits the right to respect for private life by structuring a rigid and homogenous information environment and exploiting vulnerabilities for manipulation. Compared to the right to data protection that focuses on balancing the data processing power, the right to respect for private life can address the impact of micro-targeting on development of personality. The two rights constitute a coherent protection but function distinctively.
Chapter 4 evaluates the inadequacies and insufficiency of EU regulatory framework from the perspective of fundamental rights. The section highlights the insufficiency of current regulations concerning micro-targeting, beyond the scope of data protection laws. It echoes the terminology of "special categories of personal data" or "profiling" as stipulated by data protection regulations. However, it fails to account for the potential ramifications of content targeting, which is ill-suited to the overarching goal of upholding the right to respect for private life. It is also insufficient because the regulation on advertising and special categories of personal data does not address other harmful forms of manipulation. Moreover, the regulation on the recommender systems of very large online platforms focuses not on the impact algorithms have on the users but limited to profiling stage. Furthermore, it is also insufficient as the current regulations pertaining to advertising and the handling of special categories of personal data fail to encompass various other detrimental forms of manipulation. The existing regulation concerning recommender systems within very large online platforms concentrates solely on the profiling phase and neglects the broader influence of algorithms on the user environment they interact with.
en
dc.description.provenanceSubmitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-08-16T17:02:37Z
No. of bitstreams: 0
en
dc.description.provenanceMade available in DSpace on 2023-08-16T17:02:37Z (GMT). No. of bitstreams: 0en
dc.description.tableofcontents誌謝 i
摘要 v
ABSTRACT vii
簡目 xi
詳目 xiii
圖目錄 xix
表目錄 xxi
第一章 緒論 1
第一節 問題提出 1
第一項 何謂精準投放? 1
第一款 精準投放的特性 1
第二款 精準投放中的特徵剖繪 2
第三款 使用特徵剖繪進行內容篩選排序 4
第二項 精準投放的現象與傷害 6
第三項 回應精準投放:如何正當化公部門管制? 11
第二節 研究範圍 13
第一項 社群媒體平台 13
第二項 歐盟法 15
第三節 論證結構 17
第二章 歐盟次級法下的精準投放管制 19
第一節 個資法制與內容法制的分立與與匯流 20
第二節 歐盟資料保護一般規則 23
第一項 適用範圍 24
第一款 個人資料之「處理」 24
第二款 已識別或可得識別的個人資料 25
第二項 資料主體自主權能保障 27
第一款 賦予自主控制權:事前同意與事後控制 27
第二款 強化決定能力:資訊提供義務及個資近用權 29
第一目 資訊提供義務 29
第二目 近用權 30
第三項 資料控制者客觀義務 31
第一款 多重資料控制者之共同責任 31
第二款 實體要件中的風險類型化 32
第一目 特徵剖繪的定義與定位 32
第二目 風險權衡作為合法性判準 34
第三目 類型化的脆弱性保障強化 35
第三款 技術、組織及程序上義務 37
第一目 設計階段之個資保護與預設個資保護設定 37
第二目 資料保護影響評估 38
第四項 資料保護一般規則的管制圖像分析 41
第三節 數位服務法 43
第一項 數位服務法的規範對象 43
第二項 數位服務法的精準投放管制 44
第一款 未成年人保護 44
第二款 廣告透明性與特種個資特徵剖繪禁止 45
第三款 推薦系統透明性與參數調整選項 46
第四款 系統性風險評估及風險調節義務 47
第三項 數位服務法的管制圖像分析 49
第四節 政治廣告透明性及投放規則草案 52
第一項 立法目的及適用範圍 52
第二項 政治廣告的透明性 53
第一款 政治「廣告」的概念 53
第二款 透明性義務的內容 54
第三項 針對政治廣告投放與推播之規範 55
第一款 特種個資使用禁止 56
第二款 內部政策制定、紀錄保存與透明性義務 56
第四項 分析:政治廣告草案作為數位服務法之延伸 57
第五節 個資法制與內容法制的交集與分歧 58
第三章 平台精準投放管制的憲章上基礎 63
第一節 歐盟基本權憲章之沿革與解釋基礎 64
第二節 私生活權的保障領域詮釋與分析 66
第一項 私生活權與個資保護的發展、競合關係與理論疑義 66
第二項 歐洲人權法院與歐洲人權公約第 8 條私生活權 68
第一款 社會關係面向的私生活內涵 68
第二款 純粹私人性質之資訊 68
第三款 非私密資訊之保護:個資保護與合理隱私期待 70
第一目 個資保護的一般性論述 70
第二目 系統性、長期性的監控紀錄 72
第三目 逾越「合理隱私期待」作為限制的認定標準 72
第四款 分析:私生活的廣泛意涵與保障門檻 75
第三項 歐盟法院與歐盟基本權憲章第 7 條私生活權 76
第一款 私生活權保障下的個人資料 77
第一目 涉及個人資料保護的私生活權 77
第二目 公共領域中的個人資料 79
第二款 私生活權的「本質部分」 81
第三款 私生活權干預強度評估之要素 81
第一目 權衡要素:個資性質與處理態樣 82
第二目 個資來源的公開與否 82
第三目 個資集聚的私生活推論效果 82
第四款 分析:刻意模糊的私生活權邊界 83
第五款 小結 85
第四項 私生活權的保障領域理論建構 86
第一款 私生活權作為自主的社會互動保護 87
第一目 核心領域與空間隱私 87
第二目 資訊隱私與資訊的社會規範 91
第三目 小結:社會互動視角的隱私理論與限制 92
第二款 私生活權作為人格發展條件之保護 93
第一目 人格發展的背景條件:從自主控制到自主環境 93
第二目 作為私生活權保障客體的人格觀點 94
第三目 判決先例中的私生活權與社會觀點的人格 97
第三款 監控對人格發展條件的限制 98
第一目 監控的意義 99
第二目 順服的身體與積極的主體 99
第三目 數位環境中的分類與操弄 101
第四款 分析:精準投放與私生活權的限制 105
第一目 特徵剖繪與個人化的內容投放 105
第二目 操弄性的內容投放 106
第五項 社群媒體平台精準投放之私生活權限制 107
第三節 個資保護權的保障領域詮釋與分析 108
第一項 歐洲個資保護法制發展概述 108
第一款 歐洲理事會第 108 號公約 108
第二款 個資保護指令 109
第二項 歐盟法院與歐盟基本權憲章第 8 條個資保護權 110
第三項 個資保護權的理論基礎與保障領域 111
第一款 個資保護之理論基礎:監控與識別 111
第二款 個資保護權之保障領域 113
第一目 識別性與個人資料的概念範圍 113
第二目 個資保護權的權利內涵 115
第四項 社群媒體平台精準投放之個資保護權限制 116
第四節 小結:精準投放作為私生活權與個資保護權之限制 118
第四章 歐盟法精準投放管制之架構分析 121
第一節 憲章基本權之水平效力與歐盟立法者角色 122
第一項 歐盟基本權憲章之水平效力 122
第二項 平台與使用者之關係及歐盟立法者角色 123
第二節 精準投放管制與基本權保護目的之連結 125
第一項 歐盟次級法精準投放管制架構之理論分析 125
第一款 特徵剖繪階段之管制 126
第一目 廣告系統特徵剖繪禁止規定的疑慮 127
第二目 超大型線上平台推薦系統參數調整選項 128
第二款 訊息投放階段之管制 129
第一目 廣告之來源標示與脈絡資訊 129
第二目 系統邏輯透明性要求及其侷限 130
第三目 廣告存檔及聚合資料公開 132
第三款 針對資訊環境之管制 132
第二項 以私生活權為基礎的精準投放管制結構 132
第三項 小結:歐盟法精準投放管制之理論定位 134
第三節 基本權衝突之調和:營業自由與言論自由 136
第一項 憲章第 16 條營業自由 136
第二項 憲章第 11 條言論自由 137
第一款 平台不得就篩選排序主張言論自由 138
第二款 管制手段須與使用者言論自由調和 139
第三項 分析:管制目的與營業自由、言論自由之調和 141
第四節 小結:歐盟精準投放管制的正當性與限制 142
第五章 結論 143
第一節 論證總結 143
第二節 未來展望 146
參考文獻 149
-
dc.language.isozh_TW-
dc.subject私生活權zh_TW
dc.subject數位服務法zh_TW
dc.subject資料保護一般規則zh_TW
dc.subject內容篩選排序zh_TW
dc.subject社群媒體平台zh_TW
dc.subject精準投放zh_TW
dc.subject個資保護zh_TW
dc.subject操弄zh_TW
dc.subject監控zh_TW
dc.subject政治廣告透明性及投放規則草案zh_TW
dc.subjectManipulationen
dc.subjectMicro-targetingen
dc.subjectSocial Media Platformen
dc.subjectContent Curationen
dc.subjectGeneral Data Protection Regulation (GDPR)en
dc.subjectDigital Services Act (DSA)en
dc.subjectProposal for Regulation on the Transparency and Targeting of Political Advertisingen
dc.subjectData Protectionen
dc.subjectRight to Respect for Private Lifeen
dc.subjectSurveillanceen
dc.title論歐盟社群媒體平台精準投放之管制架構:以私生活權之保障為中心zh_TW
dc.titleThe Regulatory Framework for Micro-targeting on Social Media Platforms in the European Union: A Study on the Protection of the Right to Respect for Private Lifeen
dc.typeThesis-
dc.date.schoolyear111-2-
dc.description.degree碩士-
dc.contributor.oralexamcommittee林子儀;劉定基zh_TW
dc.contributor.oralexamcommitteeTzu-Yi Lin;Ting-Chi Liuen
dc.subject.keyword精準投放,社群媒體平台,內容篩選排序,資料保護一般規則,數位服務法,政治廣告透明性及投放規則草案,個資保護,私生活權,監控,操弄,zh_TW
dc.subject.keywordMicro-targeting,Social Media Platform,Content Curation,General Data Protection Regulation (GDPR),Digital Services Act (DSA),Proposal for Regulation on the Transparency and Targeting of Political Advertising,Data Protection,Right to Respect for Private Life,Surveillance,Manipulation,en
dc.relation.page160-
dc.identifier.doi10.6342/NTU202303795-
dc.rights.note同意授權(限校園內公開)-
dc.date.accepted2023-08-11-
dc.contributor.author-college法律學院-
dc.contributor.author-dept法律學系-
顯示於系所單位:法律學系

文件中的檔案:
檔案 大小格式 
ntu-111-2.pdf
授權僅限NTU校內IP使用(校園外請利用VPN校外連線服務)
11.42 MBAdobe PDF
顯示文件簡單紀錄


系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。

社群連結
聯絡資訊
10617臺北市大安區羅斯福路四段1號
No.1 Sec.4, Roosevelt Rd., Taipei, Taiwan, R.O.C. 106
Tel: (02)33662353
Email: ntuetds@ntu.edu.tw
意見箱
相關連結
館藏目錄
國內圖書館整合查詢 MetaCat
臺大學術典藏 NTU Scholars
臺大圖書館數位典藏館
本站聲明
© NTU Library All Rights Reserved