請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99521完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 高英哲 | zh_TW |
| dc.contributor.advisor | Ying-Jer Kao | en |
| dc.contributor.author | 曾晟倢 | zh_TW |
| dc.contributor.author | Sheng-Chieh Tseng | en |
| dc.date.accessioned | 2025-09-10T16:32:46Z | - |
| dc.date.available | 2025-09-11 | - |
| dc.date.copyright | 2025-09-10 | - |
| dc.date.issued | 2025 | - |
| dc.date.submitted | 2025-07-23 | - |
| dc.identifier.citation | [1] Michael A Nielsen and Isaac L Chuang. Quantum computation and quantum information. Cambridge university press, 2010.
[2] Aram W Harrow and Ashley Montanaro. Quantum computational supremacy. Nature, 549(7671):203–209, 2017. [3] Lov K Grover. Quantum mechanics helps in searching for a needle in a haystack. Physical review letters, 79(2):325, 1997. [4] Peter W Shor. Polynomial-time algorithms for prime factorization and discrete logarithms on a quantum computer. SIAM review, 41(2):303–332, 1999. [5] Volodymyr Mnih, Koray Kavukcuoglu, David Silver, Andrei A Rusu, Joel Veness, Marc G Bellemare, Alex Graves, Martin Riedmiller, Andreas K Fidjeland, Georg Ostrovski, et al. Human-level control through deep reinforcement learning. nature, 518(7540):529–533, 2015. [6] David Silver, Aja Huang, Chris J Maddison, Arthur Guez, Laurent Sifre, George Van Den Driessche, Julian Schrittwieser, Ioannis Antonoglou, Veda Panneershelvam, Marc Lanctot, et al. Mastering the game of go with deep neural networks and tree search. nature, 529(7587):484–489, 2016. [7] A Vaswani. Attention is all you need. Advances in Neural Information Processing Systems, 2017. [8] Marco Cerezo, Andrew Arrasmith, Ryan Babbush, Simon C Benjamin, Suguru Endo, Keisuke Fujii, Jarrod R McClean, Kosuke Mitarai, Xiao Yuan, Lukasz Cincio, et al. Variational quantum algorithms. Nature Reviews Physics, 3(9):625–644, 2021. [9] Kishor Bharti, Alba Cervera-Lierta, Thi Ha Kyaw, Tobias Haug, Sumner AlperinLea, Abhinav Anand, Matthias Degroote, Hermanni Heimonen, Jakob S Kottmann, Tim Menke, et al. Noisy intermediate-scale quantum algorithms. Reviews of Modern Physics, 94(1):015004, 2022. [10] Kosuke Mitarai, Makoto Negoro, Masahiro Kitagawa, and Keisuke Fujii. Quantum circuit learning. Physical Review A, 98(3):032309, 2018. [11] Maria Schuld, Alex Bocharov, Krysta M Svore, and Nathan Wiebe. Circuit-centric quantum classifiers. Physical Review A, 101(3):032308, 2020. [12] Samuel Yen-Chi Chen, Chih-Min Huang, Chia-Wei Hsing, and Ying-Jer Kao. Hybrid quantum-classical classifier based on tensor network and variational quantum circuit. arXiv preprint arXiv:2011.14651, 2020. [13] Samuel Yen-Chi Chen, Chih-Min Huang, Chia-Wei Hsing, and Ying-Jer Kao. An end-to-end trainable hybrid classical-quantum classifier. Machine Learning: Science and Technology, 2(4):045021, 2021. [14] Nhat A Nghiem, Samuel Yen-Chi Chen, and Tzu-Chieh Wei. Unified framework for quantum classification. Physical Review Research, 3(3):033056, 2021. [15] Samuel Yen-Chi Chen, Tzu-Chieh Wei, Chao Zhang, Haiwang Yu, and Shinjae Yoo. Hybrid quantum-classical graph convolutional network. arXiv preprint arXiv:2101.06189, 2021. [16] Jun Qi, C.-H. Huck Yang, and Pin-Yu Chen. Qtn-vqc: An end-to-end learning framework for quantum neural networks. Physica Scripta, 99, 12 2023. [17] Samuel Yen-Chi Chen, Tzu-Chieh Wei, Chao Zhang, Haiwang Yu, and Shinjae Yoo. Quantum convolutional neural networks for high energy physics data analysis. Physical Review Research, 4(1):013231, 2022. [18] Ryan L’Abbate, Anthony D’Onofrio, Samuel Stein, Samuel Yen-Chi Chen, Ang Li, Pin-Yu Chen, Juntao Chen, and Ying Mao. A quantum-classical collaborative training architecture based on quantum state fidelity. IEEE Transactions on Quantum Engineering, 2024. [19] Samuel Yen-Chi Chen, Shinjae Yoo, and Yao-Lung L Fang. Quantum long shortterm memory. In ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 8622–8626. IEEE, 2022. [20] Michał Siemaszko, Adam Buraczewski, Bertrand Le Saux, and Magdalena Stobińska. Rapid training of quantum recurrent neural networks. Quantum Machine Intelligence, 5(2):31, 2023. [21] Mahdi Chehimi, Samuel Yen-Chi Chen, Walid Saad, and Shinjae Yoo. Federated quantum long short-term memory (fedqlstm). Quantum Machine Intelligence, 6(2):43, 2024. [22] Samuel Yen-Chi Chen, Daniel Fry, Amol Deshmukh, Vladimir Rastunkov, and Charlee Stefanski. Reservoir computing via quantum recurrent neural networks. arXiv preprint arXiv:2211.02612, 2022. [23] Yuji Cao, Xiyuan Zhou, Xiang Fei, Huan Zhao, Wenxuan Liu, and Junhua Zhao. Linear-layer-enhanced quantum long short-term memory for carbon price forecasting. Quantum Machine Intelligence, 5(2):26, 2023. [24] Yanan Li, Zhimin Wang, Rongbing Han, Shangshang Shi, Jiaxin Li, Ruimin Shang, Haiyong Zheng, Guoqiang Zhong, and Yongjian Gu. Quantum recurrent neural networks for sequential learning. Neural Networks, 166:148–161, 2023. [25] Shuyue Stella Li, Xiangyu Zhang, Shu Zhou, Hongchao Shu, Ruixing Liang, Hexin Liu, and Leibny Paola Garcia. Pqlm-multilingual decentralized portable quantum language model. In ICASSP 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 1–5. IEEE, 2023. [26] Chao-Han Huck Yang, Jun Qi, Samuel Yen-Chi Chen, Yu Tsao, and Pin-Yu Chen. When bert meets quantum temporal convolution learning for text classification in heterogeneous computing. In ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 8602–8606. IEEE, 2022. [27] Riccardo Di Sipio, Jia-Hong Huang, Samuel Yen-Chi Chen, Stefano Mangini, and Marcel Worring. The dawn of quantum natural language processing. In ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 8612–8616. IEEE, 2022. [28] Jonas Stein, Ivo Christ, Nicolas Kraus, Maximilian Balthasar Mansky, Robert Müller, and Claudia Linnhoff-Popien. Applying qnlp to sentiment analysis in finance. In 2023 IEEE International Conference on Quantum Computing and Engineering (QCE), volume 2, pages 20–25. IEEE, 2023. [29] Chen-Yu Liu, En-Jui Kuo, Chu-Hsuan Abraham Lin, Sean Chen, Jason Gemsun Young, Yeong-Jar Chang, and Min-Hsiu Hsieh. Training classical neural networks by quantum machine learning. arXiv preprint arXiv:2402.16465, 2024. [30] Chen-Yu Liu, Chu-Hsuan Abraham Lin, Chao-Han Huck Yang, Kuan-Cheng Chen, and Min-Hsiu Hsieh. Qtrl: Toward practical quantum reinforcement learning via quantum-train. arXiv preprint arXiv:2407.06103, 2024. [31] Chen-Yu Liu and Samuel Yen-Chi Chen. Federated quantum-train with batched parameter generation. arXiv preprint arXiv:2409.02763, 2024. [32] Chu-Hsuan Abraham Lin, Chen-Yu Liu, Samuel Yen-Chi Chen, and Kuan-Cheng Chen. Quantum-trained convolutional neural network for deepfake audio detection. arXiv preprint arXiv:2410.09250, 2024. [33] Chu-Hsuan Abraham Lin, Chen-Yu Liu, and Kuan-Cheng Chen. Quantum-train long short-term memory: Application on flood prediction problem. arXiv preprint arXiv:2407.08617, 2024. [34] Samuel Yen-Chi Chen, Chao-Han Huck Yang, Jun Qi, Pin-Yu Chen, Xiaoli Ma, and Hsi-Sheng Goan. Variational quantum circuits for deep reinforcement learning. IEEE access, 8:141007–141024, 2020. [35] Owen Lockwood and Mei Si. Reinforcement learning with quantum variational circuit. In Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment, volume 16, pages 245–251, 2020. [36] Samuel Yen-Chi Chen, Chih-Min Huang, Chia-Wei Hsing, Hsi-Sheng Goan, and Ying-Jer Kao. Variational quantum reinforcement learning via evolutionary optimization. Machine Learning: Science and Technology, 3(1):015025, 2022. [37] Samuel Yen-Chi Chen. Quantum deep recurrent reinforcement learning. In ICASSP 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 1–5. IEEE, 2023. [38] Andrea Skolik, Sofiene Jerbi, and Vedran Dunjko. Quantum agents in the gym: a variational quantum algorithm for deep q-learning. Quantum, 6:720, 2022. [39] Sofiene Jerbi, Casper Gyurik, Simon Marshall, Hans Briegel, and Vedran Dunjko. Parametrized quantum policies for reinforcement learning. Advances in Neural Information Processing Systems, 34:28362–28375, 2021. [40] Won Joon Yun, Yunseok Kwak, Jae Pyoung Kim, Hyunhee Cho, Soyi Jung, Jihong Park, and Joongheon Kim. Quantum multi-agent reinforcement learning via variational quantum circuit design. In 2022 IEEE 42nd International Conference on Distributed Computing Systems (ICDCS), pages 1332–1335. IEEE, 2022. [41] Nico Meyer, Christian Ufrecht, Maniraman Periyasamy, Daniel D Scherer, Axel Plinge, and Christopher Mutschler. A survey on quantum reinforcement learning. arXiv preprint arXiv:2211.03464, 2022. [42] Samuel Yen-Chi Chen. Efficient quantum recurrent reinforcement learning via quantum reservoir computing. In ICASSP 2024-2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 13186–13190. IEEE, 2024. [43] Michael Kölle, Mohamad Hgog, Fabian Ritz, Philipp Altmann, Maximilian Zorn, Jonas Stein, and Claudia Linnhoff-Popien. Quantum advantage actor-critic for reinforcement learning. arXiv preprint arXiv:2401.07043, 2024. [44] Samuel Yen-Chi Chen. Asynchronous training of quantum reinforcement learning. Procedia Computer Science, 222:321–330, 2023. International Neural Network Society Workshop on Deep Learning Innovations and Applications (INNS DLIA 2023). [45] André Sequeira, Luis Paulo Santos, and Luis Soares Barbosa. Trainability issues in quantum policy gradients. Machine Learning: Science and Technology, 2024. [46] Samuel Yen-Chi Chen. Quantum deep q-learning with distributed prioritized experience replay. In 2023 IEEE International Conference on Quantum Computing and Engineering (QCE), volume 2, pages 31–35. IEEE, 2023. [47] Hans Hohenfeld, Dirk Heimann, Felix Wiebe, and Frank Kirchner. Quantum deep reinforcement learning for robot navigation tasks. IEEE Access, 2024. [48] Rodrigo Coelho, André Sequeira, and Luís Paulo Santos. Vqc-based reinforcement learning with data re-uploading: performance and trainability. Quantum Machine Intelligence, 6(2):53, 2024. [49] Jarrod R McClean, Sergio Boixo, Vadim N Smelyanskiy, Ryan Babbush, and Hartmut Neven. Barren plateaus in quantum neural network training landscapes. Nature communications, 9(1):4812, 2018. [50] Samson Wang, Enrico Fontana, Marco Cerezo, Kunal Sharma, Akira Sone, Lukasz Cincio, and Patrick J Coles. Noise-induced barren plateaus in variational quantum algorithms. Nature communications, 12(1):6961, 2021. [51] Carlos Ortiz Marrero, Mária Kieferová, and Nathan Wiebe. Entanglement-induced barren plateaus. PRX Quantum, 2(4):040316, 2021. [52] Marco Cerezo, Akira Sone, Tyler Volkoff, Lukasz Cincio, and Patrick J Coles. Cost function dependent barren plateaus in shallow parametrized quantum circuits. Nature communications, 12(1):1791, 2021. [53] Andrea Skolik, Jarrod R McClean, Masoud Mohseni, Patrick Van Der Smagt, and Martin Leib. Layerwise learning for quantum neural networks. Quantum Machine Intelligence, 3:1–11, 2021. [54] Edward Grant, Leonard Wossnig, Mateusz Ostaszewski, and Marcello Benedetti. An initialization strategy for addressing barren plateaus in parametrized quantum circuits. Quantum, 3:214, 2019. [55] Kaining Zhang, Liu Liu, Min-Hsiu Hsieh, and Dacheng Tao. Escaping from the barren plateau via gaussian initializations in deep variational quantum circuits. Advances in Neural Information Processing Systems, 35:18612–18627, 2022. [56] Kishor Bharti and Tobias Haug. Quantum-assisted simulator. Physical Review A, 104(4):042418, 2021. [57] Muhammad Kashif and Saif Al-Kuwari. Resqnets: a residual approach for mitigating barren plateaus in quantum neural networks. EPJ Quantum Technology, 11(1):4, 2024. [58] Taylor L Patti, Khadijeh Najafi, Xun Gao, and Susanne F Yelin. Entanglement devised barren plateau mitigation. Physical Review Research, 3(3):033090, 2021. [59] Martin Larocca, Supanut Thanasilp, Samson Wang, Kunal Sharma, Jacob Biamonte, Patrick J Coles, Lukasz Cincio, Jarrod R McClean, Zoë Holmes, and M Cerezo. A review of barren plateaus in variational quantum computing. arXiv preprint arXiv:2405.00781, 2024. [60] Jacob Biamonte, Peter Wittek, Nicola Pancotti, Patrick Rebentrost, Nathan Wiebe, and Seth Lloyd. Quantum machine learning. Nature, 549(7671):195–202, 2017. [61] Maria Schuld and Francesco Petruccione. Supervised learning with quantum computers, volume 17. Springer, 2018. [62] Sukin Sim, Peter D Johnson, and Alán Aspuru-Guzik. Expressibility and entangling capability of parameterized quantum circuits for hybrid quantum-classical algorithms. Advanced Quantum Technologies, 2(12):1900070, 2019. [63] Yunmei Chen, Xiaojing Ye, and Qingchao Zhang. Variational model-based deep neural networks for image reconstruction. Handbook of Mathematical Models and Algorithms in Computer Vision and Imaging: Mathematical Imaging and Vision, pages 1–29, 2021. [64] Sofiene Jerbi, Lea M Trenkwalder, Hendrik Poulsen Nautrup, Hans J Briegel, and Vedran Dunjko. Quantum enhancements for deep reinforcement learning in large spaces. PRX quantum : a Physical Review journal., 2(1), 2021-2-22. [65] Pierre-Luc Dallaire-Demers and Nathan Killoran. Quantum generative adversarial networks. Physical Review A, 98(1):012324, 2018. [66] Shu Lok Tsang, Maxwell T West, Sarah M Erfani, and Muhammad Usman. Hybrid quantum-classical generative adversarial network for high resolution image generation. IEEE Transactions on Quantum Engineering, 2023. [67] Edward Farhi, Jeffrey Goldstone, and Sam Gutmann. A quantum approximate optimization algorithm. arXiv preprint arXiv:1411.4028, 2014. [68] Jarrod R McClean, Jonathan Romero, Ryan Babbush, and Alán Aspuru-Guzik. The theory of variational hybrid quantum-classical algorithms. New Journal of Physics, 18(2):023023, 2016. [69] Zoë Holmes, Kunal Sharma, Marco Cerezo, and Patrick J Coles. Connecting ansatz expressibility to gradient magnitudes and barren plateaus. PRX Quantum, 3(1):010313, 2022. [70] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition. corr abs/1512.03385 (2015), 2015. [71] Michel Ledoux. The concentration of measure phenomenon. Number 89. American Mathematical Soc., 2001. | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/99521 | - |
| dc.description.abstract | 本研究針對變分量子電路(Variational Quantum Circuits, VQC)在噪聲中尺度量子(NISQ)硬體上受「貧瘠平原」(barren plateaus)現象限制而難以訓練的問題,提出了一種結合分層正規化與殘差連結的多層變分量子電路(Multilayer VQC, MLVQC)架構,以提升深度電路的可訓練性。每一層 MLVQC 由資料編碼區塊(Encoding Block)與參數化量子區塊(Parameterized Quantum Block)組成,並比較了三種正規化方案(無正規化、L2 範數、Linf 範數)與兩種量測方式(Pauli‐$Z$ 期望值與測量機率)。
於 MNIST、FashionMNIST 及 CIFAR-10(各取二類、五類及十類子集)之分類任務中進行實驗。在大多數情境下,L2 正規化搭配 Pauli‐$Z$ 量測可提供最佳準確率:五層 MLVQC 在 CIFAR-10 十類任務上達到最高 32.7%,較未正規化基準提升逾 10%。引入殘差連結後,模型最高準確率進一步提升至 35.7%,且可穩定擴展至六層以上而無效能衰退。 理論分析基於 Haar 隨機電路證明,L2 正規化能有效阻止梯度方差的指數衰減。綜合實驗與理論結果,本研究所提出的設計原則為訓練深度 VQC 提供了一條可擴展且穩健的途徑,進而推動量子機器學習在 NISQ 時代的實際應用。 | zh_TW |
| dc.description.abstract | Variational Quantum Circuits (VQCs) hold promise for hybrid quantum–classical machine learning on noisy intermediate-scale quantum (NISQ) hardware, yet their practical utility is severely constrained by barren plateaus—regions of vanishing gradient that render parameter optimization infeasible as system size grows. In this work, we introduce a Multilayer Variational Quantum Circuit (MLVQC) architecture that integrates layer-wise normalization and residual connections to mitigate barren-plateau effects and enhance trainability. Each MLVQC layer comprises an input encoding block followed by a parameterized quantum block; we systematically evaluate three normalization schemes (none, L2, Linf) and two measurement protocols (Pauli-$Z$ expectation and outcome probabilities). We further incorporate a learnable residual gate—analogous to classical network shortcuts—that adaptively blends each layer’s output with its input, preserving gradient flow in deeper circuits.
We benchmark our approach on classification tasks drawn from MNIST, FashionMNIST, and CIFAR-10—each reduced to two-, five-, and ten-class subsets—and observe that in the vast majority of scenarios, L2 normalization combined with Pauli-$Z$ measurements yields the highest accuracy. With a five-layer MLVQC, this setup achieves up to 32.7 \% accuracy on the ten-class CIFAR-10 task, representing an improvement of over 10 \% versus unnormalized baselines. Furthermore, the introduction of residual connections elevates peak performance to 35.7 \% and allows circuits to scale beyond six layers without any drop in accuracy. Our theoretical analysis, based on Haar-random circuit models, corroborates these findings by showing that L2 normalization arrests the exponential decay of gradient variance. Collectively, these design principles offer a scalable pathway for training deep quantum circuits, broadening the applicability of VQC in practical quantum machine learning. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2025-09-10T16:32:46Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2025-09-10T16:32:46Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | 口試委員審定書 i
摘要 iii Abstract v Contents vii List of Figures ix List of Tables xi Chapter 1 Introduction 1 1.1 Relevant Works 2 1.2 Variational Quantum Circuits 2 1.3 Barren Plateaus 3 Chapter 2 Method 7 2.1 Multilayer Variational Quantum Circuit 7 2.1.1 Model Architecture 8 2.1.2 Data Preprocessing 10 2.1.3 Measurement Method 11 2.1.4 Normalization Method 11 2.1.5 Residual Block 12 2.2 Barren Plateaus 13 2.2.1 Simplify Variational Quantum Circuit 13 2.2.2 Theoretical Analysis 15 Chapter 3 Results 18 3.1 Multilayer VQC Performance on Real-World Data 18 3.1.1 Without Residual Block 18 3.1.2 With Residual Block 26 3.2 Barren Plateaus 27 3.2.1 Simplify Variational Quantum Circuit 27 3.2.2 Multilayer Variational Quantum Circuit Without Residual Block 27 3.2.3 Multilayer Variational Quantum Circuit With Residual Block 31 Chapter 4 Summary and Outlook 34 References 36 | - |
| dc.language.iso | en | - |
| dc.subject | 機器學習 | zh_TW |
| dc.subject | 量子機器學習 | zh_TW |
| dc.subject | 貧瘠平原 | zh_TW |
| dc.subject | 變分量子電路 | zh_TW |
| dc.subject | 量子計算 | zh_TW |
| dc.subject | quantum computing | en |
| dc.subject | Variational Quantum Circuit | en |
| dc.subject | barren plateaus | en |
| dc.subject | machine learning | en |
| dc.subject | Quantum machine learning | en |
| dc.title | 透過多層變分量子電路緩解貧瘠平原問題 | zh_TW |
| dc.title | Mitigation of the barren plateaus through a Multilayer Variational Quantum Circuit | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 113-2 | - |
| dc.description.degree | 碩士 | - |
| dc.contributor.oralexamcommittee | 陳柏中;鄭皓中 | zh_TW |
| dc.contributor.oralexamcommittee | Po-Chung Chen;Hao-Chung Cheng | en |
| dc.subject.keyword | 量子機器學習,機器學習,量子計算,變分量子電路,貧瘠平原, | zh_TW |
| dc.subject.keyword | Quantum machine learning,machine learning,quantum computing,Variational Quantum Circuit,barren plateaus, | en |
| dc.relation.page | 43 | - |
| dc.identifier.doi | 10.6342/NTU202501116 | - |
| dc.rights.note | 同意授權(限校園內公開) | - |
| dc.date.accepted | 2025-07-24 | - |
| dc.contributor.author-college | 理學院 | - |
| dc.contributor.author-dept | 物理學系 | - |
| dc.date.embargo-lift | 2030-07-22 | - |
| 顯示於系所單位: | 物理學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-113-2.pdf 未授權公開取用 | 829.82 kB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
