請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88577完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 吳家麟 | zh_TW |
| dc.contributor.advisor | Ja-Ling Wu | en |
| dc.contributor.author | 吳佩玲 | zh_TW |
| dc.contributor.author | Pei-Ling Wu | en |
| dc.date.accessioned | 2023-08-15T16:54:41Z | - |
| dc.date.available | 2023-11-09 | - |
| dc.date.copyright | 2023-08-15 | - |
| dc.date.issued | 2023 | - |
| dc.date.submitted | 2023-07-28 | - |
| dc.identifier.citation | 1. Yang, Y., S. Mandt, and L. Theis, An introduction to neural data compression. Foundations and Trends® in Computer Graphics and Vision, 2023. 15(2): p. 113-200.
2. Goodfellow, I., et al., Generative Adversarial Networks. Communications of the Acm, 2020. 63(11): p. 139-144. 3. Kingma, D.P. and M. Welling, An Introduction to Variational Autoencoders. Foundations and Trends in Machine Learning, 2019. 12(4): p. 4-89. 4. Papamakarios, G., et al., Normalizing Flows for Probabilistic Modeling and Inference. Journal of Machine Learning Research, 2021. 22(1): p. 2617-2680. 5. Van Den Oord, A., N. Kalchbrenner, and K. Kavukcuoglu. Pixel recurrent neural networks. in International conference on machine learning. 2016. PMLR. 6. Ho, J., A. Jain, and P. Abbeel, Denoising diffusion probabilistic models. Advances in Neural Information Processing Systems, 2020. 33: p. 6840-6851. 7. Zhang, S., et al., iflow: Numerically invertible flows for efficient lossless compression via a uniform coder. Advances in Neural Information Processing Systems, 2021. 34: p. 5822-5833. 8. Zhang, S., et al. ivpf: Numerical invertible volume preserving flow for efficient lossless compression. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2021. 9. Ho, J., E. Lohn, and P. Abbeel, Compression with flows via local bits-back coding. Advances in Neural Information Processing Systems, 2019. 32. 10. Kingma, D., et al., Variational diffusion models. Advances in neural information processing systems, 2021. 34: p. 21696-21707. 11. Townsend, J., T. Bird, and D. Barber, Practical lossless compression with latent variables using bits back coding. arXiv preprint arXiv:1901.04866, 2019. 12. Kingma, F., P. Abbeel, and J. Ho. Bit-swap: Recursive bits-back coding for lossless compression with hierarchical latent variables. in International Conference on Machine Learning. 2019. PMLR. 13. Townsend, J., et al., Hilloc: Lossless image compression with hierarchical latent variable models. arXiv preprint arXiv:1912.09953, 2019. 14. Ryder, T., et al. Split Hierarchical Variational Compression. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2022. 15. Ballé, J., et al., Variational image compression with a scale hyperprior. arXiv preprint arXiv:1802.01436, 2018. 16. Blei, D., R. Ranganath, and S. Mohamed, Variational inference: Foundations and modern methods. NIPS Tutorial, 2016. 17. White, T., Sampling generative networks. arXiv preprint arXiv:1609.04468, 2016. 18. Kingma, D.P. and M. Welling, Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114, 2013. 19. Hsieh, P.A. and J.L. Wu, A Review of the Asymmetric Numeral System and Its Applications to Digital Images. Entropy (Basel), 2022. 24(3): p. 375. 20. MacKay, D.J., Information theory, inference and learning algorithms. 2003: Cambridge university press. 21. Wallace, C.S. Classification by minimum-message-length inference. in Advances in Computing and Information—ICCI'90: International Conference on Computing and Information Niagara Falls, Canada, May 23–26, 1990 Proceedings. 1990. Springer. 22. Hinton, G.E. and D. Van Camp. Keeping the neural networks simple by minimizing the description length of the weights. in Proceedings of the sixth annual conference on Computational learning theory. 1993. 23. Frey, B.J. and G.E. Hinton. Free energy coding. in Proceedings of Data Compression Conference-DCC'96. 1996. IEEE. 24. Frey, B.J., Bits-back coding software guide. 1996. 25. Duda, J., Asymmetric numeral systems. arXiv preprint arXiv:0902.0271, 2009. 26. Duda, J., Asymmetric numeral systems: entropy coding combining speed of Huffman coding with compression rate of arithmetic coding. arXiv preprint arXiv:1311.2540, 2013. 27. Kingma, F.H., Improving Data Compression Based On Deep Learning. 2019, Erasmus University OF Rotterdam. 28. Townsend, J., Lossless compression with latent variable models. arXiv preprint arXiv:2104.10544, 2021. 29. Long, J., E. Shelhamer, and T. Darrell. Fully convolutional networks for semantic segmentation. in Proceedings of the IEEE conference on computer vision and pattern recognition. 2015. 30. Kingma, D.P., et al., Improved variational inference with inverse autoregressive flow. Advances in neural information processing systems, 2016. 29. 31. Flamich, G., M. Havasi, and J.M. Hernández-Lobato, Compressing images by encoding their latent representations with relative entropy coding. Advances in Neural Information Processing Systems, 2020. 33: p. 16131-16141. 32. Flamich, G., S. Markou, and J.M. Hernández-Lobato. Fast relative entropy coding with a* coding. in International Conference on Machine Learning. 2022. PMLR. 33. Giesen, F., Interleaved entropy coders. arXiv preprint arXiv:1402.3392, 2014. 34. Havasi, M., R. Peharz, and J.M. Hernández-Lobato, Minimal random code learning: Getting bits back from compressed model parameters. arXiv preprint arXiv:1810.00440, 2018. 35. Liu, A., S. Mandt, and G.V.d. Broeck, Lossless compression with probabilistic circuits. arXiv preprint arXiv:2111.11632, 2021. 36. Yang, Y., R. Bamler, and S. Mandt, Improving inference for neural image compression. Advances in Neural Information Processing Systems, 2020. 33: p. 573-584. | - |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/88577 | - |
| dc.description.abstract | 本研究著重於無失真的變分影像壓縮,採用了變分自動編碼器(VAE)模型進行無失真壓縮,同時搭配Bits-Back Coding演算法,該演算法天生適用於VAE模型。在本篇研究中,我們提供了對Bits-Back Coding的詳細介紹,並整合了幾篇結合Bits-Back Coding和VAE的相關論文的程式碼,進行了實驗比較。最終,針對無失真的變分影像壓縮,我們提供了在不同限制條件和情境下的建議,以協助選擇合適的模型或工具套件。這篇論文的貢獻在於深入探討了神經網路無失真資料壓縮的其中一種方法和技術,並提供了實用的指導方針,期望能對進一步的相關研究和拓展神經網路資料壓縮技術的實際應用效果有所助益。 | zh_TW |
| dc.description.abstract | This study focuses on lossless variational image compression using the VAE model, combined with the Bits-Back Coding algorithm, which is naturally suited for VAE models. We comprehensively introduce Bits-Back Coding and integrate code from several related papers that combine Bits-Back Coding with VAE for experimental comparison. Ultimately, we offer recommendations for lossless variational image compression under different constraints and scenarios to assist in choosing suitable models or toolkits. The contribution of this paper lies in its in-depth exploration of one method and technique for neural network-based lossless data compression, along with practical guidelines that can facilitate further research and the practical application of neural data compression technology. | en |
| dc.description.provenance | Submitted by admin ntu (admin@lib.ntu.edu.tw) on 2023-08-15T16:54:41Z No. of bitstreams: 0 | en |
| dc.description.provenance | Made available in DSpace on 2023-08-15T16:54:41Z (GMT). No. of bitstreams: 0 | en |
| dc.description.tableofcontents | 致謝 .............................................................................. i
中文摘要 ....................................................................... ii Abstract ....................................................................... ii 目錄 ............................................................................. iii 圖目錄 .......................................................................... iv 表目錄 .......................................................................... v 第一章 簡介 .................................................................. 1 第二章 知識背景介紹 .................................................... 3 2.1 變分影像壓縮(Variational image compression) ... 3 2.2 Bits-Back Coding(BBC) ................................... 13 2.3 使用BB-ANS的限制 ............................................... 17 第三章 文獻探討 .......................................................... 19 3.1 Bit-Swap ............................................................... 19 3.2 HiLLoC ................................................................. 21 3.3 SHVC ................................................................... 23 第四章 實驗 ................................................................ 26 第五章 討論 ................................................................ 32 第六章 未來展望 ......................................................... 34 第七章 結論 ................................................................ 35 第八章 參考資料 ......................................................... 36 | - |
| dc.language.iso | zh_TW | - |
| dc.subject | Bits-Back Coding | zh_TW |
| dc.subject | Variational Autoencoder | zh_TW |
| dc.subject | 無失真影像壓縮 | zh_TW |
| dc.subject | Lossless image compression | en |
| dc.subject | Bits-Back Coding | en |
| dc.subject | Variational Autoencoder | en |
| dc.title | 使用Bits-Back Coding於無失真變分影像壓縮之演算法實現與性能分析 | zh_TW |
| dc.title | Implementation and Performance Analysis of Lossless Variational Image Compression using Bits-Back Coding | en |
| dc.type | Thesis | - |
| dc.date.schoolyear | 111-2 | - |
| dc.description.degree | 碩士 | - |
| dc.contributor.oralexamcommittee | 陳文進;許超雲 | zh_TW |
| dc.contributor.oralexamcommittee | Wen-Chin Chen;Chau-Yun Hsu | en |
| dc.subject.keyword | Bits-Back Coding,Variational Autoencoder,無失真影像壓縮, | zh_TW |
| dc.subject.keyword | Bits-Back Coding,Variational Autoencoder,Lossless image compression, | en |
| dc.relation.page | 38 | - |
| dc.identifier.doi | 10.6342/NTU202301813 | - |
| dc.rights.note | 同意授權(限校園內公開) | - |
| dc.date.accepted | 2023-08-01 | - |
| dc.contributor.author-college | 電機資訊學院 | - |
| dc.contributor.author-dept | 資訊網路與多媒體研究所 | - |
| dc.date.embargo-lift | 2026-01-01 | - |
| 顯示於系所單位: | 資訊網路與多媒體研究所 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-111-2.pdf 未授權公開取用 | 2.08 MB | Adobe PDF | 檢視/開啟 |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
