請用此 Handle URI 來引用此文件:
http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/70879完整後設資料紀錄
| DC 欄位 | 值 | 語言 |
|---|---|---|
| dc.contributor.advisor | 葉仲基(Chung-Kee Yeh) | |
| dc.contributor.author | Fu-Kai Wang | en |
| dc.contributor.author | 王富凱 | zh_TW |
| dc.date.accessioned | 2021-06-17T04:42:08Z | - |
| dc.date.available | 2021-08-07 | |
| dc.date.copyright | 2018-08-07 | |
| dc.date.issued | 2018 | |
| dc.date.submitted | 2018-08-06 | |
| dc.identifier.citation | 王明茂、顏克安、賴鑫騰。2004。小葉菜類收割機之試驗改良。農業世界48:36-41。
王家麒。1994。以視覺為基礎的自動導引車在農業環境中的定位作業。碩士論文。臺北:臺灣大學農機系。 行政院農委會農糧署。2017。臺灣地區蔬菜年生產量。臺北:行政院農委會。網址: http://www.afa.gov.tw/GrainStatistics_index.aspx。上網日期:2018-06-25。 呂信賢。2017。平行機構機械手臂機器視覺定位系統之設計與實現。碩士論文。桃園:龍華科技大學機械工程系。 李柔靜。2009。番茄採收機械視覺系統之研究。碩士論文。臺北:臺灣大學生物產業機電工程學系。 邱榮志。2016。草莓機器人採收系統之整合研究。碩士論文。宜蘭:宜蘭大學生物產業機電工程學系。 周奕宇、周瑞仁、劉冠佑。1998。自動導航車自我定位系統之設計與製作。1999自動控制研討會論文集:233-238。 林家鋒。2009。設施內番茄採收機器人系統之研究。碩士論文。宜蘭:宜蘭大學生物產業機電工程學系。 陳令錫、林聖泉。2000。農業履帶車自動操控系統之開發研究。農業機械論文發表會論文摘要集(3):21-22。 陳熙霖。2000。視覺計算-人類感之能力的延伸。測控技術19:7-12。 祥儀齒輪。2017。IG-32RGM外型與規格表。網址:http://www.shayangye.com/product-inner.aspx?f=s&i=87。上網日期:2018-06-25。 張文宏。1999。以機器視覺引導機器人選別水果。農業機械學刊。2(3):11-24。 張晉倫。2006。應用溫室內多功能監測系統於甘藍種苗生長性狀判別之研究。碩士論文。臺北:臺灣大學生物產業機電工程學系。 許韶方。2016。植物工廠自動化萵苣採收機構之研製。碩士論文。宜蘭:宜蘭大學生物產業機電工程學系。 鳥歌的Linux私房菜。2018。Linux是甚麼與學習。網址:http://linux.vbird.org/linux_basic/0110whatislinux.php。上網日期:2018-6-25。 繆紹綱。2009。數位影像處理。初版。432-436,726-749。臺北:培生教育出版股份有限公司。 劉寶信。2003。電子構裝三維尺寸之雷射量測。碩士論文。臺南:成功大學工程科學系。 駱易辰。2007。HSV色彩空間前景物體抽取及其於人體動作辨識系統應用。碩士論文。新竹:交通大學電控工程學系。 藍光宏。1996。機器手臂應用在番茄大小選別作業之研究。碩士論文。臺中:中興大學農業機械工程學系。 羅技攝影鏡頭。2018。HD 網路攝影機 C310。網址:https://www.logitech.com/zh-tw/product/hd-webcam-c310。上網日期:2018-6-25。 Bernstein, R. 1976. Digital image processing of earth observation sensor data. IBM Journal of Research and Development 20(1): 40-57. Blasco, J., N. Aleixos and E. Molto. 2003. Machine vision system for automatic quality grading of Fruit. Biosystems Engineering 85(4): 415-423. David, M. 1982. Vision:A computational investigation into the human representation and processing of visual information. 1st ed., 29-61.San Francisco: W. H. Freeman and Company. Iyad, A. and M. Hassan. 2010. Human face detection system using HSV. WSEAS Paper No. 13-16. Stevens Point, Wisconsin: WSEAS. Jin, J., L. Jinwei, L. Guiping, Y. Xiaojuan and C. C. V. Leo. 2009. Methodology for potatoes defects detection with computer vision. ISIP Paper No. 346-351. Huangshan, P. R. China: ISIP. Jinno, T. and M. Okuda. 2012. Multiple Exposure Fusion for High Dynamic Range Image Acquisition. IEEE Transactions on Image Processing 21(1): 358-365. Kumar, T. and V. Karun. 2010. A Theory Based on Conversion of RGB image to Gray image. International Journal of Computer Applications 7(2): 5-12. Marcos, S., H. U. Lyle and B. Ruzena. 1996. Active learning for vision-based robot grasping. Machine Learning 23(2-3): 251-278. McDonald, T. and Y. R. Chen. 1990. Application of morphological image processing in agriculture. ASAE 33(4): 1345-1352. Mertens, T., J. Kautz and F. V. Reeth. 2007. Exposure Fusion in Computer Graphics and Applications. Pacific Conference 15: 382-390. Nobuyuki, O. 1979. A threshold selection method from gray-level histograms. IEEE Transactions on Systems 9(1): 62-66. Qiao, J., N. Wang, M. O. Ngadi and S. Kazemi. 2007. Predicting mechanical properties of fried chicken nuggets using image processing and neural network techniques. Journal of food engineering 79: 1065-1070. Raspberry Pi台灣樹梅派。2018。樹梅派規格。網址: https://www.raspberrypi.com.tw/。上網日期:2018-6-25。 RGB轉換HSV及HSL。2018。色彩空間示意圖。網址: http://www.ginifab.com.tw/tools/colors/rgb_to_hsv_hsl.html。上網日期:2018-06-03。 Shamik, S., Q. Gang and P. Sakti. 2002. Segmentation and histogram generation using the HSV color space for image retrieval. IEEE 2(2): 589-592. Uner, M. K., L. C. Ramac, P. K. Varshney and M. G. Alford. 1997. Concealed weapon detection: an image fusion approach. SPIE 2942: 123-132. Chen, Y. R., K. Chao and M. S. Kim. 2002. Machine vision technology for agricultural applications. Computers and Electronics in Agriculture 36(2): 173-191. | |
| dc.identifier.uri | http://tdr.lib.ntu.edu.tw/jspui/handle/123456789/70879 | - |
| dc.description.abstract | 根據農委會的統計,在臺灣五十歲到六十四歲的農業勞動力人口佔了全農業勞動力人口的44.5%,此數據突顯了農業人力結構的問題,再加上臺灣農業人口的高齡化以及農村人口外流使得作物採收期雇工不易,使得作物無法預期收成。若錯過採收期將使得作物的品項不佳,嚴重者甚至枯黃。在目前還沒有開放農業外勞人口的政策下,臺灣需透過農業自動化來改善農業缺工問題。
本研究主要目的為研製出一台能夠在溫網室內利用影像處理進行行走與採收之葉菜採收機。實驗分為三個部分,分別為尋找葉菜適當的影像處理法,履帶車割刀的行為控制以及履帶馬達的行走控制。割刀馬達控制的部分先針對40張在臺灣大學拍攝的作物照片進行影像處理,每張照片再分成作物區與土壤區。經由作物區的白點數計算,來估計作物應有的白點數量。最後執行大樣本、母體變異數未知的臨界值檢定,找出代表作物的臨界值,來決定割刀是否應該作動。實驗結果顯示,當相機畫素為4068*3456時,作物區面積大於5825540,有95%信心水準為待收割作物。 履帶馬達的行走控制部分,使用一輛自製履帶車,連接Raspberry Pi並裝上網路攝影鏡頭與馬達驅動板,來控制整台履帶車的行走。實驗中將網路攝影機拍攝之圖像進行HSV色彩空間轉換,再進行二值化後,利用程式計算網路攝影機左右兩側作物區之面積差異,透過PID參數的調整來控制馬達轉速,達成最小的穩態誤差與理想的最大超越量。實驗結果顯示,當kp=0.2時,可以達成最好的循跡效果。 | zh_TW |
| dc.description.abstract | According to Council of Agriculture, R.O.C. (Taiwan), those aged between 50 and 64 years account for 44.5% of Taiwan’s total agricultural labor force. This figure highlights Taiwan’s problematic agricultural labor force structure. Because of the aging agricultural labor force and increasing urbanization, finding workers to hire during the harvest season has become challenging. Accordingly, crops cannot be harvested in time, so crop quality has deteriorated. Additionally, crops have become increasingly likely to fail. Because current policy does not allow foreign agricultural workers to work in Taiwan, agricultural automation is required to resolve this problem.
In this study, we developed a leafy vegetable harvester that can move within a screened greenhouse and harvest leafy vegetables through image processing. The experiment comprised three parts, including the best image processing method I should use, development of behavioral control for the crawler-track cutter and development of movement control for the crawler-track motor. To develop appropriate motor control for the cutter, 40 images of crops photographed in National Taiwan University underwent image processing. Each image was divided into crop sections and soil sections. The number of white dots in the crop sections was calculated to estimate the number of white dots that crops should have. Finally, the critical value method was adopted for a large sample size and unknown population variance to ascertain a representative crop’s critical value and to determine whether the cutter should take actions. The results revealed that when the camera resolution was 4068 × 3456 and the area of the crop section exceeded 5,825,540 pixels, with a confidence level of 95%, the area reflected crops to be harvested. For movement control of the crawler-track motor, a crawler-track car produced by this study was utilized. A Raspberry Pi, a web camera, and a motor driver board were installed to the car to control and monitor its movement. In the experiment, images obtained through the web camera were processed through hue-saturation-value color-space conversion. Subsequently, image banalization was conducted, and software was used to calculate the differences in area between the two crop sections at the right-hand and left-hands side of the image. The motor’s rotational speed was controlled by adjusting the proportional-integral-derivative parameters to achieve minimal steady-state errors and optimal maximal overshoot. The results revealed that the optimal tracking effect was produced when kp = 0.2. | en |
| dc.description.provenance | Made available in DSpace on 2021-06-17T04:42:08Z (GMT). No. of bitstreams: 1 ntu-107-R05631022-1.pdf: 7486534 bytes, checksum: f3024ce468f40417141cb7d3cbd64b1f (MD5) Previous issue date: 2018 | en |
| dc.description.tableofcontents | 誌謝 ii
摘要 iii ABSTRACT iv 圖目錄 vi 表目錄 viii 第1章 緒論 1 1.1 動機 1 1.2 目的 3 第2章 文獻探討 4 2.1 農用車輛 4 2.1.1 葉菜採收機 4 2.1.2 農用自動導引車輛 6 2.2 影像處理 7 2.2.1 色彩空間 7 2.2.2 RGB色彩空間 7 2.2.3 HSV色彩空間 8 2.2.4 灰階影像 9 2.2.5 二值化影像 9 2.2.6 曝光融合 11 2.2.7 影像處理應用於農業 11 2.3 機器視覺 12 第3章 材料與方法 13 3.1 實驗設備及材料 13 3.1.1 載具硬體系統簡介 13 3.1.2 載具系統軟體簡介 19 3.1.3 OpenCV 19 3.1.4 Raspberry Pi之作業系統 19 3.2 實驗方法 20 3.2.1 實驗一 20 3.2.2 實驗二 22 3.2.3 實驗三 24 第4章 結果與討論 25 4.1 葉菜作物之萃取 25 4.1.1 經由RGB影像轉換 25 4.1.2 經由HSV影像轉換 30 4.2 割刀馬達之行為控制 34 4.3 履帶馬達之行走控制 35 4.3.1 比例控制參數調整 35 第5章 結論與建議 39 5.1 結論 39 5.2 建議 39 參考文獻 40 附錄一 程式碼 44 附錄二 影像處理結果 57 | |
| dc.language.iso | zh-TW | |
| dc.subject | 機器視覺 | zh_TW |
| dc.subject | 影像處理 | zh_TW |
| dc.subject | Raspberry Pi | zh_TW |
| dc.subject | 農業 | zh_TW |
| dc.subject | image processing | en |
| dc.subject | agriculture | en |
| dc.subject | Raspberry Pi | en |
| dc.subject | machine vision | en |
| dc.title | 影像處理應用於葉菜收穫機行走與採收之研究 | zh_TW |
| dc.title | A Study on Image Processing on Self-Propelled and Movement for a Leafy Vegetable Harvester | en |
| dc.type | Thesis | |
| dc.date.schoolyear | 106-2 | |
| dc.description.degree | 碩士 | |
| dc.contributor.oralexamcommittee | 黃振康(Chen-Kang Huang),吳剛智(Gang-Jhih Wu) | |
| dc.subject.keyword | 影像處理,機器視覺,Raspberry Pi,農業, | zh_TW |
| dc.subject.keyword | image processing,machine vision,Raspberry Pi,agriculture, | en |
| dc.relation.page | 64 | |
| dc.identifier.doi | 10.6342/NTU201802526 | |
| dc.rights.note | 有償授權 | |
| dc.date.accepted | 2018-08-06 | |
| dc.contributor.author-college | 生物資源暨農學院 | zh_TW |
| dc.contributor.author-dept | 生物產業機電工程學研究所 | zh_TW |
| 顯示於系所單位: | 生物機電工程學系 | |
文件中的檔案:
| 檔案 | 大小 | 格式 | |
|---|---|---|---|
| ntu-107-1.pdf 未授權公開取用 | 7.31 MB | Adobe PDF |
系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。
