Font Size: a A A

Automated Plant Phenotyping Using 3D Machine Vision and Robotic

Posted on:2019-12-12Degree:Ph.DType:Dissertation
University:Iowa State UniversityCandidate:Bao, YinFull Text:PDF
GTID:1448390002971097Subject:Agricultural Engineering
Abstract/Summary:
With the rapid advancements in genotyping technologies, plant phenotyping has become a bottleneck in exploiting the massive genomic data for crop improvement. The common practice of plant phenotyping relies on human efforts, which is labor-intensive, time-consuming, and prone to human errors. This dissertation documents our innovative research in automated plant phenotyping using 3D machine vision and robotics.;Sorghum and maize are important economic crops for food, feed, fuel, and fiber production. Manipulation of plant architecture plays a vital role in yield improvement via plant breeding. A high-throughput, field-based robotic phenotyping system was developed to characterize plant architecture for tall-growing sorghum plants having dense population and canopies. Side-viewing stereo cameras were used for 3D reconstruction of plants. A novel data processing pipeline was developed to measure plant height, width, convex hull volume, surface area, and stem diameter. These image-derived features were highly correlated with the in-field manual measurements, and with high repeatability.;Additionally, Time-of-Flight 3D imaging was used to collect side-view point clouds of maize plants under field conditions. Algorithms for extracting plant height, leaf angle, plant orientation, and stem diameter at plant level were developed. A customized skeletonization algorithm was developed to effectively reduce a large point cloud to a skeleton graph; and a 3D Hough line detection algorithm was implemented to find individual stems. The image-derived traits showed satisfactory accuracies, except for stem diameter due to the limitations of the sensor's depth sensing precision.;Various instrumentation devices for plant physiology study require accurate placement of their sensor probes toward the leaf surface. A robotic leaf probing system was developed for a controlled environment using a Time-of-Flight sensor, a laser profilometer, and a six-DOF robotic manipulator. The Time-of-Flight sensor and the laser profilometer were utilized for environment mapping and high-precision scanning of plant canopies, respectively. The environment point cloud was used for collision-free motion planning and individual plant segmentation, while the high-resolution canopy point cloud was analyzed for leaf segmentation and probing point extraction. The system achieved an average motion planning time of 0.4 s with an average probe positioning error of 1.5 mm and probe orientation error of 0.84 degrees.
Keywords/Search Tags:Plant, Using, Robotic
Related items