Harvester-based sensing system for cotton fiber quality mapping

Vincent P. Schielack, J. Alex Thomasson, Ruixiu Sui, Yufeng Ge

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Precision agriculture in cotton production attempts to maximize profitability by exploiting information on field spatial variability to optimize fiber yield and quality. For precision agriculture to be viable economically, collection of spatial variability data within a field must be automated and incorporated into normal harvesting and ginning operations. An automated prototype system that uses image processing to estimate the micronaire value of cotton fiber during harvest was designed and built. The system was based on a camera with a visible Indium Gallium Arsenide detector sensitive to a broad range of visible and near-infrared (NIR) energy. Image processing algorithms were developed to identify foreign matter in the images so that it could be excluded from the measurement of reflectance in three NIR wavebands. After the effects of foreign matter were removed, the NIR reflectance measurements had a strong relationship to standard micronaire measurements, even though the measurements were made on seed cotton, which has a high level of foreign matter compared to fiber samples. A simplified version of the system could be constructed from a similar camera with only three optical band-pass filters at 650, 1550, and 1600 nm. The prototype system developed shows promise for in-situ measurement of cotton fiber quality, specifically micronaire, and can enable creation of fiber quality maps to improve crop management and ultimately profitability.

Original languageEnglish (US)
Pages (from-to)386-393
Number of pages8
JournalJournal of Cotton Science
Volume20
Issue number4
StatePublished - 2016

ASJC Scopus subject areas

  • General Materials Science

Fingerprint

Dive into the research topics of 'Harvester-based sensing system for cotton fiber quality mapping'. Together they form a unique fingerprint.

Cite this