A Novel Multi-camera Fusion Approach at Plant Scale: From 2D to 3D

Research output: Contribution to journalArticlepeer-review

Abstract

Non-invasive crop phenotyping is essential for crop modeling, which relies on image processing techniques. This research presents a plant-scale vision system that can acquire multispectral plant data in agricultural fields. This paper proposes a sensory fusion method that uses three cameras, Two multispectral and a RGB depth camera. The sensory fusion method applies pattern recognition and statistical optimization to produce a single multispectral 3D image that combines thermal and near-infrared (NIR) images from crops. A multi-camera sensory fusion method incorporates five multispectral bands: three from the visible range and two from the non-visible range, namely NIR and mid-infrared. The object recognition method examines about 7000 features in each image and runs only once during calibration. The outcome of the sensory fusion process is a homographic transformation model that integrates multispectral and RGB data into a coherent 3D representation. This approach can handle occlusions, allowing an accurate extraction of crop features. The result is a 3D point cloud that contains thermal and NIR multispectral data that were initially obtained separately in 2D.
Original languageEnglish
Article number582
Number of pages17
JournalSN Computer Science
Volume5
Issue number5
DOIs
StatePublished - 23 May 2024

Keywords

  • Multi-spectral imagery
  • Light-feld plenoptic cameras
  • Phenotyping
  • Plant modeling
  • 3D plant morphology
  • Light-field plenoptic cameras

Fingerprint

Dive into the research topics of 'A Novel Multi-camera Fusion Approach at Plant Scale: From 2D to 3D'. Together they form a unique fingerprint.

Cite this