Sunflower Dataset
Crop/weed segmentation dataset for sunflower (Helianthus annuus) crops


Selective weeding is one of the key challenges in the field of agriculture robotics. To accomplish this task, a farm robot should be able to accurately detect plants and to distinguish them between crop and weeds. Most of the promising state-of-the-art approaches make use of appearance-based models trained on large annotated datasets. Unfortunately, creating large agricultural datasets with pixel-level annotations is an extremely time consuming task, actually penalizing the usage of data-driven techniques. To this end, we used a custom-built agricultural field robot to record three datasets on a sunflower farm in Jesi, (Italy), at the Assam facility over a period of one month in spring 2016.
Images in the datasets are arranged according to the following folder list:
rgb: folder containing RGB images. nir: folder containing near infrared, single channel images. gt: folder containing grayscale, single channel pixelwise annotations. (0 soil; 1 Crop; 2 weed) gt_color: folder containing RGB, 3 channels pixelwise annotations. (black soil; green crop; red weeds)