Fuente:
PubMed "smart farming"
Front Plant Sci. 2025 Dec 16;16:1722007. doi: 10.3389/fpls.2025.1722007. eCollection 2025.ABSTRACTThis study introduces a novel publicly available computer vision dataset specifically designed for weeding behavior analysis in tea plantations. Targeting the pressing challenges posed by weed infestations and aligning with global food security strategies, the dataset aims to advance intelligent weeding behavior recognition systems. The collection comprises 108 high-definition video sequences and 6,473 annotated images, capturing a wide range of weeding activities in real tea plantation environments. Data acquisition followed a hybrid approach combining field recordings with web-crawled resources, and encompasses six categories of weeding behaviors: manual weeding, tool-assisted weeding, machine-based weeding, tool-specific actions (including hoe and rake), handheld weeding machine use, and non-working states. A key innovation of the dataset lies in its multi-view acquisition strategy, integrating frontal, lateral, and top-down perspectives to ensure robust three-dimensional understanding of weeding behaviors. Annotations are provided in both COCO and YOLO formats, ensuring compatibility with mainstream object detection frameworks. Benchmark evaluations conducted with advanced algorithms such as YOLOv8, SSD, and Faster R-CNN demonstrate the effectiveness of the dataset, with Faster R-CNN achieving a mean Average Precision (mAP) of 82.24%. The proposed dataset establishes a valuable foundation for the development of intelligent weeding robots, precision agriculture monitoring systems, and computer vision applications in complex agricultural environments.PMID:41477254 | PMC:PMC12748151 | DOI:10.3389/fpls.2025.1722007