Proximal sensing for geometric characterization of vines: A review of the latest advances
In: Computers and electronics in agriculture: COMPAG online ; an international journal, Band 210, S. 107901
11 Ergebnisse
Sortierung:
In: Computers and electronics in agriculture: COMPAG online ; an international journal, Band 210, S. 107901
In: Computers and Electronics in Agriculture, Band 122, S. 67-73
In: Computers and electronics in agriculture: COMPAG online ; an international journal, Band 214, S. 108324
In: Computers and Electronics in Agriculture, Band 176, S. 105638
The authors wish to thank Pedro Hernáiz and his team (ICA-CSIC) for their invaluable help in the field trials. The authors wish to acknowledge the invaluable technical support of Damian Rodriguez. Author Contributions: The work was developed as a collaboration among all authors. J.M. Bengochea-Guevara and A Ribeiro designed the study. J.M. Bengochea-Guevara carried out the system integration and programming. D. Andújar posed the field experiments. J. Conesa-Muñoz mainly contributed to the development of the planner and provided support in the field tests with D. Andújar. A. Ribeiro directed the research, collaborating in the testing and the discussion of the results. The manuscript was mainly drafted by J.M. Bengochea-Guevara and A. Ribeiro and was revised and corrected by all co-authors. All authors have read and approved the final manuscript. ; The concept of precision agriculture, which proposes farming management adapted to crop variability, has emerged in recent years. To effectively implement precision agriculture, data must be gathered from the field in an automated manner at minimal cost. In this study, a small autonomous field inspection vehicle was developed to minimise the impact of the scouting on the crop and soil compaction. The proposed approach integrates a camera with a GPS receiver to obtain a set of basic behaviours required of an autonomous mobile robot to inspect a crop field with full coverage. A path planner considered the field contour and the crop type to determine the best inspection route. An image-processing method capable of extracting the central crop row under uncontrolled lighting conditions in real time from images acquired with a reflex camera positioned on the front of the robot was developed. Two fuzzy controllers were also designed and developed to achieve vision-guided navigation. A method for detecting the end of a crop row using camera-acquired images was developed. In addition, manoeuvres necessary for the robot to change rows were established. These manoeuvres enabled the robot to autonomously cover the entire crop by following a previously established plan and without stepping on the crop row, which is an essential behaviour for covering crops such as maize without damaging them. ; The Spanish Government has provided full and continuing support for this research work through projects AGL2011-30442-C02–02 and AGL2014-52465-C4-3-R. ; We acknowledge support by the CSIC Open Access Publication Initiative through its Unit of Information Resources for Research (URICI). ; Peer reviewed
BASE
In: Computers and electronics in agriculture: COMPAG online ; an international journal, Band 217, S. 108576
In: Computers and Electronics in Agriculture, Band 157, S. 351-358
In: Computers and Electronics in Agriculture, Band 92, S. 11-15
Crop monitoring is an essential practice within the field of precision agriculture since it is based on observing, measuring and properly responding to inter-and intra-field variability. In particular, "on ground crop inspection" potentially allows early detection of certain crop problems or precision treatment to be carried out simultaneously with pest detection. "On ground monitoring" is also of great interest for woody crops. This paper explores the development of a low-cost crop monitoring system that can automatically create accurate 3D models (clouds of coloured points) of woody crop rows. The system consists of a mobile platform that allows the easy acquisition of information in the field at an average speed of 3 km/h. The platform, among others, integrates an RGB-D sensor that provides RGB information as well as an array with the distances to the objects closest to the sensor. The RGB-D information plus the geographical positions of relevant points, such as the starting and the ending points of the row, allow the generation of a 3D reconstruction of a woody crop row in which all the points of the cloud have a geographical location as well as the RGB colour values. The proposed approach for the automatic 3D reconstruction is not limited by the size of the sampled space and includes a method for the removal of the drift that appears in the reconstruction of large crop rows. ; The Spanish Government has provided full and continuing support for this research work through project AGL2014-52465-C4-3-R. The authors wish to thank the Codorniu S.A. company for the use of the facilities on the estate of Raimat and extend their gratitude to Jordi Recasens and his team (Weed Science and Plant Ecology Research Group of the UdL) for their invaluable help in the field trials. Karla Cantuña thanks the service commission for the remuneration given by the Cotopaxi Technical University. The authors also wish to acknowledge the ongoing technical support of Damián Rodríguez ; Peer Reviewed
BASE
A series of scenarios were assessed by simulating various weed decision thresholds (WDT) and different weed detection and herbicide application resolutions. Variable responses were obtained depending on the spatial distribution pattern of the weed. In the case of a patchy distributed species (Sorghum halepense), errors in spraying decision increased as the resolution increased and the WDT decreased. In contrast, in a uniformly distributed species (Abutilon theophrasti) errors in spraying decision increased when both the resolution and the WDT increased. Weed decision threshold was essential in determining the suitability of patch spraying. Consequently, site-specific control would depend primarily on our ability to detect low weed densities. ; This research was funded by the European Union (Seventh Framework Programme, Project no. 245986: Automation and robotics for sustainable crop and forestry management). ; Peer reviewed
BASE
Weather conditions can affect sensors' readings when sampling outdoors. Although sensors are usually set up covering a wide range of conditions, their operational range must be established. In recent years, depth cameras have been shown as a promising tool for plant phenotyping and other related uses. However, the use of these devices is still challenged by prevailing field conditions. Although the influence of lighting conditions on the performance of these cameras has already been established, the effect of wind is still unknown. This study establishes the associated errors when modeling some tree characteristics at different wind speeds. A system using a Kinect v2 sensor and a custom software was tested from null wind speed up to 10 m·s −1 . Two tree species with contrasting architecture, poplars and plums, were used as model plants. The results showed different responses depending on tree species and wind speed. Estimations of Leaf Area (LA) and tree volume were generally more consistent at high wind speeds in plum trees. Poplars were particularly affected by wind speeds higher than 5 m·s −1 . On the contrary, height measurements were more consistent for poplars than for plum trees. These results show that the use of depth cameras for tree characterization must take into consideration wind conditions in the field. In general, 5 m·s −1 (18 km·h −1 ) could be established as a conservative limit for good estimations. ; We acknowledge support by the CSIC Open Access Publication Initiative through its Unit of Information Resources for Research (URICI). ; : The Spanish Government has provided full and continuing support for this research work through project AGL2014-52465-C4.
BASE