Autonomous robotic platforms can be effectively used to perform automatic fruits yield estimation. To this aim, robots need data-driven models that process image streams and count, even approximately, the number of fruits in an orchard. However, training such models following a supervised paradigm is expensive and unpractical. Extending pre-trained models to perform yield estimation for a completely new type of fruit is even more challenging, but interesting since this situation is typical in practice. In this work, we combine a State-of-the-Art weakly-supervised fruit counting model with an unsupervised style transfer method for addressing the task above. In this sense, our proposed approach is quasi-unsupervised. In particular, we use a Cycle-Generative Adversarial Network (C-GAN) to perform unsupervised domain adaptation and train it alongside with a Presence-Absence Classifier (PAC) that discriminates images containing fruits or not. The PAC produces the weak-supervision signal for the counting network, that can then be used on the target orchard directly. Experiments on datasets collected in four different orchards show that the proposed approach is more accurate than the supervised baseline methods.

Combining Domain Adaptation and Spatial Consistency for Unseen Fruits Counting: A Quasi-Unsupervised Approach / Bellocchio, E.; Costante, G.; Cascianelli, S.; Fravolini, M. L.; Valigi, P.. - In: IEEE ROBOTICS AND AUTOMATION LETTERS. - ISSN 2377-3766. - 5:2(2020), pp. 1079-1086. [10.1109/LRA.2020.2966398]

Combining Domain Adaptation and Spatial Consistency for Unseen Fruits Counting: A Quasi-Unsupervised Approach

Cascianelli S.;
2020

Abstract

Autonomous robotic platforms can be effectively used to perform automatic fruits yield estimation. To this aim, robots need data-driven models that process image streams and count, even approximately, the number of fruits in an orchard. However, training such models following a supervised paradigm is expensive and unpractical. Extending pre-trained models to perform yield estimation for a completely new type of fruit is even more challenging, but interesting since this situation is typical in practice. In this work, we combine a State-of-the-Art weakly-supervised fruit counting model with an unsupervised style transfer method for addressing the task above. In this sense, our proposed approach is quasi-unsupervised. In particular, we use a Cycle-Generative Adversarial Network (C-GAN) to perform unsupervised domain adaptation and train it alongside with a Presence-Absence Classifier (PAC) that discriminates images containing fruits or not. The PAC produces the weak-supervision signal for the counting network, that can then be used on the target orchard directly. Experiments on datasets collected in four different orchards show that the proposed approach is more accurate than the supervised baseline methods.
5
2
1079
1086
Combining Domain Adaptation and Spatial Consistency for Unseen Fruits Counting: A Quasi-Unsupervised Approach / Bellocchio, E.; Costante, G.; Cascianelli, S.; Fravolini, M. L.; Valigi, P.. - In: IEEE ROBOTICS AND AUTOMATION LETTERS. - ISSN 2377-3766. - 5:2(2020), pp. 1079-1086. [10.1109/LRA.2020.2966398]
Bellocchio, E.; Costante, G.; Cascianelli, S.; Fravolini, M. L.; Valigi, P.
File in questo prodotto:
File Dimensione Formato  
Combining_Domain_Adaptation_and_Spatial_Consistency_for_Unseen_Fruits_Counting_A_Quasi-Unsupervised_Approach.pdf

non disponibili

Tipologia: Versione dell'editore (versione pubblicata)
Dimensione 2.94 MB
Formato Adobe PDF
2.94 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

Caricamento pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1200006
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 11
  • ???jsp.display-item.citation.isi??? 11
social impact