Näytä suppeat kuvailutiedot

dc.contributor.authorKarila, Kirsi
dc.contributor.authorAlves Oliveira, Raquel
dc.contributor.authorEk, Johannes
dc.contributor.authorKaivosoja, Jere
dc.contributor.authorKoivumäki, Niko
dc.contributor.authorKorhonen, Panu
dc.contributor.authorNiemeläinen, Oiva
dc.contributor.authorNyholm, Laura
dc.contributor.authorNäsi, Roope
dc.contributor.authorPölönen, Ilkka
dc.contributor.authorHonkavaara, Eija
dc.date.accessioned2022-08-17T07:25:54Z
dc.date.available2022-08-17T07:25:54Z
dc.date.issued2022
dc.identifier.citationKarila, K., Alves Oliveira, R., Ek, J., Kaivosoja, J., Koivumäki, N., Korhonen, P., Niemeläinen, O., Nyholm, L., Näsi, R., Pölönen, I., & Honkavaara, E. (2022). Estimating Grass Sward Quality and Quantity Parameters Using Drone Remote Sensing with Deep Neural Networks. <i>Remote Sensing</i>, <i>14</i>(11), Article 2692. <a href="https://doi.org/10.3390/rs14112692" target="_blank">https://doi.org/10.3390/rs14112692</a>
dc.identifier.otherCONVID_150887493
dc.identifier.urihttps://jyx.jyu.fi/handle/123456789/82622
dc.description.abstractThe objective of this study is to investigate the potential of novel neural network architectures for measuring the quality and quantity parameters of silage grass swards, using drone RGB and hyperspectral images (HSI), and compare the results with the random forest (RF) method and handcrafted features. The parameters included fresh and dry biomass (FY, DMY), the digestibility of organic matter in dry matter (D-value), neutral detergent fiber (NDF), indigestible neutral detergent fiber (iNDF), water-soluble carbohydrates (WSC), nitrogen concentration (Ncont) and nitrogen uptake (NU); datasets from spring and summer growth were used. Deep pre-trained neural network architectures, the VGG16 and the Vision Transformer (ViT), and simple 2D and 3D convolutional neural networks (CNN) were studied. In most cases, the neural networks outperformed RF. The normalized root-mean-square errors (NRMSE) of the best models were for FY 19% (2104 kg/ha), DMY 21% (512 kg DM/ha), D-value 1.2% (8.6 g/kg DM), iNDF 12% (5.1 g/kg DM), NDF 1.1% (6.2 g/kg DM), WSC 10% (10.5 g/kg DM), Ncont 9% (2 g N/kg DM), and NU 22% (11.9 N kg/ha) using independent test dataset. The RGB data provided good results, particularly for the FY, DMY, WSC and NU. The HSI datasets provided advantages for some parameters. The ViT and VGG provided the best results with the RGB data, whereas the simple 3D-CNN was the most consistent with the HSI data.en
dc.format.mimetypeapplication/pdf
dc.language.isoeng
dc.publisherMDPI AG
dc.relation.ispartofseriesRemote Sensing
dc.rightsCC BY 4.0
dc.subject.otherdrone
dc.subject.otherremote sensing
dc.subject.otherhyperspectral
dc.subject.otherRGB
dc.subject.otherCNN
dc.subject.otherimage transformer
dc.subject.othersilage production
dc.subject.othergrass sward
dc.titleEstimating Grass Sward Quality and Quantity Parameters Using Drone Remote Sensing with Deep Neural Networks
dc.typearticle
dc.identifier.urnURN:NBN:fi:jyu-202208174166
dc.contributor.laitosInformaatioteknologian tiedekuntafi
dc.contributor.laitosFaculty of Information Technologyen
dc.contributor.oppiaineLaskennallinen tiedefi
dc.contributor.oppiaineComputing, Information Technology and Mathematicsfi
dc.contributor.oppiaineTietotekniikkafi
dc.contributor.oppiaineComputational Scienceen
dc.contributor.oppiaineComputing, Information Technology and Mathematicsen
dc.contributor.oppiaineMathematical Information Technologyen
dc.type.urihttp://purl.org/eprint/type/JournalArticle
dc.type.coarhttp://purl.org/coar/resource_type/c_2df8fbb1
dc.description.reviewstatuspeerReviewed
dc.relation.issn2072-4292
dc.relation.numberinseries11
dc.relation.volume14
dc.type.versionpublishedVersion
dc.rights.copyright© 2022 The Author(s).
dc.rights.accesslevelopenAccessfi
dc.subject.ysokaukokartoitus
dc.subject.ysomiehittämättömät ilma-alukset
dc.subject.ysonurmet
dc.subject.ysohyperspektrikuvantaminen
dc.subject.ysoilmakuvakartoitus
dc.subject.ysoneuroverkot
dc.subject.ysonurmiviljely
dc.subject.ysorehuntuotanto
dc.format.contentfulltext
jyx.subject.urihttp://www.yso.fi/onto/yso/p2521
jyx.subject.urihttp://www.yso.fi/onto/yso/p24149
jyx.subject.urihttp://www.yso.fi/onto/yso/p21042
jyx.subject.urihttp://www.yso.fi/onto/yso/p39290
jyx.subject.urihttp://www.yso.fi/onto/yso/p2520
jyx.subject.urihttp://www.yso.fi/onto/yso/p7292
jyx.subject.urihttp://www.yso.fi/onto/yso/p2668
jyx.subject.urihttp://www.yso.fi/onto/yso/p38907
dc.rights.urlhttps://creativecommons.org/licenses/by/4.0/
dc.relation.doi10.3390/rs14112692
jyx.fundinginformationThis research was funded by Academy of Finland ICT 2023 Smart-HSI—“Smart hyperspectral imaging solutions for new era in Earth and planetary observations” (Decision no. 335612), by the European Agricultural Fund for Rural Development: Europe investing in rural areas, Pohjois-Savon Ely-keskus (Grant no. 145346) and by the European Regional Development Fund for “CyberGrass I—Introduction to remote sensing and artificial intelligence assisted silage production” project (ID 20302863) in European Union Interreg Botnia-Atlantica programme. This research was carried out in affiliation with the Academy of Finland Flagship “Forest-Human-Machine Interplay—Building Resilience, Redefining Value Networks and Enabling Meaningful Experiences (UNITE)” (Decision no. 337127) ecosystem.
dc.type.okmA1


Aineistoon kuuluvat tiedostot

Thumbnail

Aineisto kuuluu seuraaviin kokoelmiin

Näytä suppeat kuvailutiedot

CC BY 4.0
Ellei muuten mainita, aineiston lisenssi on CC BY 4.0