Tree species classification of drone hyperspectral and RGB imagery with deep learning convolutional neural networks
Nezami, S., Khoramshahi, E., Nevalainen, O., Pölönen, I., & Honkavaara, E. (2020). Tree species classification of drone hyperspectral and RGB imagery with deep learning convolutional neural networks. Remote Sensing, 12(7), Article 1070. https://doi.org/10.3390/rs12071070
Julkaistu sarjassa
Remote SensingPäivämäärä
2020Tekijänoikeudet
© 2020 by the authors. Licensee MDPI, Basel, Switzerland
Interest in drone solutions in forestry applications is growing. Using drones, datasets can be captured flexibly and at high spatial and temporal resolutions when needed. In forestry applications, fundamental tasks include the detection of individual trees, tree species classification, biomass estimation, etc. Deep neural networks (DNN) have shown superior results when comparing with conventional machine learning methods such as multi-layer perceptron (MLP) in cases of huge input data. The objective of this research is to investigate 3D convolutional neural networks (3D-CNN) to classify three major tree species in a boreal forest: pine, spruce, and birch. The proposed 3D-CNN models were employed to classify tree species in a test site in Finland. The classifiers were trained with a dataset of 3039 manually labelled trees. Then the accuracies were assessed by employing independent datasets of 803 records. To find the most efficient set of feature combination, we compare the performances of 3D-CNN models trained with hyperspectral (HS) channels, Red-Green-Blue (RGB) channels, and canopy height model (CHM), separately and combined. It is demonstrated that the proposed 3D-CNN model with RGB and HS layers produces the highest classification accuracy. The producer accuracy of the best 3D-CNN classifier on the test dataset were 99.6%, 94.8%, and 97.4% for pines, spruces, and birches, respectively. The best 3D-CNN classifier produced ~5% better classification accuracy than the MLP with all layers. Our results suggest that the proposed method provides excellent classification results with acceptable performance metrics for HS datasets. Our results show that pine class was detectable in most layers. Spruce was most detectable in RGB data, while birch was most detectable in the HS layers. Furthermore, the RGB datasets provide acceptable results for many low-accuracy applications.
...
Julkaisija
MDPI AGISSN Hae Julkaisufoorumista
2072-4292Asiasanat
Julkaisu tutkimustietojärjestelmässä
https://converis.jyu.fi/converis/portal/detail/Publication/35665698
Metadata
Näytä kaikki kuvailutiedotKokoelmat
Lisätietoja rahoituksesta
This research was financially supported by the Business Finland DroneKnowledge project (Dnro 973 1617/31/2016) and by the Academy of Finland project “Autonomous tree health analyzer based on imaging UAV spectrometry” (Decision number 327861).Lisenssi
Samankaltainen aineisto
Näytetään aineistoja, joilla on samankaltainen nimeke tai asiasanat.
-
Comparison of Deep Neural Networks in the Classification of Bark Beetle-Induced Spruce Damage Using UAS Images
Turkulainen, Emma; Honkavaara, Eija; Näsi, Roope; Oliveira, Raquel A.; Hakala, Teemu; Junttila, Samuli; Karila, Kirsi; Koivumäki, Niko; Pelto-Arvo, Mikko; Tuviala, Johanna; Östersund, Madeleine; Pölönen, Ilkka; Lyytikäinen-Saarenmaa, Päivi (MDPI AG, 2023)The widespread tree mortality caused by the European spruce bark beetle (Ips typographus L.) is a significant concern for Norway spruce-dominated (Picea abies H. Karst) forests in Europe and there is evidence of increases ... -
Chlorophyll Concentration Retrieval by Training Convolutional Neural Network for Stochastic Model of Leaf Optical Properties (SLOP) Inversion
Annala, Leevi; Honkavaara, Eija; Tuominen, Sakari; Pölönen, Ilkka (MDPI AG, 2020)Miniaturized hyperspectral imaging techniques have developed rapidly in recent years and have become widely available for different applications. Combining calibrated hyperspectral imagery with inverse physically based ... -
Estimating Tree Health Decline Caused by Ips typographus L. from UAS RGB Images Using a Deep One-Stage Object Detection Neural Network
Kanerva, Heini; Honkavaara, Eija; Näsi, Roope; Hakala, Teemu; Junttila, Samuli; Karila, Kirsi; Koivumäki, Niko; Alves Oliveira, Raquel; Pelto-Arvo, Mikko; Pölönen, Ilkka; Tuviala, Johanna; Östersund, Madeleine; Lyytikäinen-Saarenmaa, Päivi (MDPI, 2022)Various biotic and abiotic stresses are causing decline in forest health globally. Presently, one of the major biotic stress agents in Europe is the European spruce bark beetle (Ips typographus L.) which is increasingly ... -
Estimating Grass Sward Quality and Quantity Parameters Using Drone Remote Sensing with Deep Neural Networks
Karila, Kirsi; Alves Oliveira, Raquel; Ek, Johannes; Kaivosoja, Jere; Koivumäki, Niko; Korhonen, Panu; Niemeläinen, Oiva; Nyholm, Laura; Näsi, Roope; Pölönen, Ilkka; Honkavaara, Eija (MDPI AG, 2022)The objective of this study is to investigate the potential of novel neural network architectures for measuring the quality and quantity parameters of silage grass swards, using drone RGB and hyperspectral images (HSI), ... -
Using Aerial Platforms in Predicting Water Quality Parameters from Hyperspectral Imaging Data with Deep Neural Networks
Hakala, Taina; Pölönen, Ilkka; Honkavaara, Eija; Näsi, Roope; Hakala, Teemu; Lindfors, Antti (Springer, 2020)In near future it is assumable that automated unmanned aerial platforms are coming more common. There are visions that transportation of different goods would be done with large planes, which can handle over 1000 kg payloads. ...
Ellei toisin mainittu, julkisesti saatavilla olevia JYX-metatietoja (poislukien tiivistelmät) saa vapaasti uudelleenkäyttää CC0-lisenssillä.