Show simple item record

dc.contributor.authorCronin, Neil J.
dc.date.accessioned2021-05-25T04:26:12Z
dc.date.available2021-05-25T04:26:12Z
dc.date.issued2021
dc.identifier.citationCronin, N. J. (2021). Using deep neural networks for kinematic analysis : challenges and opportunities. <i>Journal of Biomechanics</i>, <i>123</i>, Article 110460. <a href="https://doi.org/10.1016/j.jbiomech.2021.110460" target="_blank">https://doi.org/10.1016/j.jbiomech.2021.110460</a>
dc.identifier.otherCONVID_68771597
dc.identifier.urihttps://jyx.jyu.fi/handle/123456789/75928
dc.description.abstractKinematic analysis is often performed in a lab using optical cameras combined with reflective markers. With the advent of artificial intelligence techniques such as deep neural networks, it is now possible to perform such analyses without markers, making outdoor applications feasible. In this paper I summarise 2D markerless approaches for estimating joint angles, highlighting their strengths and limitations. In computer science, so-called “pose estimation” algorithms have existed for many years. These methods involve training a neural network to detect features (e.g. anatomical landmarks) using a process called supervised learning, which requires “training” images to be manually annotated. Manual labelling has several limitations, including labeller subjectivity, the requirement for anatomical knowledge, and issues related to training data quality and quantity. Neural networks typically require thousands of training examples before they can make accurate predictions, so training datasets are usually labelled by multiple people, each of whom has their own biases, which ultimately affects neural network performance. A recent approach, called transfer learning, involves modifying a model trained to perform a certain task so that it retains some learned features and is then re-trained to perform a new task. This can drastically reduce the required number of training images. Although development is ongoing, existing markerless systems may already be accurate enough for some applications, e.g. coaching or rehabilitation. Accuracy may be further improved by leveraging novel approaches and incorporating realistic physiological constraints, ultimately resulting in low-cost markerless systems that could be deployed both in and outside of the lab.en
dc.format.mimetypeapplication/pdf
dc.language.isoeng
dc.publisherElsevier BV
dc.relation.ispartofseriesJournal of Biomechanics
dc.rightsCC BY 4.0
dc.subject.othermotion analysis
dc.subject.otherkinematics
dc.subject.otherdeep neural network
dc.subject.othermarkerless tracking
dc.subject.otherartificial intelligence
dc.titleUsing deep neural networks for kinematic analysis : challenges and opportunities
dc.typearticle
dc.identifier.urnURN:NBN:fi:jyu-202105253187
dc.contributor.laitosLiikuntatieteellinen tiedekuntafi
dc.contributor.laitosFaculty of Sport and Health Sciencesen
dc.type.urihttp://purl.org/eprint/type/JournalArticle
dc.type.coarhttp://purl.org/coar/resource_type/c_2df8fbb1
dc.description.reviewstatuspeerReviewed
dc.relation.issn0021-9290
dc.relation.volume123
dc.type.versionpublishedVersion
dc.rights.copyright© 2021 The Author(s). Published by Elsevier Ltd.
dc.rights.accesslevelopenAccessfi
dc.relation.grantnumber323473
dc.subject.ysoliikeoppi
dc.subject.ysoliikeanalyysi
dc.subject.ysoneuroverkot
dc.subject.ysokoneoppiminen
dc.format.contentfulltext
jyx.subject.urihttp://www.yso.fi/onto/yso/p16028
jyx.subject.urihttp://www.yso.fi/onto/yso/p24952
jyx.subject.urihttp://www.yso.fi/onto/yso/p7292
jyx.subject.urihttp://www.yso.fi/onto/yso/p21846
dc.rights.urlhttps://creativecommons.org/licenses/by/4.0/
dc.relation.doi10.1016/j.jbiomech.2021.110460
dc.relation.funderResearch Council of Finlanden
dc.relation.funderSuomen Akatemiafi
jyx.fundingprogramAcademy Project, AoFen
jyx.fundingprogramAkatemiahanke, SAfi
jyx.fundinginformationAcademy of Finland for funding (decision number: 323473).
dc.type.okmA1


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

CC BY 4.0
Except where otherwise noted, this item's license is described as CC BY 4.0