Show simple item record

dc.contributor.authorPsyridou, Maria
dc.contributor.authorTolvanen, Asko
dc.contributor.authorPatel, Priyanka
dc.contributor.authorKhanolainen, Daria
dc.contributor.authorLerkkanen, Marja-Kristiina
dc.contributor.authorPoikkeus, Anna-Maija
dc.contributor.authorTorppa, Minna
dc.date.accessioned2022-08-15T13:05:43Z
dc.date.available2022-08-15T13:05:43Z
dc.date.issued2023
dc.identifier.citationPsyridou, M., Tolvanen, A., Patel, P., Khanolainen, D., Lerkkanen, M.-K., Poikkeus, A.-M., & Torppa, M. (2023). Reading Difficulties Identification : A Comparison of Neural Networks, Linear, and Mixture Models. <i>Scientific Studies of Reading</i>, <i>27</i>(1), 39-66. <a href="https://doi.org/10.1080/10888438.2022.2095281" target="_blank">https://doi.org/10.1080/10888438.2022.2095281</a>
dc.identifier.otherCONVID_150876586
dc.identifier.urihttps://jyx.jyu.fi/handle/123456789/82553
dc.description.abstractPurpose We aim to identify the most accurate model for predicting adolescent (Grade 9) reading difficulties (RD) in reading fluency and reading comprehension using 17 kindergarten-age variables. Three models (neural networks, linear, and mixture) were compared based on their accuracy in predicting RD. We also examined whether the same or a different set of kindergarten-age factors emerge as the strongest predictors of reading fluency and comprehension difficulties across the models. Method RD were identified in a Finnish sample (N ≈ 2,000) based on Grade 9 difficulties in reading fluency and reading comprehension. The predictors assessed in kindergarten included gender, parental factors (e.g., parental RD, education level), cognitive skills (e.g., phonological awareness, RAN), home literacy environment, and task-avoidant behavior. Results The results suggested that the neural networks model is the most accurate method, as compared to the linear and mixture models or their combination, for the early prediction of adolescent reading fluency and reading comprehension difficulties. The three models elicited rather similar results regarding the predictors, highlighting the importance of RAN, letter knowledge, vocabulary, reading words, number counting, gender, and maternal education. Conclusion The results suggest that neural networks have strong promise in the field of reading research for the early identification of RD.en
dc.format.mimetypeapplication/pdf
dc.language.isoeng
dc.publisherTaylor & Francis
dc.relation.ispartofseriesScientific Studies of Reading
dc.rightsCC BY 4.0
dc.titleReading Difficulties Identification : A Comparison of Neural Networks, Linear, and Mixture Models
dc.typearticle
dc.identifier.urnURN:NBN:fi:jyu-202208154097
dc.contributor.laitosPsykologian laitosfi
dc.contributor.laitosOpettajankoulutuslaitosfi
dc.contributor.laitosDepartment of Psychologyen
dc.contributor.laitosDepartment of Teacher Educationen
dc.contributor.oppiaineKasvatuspsykologiafi
dc.contributor.oppiainePsykologiafi
dc.contributor.oppiaineEsi- ja alkuopetusfi
dc.contributor.oppiaineResurssiviisausyhteisöfi
dc.contributor.oppiaineKasvatuspsykologiaen
dc.contributor.oppiainePsychologyen
dc.contributor.oppiainePre- and Early Childhood Educationen
dc.contributor.oppiaineSchool of Resource Wisdomen
dc.type.urihttp://purl.org/eprint/type/JournalArticle
dc.type.coarhttp://purl.org/coar/resource_type/c_2df8fbb1
dc.description.reviewstatuspeerReviewed
dc.format.pagerange39-66
dc.relation.issn1088-8438
dc.relation.numberinseries1
dc.relation.volume27
dc.type.versionpublishedVersion
dc.rights.copyright© 2022 The Author(s). Published with license by Taylor & Francis Group, LLC.
dc.rights.accesslevelopenAccessfi
dc.relation.grantnumber339418
dc.relation.grantnumber284439
dc.relation.grantnumber292466
dc.relation.grantnumber276239
dc.relation.grantnumber268586
dc.relation.grantnumber313768
dc.subject.ysomallit (mallintaminen)
dc.subject.ysokognitiiviset taidot
dc.subject.ysohermoverkot (biologia)
dc.subject.ysolukihäiriöt
dc.subject.ysooppimisvaikeudet
dc.subject.ysotunnistaminen
dc.subject.ysolukutaito
dc.subject.ysoennustettavuus
dc.subject.ysoluetun ymmärtäminen
dc.subject.ysolukeminen
dc.format.contentfulltext
jyx.subject.urihttp://www.yso.fi/onto/yso/p510
jyx.subject.urihttp://www.yso.fi/onto/yso/p24920
jyx.subject.urihttp://www.yso.fi/onto/yso/p38811
jyx.subject.urihttp://www.yso.fi/onto/yso/p5301
jyx.subject.urihttp://www.yso.fi/onto/yso/p5302
jyx.subject.urihttp://www.yso.fi/onto/yso/p8265
jyx.subject.urihttp://www.yso.fi/onto/yso/p11405
jyx.subject.urihttp://www.yso.fi/onto/yso/p9701
jyx.subject.urihttp://www.yso.fi/onto/yso/p20611
jyx.subject.urihttp://www.yso.fi/onto/yso/p11406
dc.rights.urlhttps://creativecommons.org/licenses/by/4.0/
dc.relation.doi10.1080/10888438.2022.2095281
dc.relation.funderResearch Council of Finlanden
dc.relation.funderResearch Council of Finlanden
dc.relation.funderResearch Council of Finlanden
dc.relation.funderResearch Council of Finlanden
dc.relation.funderResearch Council of Finlanden
dc.relation.funderResearch Council of Finlanden
dc.relation.funderSuomen Akatemiafi
dc.relation.funderSuomen Akatemiafi
dc.relation.funderSuomen Akatemiafi
dc.relation.funderSuomen Akatemiafi
dc.relation.funderSuomen Akatemiafi
dc.relation.funderSuomen Akatemiafi
jyx.fundingprogramPostdoctoral Researcher, AoFen
jyx.fundingprogramResearch costs of Academy Research Fellow, AoFen
jyx.fundingprogramResearch profiles, AoFen
jyx.fundingprogramAcademy Research Fellow, AoFen
jyx.fundingprogramAcademy Project, AoFen
jyx.fundingprogramResearch costs of Academy Research Fellow, AoFen
jyx.fundingprogramTutkijatohtori, SAfi
jyx.fundingprogramAkatemiatutkijan tutkimuskulut, SAfi
jyx.fundingprogramProfilointi, SAfi
jyx.fundingprogramAkatemiatutkija, SAfi
jyx.fundingprogramAkatemiahanke, SAfi
jyx.fundingprogramAkatemiatutkijan tutkimuskulut, SAfi
jyx.fundinginformationThis work was supported by the Academy of Finland [Grant numbers #263891, #268586, #276239, #284439, #292466, #313768, #339418].
dc.type.okmA1


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

CC BY 4.0
Except where otherwise noted, this item's license is described as CC BY 4.0