Publication:
Recognizing gaits across views through correlated motion co-clustering

dc.contributor.authorWorapan Kusakunniranen_US
dc.contributor.authorQiang Wuen_US
dc.contributor.authorJian Zhangen_US
dc.contributor.authorHongdong Lien_US
dc.contributor.authorLiang Wangen_US
dc.contributor.otherMahidol Universityen_US
dc.contributor.otherUniversity of Technology Sydneyen_US
dc.contributor.otherCSIRO Data61en_US
dc.contributor.otherAustralian National Universityen_US
dc.contributor.otherInstitute of Automation Chinese Academy of Sciencesen_US
dc.date.accessioned2018-11-09T02:09:29Z
dc.date.available2018-11-09T02:09:29Z
dc.date.issued2014-02-01en_US
dc.description.abstractHuman gait is an important biometric feature, which can be used to identify a person remotely. However, view change can cause significant difficulties for gait recognition because it will alter available visual features for matching substantially. Moreover, it is observed that different parts of gait will be affected differently by view change. By exploring relations between two gaits from two different views, it is also observed that a part of gait in one view is more related to a typical part than any other parts of gait in another view. A new method proposed in this paper considers such variance of correlations between gaits across views that is not explicitly analyzed in the other existing methods. In our method, a novel motion co-clustering is carried out to partition the most related parts of gaits from different views into the same group. In this way, relationships between gaits from different views will be more precisely described based on multiple groups of the motion co-clustering instead of a single correlation descriptor. Inside each group, a linear correlation between gait information across views is further maximized through canonical correlation analysis (CCA). Consequently, gait information in one view can be projected onto another view through a linear approximation under the trained CCA subspaces. In the end, a similarity between gaits originally recorded from different views can be measured under the approximately same view. Comprehensive experiments based on widely adopted gait databases have shown that our method outperforms the state-of-the-art. © 2013 IEEE.en_US
dc.identifier.citationIEEE Transactions on Image Processing. Vol.23, No.2 (2014), 696-709en_US
dc.identifier.doi10.1109/TIP.2013.2294552en_US
dc.identifier.issn10577149en_US
dc.identifier.other2-s2.0-84892596841en_US
dc.identifier.urihttps://repository.li.mahidol.ac.th/handle/20.500.14594/33686
dc.rightsMahidol Universityen_US
dc.rights.holderSCOPUSen_US
dc.source.urihttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=84892596841&origin=inwarden_US
dc.subjectComputer Scienceen_US
dc.titleRecognizing gaits across views through correlated motion co-clusteringen_US
dc.typeArticleen_US
dspace.entity.typePublication
mu.datasource.scopushttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=84892596841&origin=inwarden_US

Files

Collections