Show simple item record

dc.identifier.urihttp://hdl.handle.net/1951/55969
dc.identifier.urihttp://hdl.handle.net/11401/71574
dc.description.sponsorshipThis work is sponsored by the Stony Brook University Graduate School in compliance with the requirements for completion of degree.en_US
dc.formatMonograph
dc.format.mediumElectronic Resourceen_US
dc.language.isoen_US
dc.publisherThe Graduate School, Stony Brook University: Stony Brook, NY.
dc.typeThesis
dcterms.abstractRecognizing moves and movements of human body(s) is a challenging problem due to their self-occluding nature and the associated degrees of freedom for each of the numerous body-joints. This work presents a method to tag human actions and interactions by first discovering the human skeleton using depth images acquired by infrared range sensors and then exploiting the resultant skeletal tracking. Instead of estimating the pose of each body part contributing to a set of moves in a decoupled way, we represent a single-person move or a two-person interaction in terms of its skeletal joint positions. So now a single-person move is defined by the spatial and temporal arrangement of his skeletal framework over the episode of the associated move. And for a two-person interactive sequence, an event is defined in terms of both the participating agents' skeletal framework over time. In this work we have experimented with two different modes of tagging human moves and movements. In collaboration with the Music department we tried an innovative way to tag a single person's moves with music. As a participating agent performs a set of movements, musical notes are generated depending upon the velocity, acceleration and change in position of his body parts. We also try to recognize human interactions into a set of well-defined classes. We present the K-10 Interaction Dataset with ten different classes of two-person interactions performed among six different agents and captured using the Kinect for Xbox 360. We construct interaction representations in terms of local space-time features and integrate such representations with SVM classification schemes for recognition. We further aligned the clips in our dataset using the Canonical Time Warping algorithm that led to an improvement in the interaction classification results.
dcterms.available2012-05-17T12:20:06Z
dcterms.available2015-04-24T14:47:59Z
dcterms.contributorDimitris Samarasen_US
dcterms.contributorTamara L. Berg.en_US
dcterms.contributorAlexander Bergen_US
dcterms.contributorMargaret Anne. Schedel.en_US
dcterms.creatorChattopadhyay, Debaleena
dcterms.dateAccepted2012-05-17T12:20:06Z
dcterms.dateAccepted2015-04-24T14:47:59Z
dcterms.dateSubmitted2012-05-17T12:20:06Z
dcterms.dateSubmitted2015-04-24T14:47:59Z
dcterms.descriptionDepartment of Computer Scienceen_US
dcterms.formatApplication/PDFen_US
dcterms.formatMonograph
dcterms.identifierChattopadhyay_grad.sunysb_0771M_10470.pdfen_US
dcterms.identifierhttp://hdl.handle.net/1951/55969
dcterms.identifierhttp://hdl.handle.net/11401/71574
dcterms.issued2011-05-01
dcterms.languageen_US
dcterms.provenanceMade available in DSpace on 2012-05-17T12:20:06Z (GMT). No. of bitstreams: 1 Chattopadhyay_grad.sunysb_0771M_10470.pdf: 2981305 bytes, checksum: 7b8818e894712fe2d6fa8f351d3653f7 (MD5) Previous issue date: 1en
dcterms.provenanceMade available in DSpace on 2015-04-24T14:47:59Z (GMT). No. of bitstreams: 3 Chattopadhyay_grad.sunysb_0771M_10470.pdf.jpg: 1894 bytes, checksum: a6009c46e6ec8251b348085684cba80d (MD5) Chattopadhyay_grad.sunysb_0771M_10470.pdf: 2981305 bytes, checksum: 7b8818e894712fe2d6fa8f351d3653f7 (MD5) Chattopadhyay_grad.sunysb_0771M_10470.pdf.txt: 55021 bytes, checksum: bbc9d23bb4b62dd47569be59c1c72532 (MD5) Previous issue date: 1en
dcterms.publisherThe Graduate School, Stony Brook University: Stony Brook, NY.
dcterms.subjectAction Recognition, Computer Vision, Interaction Recognition, Kinect, motion to music
dcterms.subjectComputer Science
dcterms.titleMultimodal Tagging of Human Motion Using Skeletal Tracking With Kinect
dcterms.typeThesis


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record