Show simple item record

dc.identifier.urihttp://hdl.handle.net/1951/55391
dc.identifier.urihttp://hdl.handle.net/11401/70965
dc.description.sponsorshipThis work is sponsored by the Stony Brook University Graduate School in compliance with the requirements for completion of degree.en_US
dc.formatMonograph
dc.format.mediumElectronic Resourceen_US
dc.language.isoen_US
dc.publisherThe Graduate School, Stony Brook University: Stony Brook, NY.
dc.typeDissertation
dcterms.abstractMultiple object tracking and association are key capabilities in mobile sensor based applications (i.e., a large scale flexible surveillance system and multiple robots application system). Such systems track and identify multiple objects autonomously and intelligently without human operators. They also flexibly control deployed sensors to maximize resource utilization as well as system performance. Moreover, methodologies for the tracking and association should be robust against non-ideal phenomena such as false or failed data processing. In this thesis, we address various issues and present approaches to resolve them in collaborative and heterogeneous single processing for the applications.Multiple object association (finding the correspondence of objects among cameras) is an important capability in multiple cameras environment. We introduce a locally initiating line-based object association to support flexible camera movements. The method can be extended to support multiple cameras through pair-wise collaboration for the object association. While the pair-wise collaboration is effective for objects with the enough separation, the association is not well-established for objects without the enough separation and it may generate the false association. We extend the locally initiating homographic lines based association method to two different multiple camera collaboration strategies that reduce the false association. Collaboration matrices are defined with the required minimum separation for an effective collaboration. The first strategy uses the collaboration matrices to select the best pair out of many cameras having the maximum separation to efficiently collaborate on the object association. The association information in selected cameras is propagated to unselected cameras by the global information constructed from the associated targets. While the first strategy requires the long operation time to achieve the high association rate due to the limited view by the best pair, it reduces the computational cost using homographic lines. The second strategy initiates the collaboration process of objects association for all the pairing cases of cameras regardless of the separation. While the repetitive association processes improve the association performance, the transformation processes of homographic lines increase exponentially.Identification of tracked objects is achieved by using two different signals. The RFID tag is used for object identification and a visual sensor is used for estimating object movements. Visual sensors find the correspondence among cameras and localize them. An association of tracked positions with identifications utilizes object dynamics of crossing the modeled boundary of identification sensors. The proposed association method provides association recovery against tracking and association failure. We also consider coverage uncertainty induced by identification signal characteristics or multiple objects near the boundary of identification sensor coverage. A group and incomplete group association are introduced to resolve identification problems with coverage uncertainty. The simulation results demonstrate the stability of the proposed method against non-ideal phenomena such as false detection, false tracking, and inaccurate coverage model.Finally, a novel self localization method is presented to support mobile sensors. The algorithm estimates the coordinate and the orientation of mobile sensor using projected references on visual image. The proposed method considers the lens non-linearity of the camera and compensates the distortion by using a calibration table. The algorithm can be utilized in mobile robot navigation as well as positioning application where accurate self localization is necessary.
dcterms.available2012-05-15T18:02:43Z
dcterms.available2015-04-24T14:45:20Z
dcterms.contributorHong, Sangjinen_US
dcterms.contributorMurali Subbaraoen_US
dcterms.contributorMonica Fernandez-Bugalloen_US
dcterms.contributorHongshik Ahn.en_US
dcterms.creatorCho, Shung Han
dcterms.dateAccepted2012-05-15T18:02:43Z
dcterms.dateAccepted2015-04-24T14:45:20Z
dcterms.dateSubmitted2012-05-15T18:02:43Z
dcterms.dateSubmitted2015-04-24T14:45:20Z
dcterms.descriptionDepartment of Computer Engineeringen_US
dcterms.formatMonograph
dcterms.formatApplication/PDFen_US
dcterms.identifierhttp://hdl.handle.net/1951/55391
dcterms.identifierCho_grad.sunysb_0771E_10195.pdfen_US
dcterms.identifierhttp://hdl.handle.net/11401/70965
dcterms.issued2010-08-01
dcterms.languageen_US
dcterms.provenanceMade available in DSpace on 2012-05-15T18:02:43Z (GMT). No. of bitstreams: 1 Cho_grad.sunysb_0771E_10195.pdf: 7284456 bytes, checksum: a045c05537ffa9ecfe90cfd34e3071b0 (MD5) Previous issue date: 1en
dcterms.provenanceMade available in DSpace on 2015-04-24T14:45:20Z (GMT). No. of bitstreams: 3 Cho_grad.sunysb_0771E_10195.pdf.jpg: 1894 bytes, checksum: a6009c46e6ec8251b348085684cba80d (MD5) Cho_grad.sunysb_0771E_10195.pdf.txt: 243015 bytes, checksum: 965f9d742257a24bf729f906bf8ae89d (MD5) Cho_grad.sunysb_0771E_10195.pdf: 7284456 bytes, checksum: a045c05537ffa9ecfe90cfd34e3071b0 (MD5) Previous issue date: 1en
dcterms.publisherThe Graduate School, Stony Brook University: Stony Brook, NY.
dcterms.subjectHeterogeneous sensor network, Multiple camera collaboration, Multiple object association, Multiple object identification, Multiple object tracking, Self-localization
dcterms.subjectComputer Engineering
dcterms.titleCollaborative and Heterogeneous Signal Processing Methodology for Mobile Sensor Based Applications
dcterms.typeDissertation


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record