Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/114819
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Aeronautical and Aviation Engineering-
dc.creatorLee, MJL-
dc.creatorNg, HF-
dc.creatorHsu, LT-
dc.date.accessioned2025-08-28T09:29:11Z-
dc.date.available2025-08-28T09:29:11Z-
dc.identifier.issn0018-9456-
dc.identifier.urihttp://hdl.handle.net/10397/114819-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2025 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication M. J. L. Lee, H. -F. Ng and L. -T. Hsu, 'Automated Camera Positioning for Digital Twinning: Incorporating 360° Imaging, Differential Rendering, and 3-D Spatial Data,' in IEEE Transactions on Instrumentation and Measurement, vol. 74, pp. 1-9, 2025, Art no. 5038909 is available at 10.1109/TIM.2025.3584142.en_US
dc.subject3-D spatial dataen_US
dc.subject360 imagingen_US
dc.subjectCamera positioningen_US
dc.subjectDifferential renderingen_US
dc.subjectDigital twinen_US
dc.titleAutomated camera positioning for digital twinning : incorporating 360° imaging, differential rendering, and 3-D spatial dataen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume74-
dc.identifier.doi10.1109/TIM.2025.3584142-
dcterms.abstractEfficient and accurate camera positioning is crucial in dynamic environments, particularly within the construction and facilities management industry. This study presents an innovative method that combines 360° imaging, differential rendering, and 3-D spatial data to achieve precise camera positioning. Initially, Global Navigation Satellite Systems (GNSS)-based positioning is used, which is then refined using a 3-D spatial model. Synthetic views are generated for comparison with real-world panoramas captured by a 360° camera. By minimizing discrepancies through differential rendering—specifically by comparing material and color attributes of buildings—we enhance camera positioning accuracy and orientation. Experimental results demonstrate the effectiveness of our method across various environments, achieving positioning errors as low as 0.84 m in close proximity to structures and maintaining consistent accuracy across different scenarios. Our approach leverages surrounding buildings and employs level of detail (LOD) weighting, outperforming existing techniques, including smartphone-based positioning, weighted-least-square methods, and the 3D-mapping-aided algorithm. This research addresses key industry challenges, offering substantial time and labor savings, and provides a solid foundation for future advancements.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE transactions on instrumentation and measurement, 2025, v. 74, 5038909-
dcterms.isPartOfIEEE transactions on instrumentation and measurement-
dcterms.issued2025-
dc.identifier.scopus2-s2.0-105010050969-
dc.identifier.eissn1557-9662-
dc.identifier.artn5038909-
dc.description.validate202508 bcch-
dc.description.oaAccepted Manuscripten_US
dc.identifier.SubFormIDG000109/2025-08en_US
dc.description.fundingSourceSelf-fundeden_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Lee_Automated_Camera_Positioning.pdfPre-Published version35.07 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.