Please use this identifier to cite or link to this item: http://localhost:8081/jspui/handle/123456789/18551
Full metadata record
DC FieldValueLanguage
dc.contributor.authorJain, Priyal-
dc.date.accessioned2025-12-18T06:18:55Z-
dc.date.available2025-12-18T06:18:55Z-
dc.date.issued2024-06-
dc.identifier.urihttp://localhost:8081/jspui/handle/123456789/18551-
dc.guideBalasubramanian, R.en_US
dc.description.abstract3D point clouds are extensively utilized in tasks such as scene understanding, classification, and shape retrieval. This research area has gained significant attention in recent years due to its potential applications. However, a major challenge in 3D point cloud analysis is achieving rotation invariance. While many existing methods effectively classify point cloud data, their accuracy diminishes when the data is subjected to unseen rotations. This limitation arises because these methods are typically translation-invariant and permutation-invariant, but not rotation-invariant. In this work, we propose leveraging local geometric features as input for neural networks to enhance rotation invariance. By employing the ModelNet40 and ShapeNet datasets, we demonstrate that learning from local features enables a 3D classification model to maintain high accuracy despite arbitrary rotations. In this study, we employed farthest point sampling to select representative data points from the point cloud. Subsequently, we utilized k-nearest neighbors to determine the local neighborhoods of these sampled points. By computing local features from the selected data points and feeding them into the neural network, we ensured that our method achieves rotation invariance.en_US
dc.language.isoenen_US
dc.publisherIIT, Roorkeeen_US
dc.titleROTATION INVARIANT CONVOLUTION: 3D POINT CLOUD CLASSIFICATION AND PART SEGMENTATIONen_US
dc.typeDissertationsen_US
Appears in Collections:MASTERS' THESES (CSE)

Files in This Item:
File Description SizeFormat 
22535022_PRIYAL JAIN.pdf957.85 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.