Representing Source Movement in Sequences of Telescopic Images Based on Contrastive Learning for Asteroid Detection
Issued Date
2023-01-01
Resource Type
Scopus ID
2-s2.0-85180154806
Journal Title
27th International Computer Science and Engineering Conference 2023, ICSEC 2023
Start Page
9
End Page
14
Rights Holder(s)
SCOPUS
Bibliographic Citation
27th International Computer Science and Engineering Conference 2023, ICSEC 2023 (2023) , 9-14
Suggested Citation
Kongsathitporn N., Supratak A., Awiphan S., Ackley K., Dyer M.J., Lyman J., Jiminez-Ibarra F., Steeghs D., Galloway D.K., Dhillon V., O'Brien P., Ramsay G., Kotak R., Breton R.P., Nuttall L.K., Pall'e E., Pollacco D., Killestein T., Kumar A., Taka N., Rattanasai R., Noysena K. Representing Source Movement in Sequences of Telescopic Images Based on Contrastive Learning for Asteroid Detection. 27th International Computer Science and Engineering Conference 2023, ICSEC 2023 (2023) , 9-14. 14. doi:10.1109/ICSEC59635.2023.10329669 Retrieved from: https://repository.li.mahidol.ac.th/handle/20.500.14594/96339
Title
Representing Source Movement in Sequences of Telescopic Images Based on Contrastive Learning for Asteroid Detection
Author's Affiliation
National Astronomical Research Institute of Thailand
University of Leicester
University of Portsmouth
University of Warwick
Monash University
Armagh Observatory
Mahidol University
Instituto Astrofisico de Canarias
Turun yliopisto
The University of Manchester
The University of Sheffield
Chiang Mai University
University of Leicester
University of Portsmouth
University of Warwick
Monash University
Armagh Observatory
Mahidol University
Instituto Astrofisico de Canarias
Turun yliopisto
The University of Manchester
The University of Sheffield
Chiang Mai University
Corresponding Author(s)
Other Contributor(s)
Abstract
The study of asteroids, the moving rocky objects, not only makes feasible prevention of hazardous collisions, but also provides better understanding of the solar system in it's early stage. However, existing software for asteroid detection requires manual parameter setup, which is a sensitive task requiring an experienced person. Moreover, the sequence of images contains only the brightness of each image while the key feature of asteroid detection is its movement. In this research, we propose a contrastive deep learning model to learn the motion representation of asteroids in a sequence of images. The representation is used to classify a sequence of images by investigation of distance calculation using Euclidean distance and cosine similarity. Moreover, simple classifiers including k-nearest neighbors (KNN) and logistic regression (LR) are implemented to evaluate their ability to classify the motion representation. The representation generation model is trained on sky images from the Gravitational-wave Optical Transient Observer (GOTO) survey. For motion representation, the classification results show that the best classifier achieves F1 score of 88.32% on the validation set and 86.60% on the test set. Moreover, k-nearest neighbors model underperforms the best model by -3.62% of F1 score in the test set. As a result, our approach replaces the hand-engineering process and produces more promising performance.