Document Type
Poster
Publication Date
Summer 2025
Abstract
LIGO, the Laser Interferometer Gravitational-Wave Observatory, uses advanced laser technology to detect gravitational waves, which are ripples in spacetime caused by massive cosmic events. Because of its extreme sensitivity, LIGO’s detectors are influenced by various sources of noise, from small temperature fluctuations to distant vibrations like passing cars. Loud, transient noise events, known as glitches, are especially challenging for gravitational wave detection. To address this, LIGO records extensive auxiliary sensor data, which can be used to characterize and even predict glitches in the main gravitational-wave channel. This auxiliary data is reduced into feature sets, such as the strength and frequency of noise signals within a time window. Over the summer, I focused on hyperparameter tuning machine learning algorithms (MLAs) designed to read these auxiliary features and predict glitches in the strain data. Our two MLAs, GIANTS and TITAN, can have their performance greatly improved by selecting appropriate hyperparameters. We hypothesized that by utilizing Bayesian Optimization techniques, we could improve their performances. Our main tuning efforts centered on building deep learning models with more hidden layers. In this poster, I present the process and results of hyperparameter tuning. Looking ahead, I plan to compare our MLAs to the current algorithm used by LIGO.
Recommended Citation
Chintala, David; Wade, Leslie; and Wade, Madeline, "Hyperparameter Tuning LIGO Glitch MLAs" (2025). Kenyon Summer Science Scholars Program. Paper 784.
https://digital.kenyon.edu/summerscienceprogram/784
