Extending the reach of gravitational-wave detectors with machine learning

Abstract

Since the sensitivity upgrade in 2015, the Laser Interferometer Gravitational-wave Observatory (LIGO) has detected a number of black-hole and neutron star mergers. However, since strong sources of gravitational waves (GWs) produce a typical displacement of about 10-18 m (about 1000 times smaller than the diameter of a proton), techniques to reduce and filter instrumental and environmental noise have become increasingly important for the detection of weaker and more distant sources. Our group applied the Long Short-Term Memory (LSTM) neural network model, a subset of machine learning algorithms (MLAs), to noise regression analysis at LIGO. We used LSTMs because they are robust in handling sequential data. Given the time series from the GW and witness channels, the network predicts and subsequently subtracts background noise over a frequency band. Furthermore, unlike the current noise filtering technique, the Wiener filter, non-parametric MLAs like LSTMs, once trained, are capable of learning both the linear and nonlinear noise coupling mechanisms. For the linear contribution, the network matches the subtraction power of the Wiener filter. For the nonlinear contribution, the network performs well on generated mock data. Our framework is generic enough to be applied to a wide variety of series regression problems in many areas of science.

Date
Location
Seattle, WA