The tails of gravitational waves result from the non-linear interaction between the usual quadrupole radiation generated by an isolated system (with total mass - energy M), and the static monopole field associated with M. Their contributions to the field at large distances from the system include a particular effect of modulation of the phase in the Fourier domain, having M as a factor and depending on the frequency as . In this paper we investigate the level at which this tail effect could be detected in future laser interferometric detectors. We consider a family of matched filters of inspiralling compact binary signals, allowing for this effect and parametrized by a family of independent 'test' parameters including M. Detecting the effect is equivalent to attributing, by optimal signal processing, a non-zero value to M. The error bar in the measurement of M is computed by analytical and numerical methods as a function of the optimal signal-to-noise ratio (SNR). We find that the minimal values of the SNR for detection of the tail effect at the level range from to for neutron-star binaries (depending on the type of noise in the detector and on our a priori knowledge of the binary), and from to for a black-hole binary with . It is argued that some of these values, at least for black-hole binaries, could be achieved in future generations of detectors, following the currently planned VIRGO and LIGO detectors.
All Science Journal Classification (ASJC) codes
- Physics and Astronomy (miscellaneous)