MIOModDelay� is a dynamic, interpolating delay that uses a audio control signal to
set the delay time through the process. The input control signal is clipped
to the range of 0.0…1.0, and then is multipled with the maximum delay
specified by Delay(samp). The resulting fractional delay is applied to the
input signal to form the output signal.Since the delay is variable on a sample by sample basis, and it supports
fractional delays, the MIOModDelay� can be used for automatic modulation effects
like vibrato and chorus.