|Home · Standalone · rtcmix~ · uRTcmix · iRTcmix · Tutorials · Reference|
an open-source digital signal processing and sound synthesis language
|about · links · contact|
uRTcmix LOCALIZE() Instumentrtcmix.org documentation for LOCALIZE()
LOCALIZE() video tutorial
initial setupThe LOCALIZE() RTcmix instrument uses a sound 'ray-tracing' approach combined with a simple HRTF model to position a sound source in virtual space relative to a listener. In Unity, the 'sound listener' for a given scene is usually the Main Camera. We will use that as our destination for sound emanating from a source.
We need to add the C# script
to the Main Camera. This will allow us to get the proper set of transform
coordinates to use with the LOCALIZE() instrument. Add this C# script
to your Assetts and drag it onto the Main Camera.
localize a sound sourceInstead of having the LOCALIZE() instrument operate on the Main Camera object (keeping all the incoming audio streams separate for individual localization would be very difficult), we will instead apply the localization delays and amplitude calculations at the sound source. To do this, we will need to find out from the Main Camera what the sound source position is relative to the camera so that the correct parameters can be used. This is what the "getMyTransform.cs" script does.
We declare a class variable to reference the getMyTransform component
we have added to the Main Camera:
Of course this setup of the LOCALIZE() instrument assumes that the
sound source has been configured to pass through the LOCALIZE()
instrument. This is done by using the
scorefile command in a chain of sound-producing game objects, or
by using the
One warning: if a sound source is moving quickly, a rapid clickihg
or 'zipper' noise might occur. This is because the source position is
being updated at the slow frame rate, causing visually imperceptible
'jumps' in the position for each frame rendered. However, these
'jumps' can introduce
discontinuities in the audio signal, resulting in the noise. This
can be minimized or eliminated by using the "smooth"
filter type of the RTcxmix
makefilter() with a "smooth"
filter type interpolates values coming
through a PField, acting as a low-pass filter to smooth out the
discontinuities. An example of how this is set up:
LOCALIZE() instrument parametersParameters with an asterisk(*) are dynamic PFIelds:
p = output skip time p = input skip time p = duration *p = overall amp *p = source x *p = source y *p = source z *p = dest x *p = dest y *p = dest z *p = headwidth (units) p = feet/unit scaler p = input channel p = behind head filter on/off (simple HRTF) p = amp/distance calculation flag 0: no amp/distance 1: linear amp/distance 2: inverse square amp/distance (physically 'correct') p = minimum amp/distance multiplier (the smallest amp calculated p = maximum distance (for linear amp/distance scaling, p14 == 1)Often a large 'overall amp' (p) value will need to be used, especially if the inverse-square distance calculation (p14 = 2) is used. This is essentially how sound works in the Real World, but we have no reflection calculations from nearby surfaces, so the amplitude falls very rapidly. It's as if everything was happening in an anechoic chamber. Altering the overall amp multiplier can compensate for this. Use values that are necessary to achieve the results you want.