-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add optional realtime rendering support #1
Comments
PART 1 - The AudioCallback static class First all all, you need to have a generic AudioCallback class, a singleton or an external pointer that is initialized and released in the main class of the project :
The AudioCallback is derivated from the Juce's base class AudioIODeviceCallback. There's is three abstract methods to override :
The class need to embedd the Juce AudioDeviceManager class that is the root for the audio system initialization. Just call the There's is 6 parameters :
For example, if you want to select the buffer size or the sample rate of the audio callback, you can use the AudioDeviceSetup and pass it to the
Now, you need to add your callback to the AudioDeviceManager, if the initialise return no error, just add one line in your constructor following the initialization : If the input source selected is a audio source like a microphone, the callback will basically produce an output of the microphone. You'll need to have a source player in the main callback :
So you need to add a PART 2 - Load and play audio file stream In order to play audio file and modifying the signal in real-time, your AudioCallback class need two other Juce's audio classes :
The first one, the Transport Source, allow to have a 'transport' (start/stop ...) control on the audio file you want to stream. The second, the Mixer Source, allow you to connect the transport with the source player. In the AudioCallback constructor, you'll need to wire the differents classes like that :
By symmetry, the destructor of the AudioCallback will be :
Now, your class are able to load a audio file and play it through the transport class, so add a new public method in your AudioCallback, let's say
The Your application GUI need to have control on the audio transport, because your AudioCallback is a static pointer (singleton or extern pointer), add a public accessor to your Transport class.
PART 3 - Process/modify the audio stream in real-time Now, your AudioCallback class is able to play differents sources, but do nothing more. If you want to alter the signal by a process (in our case, a WDF circuit), you need to create a new class that inherit from the Juce's AudioProcessor. The AudioProcessor is basically the Juce base class for all the plugins generated by is wrapper (VST ...), the AudioProcessor have a lot of class methods / accessors (I see right now that a concept of Bus is add recently to the Juce documentation for the AudioProcessor class) but basically a derivated class need to override only few methods :
Ofcourse, the three main methods to initialize and produce audio stream are :
The You can also add a GUI to your processor by create a new 'Editor' class that inherit from the AudioProcessorEditor Juce's class and return a pointer with the AudioProcessor createEditor method.
Now, how to connect a AudioProcessor to your AudioCallback stream ? It's very simple, you just need to add a new AudioProcessorPlayer member class to your callback class : Your callback method becomes :
In the AudioCallback constructor (or where you want to set the current processor depending of your application strategy), just add two lines to connect your AudioProcessor class (eg. WDFProcessor) to the AudioProcessorPlayer :
Ofcourse,
Now your audio stream (depending of the source : microphone / audio file) will be processed by the code of your AudioProcessor. That's all folks. Hope that help, questions or suggestions are welcome. |
Add an option to switch to realtime WDF rendering. Maybe only enable this option after an offline benchmark test before?
The text was updated successfully, but these errors were encountered: