Pixel streaming audio capture
Pixel Streaming is a plugin for Unreal Engine that streams rendered frames and synchronizes input/output via WebRTC. The application runs on the server side, whereas the client side handles rendering and user interaction. For more details on Pixel Streaming and setup, refer to Pixel Streaming Documentation.
Audio Capture with Pixel Streaming
Pixel Streaming supports audio capture, but it differs from the default plugin’s Capturable Sound Wave implementation. When it comes to Pixel Streaming, capturable sound wave captures audio from input devices on the server side, whereas capturing audio from the client side (browser) requires the Synth-based Sound Wave, which extends capturable sound wave and is designed for capturing audio from Synth Component sources, including the Pixel Streaming Audio Component.
Implementation steps
Add a Pixel Streaming Audio component to an actor that will hold audio-related components.
Your actor should now include this component:
Next, select this component, go to Details panel, and set Auto Activate to False.
In your Blueprints (or C++), create a Synth-based Sound Wave and pass the previously added Pixel Streaming Audio component as a parameter. This will make it possible for the sound wave to internally gather audio data from the audio component.
Use the StartCapture
function as you would with Capturable Sound Wave.
Finally, your implementation might look like this: