Skip to main content

Pixel streaming audio capture

Pixel Streaming is a plugin for Unreal Engine that streams rendered frames and synchronizes input/output via WebRTC. The application runs on the server side, whereas the client side handles rendering and user interaction. For more details on Pixel Streaming and setup, refer to Pixel Streaming Documentation.

Audio Capture with Pixel Streaming

Pixel Streaming supports audio capture, but it differs from the default plugin’s Capturable Sound Wave implementation. When it comes to Pixel Streaming, capturable sound wave captures audio from input devices on the server side, whereas capturing audio from the client side (browser) requires the Synth-based Sound Wave, which extends capturable sound wave and is designed for capturing audio from Synth Component sources, including the Pixel Streaming Audio Component.

Implementation steps

Add a Pixel Streaming Audio component to an actor that will hold audio-related components.

Add Pixel Streaming Audio Component


Your actor should now include this component:

Pixel Streaming Audio Component


Next, select this component, go to Details panel, and set Auto Activate to False.

Auto Activate to False


In your Blueprints (or C++), create a Synth-based Sound Wave and pass the previously added Pixel Streaming Audio component as a parameter. This will make it possible for the sound wave to internally gather audio data from the audio component.

Create Synth Based Sound Wave node

Use the StartCapture function as you would with Capturable Sound Wave.

Finally, your implementation might look like this:

An example of using a synth based sound wave

Details

Notes Synth-based Sound Wave supports capturing audio from Synth Component sources, including Pixel Streaming Audio Component. Although it theoretically supports any derived classes of Synth Component, it has only been tested with Pixel Streaming Audio Component.