Skip to main content

Examples

This page provides complete, ready-to-use examples that demonstrate common workflows with the Runtime Local LLM plugin. Each example includes both Blueprint and C++ implementations.

Make sure you've read How to use the plugin first for an overview of the API.

Simple Chat

Create an LLM instance, load a model by name, send a message, and display the response token by token.

Download, Load, and Chat

Download a model from a URL at runtime, load it, and start a conversation. The download is skipped if the model already exists on disk.

Pre-download Models

Download models to disk without loading, useful for a settings screen or loading menu where users pick models to cache ahead of time.

NPC Dialogue

A basic NPC dialogue system where the player types a message and the NPC responds. The conversation context persists between messages so the NPC remembers previous exchanges.