示例
本页面提供完整的、即刻可用的示例,展示使用 Runtime Local LLM 插件的常见工作流程。每个示例都包含 Blueprint 和 C++ 实现。
请确保先阅读如何使用该插件,以了解 API 的概述。
简单聊天
创建一个 LLM 实例,按名称加载模型,发送消息,并逐个 token 显示响应。
- Blueprint
- C++
// Assuming "this" is an AActor with a UPROPERTY() URuntimeLocalLLM* LLM member
// Note: Callback functions (OnModelReady, OnToken, OnComplete, OnLLMError) must be marked as UFUNCTION() in your header file
void AMyActor::BeginPlay()
{
Super::BeginPlay();
LLM = URuntimeLocalLLM::CreateRuntimeLocalLLM();
LLM->OnModelLoadedNative.AddUObject(this, &AMyActor::OnModelReady);
LLM->OnTokenGeneratedNative.AddUObject(this, &AMyActor::OnToken);
LLM->OnGenerationCompleteNative.AddUObject(this, &AMyActor::OnComplete);
LLM->OnErrorNative.AddUObject(this, &AMyActor::OnLLMError);
FLLMInferenceParams Params;
Params.MaxTokens = 512;
Params.Temperature = 0.7f;
Params.SystemPrompt = TEXT("You are a helpful assistant.");
TArray<FLLMModelMetadata> DownloadedModels = URuntimeLLMLibrary::GetAllDownloadedModelMetadata();
// For demonstration purposes, we simply pick the first available model, swap this with your own selection logic as needed
if (DownloadedModels.Num() > 0)
{
const FLLMModelMetadata& Model = DownloadedModels[0]; // Select the first available model
FString ModelFileName = URuntimeLLMLibrary::GetModelFileName(Model);
LLM->LoadModelByName(FName(*ModelFileName), Params);
}
}
void AMyActor::OnModelReady(const FLLMModelMetadata& Metadata)
{
UE_LOG(LogTemp, Log, TEXT("Model ready: %s"), *Metadata.ModelDisplayName);
LLM->SendMessage(TEXT("Hello! Tell me a fun fact."));
}
void AMyActor::OnToken(const FString& Token)
{
// Handle each token as it's generated (e.g. append to a string, update UI, stream to audio, etc)
UE_LOG(LogTemp, Log, TEXT("%s"), *Token);
}
void AMyActor::OnComplete(const FString& FullResponse, float Duration, int32 Tokens, float TPS)
{
UE_LOG(LogTemp, Log, TEXT("Done: %d tokens in %.1fs (%.1f tok/s)"), Tokens, Duration, TPS);
}
void AMyActor::OnLLMError(ELLMErrorCode ErrorCode)
{
FText Desc = URuntimeLLMUtils::GetLLMErrorDescription(ErrorCode);
UE_LOG(LogTemp, Error, TEXT("LLM Error: %s"), *Desc.ToString());
}
下载、加载和对话
在运行时从 URL 下载模型,加载它,然后开始对话。如果模型已存在于磁盘上,则会跳过下载。
- Blueprint
- C++
// Assuming "this" is an AActor with a UPROPERTY() URuntimeLocalLLM* LLM member
// Note: Callback functions (OnProgress, OnModelReady, OnToken, OnComplete, OnLLMError) must be marked as UFUNCTION() in your header file
void AMyActor::BeginPlay()
{
Super::BeginPlay();
LLM = URuntimeLocalLLM::CreateRuntimeLocalLLM();
LLM->OnDownloadProgressNative.AddUObject(this, &AMyActor::OnProgress);
LLM->OnModelLoadedNative.AddUObject(this, &AMyActor::OnModelReady);
LLM->OnTokenGeneratedNative.AddUObject(this, &AMyActor::OnToken);
LLM->OnGenerationCompleteNative.AddUObject(this, &AMyActor::OnComplete);
LLM->OnErrorNative.AddUObject(this, &AMyActor::OnLLMError);
FLLMInferenceParams Params;
Params.MaxTokens = 256;
LLM->LoadModelFromURLSimple(
TEXT("https://huggingface.co/bartowski/Llama-3.2-1B-Instruct-GGUF/resolve/main/Llama-3.2-1B-Instruct-Q4_K_M.gguf"), Params);
}
void AMyActor::OnProgress(float Progress, int64 BytesReceived, int64 TotalBytes)
{
FString ProgressText = URuntimeLLMUtils::FormatDownloadProgress(BytesReceived, TotalBytes);
UE_LOG(LogTemp, Log, TEXT("Downloading: %s"), *ProgressText);
}
void AMyActor::OnModelReady(const FLLMModelMetadata& Metadata)
{
LLM->SendMessage(TEXT("What is the capital of France?"));
}
void AMyActor::OnToken(const FString& Token)
{
// Handle each token as it's generated (e.g. append to a string, update UI, stream to audio, etc)
UE_LOG(LogTemp, Log, TEXT("%s"), *Token);
}
void AMyActor::OnComplete(const FString& FullResponse, float Duration, int32 Tokens, float TPS)
{
UE_LOG(LogTemp, Log, TEXT("Done: %d tokens in %.1fs (%.1f tok/s)"), Tokens, Duration, TPS);
}
void AMyActor::OnLLMError(ELLMErrorCode ErrorCode)
{
FText Desc = URuntimeLLMUtils::GetLLMErrorDescription(ErrorCode);
UE_LOG(LogTemp, Error, TEXT("LLM Error: %s"), *Desc.ToString());
}
预下载模型
将模型下载到磁盘而无需加载,适用于设置界面或加载菜单,用户可在此提前选择要缓存的模型。
- Blueprint
- C++
// Assuming "this" is an AActor with a UPROPERTY() URuntimeLocalLLM* LLM member
// Note: Callback functions (OnProgress, OnDownloaded, OnLLMError) must be marked as UFUNCTION() in your header file
void AMyActor::StartPredownload()
{
LLM = URuntimeLocalLLM::CreateRuntimeLocalLLM();
LLM->OnDownloadProgressNative.AddUObject(this, &AMyActor::OnProgress);
LLM->OnModelDownloadedNative.AddUObject(this, &AMyActor::OnDownloaded);
LLM->OnErrorNative.AddUObject(this, &AMyActor::OnLLMError);
LLM->DownloadModelFromURL(
TEXT("https://huggingface.co/user/model/resolve/main/model-Q4_K_M.gguf"));
}
void AMyActor::OnProgress(float Progress, int64 BytesReceived, int64 TotalBytes)
{
FString ProgressText = URuntimeLLMUtils::FormatDownloadProgress(BytesReceived, TotalBytes);
UE_LOG(LogTemp, Log, TEXT("Downloading: %s"), *ProgressText);
}
void AMyActor::OnDownloaded(const FString& FilePath, const FLLMModelMetadata& Metadata)
{
UE_LOG(LogTemp, Log, TEXT("Model saved to: %s"), *FilePath);
// Model is now on disk - load it later with LoadModelByName or LoadModelFromFile
}
void AMyActor::OnLLMError(ELLMErrorCode ErrorCode)
{
FText Desc = URuntimeLLMUtils::GetLLMErrorDescription(ErrorCode);
UE_LOG(LogTemp, Error, TEXT("LLM Error: %s"), *Desc.ToString());
}
NPC 对话
一个基本的 NPC 对话系统,玩家输入消息,NPC 进行回复。对话上下文在消息之间持续存在,因此 NPC 会记住之前的交流。
- Blueprint
- C++
// Assuming "this" is an AActor with a UPROPERTY() URuntimeLocalLLM* LLM member
// Note: Callback functions (OnToken, OnComplete, OnLLMError) must be marked as UFUNCTION() in your header file
void ANPCActor::BeginPlay()
{
Super::BeginPlay();
LLM = URuntimeLocalLLM::CreateRuntimeLocalLLM();
LLM->OnTokenGeneratedNative.AddUObject(this, &ANPCActor::OnToken);
LLM->OnGenerationCompleteNative.AddUObject(this, &ANPCActor::OnComplete);
LLM->OnErrorNative.AddUObject(this, &AMyActor::OnLLMError);
FLLMInferenceParams Params;
Params.MaxTokens = 150;
Params.Temperature = 0.9f;
Params.SystemPrompt = TEXT(
"You are a grumpy blacksmith in a medieval village. "
"Keep responses to 2-3 sentences. Stay in character.");
// For demonstration purposes, we simply pick the first available model, swap this with your own selection logic as needed
if (DownloadedModels.Num() > 0)
{
const FLLMModelMetadata& Model = DownloadedModels[0]; // Select the first available model
FString ModelFileName = URuntimeLLMLibrary::GetModelFileName(Model);
LLM->LoadModelByName(FName(*ModelFileName), Params);
}
}
void ANPCActor::OnPlayerInteract(const FString& PlayerMessage)
{
if (LLM->IsModelLoaded() && !LLM->IsGenerating())
{
LLM->SendMessage(PlayerMessage);
}
}
void ANPCActor::OnToken(const FString& Token)
{
// Stream token into dialogue widget
}
void ANPCActor::OnComplete(const FString& FullResponse, float Duration, int32 Tokens, float TPS)
{
// Re-enable player input
}
void ANPCActor::OnPlayerLeaves()
{
// Reset for next encounter
LLM->ResetContext(true); // Keep system prompt
}



