Raymond Maarloeve
Loading...
Searching...
No Matches
LlmManager Class Reference

Manages communication with the LLM server, including model registration, loading, unloading, and chat requests. Handles request queuing and ensures only one POST request is processed at a time. More...

Public Member Functions

void Setup (string api)
 Sets up the LLM manager with the specified API base URL.
 
IEnumerator Get< T > (string endpoint, Action< T > onSuccess, Action< string > onError)
 Sends a GET request to the specified endpoint and deserializes the response to type T.
 
void QueuePostRequest< T, TRequest > (string endpoint, TRequest data, Action< T > onSuccess, Action< string > onError)
 Queues a POST request, ensuring that only one executes at a time.
 
void Status (Action< StatusDTO > onComplete, Action< string > onError)
 Gets the status of the LLM server.
 
void LoadModel (string modelID, string path, Action< MessageDTO > onComplete, Action< string > onError)
 Loads a model on the LLM server.
 
void UnloadModel (string modelID, Action< MessageDTO > onComplete, Action< string > onError)
 Unloads a model from the LLM server.
 
void Register (string modelID, string path, Action< MessageDTO > onComplete, Action< string > onError)
 Registers a model with the LLM server, making it available for loading and inference.
 
void Chat (string modelID, List< Message > messages, Action< ChatResponseDTO > onComplete, Action< string > onError, float top_p=0.95f, float temperature=0.8f, int maxTokens=4096)
 Sends a chat request to the LLM server using the specified model and message history.
 
void Connect (Action< bool > onComplete)
 Connects to the LLM server and unloads all currently loaded models.
 
void GenericComplete (MessageDTO message)
 Generic callback for handling MessageDTO responses, logs success or error messages.
 

Static Public Member Functions

static bool StatusCommand ()
 Console command to check the status of the LLM server.
 
static bool LLMQueue ()
 

Public Attributes

bool LogDebug = false
 Whether log requests and responses.
 

Static Public Attributes

static LlmManager Instance
 Singleton instance of the LlmManager.
 

Properties

bool IsConnected [get]
 Indicates whether the manager is connected to the LLM server.
 

Detailed Description

Manages communication with the LLM server, including model registration, loading, unloading, and chat requests. Handles request queuing and ensures only one POST request is processed at a time.

Member Function Documentation

◆ Chat()

void LlmManager.Chat ( string  modelID,
List< Message messages,
Action< ChatResponseDTO onComplete,
Action< string >  onError,
float  top_p = 0::95f,
float  temperature = 0::8f,
int  maxTokens = 4096 
)

Sends a chat request to the LLM server using the specified model and message history.

Parameters
modelIDUnique identifier for the model to use.
messagesList of messages forming the conversation history.
onCompleteCallback on successful response.
onErrorCallback on error.
top_ptop_p LLM parameter
temperatureTemperature LLM parameter
maxTokensMax tokens to generate

◆ Connect()

void LlmManager.Connect ( Action< bool >  onComplete)

Connects to the LLM server and unloads all currently loaded models.

Parameters
onCompleteCallback with connection status (true if healthy).

◆ GenericComplete()

void LlmManager.GenericComplete ( MessageDTO  message)

Generic callback for handling MessageDTO responses, logs success or error messages.

Parameters
messageThe message DTO returned from the server.

◆ Get< T >()

IEnumerator LlmManager.Get< T > ( string  endpoint,
Action< T >  onSuccess,
Action< string >  onError 
)

Sends a GET request to the specified endpoint and deserializes the response to type T.

Template Parameters
TType to deserialize the response to.
Parameters
endpointAPI endpoint.
onSuccessCallback on successful response.
onErrorCallback on error.
Returns
Coroutine enumerator.
Type Constraints
T :class 

◆ LLMQueue()

static bool LlmManager.LLMQueue ( )
static

◆ LoadModel()

void LlmManager.LoadModel ( string  modelID,
string  path,
Action< MessageDTO onComplete,
Action< string >  onError 
)

Loads a model on the LLM server.

Parameters
modelIDUnique identifier for the model.
pathFile system path to the model file.
onCompleteCallback on successful response.
onErrorCallback on error.
Todo:
: Make it randomized

◆ QueuePostRequest< T, TRequest >()

void LlmManager.QueuePostRequest< T, TRequest > ( string  endpoint,
TRequest  data,
Action< T >  onSuccess,
Action< string >  onError 
)

Queues a POST request, ensuring that only one executes at a time.

Template Parameters
TType to deserialize the response to.
TRequestType of the request data.
Parameters
endpointAPI endpoint.
dataRequest data.
onSuccessCallback on successful response.
onErrorCallback on error.
Type Constraints
T :class 
TRequest :class 

◆ Register()

void LlmManager.Register ( string  modelID,
string  path,
Action< MessageDTO onComplete,
Action< string >  onError 
)

Registers a model with the LLM server, making it available for loading and inference.

Parameters
modelIDUnique identifier for the model.
pathFile system path to the model file.
onCompleteCallback on successful response.
onErrorCallback on error.

◆ Setup()

void LlmManager.Setup ( string  api)

Sets up the LLM manager with the specified API base URL.

Parameters
apiBase URL of the LLM server API.

◆ Status()

void LlmManager.Status ( Action< StatusDTO onComplete,
Action< string >  onError 
)

Gets the status of the LLM server.

Parameters
onCompleteCallback on successful response.
onErrorCallback on error.

◆ StatusCommand()

static bool LlmManager.StatusCommand ( )
static

Console command to check the status of the LLM server.

Returns
True if the command was executed.

◆ UnloadModel()

void LlmManager.UnloadModel ( string  modelID,
Action< MessageDTO onComplete,
Action< string >  onError 
)

Unloads a model from the LLM server.

Parameters
modelIDUnique identifier for the model.
onCompleteCallback on successful response.
onErrorCallback on error.

Member Data Documentation

◆ Instance

LlmManager LlmManager.Instance
static

Singleton instance of the LlmManager.

◆ LogDebug

bool LlmManager.LogDebug = false

Whether log requests and responses.

Property Documentation

◆ IsConnected

bool LlmManager.IsConnected
get

Indicates whether the manager is connected to the LLM server.


The documentation for this class was generated from the following file: