OmniLLM API Reference¶
The omnillm subpackage provides the OmniLLM provider adapter.
Provider¶
New¶
Creates a new OpenAI provider with the given configuration.
Config¶
| Field | Type | Description |
|---|---|---|
| APIKey | string |
OpenAI API key (required) |
| BaseURL | string |
Custom API endpoint |
| Organization | string |
OpenAI organization ID |
Methods¶
Name¶
Returns the provider identifier.
Capabilities¶
Returns the provider's supported features.
Returns:
core.Capabilities{
Tools: true,
Streaming: true,
Vision: true,
JSON: true,
SystemRole: true,
MaxContextWindow: 128000,
SupportsMaxTokens: true,
}
CreateChatCompletion¶
Sends a chat completion request.
func (p *Provider) CreateChatCompletion(ctx context.Context, req *core.ChatCompletionRequest) (*core.ChatCompletionResponse, error)
CreateChatCompletionStream¶
Creates a streaming chat completion.
func (p *Provider) CreateChatCompletionStream(ctx context.Context, req *core.ChatCompletionRequest) (core.ChatCompletionStream, error)
Close¶
Releases resources held by the provider.
Auto-Registration¶
The package auto-registers with omnillm-core on import:
This registers the OpenAI provider with priority PriorityThick (10), which overrides any thin providers.