LanguageModelChatProvider
A LanguageModelChatProvider implements access to language models, which users can then use through the chat view, or through extension API by acquiring a LanguageModelChat. An example of this would be an OpenAI provider that provides models like gpt-5, o3, etc.
Properties
Functions
Link copied to clipboard
abstract fun provideLanguageModelChatInformation(options: PrepareLanguageModelChatModelOptions, token: CancellationToken): ProviderResult<ReadonlyArray<T>>
Get the list of available language models provided by this provider
Link copied to clipboard
abstract fun provideLanguageModelChatResponse(model: T, messages: ReadonlyArray<LanguageModelChatRequestMessage>, options: ProvideLanguageModelChatResponseOptions, progress: Progress<LanguageModelResponsePart>, token: CancellationToken): PromiseLike<Void>
Returns the response for a chat request, passing the results to the progress callback. The LanguageModelChatProvider must emit the response parts to the progress callback as they are received from the language model.
Link copied to clipboard
abstract fun provideTokenCount(model: T, text: String, token: CancellationToken): PromiseLike<JsInt>
Returns the number of tokens for a given text using the model-specific tokenizer logic
abstract fun provideTokenCount(model: T, text: LanguageModelChatRequestMessage, token: CancellationToken): PromiseLike<JsInt>