Create Message Request
@Serializable
A request from the server to sample an LLM via the client. The client has full discretion over which model to select. The client should also inform the user before beginning sampling to allow them to inspect the request (human in the loop) and decide whether to approve it.
Constructors
Link copied to clipboard
constructor(messages: List<SamplingMessage>, systemPrompt: String?, includeContext: CreateMessageRequest.IncludeContext?, temperature: Double?, maxTokens: Int, stopSequences: List<String>?, metadata: JsonObject = EmptyJsonObject, modelPreferences: ModelPreferences?, _meta: JsonObject = EmptyJsonObject)
Properties
Link copied to clipboard
A request to include context from one or more MCP servers (including the caller), to be attached to the prompt. The client MAY ignore this request.
Link copied to clipboard
Link copied to clipboard
The server's preferences for which model to select.
Link copied to clipboard
Link copied to clipboard
An optional system prompt the server wants to use it for sampling. The client MAY modify or omit this prompt.
Link copied to clipboard