Last updated:
0 purchases
mistralai dart
Mistral AI Dart Client #
Unofficial Dart client for Mistral AI API.
Note: Mistral AI API is currently in closed beta. You can request access here.
Features #
Generated from the official Mistral AI OpenAPI specification
Fully type-safe, documented and tested
All platforms supported (including streaming on web)
Custom base URL, headers and query params support (e.g. HTTP proxies)
Custom HTTP client support (e.g. SOCKS5 proxies or advanced use cases)
Supported endpoints:
Chat Completions (with streaming support)
Embeddings
Models
Table of contents #
Usage
Chat Completions
Embeddings
Models
List models
Advance Usage
Default HTTP client
Custom HTTP client
Using a proxy
HTTP proxy
SOCKS5 proxy
Acknowledgements
License
Usage #
Refer to the documentation for more information about the API.
Chat Completions #
Given a list of messages comprising a conversation, the model will return a response.
Generate chat completion:
final res = await client.createChatCompletion(
request: ChatCompletionRequest(
model: ChatCompletionModel.model(ChatCompletionModels.mistralMedium),
temperature: 0,
messages: [
ChatCompletionMessage(
role: ChatCompletionMessageRole.user,
content: 'Why is the sky blue?',
),
],
),
);
print(res.choices.first.message?.content);
// The sky appears blue due to a phenomenon called Rayleigh scattering...
copied to clipboard
ChatCompletionModel is a sealed class that offers two ways to specify the model:
ChatCompletionModel.modelId('model-id'): the model ID as string (e.g. 'mistral-small').
ChatCompletionModel.model(ChatCompletionModels.mistralMedium): a value from ChatCompletionModels enum which lists all the available models.
The following models are available at the moment:
mistral-tiny: Mistral 7B Instruct v0.2 (a minor release of Mistral 7B Instruct). It only works in English and obtains 7.6 on MT-Bench.
mistral-small: Mixtral 8x7B. It masters English/French/Italian/German/Spanish and code and obtains 8.3 on MT-Bench.
mistral-medium: a prototype model, that is currently among the top serviced models available based on standard benchmarks. It masters English/French/Italian/German/Spanish and code and obtains a score of 8.6 on MT-Bench.
Mind that this list may not be up-to-date. Refer to the documentation for the updated list.
Stream chat completion:
final stream = client.createChatCompletionStream(
request: const ChatCompletionRequest(
model: ChatCompletionModel.model(ChatCompletionModels.mistralMedium),
temperature: 0,
messages: [
ChatCompletionMessage(
role: ChatCompletionMessageRole.user,
content: 'Why is the sky blue?',
),
],
),
);
String text = '';
await for (final res in stream) {
text += res.choices.first.delta.content?.trim() ?? '';
}
print(text);
// The sky appears blue due to a phenomenon called Rayleigh scattering...
copied to clipboard
Embeddings #
Given a prompt, the model will generate an embedding representing the prompt.
Generate embedding:
final generated = await client.createEmbedding(
request: const EmbeddingRequest(
model: EmbeddingModel.model(EmbeddingModels.mistralEmbed),
input: ['Why is the sky blue?'],
),
);
print(generated.data.first.embedding);
// [-0.0182342529296875, 0.03594970703125, 0.0286102294921875, ...]
copied to clipboard
EmbeddingModel is a sealed class that offers two ways to specify the model:
EmbeddingModel.modelId('model-id'): the model ID as string (e.g. 'mistral-embed').
EmbeddingModel.model(EmbeddingModels.mistralEmbed): a value from EmbeddingModels enum which lists all the available models.
The following models are available at the moment:
mistral-embed: an embedding model with a 1024 embedding dimensions designed with retrieval capabilities in mind. It achieves a retrieval score of 55.26 on MTEB.
Models #
List models
List models that are available.
final res = await client.listModels();
print(res.data);
// [Model(id: mistral-medium, object: model, created: 1702396611, ownedBy: mistralai), ...]
copied to clipboard
Advance Usage #
Default HTTP client #
By default, the client uses https://api.mistral.ai/v1 as the baseUrl and the following implementations of http.Client:
Non-web: IOClient
Web: FetchClient (to support streaming on web)
Custom HTTP client #
You can always provide your own implementation of http.Client for further customization:
final client = MistralAIClient(
apiKey: 'MISTRAL_API_KEY',
client: MyHttpClient(),
);
copied to clipboard
Using a proxy #
HTTP proxy
You can use your own HTTP proxy by overriding the baseUrl and providing your required headers:
final client = MistralAIClient(
baseUrl: 'https://my-proxy.com',
headers: {
'x-my-proxy-header': 'value',
},
);
copied to clipboard
If you need further customization, you can always provide your own http.Client.
SOCKS5 proxy
To use a SOCKS5 proxy, you can use the socks5_proxy package:
final baseHttpClient = HttpClient();
SocksTCPClient.assignToHttpClient(baseHttpClient, [
ProxySettings(InternetAddress.loopbackIPv4, 1080),
]);
final httpClient = IOClient(baseClient);
final client = MistralAIClient(
client: httpClient,
);
copied to clipboard
Acknowledgements #
The generation of this client was made possible by the openapi_spec package.
License #
Mistral AI Dart Client is licensed under the MIT License.
For personal and professional use. You cannot resell or redistribute these repositories in their original state.
There are no reviews.