Skip to content

Commit a84b07a

Browse files
authored
docs(provider/ollama): showcase ollama-ai-provider-v2 and ai-sdk-ollama (vercel#7998)
## Background vercel#6924 (comment) The current community-made Ollama Provider is unresponsive to issues and PR's, hasn't been worked on in 7 months and is missing essential features like: - tool streaming - reasoning support - ollama's think toggle, for models like Qwen3 ## Summary This PR updates the docs to point new users in the direction of an actively-maintained and up-to-date package for use with Ollama.
1 parent 9c42bb2 commit a84b07a

File tree

1 file changed

+39
-27
lines changed

1 file changed

+39
-27
lines changed

content/providers/03-community-providers/03-ollama.mdx

Lines changed: 39 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -5,47 +5,40 @@ description: Learn how to use the Ollama provider.
55

66
# Ollama Provider
77

8-
<Note type="warning">
9-
This community provider is not yet compatible with AI SDK 5. It uses the
10-
deprecated `.embedding()` method instead of the standard
11-
`.textEmbeddingModel()` method. Please wait for the provider to be updated or
12-
consider using an [AI SDK 5 compatible provider](/providers/ai-sdk-providers).
13-
</Note>
14-
15-
[sgomez/ollama-ai-provider](https://github.com/sgomez/ollama-ai-provider) is a community provider that uses [Ollama](https://ollama.com/) to provide language model support for the AI SDK.
8+
[nordwestt/ollama-ai-provider-v2](https://github.com/nordwestt/ollama-ai-provider-v2) is a community provider that uses [Ollama](https://ollama.com/) to provide language model support for the AI SDK.
169

1710
## Setup
1811

19-
The Ollama provider is available in the `ollama-ai-provider` module. You can install it with
12+
The Ollama provider is available in the `ollama-ai-provider-v2` module. You can install it with
2013

2114
<Tabs items={['pnpm', 'npm', 'yarn', 'bun']}>
2215
<Tab>
23-
<Snippet text="pnpm add ollama-ai-provider" dark />
16+
<Snippet text="pnpm add ollama-ai-provider-v2" dark />
2417
</Tab>
2518
<Tab>
26-
<Snippet text="npm install ollama-ai-provider" dark />
19+
<Snippet text="npm install ollama-ai-provider-v2" dark />
2720
</Tab>
2821
<Tab>
29-
<Snippet text="yarn add ollama-ai-provider" dark />
22+
<Snippet text="yarn add ollama-ai-provider-v2" dark />
3023
</Tab>
3124

3225
<Tab>
33-
<Snippet text="bun add ollama-ai-provider" dark />
26+
<Snippet text="bun add ollama-ai-provider-v2" dark />
3427
</Tab>
3528
</Tabs>
3629

3730
## Provider Instance
3831

39-
You can import the default provider instance `ollama` from `ollama-ai-provider`:
32+
You can import the default provider instance `ollama` from `ollama-ai-provider-v2`:
4033

4134
```ts
42-
import { ollama } from 'ollama-ai-provider';
35+
import { ollama } from 'ollama-ai-provider-v2';
4336
```
4437

45-
If you need a customized setup, you can import `createOllama` from `ollama-ai-provider` and create a provider instance with your settings:
38+
If you need a customized setup, you can import `createOllama` from `ollama-ai-provider-v2` and create a provider instance with your settings:
4639

4740
```ts
48-
import { createOllama } from 'ollama-ai-provider';
41+
import { createOllama } from 'ollama-ai-provider-v2';
4942

5043
const ollama = createOllama({
5144
// optional settings, e.g.
@@ -77,21 +70,40 @@ You can find more models on the [Ollama Library](https://ollama.com/library) hom
7770

7871
### Model Capabilities
7972

80-
This provider is capable of generating and streaming text and objects. Object generation may fail depending
81-
on the model used and the schema used.
82-
83-
The following models have been tested with image inputs:
73+
This provider is capable of using hybrid reasoning models such as qwen3, allowing toggling of reasoning between messages.
8474

85-
- llava
86-
- llava-llama3
87-
- llava-phi3
88-
- moondream
75+
```ts
76+
import { ollama } from 'ollama-ai-provider-v2';
77+
import { generateText } from 'ai';
78+
79+
const { text } = await generateText({
80+
model: ollama('qwen3:4b'),
81+
providerOptions: { ollama: { think: true } },
82+
prompt:
83+
'Write a vegetarian lasagna recipe for 4 people, but really think about it',
84+
});
85+
```
8986

9087
## Embedding Models
9188

9289
You can create models that call the [Ollama embeddings API](https://github.com/ollama/ollama/blob/main/docs/api.md#generate-embeddings)
93-
using the `.embedding()` factory method.
90+
using the `.textEmbeddingModel()` factory method.
9491

9592
```ts
96-
const model = ollama.embedding('nomic-embed-text');
93+
const model = ollama.textEmbeddingModel('nomic-embed-text');
94+
95+
const { embeddings } = await embedMany({
96+
model: model,
97+
values: ['sunny day at the beach', 'rainy afternoon in the city'],
98+
});
99+
100+
console.log(
101+
`cosine similarity: ${cosineSimilarity(embeddings[0], embeddings[1])}`,
102+
);
97103
```
104+
105+
## Alternative Providers
106+
107+
There is an alternative provider package called [`ai-sdk-ollama` by jagreehal](https://github.com/jagreehal/ai-sdk-ollama), which is fundamentally different from this provider. Instead of using the HTTP API directly, it leverages the [`ollama`](https://www.npmjs.com/package/ollama) package for communication.
108+
109+
This approach may have different tradeoffs in terms of performance, compatibility, and features. It may be better or worse for your use case, so you may want to review both options to decide which fits your needs best.

0 commit comments

Comments
 (0)