Skip to content

Commit

Permalink
Merge branch 'main' into main
Browse files Browse the repository at this point in the history
  • Loading branch information
justkahdri authored Apr 29, 2024
2 parents 3b2392d + d2bc2aa commit b3aa8ea
Show file tree
Hide file tree
Showing 69 changed files with 2,472 additions and 496 deletions.
46 changes: 31 additions & 15 deletions docs/pages/docs/ai-core/anthropic.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -41,30 +41,37 @@ yarn add @ai-sdk/anthropic

## Provider Instance

You can import `createAnthropic` from `@ai-sdk/anthropic` and create a provider instance with various settings:
You can import the default provider instance `anthropic` from `@ai-sdk/anthropic`:

```ts
import { createAnthropic } from '@ai-sdk/anthropic';
import { anthropic } from '@ai-sdk/anthropic';
```

const anthropic = createAnthropic({
// optional base URL for proxies etc.:
baseURL: '',
If you need a customized setup, you can import `createAnthropic` from `@ai-sdk/anthropic` and create a provider instance with your settings:

// optional API key, default to env property ANTHROPIC_API_KEY:
apiKey: '',
```ts
import { createAnthropic } from '@ai-sdk/anthropic';

// optional custom headers:
headers: {
'custom-header': 'value',
},
const anthropic = createAnthropic({
// custom settings
});
```

The AI SDK also provides a shorthand `anthropic` import with a Anthropic provider instance that uses defaults:
You can use the following optional settings to customize the Google Generative AI provider instance:

```ts
import { anthropic } from '@ai-sdk/anthropic';
```
- **baseURL** _string_

Use a different URL prefix for API calls, e.g. to use proxy servers.
The default prefix is `https://api.anthropic.com/v1`.

- **apiKey** _string_

API key that is being send using the `x-api-key` header.
It defaults to the `ANTHROPIC_API_KEY` environment variable.

- **headers** _Record<string,string>_

Custom headers to include in the requests.

## Models

Expand All @@ -84,3 +91,12 @@ const model = anthropic('claude-3-haiku-20240307', {
topK: 0.2,
});
```

The following optional settings are available for Anthropic models:

- **topK** _number_

Only sample from the top K options for each subsequent token.

Used to remove "long tail" low probability responses.
Recommended for advanced use cases only. You usually only need to use temperature.
115 changes: 76 additions & 39 deletions docs/pages/docs/ai-core/custom-provider.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -19,62 +19,99 @@ It can be imported from `'@ai-sdk/provider'`.

We also provide utilities that make it easier to implement a custom provider. You can find them in the `@ai-sdk/provider-utils` package ([source code](https://github.com/vercel/ai/tree/main/packages/provider-utils)).

There are several reference implementations, e.g. a [OpenAI reference implementation](https://github.com/vercel/ai/tree/main/packages/openai)
and a [Mistral reference implementation](https://github.com/vercel/ai/tree/main/packages/mistral).
There are several reference implementations, e.g. a [Mistral reference implementation](https://github.com/vercel/ai/tree/main/packages/mistral).

## Provider Facade
## Provider Entry Point

A custom provider should follow the pattern of using a provider facade with factory methods for the specific providers.
An instance of the custom provider class with default settings can be exported for convenience.
Each AI SDK provider should follow the pattern of using a factory function that returns a provider instance
and provide a default instance.

```ts filename="custom-provider-facade.ts"
import { generateId, loadApiKey, withoutTrailingSlash } from ''@ai-sdk/provider-utils'';
```ts filename="custom-provider.ts"
import {
generateId,
loadApiKey,
withoutTrailingSlash,
} from '@ai-sdk/provider-utils';
import { CustomChatLanguageModel } from './custom-chat-language-model';
import { CustomChatModelId, CustomChatSettings } from './mistral-chat-settings';
import { CustomChatModelId, CustomChatSettings } from './custom-chat-settings';

// model factory function with additional methods and properties
export interface CustomProvider {
(
modelId: CustomChatModelId,
settings?: CustomChatSettings,
): CustomChatLanguageModel;

// explicit method for targeting a specific API in case there are several
chat(
modelId: CustomChatModelId,
settings?: CustomChatSettings,
): CustomChatLanguageModel;
}

/**
* Custom provider facade.
*/
export class CustomProvider {
readonly baseURL: string;
readonly apiKey?: string;

constructor(
options: {
baseURL?: string;
apiKey?: string;
} = {},
) {
this.baseURL = withoutTrailingSlash(options.baseURL) ??
'https://api.custom.ai/v1';
this.apiKey = options.apiKey;
}

private get baseConfig() {
return {
baseURL: this.baseURL,
// optional settings for the provider
export interface CustomProviderSettings {
/**
Use a different URL prefix for API calls, e.g. to use proxy servers.
*/
baseURL?: string;

/**
API key.
*/
apiKey?: string;

/**
Custom headers to include in the requests.
*/
headers?: Record<string, string>;
}

// provider factory function
export function createCustomProvider(
options: CustomProviderSettings = {},
): MistralProvider {
const createModel = (
modelId: MistralChatModelId,
settings: MistralChatSettings = {},
) =>
new MistralChatLanguageModel(modelId, settings, {
provider: 'custom.chat',
baseURL:
withoutTrailingSlash(options.baseURL) ?? 'https://custom.ai/api/v1',
headers: () => ({
Authorization: `Bearer ${loadApiKey({
apiKey: this.apiKey,
apiKey: options.apiKey,
environmentVariableName: 'CUSTOM_API_KEY',
description: 'Custom Provider',
})}`,
...options.headers,
}),
};
}

chat(modelId: CustomChatModelId, settings: CustomChatSettings = {}) {
return new CustomChatLanguageModel(modelId, settings, {
provider: 'custom.chat',
...this.baseConfig,
generateId: options.generateId ?? generateId,
});
}

const provider = function (
modelId: CustomChatModelId,
settings?: CustomChatSettings,
) {
if (new.target) {
throw new Error(
'The model factory function cannot be called with the new keyword.',
);
}

return createModel(modelId, settings);
};

provider.chat = createModel;

return provider as CustomProvider;
}

/**
* Default custom provider instance.
*/
export const customprovider = new CustomProvider();
export const customProvider = createCustomProvider();
```

## Language Model Implementation
Expand Down
47 changes: 32 additions & 15 deletions docs/pages/docs/ai-core/google.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -39,30 +39,37 @@ yarn add @ai-sdk/google

## Provider Instance

You can import `createGoogleGenerativeAI` from `@ai-sdk/google` and initialize a provider instance with various settings:
You can import the default provider instance `google` from `@ai-sdk/google`:

```ts
import { createGoogleGenerativeAI } from '@ai-sdk/google';
import { google } from '@ai-sdk/google';
```

const google = createGoogleGenerativeAI({
// optional base URL for proxies etc.:
baseURL: '',
If you need a customized setup, you can import `createGoogleGenerativeAI` from `@ai-sdk/google` and create a provider instance with your settings:

// optional API key, default to env property GOOGLE_GENERATIVE_AI_API_KEY:
apiKey: '',
```ts
import { createGoogleGenerativeAI } from '@ai-sdk/google';

// optional custom headers:
headers: {
'custom-header': 'value',
},
const google = createGoogleGenerativeAI({
// custom settings
});
```

The AI SDK also provides a shorthand `google` import with a Google provider instance that uses defaults:
You can use the following optional settings to customize the Google Generative AI provider instance:

```ts
import { google } from '@ai-sdk/google';
```
- **baseURL** _string_

Use a different URL prefix for API calls, e.g. to use proxy servers.
The default prefix is `https://generativelanguage.googleapis.com/v1beta`.

- **apiKey** _string_

API key that is being send using the `x-goog-api-key` header.
It defaults to the `GOOGLE_GENERATIVE_AI_API_KEY` environment variable.

- **headers** _Record&lt;string,string&gt;_

Custom headers to include in the requests.

## Models

Expand All @@ -82,3 +89,13 @@ const model = google('models/gemini-pro', {
topK: 0.2,
});
```

The following optional settings are available for Google Generative AI models:

- **topK** _number_

Optional. The maximum number of tokens to consider when sampling.

Models use nucleus sampling or combined Top-k and nucleus sampling.
Top-k sampling considers the set of topK most probable tokens.
Models running with nucleus sampling don't allow topK setting.
50 changes: 33 additions & 17 deletions docs/pages/docs/ai-core/mistral.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -39,34 +39,42 @@ yarn add @ai-sdk/mistral

## Provider Instance

You can import `createMistral` from `@ai-sdk/mistral` and create a provider instance with various settings:
You can import the default provider instance `mistral` from `@ai-sdk/mistral`:

```ts
import { createMistral } from '@ai-sdk/mistral';
import { mistral } from '@ai-sdk/mistral';
```

const mistral = createMistral({
// optional base URL for proxies etc.:
baseURL: '',
If you need a customized setup, you can import `createMistral` from `@ai-sdk/mistral`
and create a provider instance with your settings:

// optional API key, default to env property MISTRAL_API_KEY:
apiKey: '',
```ts
import { createMistral } from '@ai-sdk/mistral';

// optional custom headers:
headers: {
'custom-header': 'value',
},
const mistral = createMistral({
// custom settings
});
```

The AI SDK also provides a shorthand `mistral` import with a Mistral provider instance that uses defaults:
You can use the following optional settings to customize the Mistral provider instance:

```ts
import { mistral } from '@ai-sdk/mistral';
```
- **baseURL** _string_

Use a different URL prefix for API calls, e.g. to use proxy servers.
The default prefix is `https://api.mistral.ai/v1`.

- **apiKey** _string_

API key that is being send using the `Authorization` header.
It defaults to the `MISTRAL_API_KEY` environment variable.

- **headers** _Record&lt;string,string&gt;_

Custom headers to include in the requests.

## Models

You can create models that call the [Mistral chat API](https://docs.mistral.ai/api/#operation/createChatCompletion) using mistral provider instance.
You can create models that call the [Mistral chat API](https://docs.mistral.ai/api/#operation/createChatCompletion) using provider instance.
The first argument is the model id, e.g. `mistral-large-latest`.
Some Mistral chat models support tool calls.

Expand All @@ -78,7 +86,15 @@ Mistral chat models also support additional model settings that are not part of
You can pass them as an options argument:

```ts
const model = mistral.chat('mistral-large-latest', {
const model = mistral('mistral-large-latest', {
safePrompt: true, // optional safety prompt injection
});
```

The following optional settings are available for Mistral models:

- **safePrompt** _boolean_

Whether to inject a safety prompt before all conversations.

Defaults to `false`.
Loading

0 comments on commit b3aa8ea

Please sign in to comment.