Overview

Embedchain supports several embedding models from the following providers:

OpenAI

To use OpenAI embedding function, you have to set the OPENAI_API_KEY environment variable. You can obtain the OpenAI API key from the OpenAI Platform.

Once you have obtained the key, you can use it like this:

  • OpenAI announced two new embedding models: text-embedding-3-small and text-embedding-3-large. Embedchain supports both these models. Below you can find YAML config for both:

Google AI

To use Google AI embedding function, you have to set the GOOGLE_API_KEY environment variable. You can obtain the Google API key from the Google Maker Suite


For more details regarding the Google AI embedding model, please refer to the Google AI documentation.

Azure OpenAI

To use Azure OpenAI embedding model, you have to set some of the azure openai related environment variables as given in the code block below:

You can find the list of models and deployment name on the Azure OpenAI Platform.

GPT4ALL

GPT4All supports generating high quality embeddings of arbitrary length documents of text using a CPU optimized contrastively trained Sentence Transformer.

Hugging Face

Hugging Face supports generating embeddings of arbitrary length documents of text using Sentence Transformer library. Example of how to generate embeddings using hugging face is given below:

Vertex AI

Embedchain supports Google’s VertexAI embeddings model through a simple interface. You just have to pass the model_name in the config yaml and it would work out of the box.