This section gives a quickstart example of using Mistral as the Open source LLM and Sentence transformers as the Open source embedding model. These models are free and run mostly on your local machine.We are using Mistral hosted at Hugging Face, so will you need a Hugging Face token to run this example. Its free and you can create one here.
Copy
Ask AI
import os# replace this with your HF keyos.environ["HUGGINGFACE_ACCESS_TOKEN"] = "hf_xxxx"from embedchain import Appapp = App.from_config("mistral.yaml")app.add("https://www.forbes.com/profile/elon-musk")app.add("https://en.wikipedia.org/wiki/Elon_Musk")app.query("What is the net worth of Elon Musk today?")# Answer: The net worth of Elon Musk today is $258.7 billion.
In this section, we will use both LLM and embedding model from OpenAI.
quickstart.py
Copy
Ask AI
import os# replace this with your OpenAI keyos.environ["OPENAI_API_KEY"] = "sk-xxxx"from embedchain import Appapp = App()app.add("https://www.forbes.com/profile/elon-musk")app.add("https://en.wikipedia.org/wiki/Elon_Musk")app.query("What is the net worth of Elon Musk today?")# Answer: The net worth of Elon Musk today is $258.7 billion.