๐ฌ chat
chat()
method allows you to chat over your data sources using a user-friendly chat API. You can find the signature below:
Parameters
Question to ask
Configure different llm settings such as prompt, temprature, number_documents etc.
The purpose is to test the prompt structure without actually running LLM inference. Defaults to False
A dictionary of key-value pairs to filter the chunks from the vector database. Defaults to None
Session ID of the chat. This can be used to maintain chat history of different user sessions. Default value: default
Return citations along with the LLM answer. Defaults to False
Returns
If citations=False
, return a stringified answer to the question asked.
If citations=True
, returns a tuple with answer and citations respectively.
Usage
With citations
If you want to get the answer to question and return both answer and citations, use the following code snippet:
When citations=True
, note that the returned sources
are a list of tuples where each tuple has two elements (in the following order):
- source chunk
- dictionary with metadata about the source chunk
url
: url of the sourcedoc_id
: document id (used for book keeping purposes)score
: score of the source chunk with respect to the question- other metadata you might have added at the time of adding the source
Without citations
If you just want to return answers and donโt want to return citations, you can use the following example:
With session id
If you want to maintain chat sessions for different users, you can simply pass the session_id
keyword argument. See the example below:
With custom context window
If you want to customize the context window that you want to use during chat (default context window is 3 document chunks), you can do using the following code snippet:
Was this page helpful?