Retrieve Answers
This endpoint retrievers answers from your bot
Body
Prompt to generate completions. If you remove this parameter from your request, the instruction from the Fini’s UI setup page will be picked up instead. We recommend always filling this for faster responses (skips DB lookup of instruction)
User question for which the answer needs to be fetched
A list of messages comprising the conversation so far.
A list of functions the model may generate JSON inputs for.
Controls how the model responds to function calls. “none” means the model does not call a function, and responds to the end-user. “auto” means the model can pick between an end-user or calling a function. Specifying a particular function via {"name":\ "my_function"}
forces the model to call that function. “none” is the default when no functions are present. “auto” is the default if functions are present.
Whether to stream back partial progress.
What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We recommend using 0.4 as the default.
The API will stop generating further tokens. The returned text will not contain the stop sequence. It can be set to [“Optional”, “stop”, “words”]
List of categories that you want to use to categorize the Q&A pair.
Response
Indicates whether the call was successful. 1 if successful, 0 if not.
The contents of the “based_on” field. “based_on” is a list of all data blocks from the knowledgbase that were used to generate the answer
A list of messages comprising the conversation so far.