# Chat

chat node
chat node
The LLM: Chat node lets you use conversation-tuned large language models like GPT-4.

Using the model selection, you can choose different models (and versions) from different providers.

Additional model parameters are available, depending on the model you choose.

# Inputs

Name Type Description
prompt text The prompt sent to the LLM as the last message
conversation struct The full conversation to be sent - a list of chat | messages.
model text the model to use
OpenAI models
temperature number the temperature parameter

# Outputs

Name Type Description
text text The content of the models last response.
response struct The full vendor response - useful for inspecting e.g. used tokens.
conversation struct The full conversation - the input conversation appended with the last response.
error struct the error response from the model vendor if an error occured

# Usage examples

The most basic LLM workflow: take some user input, create a prompt and send it to an LLM. Returns the last LLM response.

llm chain
llm chain

A more advanced example combining conversations and memory nodes for remembering user conversations. Takes an additional userId input and will use it to lookup and store conversations in our Key/Value store.

rember conversation
rember conversation