Get Started With Langchain Prompt Templates

Practical examples of Langchain prompt templates in Python

George Pipis


Image generated by author using DALLE-2

In the majority of cases, LLM applications don’t directly input user input into an LLM. Instead, they utilize a larger piece of text known as a “prompt template” to include the user input along with additional context related to the specific task. They encapsulate all the necessary logic to transform user input into a fully formatted prompt. Let’s start with some prompt templates:

Single Input Prompt

from langchain.llms import OpenAI
from langchain.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate

llm = OpenAI(model_name='text-davinci-003')
chat = ChatOpenAI()

single_input_prompt = PromptTemplate(input_variables = ['product'],
template = 'What is a good name for a company that makes {product}?')

single_input_prompt.format(product='colorful socks')
'What is a good name for a company that makes colorful socks?'
print(llm(single_input_prompt.format(product='colorful socks')))

Multi-Input Prompt

We can easily add more parameters as follows:

multi_input_prompt = PromptTemplate(input_variables = ['entity', 'product'],
template = 'What is a good name for a {entity} that is about {product}?')

multi_input_prompt.format(entity='blog', product='data science')
'What is a good name for a blog that is about data science?'
llm(multi_input_prompt.format(entity='blog', product='data science'))
'Data Science Insight'

Chat Open AI Templates

The previous templates were suitable for large language models. When we have to work with the Chat Open AI, we need to create separate templates by role.

Let’s assume that we would like to get a training program for running. Of course, our virtual coach must know our level, the running distance that we want to race and the duration of the program.

from langchain.chat_models import ChatOpenAI



George Pipis

Sr. Director, Data Scientist @ Persado | Co-founder of the Data Science blog: