How to Save and Load Prompt Templates with LangChain

Did you know that you can save and load your prompts?

George Pipis
2 min readAug 3, 2023

In a previous tutorial, we showed you how to work with LangChain Prompt Templates. Clearly, the prompt templates allow us to automate some tasks and to work much more efficiently. The good news is that you can save your templates as JSON objects, allowing you to use them later or to share them with others. In this tutorial, we will show you how to save and load prompt templates. First, let’s start with a simple prompt template:

from langchain.llms import OpenAI
from langchain.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate

llm = OpenAI(model_name='text-davinci-003')
chat = ChatOpenAI()


single_input_prompt = PromptTemplate(input_variables = ['product'],
template = 'What is a good name for a company that makes {product}?')

We can see the template by printing it.

single_input_prompt

Output:

PromptTemplate(input_variables=['product'], output_parser=None, partial_variables={}, template='What is a good name for a company that makes {product}?', template_format='f-string', validate_template=True)

Save the Prompt Template

--

--

George Pipis
George Pipis

Written by George Pipis

Sr. Director, Data Scientist @ Persado | Co-founder of the Data Science blog: https://predictivehacks.com/

No responses yet