iTranslated by AI
type-safe-prompt: A lightweight library for type-safe variable injection in prompts
I couldn't find a library that did exactly this, so I quickly created and published one. I'd like to introduce it here.
What I Wanted to Do
When interacting with APIs like ChatGPT from TypeScript, you often want to abstract things by creating prompt templates and embedding variables into them.
For example, in a prompt to translate arbitrary text into English:
const myPrompt = `
あなたは優れた翻訳者です。
以下のテキストを自然な英語に直してください。
\`\`\`
{{originalText}}
\`\`\`
`
It's this originalText part.
If you try to abstract this straightforwardly with a function:
const myPrompt = (vars: { [K in "originalText"]: string }) => {
return `
あなたは優れた翻訳者です。
以下のテキストを自然な英語に直してください。
\`\`\`
${vars.originalText}
\`\`\`
`
}
It looks like this.
It's not bad, but the indentation gets messed up, making it hard to read, and managing the variable parts or having to write a function every time is a hassle.
Alternatively, if you're using LangChain, there's a way to use PromptTemplate:
const myPromptText = `
あなたは優れた翻訳者です。
以下のテキストを自然な英語に直してください。
\`\`\`
{originalText}
\`\`\`
`
const myPrompt = new PromptTemplate({
inputVariables: ["originalText"],
template: myPromptText,
})
// Apply
myPrompt.invoke({
inputVariables: "こんにちは!",
})
However, the lack of type checking was a bit disappointing, and unless you're already using LangChain for other purposes, it felt like overkill just for this.
This isn't directly related to the main point of the article, but personally, I haven't found much advantage in abstracting with LangChain compared to using language-level features. In fact, it feels a bit counter-intuitive to me, so I'm not very keen on adopting it.
So I Made This
You can install it from npm.
$ pnpm add type-safe-prompt
The usage is simple: just pass the template defined as a string to the function provided by the library.
import { fillPrompt } from "type-safe-prompt"
const promptTemplate = `
あなたは優れた翻訳者です。
以下のテキストを自然な英語に直してください。
\`\`\`
{{originalText}}
\`\`\`
`
const prompt = fillPrompt(promptTemplate, {
originalText: "こんにちは!",
})
fillPrompt recursively extracts patterns like {{varName}} from the type of promptTemplate and requires the necessary values as typed arguments in the second parameter.
So, for example, if you make a mistake in the variable name like this:
const prompt = fillPrompt(promptTemplate, {
originalTextMissed: "こんにちは!",
})
It will result in a type error.

Also, the return value is typed, so it's convenient to see what kind of prompt will be resolved without even running it.

If you're using pnpm, tsup + release-it was great for release management
Publishing an npm library properly is actually quite a hassle:
- Dual Package support because we're in the transition period from CommonJS to ESModules
- Need to fill the
exportsfield and build separately for ESM and CJS
- Need to fill the
- Incrementing
package.json#version - Writing a CHANGELOG
- Creating tags and GitHub releases
- Publishing to npm
Handling these requirements is quite a lot of work.
However, using tsup + release-it took care of most of these headaches and made it very easy.
First, by using tsup:
- Transpiling TypeScript files
- Bundling
-
- Since it's bundled, you only need to specify one entry in the
exportsfield, which is convenient.
- Since it's bundled, you only need to specify one entry in the
-
- Generating type definitions
can all be executed with a single command.
And while there are several tools to handle release-related tasks:
- sindresorhus/np is for npm and doesn't support pnpm
- semantic-release/semantic-release (while good if you want to do it properly) felt like a high hurdle for me personally because I couldn't visualize managing versions strictly through commit messages alone.
I didn't know a "just right" tool, but with release-it, you can interactively:
- Update the version
- Update the CHANGELOG
- Create tags & release commits
- Publish
- Create a GitHub Release
handle everything smoothly, making it very user-friendly.
It felt like a great setup to adopt whenever making a small library to publish easily.
Conclusion
I've introduced a library I made for easily embedding variables when writing LLM prompts.
It's very lightweight and easy to add or remove, so please give it a try if you're interested.
Discussion