Skip to content
You are viewing help for Supervertaler Workbench. Looking for Supervertaler for Trados help?

Using Local LLMs (Ollama)

Ollama lets you run LLMs locally for privacy and offline translation.

Install Ollama

  1. Download and install from https://ollama.ai
  2. Start Ollama
  3. Pull a model (example):
bash
ollama pull llama3

Use in Supervertaler

Once Ollama is installed and running, Supervertaler can use it as a provider.

INFO

Local models vary a lot in quality. For best results, test a few models on your typical content.

Supervertaler Workbench