2 Comments
User's avatar
Thomas's avatar

Did you ran into problems using Ollama because of the prompt length?

Stephen Turner's avatar

Not with these small examples, but context length is getting longer for many open models these days.