LINQPad + Ollama

After a good long while, I have updated my local installation of LINQPad, a .net programmer scratchpad application that allows for easy computation and data analysis.

Much to my bemusement the application now features integration of LLMs like OpenAI or Anthropic. However, LINQPad also supports the use of a custom endpoint, meaning you can roll an Ollama setup and waste your own power instead of paying to waste someone else’s.

To configure Ollama, open the AI Settings and add a new AI provider.

When prompted to select the provider type, select “Custom OpenAI”.

Give the endpoint a proper name (i.e. Ollama) and set your endpoint to:

http://localhost:11434/v1/chat/completions/

Uncheck the existing OpenAI and Anthropic models and add your own local ones that you have installed with ollama pull <modelname>.

I found qwen3 to be a somewhat competent test model.

Using this setup, I was able to play around with the LLM integration into LINQPad. Unfortunately, as of version 9.6.6, there is no way to completely hide all traces of the AI stuff. If you are not interested in using these features, be it for monetary or compliance reasons, or out of conviction, you will constantly be reminded that they exist.

Published by

tsukasa

The fool's herald.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.