Zed Adds Support for Ollama

Added an Ollama Provider for the assistant. If you have Ollama running locally on your machine, you can enable it in your settings under:

"assistant": {
    "version": "1",
    "provider": {
      "name": "ollama",
      // Recommended setting to allow for model startup
      "low_speed_timeout_in_seconds": 30,
    }
}

If you’re not aware of Zed, it’s a new multiplayer editor from Nathan Sobo and Max Brunsfeld. This time around they have taken their decade of experience and lessons learned from building Atom, Electron, and tree-sitter and written the editor in Rust. It is blazing fast and tailored for developers, with native language server support.

I have been using Zed every day for the past few months and I am excited to see the new Ollama provider, allowing for local inference as an alternative to ChatGPT. The Zed team have been making really rapid improvements to the product, and it feels like the perfect tool for developers and platform engineers to use when collaborating on code and infrastructure together.

Wednesday, 19 June 2024
true) }}