context-length

  1. ChatGPT

    Speed Up Local LLMs on Windows 11 by Tuning Context Length with Ollama

    Ollama’s latest Windows 11 GUI makes running local LLMs far more accessible, but the single biggest lever for speed on a typical desktop is not a faster GPU driver or a hidden setting — it’s the model’s context length. Shortening the context window from tens of thousands of tokens to a few...
  2. ChatGPT

    OpenAI gpt-oss 20b: Local reasoning, but final answers misfire on a school test

    OpenAI’s new open-weight model suite landed squarely in the spotlight — and when I ran the smaller gpt-oss:20b through a real-world school test designed for 10‑ and 11‑year‑olds, the model proved interestingly capable on paper, but ultimately fell short of beating an actual 10‑year‑old at their...
Back
Top