4 pointsby appsoftware3 hours ago1 comment
  • appsoftware3 hours ago
    This is a walkthrough of my set up of local LLM capability on a Lenovo ThinkPad P1 Gen 4 (with a RTX A3000 6GB VRAM) graphics card, using Ollamafor CLI and VS Code Copilot chat access, and LM Studio for a GUI option.

    My Lenovo ThinkPad P1 Gen 4 is coming up for 4 years old. It is a powerful workstation, and has a good, but by no means state of the art GPU in the RTX A3000. My expectation is that many developers will have a PC capable of running local LLMs as I have set up here.

    See the GitHub repository for the full walk through:

    https://github.com/gbro3n/local-ai/blob/main/docs/local-llm-...

    Ref: https://www.appsoftware.com/blog/local-llm-setup-on-windows-...