• 0 Posts
  • 8 Comments
Joined 1 year ago
cake
Cake day: March 1st, 2025

help-circle



  • Its all local. Ollama is the application, deepseek and llama and qwen and whatever else are just model weights. The models arent executables, nor do the models ping external services or whatever. The models are safe. Ollama itself is meant for hosting models locally, and I dont believe it even has capability of doing anything besides run local models.

    Where it gets more complicated is “agentic” assistants, that can read files or execute things at the terminal. The most advanced code assistance are doing this. But this is NOT a function of ollama or the model, its a function of the chat UI or code editor plugin that glues the model output together with a web search, filesystem, terminal session, etc.

    So in short, ollama just runs models. Its all local and private, no worries.




  • rutrum@programming.devtoLinux@lemmy.mlHow do you backup?
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I use borg the same way you describe. Part of my nixos config builds a systemd unit that starts a backup on various directories on my machine at midnight every day. I have 2 repos: one to store locally and on a cloud backup provider (borgbase) and another thats just stored locally. That is, another computer in my house. That local only is for all my home media. I havent yet put the large dataset of photos and videos on the cloud or offsite.