I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?

  • Jack@slrpnk.net
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    1 month ago

    Can’t you run if from a container? I guess the will slow it down, but it will deny access to your files.

    • marcie (she/her)@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      1 month ago

      yeah you could. though i dont see any evidence that the large open source llm programs like jan.ai or ollama are doing anything wrong with their program or files. chucking it in a sandbox would solve the problem for good though