wuphysics87@lemmy.ml to Privacy@lemmy.ml · 1 month agoCan you trust locally run LLMs?message-squaremessage-square8fedilinkarrow-up131arrow-down15file-text
arrow-up126arrow-down1message-squareCan you trust locally run LLMs?wuphysics87@lemmy.ml to Privacy@lemmy.ml · 1 month agomessage-square8fedilinkfile-text
I’ve been play around with ollama. Given you download the model, can you trust it isn’t sending telemetry?
minus-squareforemanguy@lemmy.mllinkfedilinkarrow-up3·1 month agoThe only real way of checking is by checking the send packets and/or inspect the source code. This “problem” is not only related to local AI but open source are too
The only real way of checking is by checking the send packets and/or inspect the source code. This “problem” is not only related to local AI but open source are too