If one chats/mails with a person using Windows, despite using secure private protocols, every message will be stored by Microsoft’s Windoze Recall. Either I’m missing something but this feature seems like the most grotesque breach in online privacy/security.

What are ways to avoid this except for using obfuscated text?

  • GetOffMyLan@programming.dev
    link
    fedilink
    arrow-up
    7
    ·
    8 days ago

    It can be turned off so it’s up to the person you’re messaging. Once you send something the person at the other end is in control of what happens to it.

    • arsCynic@slrpnk.netOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      8 days ago

      Once you send something the person at the other end is in control of what happens to it.

      True, but this is the beauty of trust. I decide to communicate one way or another with someone depending on the level of trust. Them deciding to break that trust is a risk I chose to take. However, I do not choose to communicate with Microsoft, whatsoever. Windows Recall is the most blatant piece of spyware ever; beyond comprehension how this is so normalized.

      • BananaTrifleViolin@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 days ago

        Then you have to trust the person you are communicating with has turned off windows recall. That has to be the starting position.

        Tools will come to block or break windows recall but it will still be based on trust that the recipient is using them. Privacy centred apps like Signal wouldn’t want windows screen shotitng every message for example. There are many apps and tools including in the professional sphere that would not want their data leaking via recall so it will come.

        Unfortunately it may come late in the professional realm probably after scandals break. Employers using recall data to investigate staff for example - it’s bound to happen eventually.

        My own organisation, a huge health organisation, has opted in to CoPilot. It’s crazy in my view, even if our data is ring fenced in some way. I don’t want private patient information being used to train Microsoft shitty tools, or stored on their servers. Regulation and the law is way behind when it comes to this stuff.