• 0 Posts
  • 2 Comments
Joined 1 month ago
cake
Cake day: October 8th, 2024

help-circle
  • If I try to do the threat modeling, I guess I’m seeing three levels:

    1. Intelligence agencies. They probably have access to all possible data about you. Don’t make them your enemy. Hopefully they never turn evil in your country.
    2. Large technology companies. They make the infrastructure like phone operating systems, stuff that you can’t get around on the modern internet like Cloudflare, etc. They can be affected a little bit with legislation like the GDPR but only to a matter of degrees. But at least they have reasonably good security so you don’t fully lose control of your data. The worst thing they will do to you is to try to convince you to buy stuff, which isn’t all that bad.
    3. Smaller or non -tech companies that just are not competent enough to keep your data secure. They will use dependencies that spy on you, like Google Analytics or android app creation frameworks that inject location tracking. An online pharmacy that is using Facebook scripts and thus shares all your medical purchases with Facebook or elsewhere. A lot of this would be illegal but it is hard to find out and enforce the law about, and it’s like a whack a mole game. It’s hard to know where your data goes and it is probably being sold to whoever wants to pay. For example, local police buying location data from data brokers (worth double checking but I think this actually happens). Since there is no limit to who can access the data, this is more worrying. But for these things, you kind of have the big tech companies on your side. Browsers and phones tend to have built in tracker blocking these days. And you yourself can choose to be careful about what software you run from this category.

    My point is that we should be clear about why we are concerned about the future. Who is the threat and how could they use your data against you? Breaking it down and pointing to a clear harm will help people around you understand why you are concerned.