Is there something that can generate random Internet usage to make the real sites I go to a bit obfuscated?
I’m thinking something that runs on my server, and simply visits a random website. It probably shouldn’t actually be random, and some sort of tweaking would be great. Like the ability to have it visit every news site there is. That way the ISP will have a harder time telling my political bias.
The threat model for this is below using a VPN for normal usage, although getting a dedicated VPN IP address is a project for one day.
Yea. My issue now is finding a list of these sites
Turn on your browser history for a while then use that.
Just start listing the most popular and generic sites. Then Google a topic like technology and copy whatever those sites are. I imagine you could have a pretty decent list populated in 15 minutes. You could also just ask chatgpt to create lists of the top 100 sites for “x”.
What would write in? I might be willing to help because this interests me as well.
That’s a good idea.
Probably just a shell script. Someone mentioned using curl so that’d be pretty easy
Let me know if you start working on anything. I want to try to use greasemonkey, I haven’t in years.
Little curl shell script that works:
#!/bin/bash # Random_Curl_Request.sh # CSV file containing websites CSV_FILE="/home/user/Documents/randomSiteVisitor/websites.csv" # Get a random line from the CSV file RANDOM_LINE=$(shuf -n 1 "$CSV_FILE") # Extract the website URL from the random line WEBSITE=$(echo $RANDOM_LINE | cut -d ',' -f 1) # Make a curl request to the random website every minute while true; do curl $WEBSITE sleep 60 # Get a new random line from the CSV file RANDOM_LINE=$(shuf -n 1 "$CSV_FILE") # Extract the website URL from the new random line NEW_WEBSITE=$(echo $RANDOM_LINE | cut -d ',' -f 1) # Update the website URL for the next iteration WEBSITE=$NEW_WEBSITE done
Oh dang, nice. Simple and sweet. Thanks.
use an adblock list and just visit the urls one by one using curl