V

  • 8 Posts
  • 74 Comments
Joined 1 year ago
cake
Cake day: June 21st, 2023

help-circle




  • I could be misinformed, but this isn’t just limited to Spark as I understand it, I believe a lot (maybe all?) third-party clients do the same thing. They act as an intermediary between you and the server so they can deliver push notifications.

    However, as I understand it, Spark’s privacy policy outlines that they don’t read/scan the contents of your emails, and the use of app-specific passwords rather than your email password ensures they only have access to emails and nothing else.

    Pretty sure others such as Canary, Airmail, Edison, etc. all do/did the same thing, but it was the lack of clarity in Spark’s privacy policy that made them the main target for scrutiny. I think they’ve since cleared that up.

    I could be mistaken, though.





  • I replied to another comment on here saying that I’d tried this once before, via a Docker container, but just wasn’t getting any results back (kept getting timeouts from all the search engines).

    I’ve just revisited it, and still get the timeouts. Reckon you’re able to help me troubleshoot it?

    Below are the logs from Portainer:

     File "/usr/local/searxng/searx/network/__init__.py", line 165, in get
        return request('get', url, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 98, in request
        raise httpx.TimeoutException('Timeout', request=None) from e
    httpx.TimeoutException: Timeout
    2023-08-06 09:58:13,651 ERROR:searx.engines.soundcloud: Fail to initialize
    Traceback (most recent call last):
      File "/usr/local/searxng/searx/network/__init__.py", line 96, in request
        return future.result(timeout)
               ^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/lib/python3.11/concurrent/futures/_base.py", line 458, in result
        raise TimeoutError()
    TimeoutError
    The above exception was the direct cause of the following exception:
    Traceback (most recent call last):
      File "/usr/local/searxng/searx/search/processors/abstract.py", line 75, in initialize
        self.engine.init(get_engine_from_settings(self.engine_name))
      File "/usr/local/searxng/searx/engines/soundcloud.py", line 69, in init
        guest_client_id = get_client_id()
                          ^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/engines/soundcloud.py", line 45, in get_client_id
        response = http_get("https://soundcloud.com")
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 165, in get
        return request('get', url, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 98, in request
        raise httpx.TimeoutException('Timeout', request=None) from e
    httpx.TimeoutException: Timeout
    2023-08-06 09:58:13,654 ERROR:searx.engines.soundcloud: Fail to initialize
    Traceback (most recent call last):
      File "/usr/local/searxng/searx/network/__init__.py", line 96, in request
        return future.result(timeout)
               ^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/lib/python3.11/concurrent/futures/_base.py", line 458, in result
        raise TimeoutError()
    TimeoutError
    The above exception was the direct cause of the following exception:
    Traceback (most recent call last):
      File "/usr/local/searxng/searx/search/processors/abstract.py", line 75, in initialize
        self.engine.init(get_engine_from_settings(self.engine_name))
      File "/usr/local/searxng/searx/engines/soundcloud.py", line 69, in init
        guest_client_id = get_client_id()
                          ^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/engines/soundcloud.py", line 45, in get_client_id
        response = http_get("https://soundcloud.com")
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 165, in get
        return request('get', url, **kwargs)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
      File "/usr/local/searxng/searx/network/__init__.py", line 98, in request
        raise httpx.TimeoutException('Timeout', request=None) from e
    httpx.TimeoutException: Timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.wikidata: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.duckduckgo: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.google: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.qwant: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.startpage: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.wikibooks: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.wikiquote: engine timeout
    2023-08-06 10:02:05,024 ERROR:searx.engines.wikisource: engine timeout
    2023-08-06 10:02:05,025 ERROR:searx.engines.wikipecies: engine timeout
    2023-08-06 10:02:05,025 ERROR:searx.engines.wikiversity: engine timeout
    2023-08-06 10:02:05,025 ERROR:searx.engines.wikivoyage: engine timeout
    2023-08-06 10:02:05,025 ERROR:searx.engines.brave: engine timeout
    2023-08-06 10:02:05,481 WARNING:searx.engines.wikidata: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,481 ERROR:searx.engines.wikidata: HTTP requests timeout (search duration : 6.457878380082548 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,482 WARNING:searx.engines.wikisource: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,484 ERROR:searx.engines.wikisource: HTTP requests timeout (search duration : 6.460748491808772 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,485 WARNING:searx.engines.brave: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,485 ERROR:searx.engines.brave: HTTP requests timeout (search duration : 6.461546086706221 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,487 WARNING:searx.engines.google: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,487 ERROR:searx.engines.google: HTTP requests timeout (search duration : 6.463769535068423 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,489 WARNING:searx.engines.wikiversity: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,489 ERROR:searx.engines.wikiversity: HTTP requests timeout (search duration : 6.466003180015832 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,490 WARNING:searx.engines.wikivoyage: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,490 ERROR:searx.engines.wikivoyage: HTTP requests timeout (search duration : 6.466597221791744 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,490 WARNING:searx.engines.qwant: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,490 ERROR:searx.engines.qwant: HTTP requests timeout (search duration : 6.4669976509176195 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,491 WARNING:searx.engines.wikibooks: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,491 ERROR:searx.engines.wikibooks: HTTP requests timeout (search duration : 6.4674198678694665 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,491 WARNING:searx.engines.wikiquote: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,492 WARNING:searx.engines.wikipecies: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,492 ERROR:searx.engines.wikiquote: HTTP requests timeout (search duration : 6.468321242835373 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,492 ERROR:searx.engines.wikipecies: HTTP requests timeout (search duration : 6.468797960784286 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,496 WARNING:searx.engines.duckduckgo: ErrorContext('searx/engines/duckduckgo.py', 98, 'res = get(query_url, headers=headers)', 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,497 ERROR:searx.engines.duckduckgo: HTTP requests timeout (search duration : 6.47349306801334 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:02:05,511 WARNING:searx.engines.startpage: ErrorContext('searx/engines/startpage.py', 214, 'resp = get(get_sc_url, headers=headers)', 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:02:05,511 ERROR:searx.engines.startpage: HTTP requests timeout (search duration : 6.487425099126995 s, timeout: 6.0 s) : TimeoutException
    2023-08-06 10:04:27,475 ERROR:searx.engines.duckduckgo: engine timeout
    2023-08-06 10:04:27,770 WARNING:searx.engines.duckduckgo: ErrorContext('searx/search/processors/online.py', 118, "response = req(params['url'], **request_args)", 'httpx.TimeoutException', None, (None, None, None)) False
    2023-08-06 10:04:27,771 ERROR:searx.engines.duckduckgo: HTTP requests timeout (search duration : 3.2968566291965544 s, timeout: 3.0 s) : TimeoutException
    2023-08-06 10:04:50,094 ERROR:searx.engines.duckduckgo: engine timeout
    2023-08-06 10:04:50,187 WARNING:searx.engines.duckduckgo: ErrorContext('searx/engines/duckduckgo.py', 98, 'res = get(query_url, headers=headers)', 'httpx.ConnectTimeout', None, (None, None, 'duckduckgo.com')) False
    2023-08-06 10:04:50,187 ERROR:searx.engines.duckduckgo: HTTP requests timeout (search duration : 3.0933595369569957 s, timeout: 3.0 s) : ConnectTimeout
    

    The above is a simple search for “best privacy focused search engines 2023”, followed by the same search again but using the ddg! bang in front of it.

    I can post my docker-compose if it helps?








  • I’ve always just used Safari as my browser on iOS and macOS so have never paid attention to reviews/opinions on the newer browsers such as Brave. Before I switched to Mac I always used Firefox on my Windows machines so know how privacy focused they’ve always been. But I’m hearing a lot of positives about Brave, and so far it seems pretty decent.

    I’ve tried Arc but wasn’t entirely convinced. And in work I have a Windows machine so have been tied to Edge (although I’ve recently put in a request for Firefox and had it approved).

    I guess it’d be nice to have a consistent search experience across the board, which the likes of DDG would give me. But definitely seeing good things about Brave and Brave Search.





  • Before putting Pi-hole behind Traefik, it worked perfectly via :/admin. And the logs for Pi-hole now in Traefik show that it is up and working, and I get the login page. But just can’t get beyond it.

    The guides I’ve seen show how to structure the Traefik labels with and without the addprefix middleware, and both apparently work. So I’m wondering if by following several guides and taking bits from each, I’ve ended up overlooking something.

    I’ll try and expose 80 and see if it makes a difference, but like I say everything is up and running in the backend, I just can’t get past the login screen on the frontend.




  • Just a quick update on where I’m up to…

    I’ve managed to get all my containers working behind the Traefik reverse proxy with SSL. I’ve also deployed a Cloudflare DDNS container in Docker and have linked the external IP address of my Synology NAS to Cloudflare. I haven’t port forwarded 80 and 443, though, so it’s not accessible over the internet. So I’ve added local DNS into Pi-hole so I can access all the containers using subdomains.

    I’ve also deployed an Authelia container and have started running through my containers adding 2FA in front of them all.

    I should probably point out at this juncture, that if I encounter any errors, the HTTP 404 page that I get is a Cloudflare one - I assume that’s expected behaviour?

    So, the final three bits I’m struggling with now are:

    • Pi-hole behind the reverse proxy
    • Portainer behind the reverse proxy
    • Accessing Vaultwarden over the internet (because as soon as I leave my house, if the vault hasn’t synced then I don’t have access to all my passwords) - unless anybody has a better suggestion?

    Portainer - I have no idea how I do it, because I use it to manage my containers, so don’t have the config for Portainer in Portainer (obviously). So if I screw up the config, how am I getting back in to Portainer to fix it?

    And the far more troubling one is Pi-hole. I just cannot get that thing working behind the reverse proxy.

    I’ve followed a few different guides (though none of them are recent), and the below is the latest docker-compose I have. It will bring up the login page, but when I login it keeps returning me back to the login page - it won’t go to the main admin page.

    version: "3.7"
    
    services:
      pihole:
        container_name: pihole
        image: pihole/pihole:latest
        restart: unless-stopped
        networks:
          - medianet
          - npm_network
        ports:
          - 8008:80
          - 53:53/tcp
          - 53:53/udp
        environment:
          - TZ=Europe/London
          - WEBPASSWORD=xxxxxxxxxx
          - FTLCONF_LOCAL_IPV4=192.168.1.116
          - WEBTHEME=default-auto
          - DNSMASQ_LISTENING=ALL
          - VIRTUAL_HOST=pihole.mydomain.com
        volumes:
          - /path/to/pihole:/etc/pihole
          - /path/to/pihole/dnsmasq.d:/etc/dnsmasq.d
        cap_add:
          - NET_ADMIN
        labels:
          - traefik.enable=true
          - traefik.http.routers.pihole.entrypoints=http
          - traefik.http.routers.pihole.rule=Host(`pihole.mydomain.com`)
          - traefik.http.middlewares.pihole-https-redirect.redirectscheme.scheme=https
          - traefik.http.routers.pihole.middlewares=pihole-https-redirect
          - traefik.http.middlewares.pihole-addprefix.addprefix.prefix=/admin
          - traefik.http.routers.pihole.middlewares=pihole-addprefix
          - traefik.http.routers.pihole-secure.entrypoints=https
          - traefik.http.routers.pihole-secure.rule=Host(`pihole.mydomain.com`)
          - traefik.http.routers.pihole-secure.tls=true
          - traefik.http.routers.pihole-secure.service=pihole
          - traefik.http.services.pihole.loadbalancer.server.port=80
    
    networks:
      medianet:
        external: true
      npm_network:
        external: true