mullvad / mullvadvpn-app

The Mullvad VPN client app for desktop and mobile
https://mullvad.net/
GNU General Public License v3.0
4.99k stars 334 forks source link

Mullvad app stuck at "Connecting to mullvad system service" on MacOS Big Sur 11.6 #3034

Closed kevholder closed 3 years ago

kevholder commented 3 years ago

Issue report

Operating system: MacOS Big Sur 11.6

App version: 2021.4

Issue description

Mullvad app stuck at "Connecting to mullvad system service".

Logs:

====================
Log: ~/Library/Logs/Mullvad VPN/frontend-main.log
====================
[2021-10-11 17:42:47.051][debug] Chromium sandbox is enabled
[2021-10-11 17:42:47.052][info] Running version 2021.4
[2021-10-11 17:42:47.120][info] Detected locale: en-US
[2021-10-11 17:42:57.124][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:43:10.125][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:43:10.126][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:43:23.130][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:43:23.130][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:43:36.136][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:43:36.136][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:43:49.145][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:43:49.145][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:44:02.153][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:44:02.153][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:44:15.156][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:44:15.157][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:44:28.163][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:44:28.163][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:44:41.171][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:44:41.171][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:44:54.179][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:44:54.179][error] Failed to reconnect - Error: Failed to connect before the deadline

====================
Log: ~/Library/Logs/Mullvad VPN/frontend-main.old.log
====================
[2021-10-11 17:38:45.057][debug] Chromium sandbox is enabled
[2021-10-11 17:38:45.063][info] Running version 2021.4
[2021-10-11 17:38:45.129][info] Detected locale: en-US
[2021-10-11 17:38:55.135][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:39:08.141][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:39:08.142][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:39:21.151][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:39:21.151][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:39:34.157][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:39:34.157][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:39:47.163][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:39:47.163][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:40:00.166][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:40:00.166][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:40:13.170][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:40:13.171][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:40:26.171][error] Failed to connect to daemon: Failed to connect before the deadline
[2021-10-11 17:40:26.171][error] Failed to reconnect - Error: Failed to connect before the deadline
[2021-10-11 17:40:33.603][info] Cannot close the tunnel because there is no active connection to daemon.

====================
Log: ~/Library/Logs/Mullvad VPN/frontend-renderer.log
====================
[2021-10-11 17:42:47.576][debug] Tunnel state: disconnected

====================
Log: ~/Library/Logs/Mullvad VPN/frontend-renderer.old.log
====================
[2021-10-11 17:38:45.673][debug] Tunnel state: disconnected

Tried rebooting and reinstalling but nothing works.

kevholder commented 3 years ago

Also, I'm getting "Failed to send" when trying to use the "Report a problem" feature.

pinkisemils commented 3 years ago

It seems like the daemon doesn't even start. Would you mind trying to run the daemon from a shell manually? I'd be interested in the output. Feel free to only share whatever seems relevant, if you're unsure about disclosing too much info.

cd /Applications/Mullvad\ VPN.app/Contents/Resources
sudo ./mullvad-daemon -v

This seems horribly broken.

kevholder commented 3 years ago

Launching from terminal worked:

Applications/Mullvad VPN.app/Contents/Resources$ sudo ./mullvad-daemon -v
Password:
[2021-10-11 22:01:55.782][mullvad_daemon::version][INFO] Starting mullvad-daemon - 2021.4 2021-06-30
[2021-10-11 22:01:55.783][mullvad_daemon][INFO] Logging to /var/log/mullvad-vpn
[2021-10-11 22:01:55.785][mullvad_daemon::rpc_uniqueness_check][DEBUG] Failed to locate/connect to another daemon instance, assuming there isn't one
[2021-10-11 22:01:55.787][mullvad_daemon][INFO] Management interface listening on /var/run/mullvad-vpn
[2021-10-11 22:01:55.787][mullvad_rpc::address_cache][DEBUG] Loading API addresses from "/Library/Caches/mullvad-vpn/api-ip-address.txt"
[2021-10-11 22:01:55.787][mullvad_rpc::address_cache][DEBUG] Loading API addresses from "/Applications/Mullvad VPN.app/Contents/Resources/api-ip-address.txt"
[2021-10-11 22:01:55.788][mullvad_rpc::address_cache][DEBUG] Using API address: 193.138.218.78:443
[2021-10-11 22:01:55.827][mullvad_daemon::relays][DEBUG] Reading relays from /Library/Caches/mullvad-vpn/relays.json
[2021-10-11 22:01:55.827][mullvad_daemon::relays][DEBUG] Reading relays from /Applications/Mullvad VPN.app/Contents/Resources/relays.json
[2021-10-11 22:01:55.859][mullvad_daemon::relays][INFO] Initialized with 771 cached relays from 2021-06-30 10:51:45.000
[2021-10-11 22:01:55.859][mullvad_daemon::settings][INFO] Loading settings from /etc/mullvad-vpn/settings.json
[2021-10-11 22:01:55.860][mullvad_types::settings::migrations::v2][INFO] Migrating settings format to V3
[2021-10-11 22:01:55.860][mullvad_types::settings::migrations::v3][INFO] Migrating settings format to V4
[2021-10-11 22:01:55.860][mullvad_daemon::settings][DEBUG] Writing settings to /etc/mullvad-vpn/settings.json
[2021-10-11 22:01:55.867][mullvad_daemon::version_check][DEBUG] Loading version check cache from /Library/Caches/mullvad-vpn/version-info.json
[2021-10-11 22:01:55.867][mullvad_daemon::version_check][WARN] Error: Unable to load cached version info
Caused by: Failed to open app version cache file for reading
Caused by: No such file or directory (os error 2)
[2021-10-11 22:01:55.880][mullvad_daemon::account_history][INFO] Opening account history file in /etc/mullvad-vpn/account-history.json
[2021-10-11 22:01:55.931][mullvad_daemon][DEBUG] No cached target state to load
[2021-10-11 22:01:55.957][talpid_core::firewall][INFO] Resetting firewall policy
[2021-10-11 22:01:55.958][mullvad_daemon][INFO] Account already has WireGuard key
[2021-10-11 22:01:55.958][mullvad_daemon::wireguard][DEBUG] Starting automatic key rotation job
[2021-10-11 22:01:57.092][mullvad_daemon::relays][DEBUG] Writing relays cache to /Library/Caches/mullvad-vpn/relays.json
[2021-10-11 22:01:57.118][mullvad_daemon::relays][INFO] Downloaded relay inventory has 763 relays
[2021-10-11 22:01:57.120][mullvad_daemon::management_interface][DEBUG] Broadcasting new relay list
[2021-10-11 22:01:57.361][mullvad_daemon::management_interface][DEBUG] get_account_history
[2021-10-11 22:01:57.366][mullvad_daemon::management_interface][DEBUG] get_tunnel_state
[2021-10-11 22:01:57.370][mullvad_daemon::management_interface][DEBUG] get_settings
[2021-10-11 22:01:57.370][mullvad_daemon::management_interface][DEBUG] get_current_location
[2021-10-11 22:01:57.379][mullvad_daemon::management_interface][DEBUG] get_relay_locations
[2021-10-11 22:01:57.379][mullvad_daemon::management_interface][DEBUG] get_account_history
[2021-10-11 22:01:57.379][mullvad_daemon::management_interface][DEBUG] get_wireguard_key
[2021-10-11 22:01:57.379][mullvad_daemon::management_interface][DEBUG] get_account_data
[2021-10-11 22:01:57.480][mullvad_daemon::management_interface][DEBUG] get_current_version
[2021-10-11 22:01:57.482][mullvad_daemon::management_interface][DEBUG] get_version_info
[2021-10-11 22:01:57.482][mullvad_daemon][DEBUG] No version cache found. Fetching new info
[2021-10-11 22:01:57.575][mullvad_daemon::version_check][DEBUG] Writing version check cache to /Library/Caches/mullvad-vpn/version-info.json
[2021-10-11 22:01:57.575][mullvad_daemon::management_interface][DEBUG] Broadcasting new app version info
[2021-10-11 22:01:57.872][mullvad_daemon::geoip][DEBUG] Unable to fetch IPv6 GeoIP location: Hyper connect error: failed to lookup address information: nodename nor servname provided, or not known

Strange... The mullvad app never asks my sudo password and launching the daemon without sudo fails, I wonder if the problem is related?

kevholder commented 3 years ago

I tried launching the daemon with launchctl load /Library/LaunchDaemons/net.mullvad.daemon.plist and here are system logs:

/Library/LaunchDaemons0  tail -f /var/log/system.log | grep mullvad
Oct 11 22:32:32 mbp com.apple.xpc.launchd[1] (net.mullvad.daemon[4963]): Service could not initialize: 20G165: xpcproxy + 23787 [839][42C436DB-45AD-3EE7-AD04-D60F7FCAA4AD]: 0xd
Oct 11 22:32:32 mbp com.apple.xpc.launchd[1] (net.mullvad.daemon[4963]): Service exited with abnormal code: 78
Oct 11 22:32:32 mbp com.apple.xpc.launchd[1] (net.mullvad.daemon): Service only ran for 0 seconds. Pushing respawn out by 10 seconds.
kevholder commented 3 years ago

Strangely, deleting /Library/LaunchDaemons/net.mullvad.daemon.plist and re-installing mullvad seemed to fix the issue.

pinkisemils commented 3 years ago

This is strange. I guess you no longer have the old /Library/LaunchDaemons/net.mullvad.daemon.plist. Since this is the only time we've seen this happen, I don't think this is necessarily related to the daemon code but rather fishy with the service file. Maybe it was corrupted?

pinkisemils commented 3 years ago

Closing the issue as we cannot reproduce the issue.

rakleed commented 2 years ago

@pinkisemils if user deletes /Library/LaunchDaemons/net.mullvad.daemon.plist, restarts macOS and launches Mullvad, he gets stuck at "Connecting to Mullvad system service". Why don't you suggest at this stage to reinstall Launch Daemon? Because there is no error message, a user can't understand what is wrong at this stage.

User can delete Launch Daemon by accident or via CleanMyMac X or App Cleaner & Uninstaller while trying to set up Startup Programs.

JasperMeurs commented 1 year ago

I'm experiencing the same issue right now. Is there any news on this?

ryenski commented 1 year ago

I am also having this problem. Manually starting the daemon as described above works. Removing the .plist and reinstalling Mullvad will cause the daemon to run until the machine is restarted.

image
ryenski commented 1 year ago

After reinstalling Mullvad and restarting, Mullvad daemon is not running:

❯ ps ax | grep mullvad-daemon
 3437 s002  S+     0:00.00 grep --color=auto --exclude-dir=.bzr --exclude-dir=CVS --exclude-dir=.git --exclude-dir=.hg --exclude-dir=.svn --exclude-dir=.idea --exclude-dir=.tox mullvad-daemon

Daemon log is empty.

❯ cat /var/log/mullvad-vpn/daemon.log

❯

Manually starting the daemon...

❯ sudo /Applications/Mullvad\ VPN.app/Contents/Resources/mullvad-daemon -v
[2022-12-09 13:52:07.917][mullvad_daemon::version][INFO] Starting mullvad-daemon - 2022.5 2022-10-14
[2022-12-09 13:52:07.918][mullvad_daemon][INFO] Logging to /var/log/mullvad-vpn
[2022-12-09 13:52:07.920][mullvad_daemon::rpc_uniqueness_check][DEBUG] Failed to locate/connect to another daemon instance, assuming there isn't one
[2022-12-09 13:52:07.920][mullvad_daemon][INFO] Management interface listening on /var/run/mullvad-vpn
[2022-12-09 13:52:07.923][mullvad_api::address_cache][DEBUG] Loading API addresses from /Library/Caches/mullvad-vpn/api-ip-address.txt
[2022-12-09 13:52:07.924][mullvad_api::address_cache][DEBUG] Using API address: [REDACTED]:443
[2022-12-09 13:52:07.924][mullvad_api::availability][DEBUG] Suspending API requests
[2022-12-09 13:52:07.926][mullvad_daemon::settings][INFO] Loading settings from /etc/mullvad-vpn/settings.json
[2022-12-09 13:52:07.927][mullvad_daemon::version_check][DEBUG] Loading version check cache from /Library/Caches/mullvad-vpn/version-info.json
[2022-12-09 13:52:07.928][mullvad_relay_selector][DEBUG] Reading relays from /Library/Caches/mullvad-vpn/relays.json
[2022-12-09 13:52:07.936][mullvad_relay_selector][DEBUG] Reading relays from /Applications/Mullvad VPN.app/Contents/Resources/relays.json
[2022-12-09 13:52:07.943][mullvad_relay_selector][INFO] Initialized with 897 cached relays from 2022-12-09 11:12:49.201
[2022-12-09 13:52:07.968][mullvad_api::availability][DEBUG] Pausing background API requests
[2022-12-09 13:52:07.968][mullvad_daemon::account_history][INFO] Opening account history file in /etc/mullvad-vpn/account-history.json
[2022-12-09 13:52:07.969][mullvad_daemon][INFO] Automatically connecting since auto-connect is turned on
[2022-12-09 13:52:07.969][mullvad_api::access][DEBUG] Fetching access token for an account
[2022-12-09 13:52:07.983][talpid_core::offline::imp][DEBUG] Assuming non-tunnel default route exists due to Node { ip: Some([REDACTED]), device: Some("en0") }
[2022-12-09 13:52:07.986][talpid_core::offline::imp][DEBUG] Assuming non-tunnel default route exists due to Node { ip: Some([REDACTED]), device: Some("en0") }
[2022-12-09 13:52:07.986][talpid_core::dns][INFO] Resetting DNS
[2022-12-09 13:52:07.999][mullvad_api::availability][DEBUG] Unsuspending API requests
[2022-12-09 13:52:07.999][mullvad_api::availability][DEBUG] Resuming background API requests
[2022-12-09 13:52:08.000][mullvad_relay_selector][INFO] Selected relay [REDACTED] at [REDACTED]
[2022-12-09 13:52:08.000][mullvad_relay_selector][DEBUG] Relay matched on highest preference for retry attempt 0
[2022-12-09 13:52:08.000][talpid_core::firewall][INFO] Applying firewall policy: Connecting to [REDACTED]:18980/UDP, Allowing LAN, interface: none. Allowing endpoint [REDACTED]:443/TCP
[2022-12-09 13:52:08.005][mullvad_daemon][DEBUG] New tunnel state: Connecting { endpoint: TunnelEndpoint { endpoint: Endpoint { address: [REDACTED]:18980, protocol: Udp }, tunnel_type: Wireguard, quantum_resistant: false, proxy: None, obfuscation: None, entry_endpoint: None }, location: Some(GeoIpLocation { ipv4: None, ipv6: None, country: "[REDACTED]", city: Some("[REDACTED]"), latitude: [REDACTED], longitude: -[REDACTED], mullvad_exit_ip: true, hostname: Some("[REDACTED]"), bridge_hostname: None, entry_hostname: None, obfuscator_hostname: None }) }
[2022-12-09 13:52:08.020][talpid_core::firewall][INFO] Applying firewall policy: Connecting to [REDACTED]:18980/UDP over "utun5" (ip: [REDACTED], v4 gw: [REDACTED], v6 gw: Some([REDACTED]::1), allowed in-tunnel traffic: All), Allowing LAN. Allowing endpoint [REDACTED]:443/TCP
add net [REDACTED]: gateway utun5
add host [REDACTED]::1/128: gateway utun5
add net [REDACTED]
add net 8000::/1: gateway utun5
add net ::/1: gateway utun5
add net 128.0.0.0: gateway utun5
add net 0.0.0.0: gateway utun5
[2022-12-09 13:52:08.322][talpid_core::firewall][INFO] Applying firewall policy: Connected to [REDACTED]:18980/UDP over "utun5" (ip: [REDACTED], v4 gw: [REDACTED], v6 gw: Some([REDACTED]::1)), Allowing LAN
[2022-12-09 13:52:08.323][talpid_core::dns][INFO] Setting DNS servers to [REDACTED]
[2022-12-09 13:52:08.324][mullvad_daemon][DEBUG] New tunnel state: Connected { endpoint: TunnelEndpoint { endpoint: Endpoint { address: [REDACTED]:18980, protocol: Udp }, tunnel_type: Wireguard, quantum_resistant: false, proxy: None, obfuscation: None, entry_endpoint: None }, location: Some(GeoIpLocation { ipv4: None, ipv6: None, country: "[REDACTED]", city: Some("[REDACTED]"), latitude: [REDACTED], longitude: -[REDACTED], mullvad_exit_ip: true, hostname: Some("[REDACTED]"), bridge_hostname: None, entry_hostname: None, obfuscator_hostname: None }) }
[2022-12-09 13:52:08.377][mullvad_daemon::management_interface][DEBUG] is_performing_post_upgrade
[2022-12-09 13:52:08.381][mullvad_daemon::management_interface][DEBUG] get_account_history
[2022-12-09 13:52:08.383][mullvad_daemon::management_interface][DEBUG] get_tunnel_state
[2022-12-09 13:52:08.388][mullvad_daemon::management_interface][DEBUG] get_device
[2022-12-09 13:52:08.393][mullvad_daemon::management_interface][DEBUG] get_account_history
[2022-12-09 13:52:08.393][mullvad_daemon::management_interface][DEBUG] update_device
[2022-12-09 13:52:08.393][mullvad_daemon::management_interface][DEBUG] get_account_data
[2022-12-09 13:52:08.393][mullvad_daemon::management_interface][DEBUG] get_settings
[2022-12-09 13:52:08.393][mullvad_api::access][DEBUG] Fetching access token for an account
[2022-12-09 13:52:08.393][mullvad_api::access][DEBUG] Fetching access token for an account
[2022-12-09 13:52:08.394][mullvad_daemon::management_interface][DEBUG] get_current_location
[2022-12-09 13:52:08.399][mullvad_daemon::management_interface][DEBUG] get_relay_locations
[2022-12-09 13:52:08.440][mullvad_daemon::management_interface][DEBUG] get_current_version
[2022-12-09 13:52:08.442][mullvad_daemon::management_interface][DEBUG] connect_tunnel
[2022-12-09 13:52:08.442][mullvad_daemon::management_interface][DEBUG] get_version_info
[2022-12-09 13:52:08.849][mullvad_api::rest][ERROR] Error: HTTP request failed
Caused by: Hyper error
Caused by: error trying to connect: failed to lookup address information: nodename nor servname provided, or not known
Caused by: failed to lookup address information: nodename nor servname provided, or not known
[2022-12-09 13:52:09.277][mullvad_relay_selector::updater][DEBUG] Relay list is up-to-date
[2022-12-09 13:52:09.441][mullvad_api::rest][ERROR] Unexpected HTTP status code 429 Too Many Requests, expected codes [200 OK]
[2022-12-09 13:52:09.442][mullvad_api::access][ERROR] Error: Failed to obtain access token
Caused by: Unexpected response status code 429 Too Many Requests - THROTTLED
[2022-12-09 13:52:09.442][mullvad_daemon::management_interface][ERROR] Unable to get account data from API: Error: Unexpected response status code 429 Too Many Requests - THROTTLED
[2022-12-09 13:52:09.803][mullvad_daemon::geoip][DEBUG] Unable to fetch IPv6 GeoIP location: Hyper connect error: failed to lookup address information: nodename nor servname provided, or not known
[2022-12-09 13:52:09.804][mullvad_daemon::device][DEBUG] The current device is still valid
[2022-12-09 13:52:17.453][mullvad_daemon::management_interface][DEBUG] get_account_data
[2022-12-09 13:52:17.924][talpid_core::offline::imp][DEBUG] Assuming non-tunnel default route exists due to Node { ip: Some([REDACTED]), device: Some("en0") }

Contents of daemon.log after manually starting daemon...

❯ sudo tail -n 1000 -f daemon.log
[2022-12-09 13:52:07.918][mullvad_daemon::version][INFO] Starting mullvad-daemon - 2022.5 2022-10-14
[2022-12-09 13:52:07.918][mullvad_daemon][INFO] Logging to /var/log/mullvad-vpn
[2022-12-09 13:52:07.920][mullvad_daemon::rpc_uniqueness_check][DEBUG] Failed to locate/connect to another daemon instance, assuming there isn't one
[2022-12-09 13:52:07.920][mullvad_daemon][INFO] Management interface listening on /var/run/mullvad-vpn
[2022-12-09 13:52:07.923][mullvad_api::address_cache][DEBUG] Loading API addresses from /Library/Caches/mullvad-vpn/api-ip-address.txt
[2022-12-09 13:52:07.924][mullvad_api::address_cache][DEBUG] Using API address: [REDACTED]:443
[2022-12-09 13:52:07.924][mullvad_api::availability][DEBUG] Suspending API requests
[2022-12-09 13:52:07.926][mullvad_daemon::settings][INFO] Loading settings from /etc/mullvad-vpn/settings.json
[2022-12-09 13:52:07.927][mullvad_daemon::version_check][DEBUG] Loading version check cache from /Library/Caches/mullvad-vpn/version-info.json
[2022-12-09 13:52:07.928][mullvad_relay_selector][DEBUG] Reading relays from /Library/Caches/mullvad-vpn/relays.json
[2022-12-09 13:52:07.936][mullvad_relay_selector][DEBUG] Reading relays from /Applications/Mullvad VPN.app/Contents/Resources/relays.json
[2022-12-09 13:52:07.943][mullvad_relay_selector][INFO] Initialized with 897 cached relays from 2022-12-09 11:12:49.201
[2022-12-09 13:52:07.968][mullvad_api::availability][DEBUG] Pausing background API requests
[2022-12-09 13:52:07.968][mullvad_daemon::account_history][INFO] Opening account history file in /etc/mullvad-vpn/account-history.json
[2022-12-09 13:52:07.969][mullvad_daemon][INFO] Automatically connecting since auto-connect is turned on
[2022-12-09 13:52:07.969][mullvad_api::access][DEBUG] Fetching access token for an account
[2022-12-09 13:52:07.983][talpid_core::offline::imp][DEBUG] Assuming non-tunnel default route exists due to Node { ip: Some([REDACTED]), device: Some("en0") }
[2022-12-09 13:52:07.986][talpid_core::offline::imp][DEBUG] Assuming non-tunnel default route exists due to Node { ip: Some([REDACTED]), device: Some("en0") }
[2022-12-09 13:52:07.987][talpid_core::dns][INFO] Resetting DNS
[2022-12-09 13:52:07.999][mullvad_api::availability][DEBUG] Unsuspending API requests
[2022-12-09 13:52:07.999][mullvad_api::availability][DEBUG] Resuming background API requests
[2022-12-09 13:52:08.000][mullvad_relay_selector][INFO] Selected relay [REDACTED] at [REDACTED]
[2022-12-09 13:52:08.000][mullvad_relay_selector][DEBUG] Relay matched on highest preference for retry attempt 0
[2022-12-09 13:52:08.000][talpid_core::firewall][INFO] Applying firewall policy: Connecting to [REDACTED]:18980/UDP, Allowing LAN, interface: none. Allowing endpoint [REDACTED]:443/TCP
[2022-12-09 13:52:08.005][mullvad_daemon][DEBUG] New tunnel state: Connecting { endpoint: TunnelEndpoint { endpoint: Endpoint { address: [REDACTED]:18980, protocol: Udp }, tunnel_type: Wireguard, quantum_resistant: false, proxy: None, obfuscation: None, entry_endpoint: None }, location: Some(GeoIpLocation { ipv4: None, ipv6: None, country: "USA", city: Some("[REDACTED]"), latitude: [REDACTED], longitude: -[REDACTED], mullvad_exit_ip: true, hostname: Some("[REDACTED]"), bridge_hostname: None, entry_hostname: None, obfuscator_hostname: None }) }
[2022-12-09 13:52:08.020][talpid_core::firewall][INFO] Applying firewall policy: Connecting to [REDACTED]:18980/UDP over "utun5" (ip: [REDACTED],[REDACTED]::2e:3ad6, v4 gw: [REDACTED], v6 gw: Some([REDACTED]::1), allowed in-tunnel traffic: All), Allowing LAN. Allowing endpoint [REDACTED]:443/TCP
[2022-12-09 13:52:08.322][talpid_core::firewall][INFO] Applying firewall policy: Connected to [REDACTED]:18980/UDP over "utun5" (ip: [REDACTED],[REDACTED]::2e:3ad6, v4 gw: [REDACTED], v6 gw: Some([REDACTED]::1)), Allowing LAN
[2022-12-09 13:52:08.323][talpid_core::dns][INFO] Setting DNS servers to [REDACTED]
[2022-12-09 13:52:08.324][mullvad_daemon][DEBUG] New tunnel state: Connected { endpoint: TunnelEndpoint { endpoint: Endpoint { address: [REDACTED]:18980, protocol: Udp }, tunnel_type: Wireguard, quantum_resistant: false, proxy: None, obfuscation: None, entry_endpoint: None }, location: Some(GeoIpLocation { ipv4: None, ipv6: None, country: "[REDACTED]", city: Some("[REDACTED]"), latitude: [REDACTED], longitude: -[REDACTED], mullvad_exit_ip: true, hostname: Some("[REDACTED]"), bridge_hostname: None, entry_hostname: None, obfuscator_hostname: None }) }
[2022-12-09 13:52:08.377][mullvad_daemon::management_interface][DEBUG] is_performing_post_upgrade
[2022-12-09 13:52:08.381][mullvad_daemon::management_interface][DEBUG] get_account_history
[2022-12-09 13:52:08.383][mullvad_daemon::management_interface][DEBUG] get_tunnel_state
[2022-12-09 13:52:08.388][mullvad_daemon::management_interface][DEBUG] get_device
[2022-12-09 13:52:08.393][mullvad_daemon::management_interface][DEBUG] get_account_history
[2022-12-09 13:52:08.393][mullvad_daemon::management_interface][DEBUG] update_device
[2022-12-09 13:52:08.393][mullvad_daemon::management_interface][DEBUG] get_account_data
[2022-12-09 13:52:08.393][mullvad_daemon::management_interface][DEBUG] get_settings
[2022-12-09 13:52:08.393][mullvad_api::access][DEBUG] Fetching access token for an account
[2022-12-09 13:52:08.393][mullvad_api::access][DEBUG] Fetching access token for an account
[2022-12-09 13:52:08.394][mullvad_daemon::management_interface][DEBUG] get_current_location
[2022-12-09 13:52:08.399][mullvad_daemon::management_interface][DEBUG] get_relay_locations
[2022-12-09 13:52:08.440][mullvad_daemon::management_interface][DEBUG] get_current_version
[2022-12-09 13:52:08.442][mullvad_daemon::management_interface][DEBUG] connect_tunnel
[2022-12-09 13:52:08.442][mullvad_daemon::management_interface][DEBUG] get_version_info
[2022-12-09 13:52:08.849][mullvad_api::rest][ERROR] Error: HTTP request failed
Caused by: Hyper error
Caused by: error trying to connect: failed to lookup address information: nodename nor servname provided, or not known
Caused by: failed to lookup address information: nodename nor servname provided, or not known
[2022-12-09 13:52:09.277][mullvad_relay_selector::updater][DEBUG] Relay list is up-to-date
[2022-12-09 13:52:09.442][mullvad_api::rest][ERROR] Unexpected HTTP status code 429 Too Many Requests, expected codes [200 OK]
[2022-12-09 13:52:09.442][mullvad_api::access][ERROR] Error: Failed to obtain access token
Caused by: Unexpected response status code 429 Too Many Requests - THROTTLED
[2022-12-09 13:52:09.442][mullvad_daemon::management_interface][ERROR] Unable to get account data from API: Error: Unexpected response status code 429 Too Many Requests - THROTTLED
[2022-12-09 13:52:09.803][mullvad_daemon::geoip][DEBUG] Unable to fetch IPv6 GeoIP location: Hyper connect error: failed to lookup address information: nodename nor servname provided, or not known
[2022-12-09 13:52:09.804][mullvad_daemon::device][DEBUG] The current device is still valid
[2022-12-09 13:52:17.453][mullvad_daemon::management_interface][DEBUG] get_account_data
[2022-12-09 13:52:17.924][talpid_core::offline::imp][DEBUG] Assuming non-tunnel default route exists due to Node { ip: Some([REDACTED]), device: Some("en0") }

System profile

❯ system_profiler SPSoftwareDataType SPHardwareDataType
Software:

    System Software Overview:

      System Version: macOS 13.0.1 (22A400)
      Kernel Version: Darwin 22.1.0
      Boot Volume: Macintosh HD
      Boot Mode: Normal
      Secure Virtual Memory: Enabled
      System Integrity Protection: Enabled
      Time since boot: 20 minutes, 2 seconds

Hardware:

    Hardware Overview:

      Model Name: MacBook Pro
      Model Identifier: MacBookPro16,1
      Processor Name: 8-Core Intel Core i9
      Processor Speed: 2.3 GHz
      Number of Processors: 1
      Total Number of Cores: 8
      L2 Cache (per Core): 256 KB
      L3 Cache: 16 MB
      Hyper-Threading Technology: Enabled
      Memory: 16 GB
      System Firmware Version: 1916.40.8.0.0 (iBridge: 20.16.420.0.0,0)
      OS Loader Version: 564.40.4~27
      Activation Lock Status: Enabled
krstp commented 1 year ago

The same issue here. Reinstalling multiple times did not help. But running daemon from command line actually works. Will try to pin-point later in more detail. Thx for the tips.

dlon commented 1 year ago

If you're having this issue after updating to macOS 13, the solution is to go to the login items panel and enable the item labeled Amagicom.

fregante commented 1 year ago

I think this issue also started for me after updating to 13.2.1. The daemon is enabled and I cleanly uninstalled it and reinstalled 2023.1 a few times and restarted my computer as well. I get this error popup after it tries connecting for a bit:

Screenshot 2