In some cases COMMAND can be omitted and ping.pe will figure it out automatically.
For example, when a query contains just one IP or DOMAIN, then omitted 'ping' command will be implied. If there is a single semicolon in the query, 'tcp' command will be implied etc.
Additionally, you can use domain name to indicate COMMAND:
root@1444685184woiden:~# wget -Nhttps://github.com/klzgrad/naiveproxy/releases/download/v108.0.5359.94-1/naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz
GNU Wget 1.21, a non-interactive network retriever.
Usage: wget [OPTION]... [URL]...
Mandatory arguments to long options are mandatory for short options too.
Startup:
-V, --version display the version of Wget and exit
-h, --help print this help
-b, --background go to background after startup
-e, --execute=COMMAND execute a `.wgetrc'-style command
Logging and input file:
-o, --output-file=FILE log messages to FILE
-a, --append-output=FILE append messages to FILE
-d, --debug print lots of debugging information
-q, --quiet quiet (no output)
-v, --verbose be verbose (this is the default)
-nv, --no-verbose turn off verboseness, without being quiet
--report-speed=TYPE output bandwidth as TYPE. TYPE can be bits
-i, --input-file=FILE download URLs found in local or external FILE
-F, --force-html treat input file as HTML
-B, --base=URL resolves HTML input-file links (-i -F)
relative to URL
--config=FILE specify config file to use
--no-config do not read any config file
--rejected-log=FILE log reasons for URL rejection to FILE
Download:
-t, --tries=NUMBER set number of retries to NUMBER (0 unlimits)
--retry-connrefused retry even if connection is refused
--retry-on-http-error=ERRORS comma-separated list of HTTP errors to retry
-O, --output-document=FILE write documents to FILE
-nc, --no-clobber skip downloads that would download to
existing files (overwriting them)
--no-netrc don't try to obtain credentials from .netrc
-c, --continue resume getting a partially-downloaded file
--start-pos=OFFSET start downloading from zero-based position OFFSET
--progress=TYPE select progress gauge type
--show-progress display the progress bar in any verbosity mode
-N, --timestamping don't re-retrieve files unless newer than
local
--no-if-modified-since don't use conditional if-modified-since get
requests in timestamping mode
--no-use-server-timestamps don't set the local file's timestamp by
the one on the server
-S, --server-response print server response
--spider don't download anything
-T, --timeout=SECONDS set all timeout values to SECONDS
--dns-timeout=SECS set the DNS lookup timeout to SECS
--connect-timeout=SECS set the connect timeout to SECS
--read-timeout=SECS set the read timeout to SECS
-w, --wait=SECONDS wait SECONDS between retrievals
(applies if more then 1 URL is to be retrieved)
--waitretry=SECONDS wait 1..SECONDS between retries of a retrieval
(applies if more then 1 URL is to be retrieved)
--random-wait wait from 0.5WAIT...1.5WAIT secs between retrievals
(applies if more then 1 URL is to be retrieved)
--no-proxy explicitly turn off proxy
-Q, --quota=NUMBER set retrieval quota to NUMBER
--bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host
--limit-rate=RATE limit download rate to RATE
--no-dns-cache disable caching DNS lookups
--restrict-file-names=OS restrict chars in file names to ones OS allows
--ignore-case ignore case when matching files/directories
-4, --inet4-only connect only to IPv4 addresses
-6, --inet6-only connect only to IPv6 addresses
--prefer-family=FAMILY connect first to addresses of specified family,
one of IPv6, IPv4, or none
--user=USER set both ftp and http user to USER
--password=PASS set both ftp and http password to PASS
--ask-password prompt for passwords
--use-askpass=COMMAND specify credential handler for requesting
username and password. If no COMMAND is
specified the WGET_ASKPASS or the SSH_ASKPASS
environment variable is used.
--no-iri turn off IRI support
--local-encoding=ENC use ENC as the local encoding for IRIs
--remote-encoding=ENC use ENC as the default remote encoding
--unlink remove file before clobber
--xattr turn on storage of metadata in extended file attributes
Directories:
-nd, --no-directories don't create directories
-x, --force-directories force creation of directories
-nH, --no-host-directories don't create host directories
--protocol-directories use protocol name in directories
-P, --directory-prefix=PREFIX save files to PREFIX/..
--cut-dirs=NUMBER ignore NUMBER remote directory components
HTTP options:
--http-user=USER set http user to USER
--http-password=PASS set http password to PASS
--no-cache disallow server-cached data
--default-page=NAME change the default page name (normally
this is 'index.html'.)
-E, --adjust-extension save HTML/CSS documents with proper extensions
--ignore-length ignore 'Content-Length' header field
--header=STRING insert STRING among the headers
--compression=TYPE choose compression, one of auto, gzip and none. (default: none)
--max-redirect maximum redirections allowed per page
--proxy-user=USER set USER as proxy username
--proxy-password=PASS set PASS as proxy password
--referer=URL include 'Referer: URL' header in HTTP request
--save-headers save the HTTP headers to file
-U, --user-agent=AGENT identify as AGENT instead of Wget/VERSION
--no-http-keep-alive disable HTTP keep-alive (persistent connections)
--no-cookies don't use cookies
--load-cookies=FILE load cookies from FILE before session
--save-cookies=FILE save cookies to FILE after session
--keep-session-cookies load and save session (non-permanent) cookies
--post-data=STRING use the POST method; send STRING as the data
--post-file=FILE use the POST method; send contents of FILE
--method=HTTPMethod use method "HTTPMethod" in the request
--body-data=STRING send STRING as data. --method MUST be set
--body-file=FILE send contents of FILE. --method MUST be set
--content-disposition honor the Content-Disposition header when
choosing local file names (EXPERIMENTAL)
--content-on-error output the received content on server errors
--auth-no-challenge send Basic HTTP authentication information
without first waiting for the server's
challenge
HTTPS (SSL/TLS) options:
--secure-protocol=PR choose secure protocol, one of auto, SSLv2,
SSLv3, TLSv1, TLSv1_1, TLSv1_2 and PFS
--https-only only follow secure HTTPS links
--no-check-certificate don't validate the server's certificate
--certificate=FILE client certificate file
--certificate-type=TYPE client certificate type, PEM or DER
--private-key=FILE private key file
--private-key-type=TYPE private key type, PEM or DER
--ca-certificate=FILE file with the bundle of CAs
--ca-directory=DIR directory where hash list of CAs is stored
--crl-file=FILE file with bundle of CRLs
--pinnedpubkey=FILE/HASHES Public key (PEM/DER) file, or any number
of base64 encoded sha256 hashes preceded by
'sha256//' and separated by ';', to verify
peer against
--ciphers=STR Set the priority string (GnuTLS) or cipher list string (OpenSSL) directly.
Use with care. This option overrides --secure-protocol.
The format and syntax of this string depend on the specific SSL/TLS engine.
FTP options:
--ftp-user=USER set ftp user to USER
--ftp-password=PASS set ftp password to PASS
--no-remove-listing don't remove '.listing' files
--no-glob turn off FTP file name globbing
--no-passive-ftp disable the "passive" transfer mode
--preserve-permissions preserve remote file permissions
--retr-symlinks when recursing, get linked-to files (not dir)
FTPS options:
--ftps-implicit use implicit FTPS (default port is 990)
--ftps-resume-ssl resume the SSL/TLS session started in the control connection when
opening a data connection
--ftps-clear-data-connection cipher the control channel only; all the data will be in plaintext
--ftps-fallback-to-ftp fall back to FTP if FTPS is not supported in the target server
WARC options:
--warc-file=FILENAME save request/response data to a .warc.gz file
--warc-header=STRING insert STRING into the warcinfo record
--warc-max-size=NUMBER set maximum size of WARC files to NUMBER
--warc-cdx write CDX index files
--warc-dedup=FILENAME do not store records listed in this CDX file
--no-warc-compression do not compress WARC files with GZIP
--no-warc-digests do not calculate SHA1 digests
--no-warc-keep-log do not store the log file in a WARC record
--warc-tempdir=DIRECTORY location for temporary files created by the
WARC writer
Recursive download:
-r, --recursive specify recursive download
-l, --level=NUMBER maximum recursion depth (inf or 0 for infinite)
--delete-after delete files locally after downloading them
-k, --convert-links make links in downloaded HTML or CSS point to
local files
--convert-file-only convert the file part of the URLs only (usually known as the basename)
--backups=N before writing file X, rotate up to N backup files
-K, --backup-converted before converting file X, back up as X.orig
-m, --mirror shortcut for -N -r -l inf --no-remove-listing
-p, --page-requisites get all images, etc. needed to display HTML page
--strict-comments turn on strict (SGML) handling of HTML comments
Recursive accept/reject:
-A, --accept=LIST comma-separated list of accepted extensions
-R, --reject=LIST comma-separated list of rejected extensions
--accept-regex=REGEX regex matching accepted URLs
--reject-regex=REGEX regex matching rejected URLs
--regex-type=TYPE regex type (posix|pcre)
-D, --domains=LIST comma-separated list of accepted domains
--exclude-domains=LIST comma-separated list of rejected domains
--follow-ftp follow FTP links from HTML documents
--follow-tags=LIST comma-separated list of followed HTML tags
--ignore-tags=LIST comma-separated list of ignored HTML tags
-H, --span-hosts go to foreign hosts when recursive
-L, --relative follow relative links only
-I, --include-directories=LIST list of allowed directories
--trust-server-names use the name specified by the redirection
URL's last component
-X, --exclude-directories=LIST list of excluded directories
-np, --no-parent don't ascend to the parent directory
root@1444685184woiden:~# tar -xf naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz
root@1444685184woiden:~# cd
root@1444685184woiden:~# ls
_v2ray_vlessurl menu.sh naiveproxy-v108.0.5359.94-1-linux-x64 naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz
root@1444685184woiden:# tar -xf naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz
root@1444685184woiden:# cd
root@1444685184woiden:~# ls
v2ray_vless_url menu.sh naiveproxy-v108.0.5359.94-1-linux-x64 naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz
就是这个样子
root@1444685184woiden:~# ls -l
total 2628
-rw-r--r-- 1 root root 17885 Nov 28 00:12 _v2ray_vlessurl
-rw-r--r-- 1 root root 122511 Dec 2 18:32 menu.sh
drwxr-xr-x 2 1001 122 4096 Dec 3 03:59 naiveproxy-v108.0.5359.94-1-linux-x64
-rw-r--r-- 1 root root 2535808 Dec 8 00:53 naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz
root@1444685184woiden:~# cd
root@1444685184woiden:~# cd naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz
-bash: cd: naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz: Not a directory
root@1444685184woiden:~# cd naiveproxy-v108.0.5359.94-1-linux-x64
root@1444685184woiden:~/naiveproxy-v108.0.5359.94-1-linux-x64# cd naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz
-bash: cd: naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz: No such file or directory
root@1444685184woiden:~/naiveproxy-v108.0.5359.94-1-linux-x64#
你好我的wpiden ipv6检测出问题了
In some cases COMMAND can be omitted and ping.pe will figure it out automatically.
For example, when a query contains just one IP or DOMAIN, then omitted 'ping' command will be implied. If there is a single semicolon in the query, 'tcp' command will be implied etc.
Additionally, you can use domain name to indicate COMMAND:
ping.pe/SUBJECT – assumed command: ping
ping6.ping.pe/SUBJECT – assumed command: ping6
chart.ping.pe/SUBJECT – assumed command: chart
chart6.ping.pe/SUBJECT – assumed command: chart6
tcp.ping.pe/SUBJECT – assumed command: tcp
tcp6.ping.pe/SUBJECT – assumed command: tcp6
port.ping.pe/SUBJECT – assumed command: tcp
dig.ping.pe/SUBJECT – assumed command: dig
我不清楚這是怎麼回事
请贴出来你在 ping.pe 的方框里输入了什么。
你也可以截图,然后在这里 ctrl+v 粘上。
2a01:4f8:202:1002:1111:1111:3152:0001
![Uploading 截圖 2022-12-07 下午10.37.15.png…]()
![Uploading 截圖 2022-12-07 下午10.37.29.png…]()
Query: tcp6 [2a01:4f8:202:1002:1111:1111:3152:0001]:443
China-based test nodes are disabled due to DDoS attacks. Will be re-enabled once attacks stop.
Query: tcp6 [2a01:4f8:202:1002:1111:1111:3152:0001]:80
基本上是全通的,但就是用不了
caddyfile的pei zhi配置文件
第三方测试 TCP 是通的。 那你自己的网络环境,能连通 [2a01:4f8:202:1002:1111:1111:3152:0001]:443 吗?
你可以在你自己的VPS上测试梯子的配置有没有问题。 https://zelikk.blogspot.com/2022/12/naive-proxy-troubleshoot.html
root@1444685184woiden:~# wget -Nhttps://github.com/klzgrad/naiveproxy/releases/download/v108.0.5359.94-1/naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz GNU Wget 1.21, a non-interactive network retriever. Usage: wget [OPTION]... [URL]...
Mandatory arguments to long options are mandatory for short options too.
Startup: -V, --version display the version of Wget and exit -h, --help print this help -b, --background go to background after startup -e, --execute=COMMAND execute a `.wgetrc'-style command
Logging and input file: -o, --output-file=FILE log messages to FILE -a, --append-output=FILE append messages to FILE -d, --debug print lots of debugging information -q, --quiet quiet (no output) -v, --verbose be verbose (this is the default) -nv, --no-verbose turn off verboseness, without being quiet --report-speed=TYPE output bandwidth as TYPE. TYPE can be bits -i, --input-file=FILE download URLs found in local or external FILE -F, --force-html treat input file as HTML -B, --base=URL resolves HTML input-file links (-i -F) relative to URL --config=FILE specify config file to use --no-config do not read any config file --rejected-log=FILE log reasons for URL rejection to FILE
Download: -t, --tries=NUMBER set number of retries to NUMBER (0 unlimits) --retry-connrefused retry even if connection is refused --retry-on-http-error=ERRORS comma-separated list of HTTP errors to retry -O, --output-document=FILE write documents to FILE -nc, --no-clobber skip downloads that would download to existing files (overwriting them) --no-netrc don't try to obtain credentials from .netrc -c, --continue resume getting a partially-downloaded file --start-pos=OFFSET start downloading from zero-based position OFFSET --progress=TYPE select progress gauge type --show-progress display the progress bar in any verbosity mode -N, --timestamping don't re-retrieve files unless newer than local --no-if-modified-since don't use conditional if-modified-since get requests in timestamping mode --no-use-server-timestamps don't set the local file's timestamp by the one on the server -S, --server-response print server response --spider don't download anything -T, --timeout=SECONDS set all timeout values to SECONDS --dns-timeout=SECS set the DNS lookup timeout to SECS --connect-timeout=SECS set the connect timeout to SECS --read-timeout=SECS set the read timeout to SECS -w, --wait=SECONDS wait SECONDS between retrievals (applies if more then 1 URL is to be retrieved) --waitretry=SECONDS wait 1..SECONDS between retries of a retrieval (applies if more then 1 URL is to be retrieved) --random-wait wait from 0.5WAIT...1.5WAIT secs between retrievals (applies if more then 1 URL is to be retrieved) --no-proxy explicitly turn off proxy -Q, --quota=NUMBER set retrieval quota to NUMBER --bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host --limit-rate=RATE limit download rate to RATE --no-dns-cache disable caching DNS lookups --restrict-file-names=OS restrict chars in file names to ones OS allows --ignore-case ignore case when matching files/directories -4, --inet4-only connect only to IPv4 addresses -6, --inet6-only connect only to IPv6 addresses --prefer-family=FAMILY connect first to addresses of specified family, one of IPv6, IPv4, or none --user=USER set both ftp and http user to USER --password=PASS set both ftp and http password to PASS --ask-password prompt for passwords --use-askpass=COMMAND specify credential handler for requesting username and password. If no COMMAND is specified the WGET_ASKPASS or the SSH_ASKPASS environment variable is used. --no-iri turn off IRI support --local-encoding=ENC use ENC as the local encoding for IRIs --remote-encoding=ENC use ENC as the default remote encoding --unlink remove file before clobber --xattr turn on storage of metadata in extended file attributes
Directories: -nd, --no-directories don't create directories -x, --force-directories force creation of directories -nH, --no-host-directories don't create host directories --protocol-directories use protocol name in directories -P, --directory-prefix=PREFIX save files to PREFIX/.. --cut-dirs=NUMBER ignore NUMBER remote directory components
HTTP options: --http-user=USER set http user to USER --http-password=PASS set http password to PASS --no-cache disallow server-cached data --default-page=NAME change the default page name (normally this is 'index.html'.) -E, --adjust-extension save HTML/CSS documents with proper extensions --ignore-length ignore 'Content-Length' header field --header=STRING insert STRING among the headers --compression=TYPE choose compression, one of auto, gzip and none. (default: none) --max-redirect maximum redirections allowed per page --proxy-user=USER set USER as proxy username --proxy-password=PASS set PASS as proxy password --referer=URL include 'Referer: URL' header in HTTP request --save-headers save the HTTP headers to file -U, --user-agent=AGENT identify as AGENT instead of Wget/VERSION --no-http-keep-alive disable HTTP keep-alive (persistent connections) --no-cookies don't use cookies --load-cookies=FILE load cookies from FILE before session --save-cookies=FILE save cookies to FILE after session --keep-session-cookies load and save session (non-permanent) cookies --post-data=STRING use the POST method; send STRING as the data --post-file=FILE use the POST method; send contents of FILE --method=HTTPMethod use method "HTTPMethod" in the request --body-data=STRING send STRING as data. --method MUST be set --body-file=FILE send contents of FILE. --method MUST be set --content-disposition honor the Content-Disposition header when choosing local file names (EXPERIMENTAL) --content-on-error output the received content on server errors --auth-no-challenge send Basic HTTP authentication information without first waiting for the server's challenge
HTTPS (SSL/TLS) options: --secure-protocol=PR choose secure protocol, one of auto, SSLv2, SSLv3, TLSv1, TLSv1_1, TLSv1_2 and PFS --https-only only follow secure HTTPS links --no-check-certificate don't validate the server's certificate --certificate=FILE client certificate file --certificate-type=TYPE client certificate type, PEM or DER --private-key=FILE private key file --private-key-type=TYPE private key type, PEM or DER --ca-certificate=FILE file with the bundle of CAs --ca-directory=DIR directory where hash list of CAs is stored --crl-file=FILE file with bundle of CRLs --pinnedpubkey=FILE/HASHES Public key (PEM/DER) file, or any number of base64 encoded sha256 hashes preceded by 'sha256//' and separated by ';', to verify peer against
HSTS options: --no-hsts disable HSTS --hsts-file path of HSTS database (will override default)
FTP options: --ftp-user=USER set ftp user to USER --ftp-password=PASS set ftp password to PASS --no-remove-listing don't remove '.listing' files --no-glob turn off FTP file name globbing --no-passive-ftp disable the "passive" transfer mode --preserve-permissions preserve remote file permissions --retr-symlinks when recursing, get linked-to files (not dir)
FTPS options: --ftps-implicit use implicit FTPS (default port is 990) --ftps-resume-ssl resume the SSL/TLS session started in the control connection when opening a data connection --ftps-clear-data-connection cipher the control channel only; all the data will be in plaintext --ftps-fallback-to-ftp fall back to FTP if FTPS is not supported in the target server WARC options: --warc-file=FILENAME save request/response data to a .warc.gz file --warc-header=STRING insert STRING into the warcinfo record --warc-max-size=NUMBER set maximum size of WARC files to NUMBER --warc-cdx write CDX index files --warc-dedup=FILENAME do not store records listed in this CDX file --no-warc-compression do not compress WARC files with GZIP --no-warc-digests do not calculate SHA1 digests --no-warc-keep-log do not store the log file in a WARC record --warc-tempdir=DIRECTORY location for temporary files created by the WARC writer
Recursive download: -r, --recursive specify recursive download -l, --level=NUMBER maximum recursion depth (inf or 0 for infinite) --delete-after delete files locally after downloading them -k, --convert-links make links in downloaded HTML or CSS point to local files --convert-file-only convert the file part of the URLs only (usually known as the basename) --backups=N before writing file X, rotate up to N backup files -K, --backup-converted before converting file X, back up as X.orig -m, --mirror shortcut for -N -r -l inf --no-remove-listing -p, --page-requisites get all images, etc. needed to display HTML page --strict-comments turn on strict (SGML) handling of HTML comments
Recursive accept/reject: -A, --accept=LIST comma-separated list of accepted extensions -R, --reject=LIST comma-separated list of rejected extensions --accept-regex=REGEX regex matching accepted URLs --reject-regex=REGEX regex matching rejected URLs --regex-type=TYPE regex type (posix|pcre) -D, --domains=LIST comma-separated list of accepted domains --exclude-domains=LIST comma-separated list of rejected domains --follow-ftp follow FTP links from HTML documents --follow-tags=LIST comma-separated list of followed HTML tags --ignore-tags=LIST comma-separated list of ignored HTML tags -H, --span-hosts go to foreign hosts when recursive -L, --relative follow relative links only -I, --include-directories=LIST list of allowed directories --trust-server-names use the name specified by the redirection URL's last component -X, --exclude-directories=LIST list of excluded directories -np, --no-parent don't ascend to the parent directory
Email bug reports, questions, discussions to bug-wget@gnu.org and/or open issues at https://savannah.gnu.org/bugs/?func=additem&group=wget.
root@1444685184woiden:~# wget -N https://github.com/klzgrad/naiveproxy/releases/download/v108.0.5359.94-1/naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz --2022-12-07 16:09:45-- https://github.com/klzgrad/naiveproxy/releases/download/v108.0.5359.94-1/naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz Resolving github.com (github.com)... failed: Temporary failure in name resolution. wget: unable to resolve host address 'github.com' 安装失败了
root@1444685184woiden:~# tar -xf naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz tar: naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz: Cannot open: No such file or directory tar: Error is not recoverable: exiting now root@1444685184woiden:~# tar -xf https://github.com/klzgrad/naiveproxy/releases/download/v108.0.5359.94-1/naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz tar: Cannot connect to https: resolve failed
我这里只有ipv4,没有ipv6,但我已经用了warp ipv4了,也连接不上
你弄不通 github.com 但是可以你自己下载了文件再上传到VPS上面去。
好像不行 root@1444685184woiden:~# wget -N '/Users/lokou/Downloads/naiveproxy-v108.0.5359.94-1-linux-x64' /Users/lokou/Downloads/naiveproxy-v108.0.5359.94-1-linux-x64: Scheme missing.
root@1444685184woiden:~# tar -xf naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz root@1444685184woiden:~# cd root@1444685184woiden:~# ls _v2ray_vlessurl menu.sh naiveproxy-v108.0.5359.94-1-linux-x64 naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz
我把我的那个下载还没解压的直接导入进termius了
就是这样
不知道有没有问题呢
你把文件传到vps上了,就继续按教程操作解压呀
已经解压了
root@1444685184woiden:# tar -xf naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz root@1444685184woiden:# cd root@1444685184woiden:~# ls v2ray_vless_url menu.sh naiveproxy-v108.0.5359.94-1-linux-x64 naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz 就是这个样子
好像跟你说的目录不太一样
你可以
ls -l
看一下,你会发现naiveproxy-v108.0.5359.94-1-linux-x64
是一个目录,你cd
进去看看我的点进去了
root@1444685184woiden:~# ls -l total 2628 -rw-r--r-- 1 root root 17885 Nov 28 00:12 _v2ray_vlessurl -rw-r--r-- 1 root root 122511 Dec 2 18:32 menu.sh drwxr-xr-x 2 1001 122 4096 Dec 3 03:59 naiveproxy-v108.0.5359.94-1-linux-x64 -rw-r--r-- 1 root root 2535808 Dec 8 00:53 naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz root@1444685184woiden:~# cd root@1444685184woiden:~# cd naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz -bash: cd: naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz: Not a directory root@1444685184woiden:~# cd naiveproxy-v108.0.5359.94-1-linux-x64 root@1444685184woiden:~/naiveproxy-v108.0.5359.94-1-linux-x64# cd naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz -bash: cd: naiveproxy-v108.0.5359.94-1-linux-x64.tar.xz: No such file or directory root@1444685184woiden:~/naiveproxy-v108.0.5359.94-1-linux-x64#
出现这种情况
你
ls
查一下有哪些文件。 然后就照着教程走是这样子吗
对呀,这些文件都齐全呀。你照着教程继续走。
我是用小火箭的,没有客户端
你现在看到的naive就是客户端。 你在自己的VPS上支持客户端对接你的服务端,你测试一下功能是不是好的。 https://zelikk.blogspot.com/2022/12/naive-proxy-troubleshoot.html
是把第三点的这个配置内容复制到ssh执行面板吗
还是要找那个config json的文件 { "listen": "socks://127.0.0.1: 10811 ", "proxy": "https:// naive用戶名: naive密碼@ naive域名: naive端口", "log": "" }
编辑config json文件,改成这个内容。 你会发现config json文件里面已经有这个格式了。对照着修改一下。
config.json放在哪里呢
不会我要去github下载个config。json文件自己编辑吧
你这个目录下面就有。
你学开车也要自己看一点路啊,加一点自己的判断。教练少说了一个小细节你就完全啥也不会了?
这个我怎么进去呢
我从这里进去了
bu不过好像这个什么都没写
怎么修改文件内容你google自学一下吧。方法很多的。
{ "listen": "socks://127.0.0.1: 10811 ", "proxy": "https:// naive用戶名: naive密碼@ naive域名: naive端口", "log": "" } 我是不是把这段直接放上去然后改下域名和端口就可以了?
里面有些空格不能乱写。建议你看一下原来的config.json的文件内容。