corporate-gadfly / Tunlr-Clone

302 stars 57 forks source link

403 Forbidden #8

Closed liammac closed 11 years ago

liammac commented 11 years ago

I followed the instructions exactly as described but I seem to be getting 403 Forbidden errors when trying to access sites like HULU or Fox.com however I can get to non video sites like whatismyip.net via the squid proxy. is there a new step required that I'm missing?

corporate-gadfly commented 11 years ago

I have hulu.com working with the described setup. You might have to look at downloading a tool like fiddler2 to figure out what communication is happening when you access HULU or FOX. In addition, keep an eye on your bind and squid logs to see if the traffic is actually appearing there at all.

Hope that helps.

liammac commented 11 years ago

DNS is working. However I'm not certain of Squid. I'm using squid 3.1.19

squid3 -v Squid Cache: Version 3.1.19 configure options: '--build=x86_64-linux-gnu' '--prefix=/usr' '--includedir=${prefix}/include' '--mandir=${prefix}/share/man' '--infodir=${prefix}/share/info' '--sysconfdir=/etc' '--localstatedir=/var' '--libexecdir=${prefix}/lib/squid3' '--srcdir=.' '--disable-maintainer-mode' '--disable-dependency-tracking' '--disable-silent-rules' '--datadir=/usr/share/squid3' '--sysconfdir=/etc/squid3' '--mandir=/usr/share/man' '--with-cppunit-basedir=/usr' '--enable-inline' '--enable-async-io=8' '--enable-storeio=ufs,aufs,diskd' '--enable-removal-policies=lru,heap' '--enable-delay-pools' '--enable-cache-digests' '--enable-underscores' '--enable-icap-client' '--enable-follow-x-forwarded-for' '--enable-auth=basic,digest,ntlm,negotiate' '--enable-basic-auth-helpers=LDAP,MSNT,NCSA,PAM,SASL,SMB,YP,DB,POP3,getpwnam,squid_radius_auth,multi-domain-NTLM' '--enable-ntlm-auth-helpers=smb_lm,' '--enable-digest-auth-helpers=ldap,password' '--enable-negotiate-auth-helpers=squid_kerb_auth' '--enable-external-acl-helpers=ip_user,ldap_group,session,unix_group,wbinfo_group' '--enable-arp-acl' '--enable-esi' '--enable-zph-qos' '--enable-wccpv2' '--disable-translation' '--with-logdir=/var/log/squid3' '--with-pidfile=/var/run/squid3.pid' '--with-filedescriptors=65536' '--with-large-files' '--with-default-user=proxy' '--enable-linux-netfilter' 'build_alias=x86_64-linux-gnu' 'CFLAGS=-g -O2 -fPIE -fstack-protector --param=ssp-buffer-size=4 -Wformat -Wformat-security -Werror=format-security' 'LDFLAGS=-Wl,-Bsymbolic-functions -fPIE -pie -Wl,-z,relro -Wl,-z,now' 'CPPFLAGS=-D_FORTIFY_SOURCE=2' 'CXXFLAGS=-g -O2 -fPIE -fstack-protector --param=ssp-buffer-size=4 -Wformat -Wformat-security -Werror=format-security' --with-squid=/build/buildd/squid3-3.1.19

I'm not seeing anything in the logs for requests to sites like hulu or fox. Just to reiterate I'm using the configs from the main Tunlr-Clone git page (no divergence except IP addresses) Any suggestions on what to look for?

corporate-gadfly commented 11 years ago

I'm not seeing anything in the logs for requests to sites like hulu or fox. Just to reiterate I'm using the configs from the main Tunlr-Clone git page (no divergence except IP addresses) Any suggestions on what to look for?

So, your DNS is resolving the address for hulu.com to the IP address of your squid box? After that, you should check the iptables rules, so that any request to port 80 should be redirected to whatever port squid is running on. Unfortunately, my iptables fu is not very strong.

Once you have determined that DNS is working, you can use a tool like curl to see if the squid box is getting hit. Another thing with hulu, you should also have a redirect for huluim.com (in my experience).

Hopefully someone else will chime in too.