xrma / crawler4j

Automatically exported from code.google.com/p/crawler4j
0 stars 0 forks source link

Fatal transport error when using a proxy #268

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?
1. I downloaded the code from the project home for crawler and controller and 
so on
2. I made the classpath and classes
3. I have used the URL: http://www.nduoa.com/
4. I am using a proxy (made for http)
5. I am using another user-agent: 

What is the expected output? What do you see instead?
The code should crawl the URL and fetch the data.
FATAL: Fatal transport error: http://myproxy.com while fetching 
http://www.nduoa.com/ (link found in doc #0)
Non success status for link: http://www.nduoa.com/, status code: 1005, 
description: Fatal transport error

What version of the product are you using?
Latest (3.5)

Please provide any additional information below.
When I crawl without using a proxy this error does not occur.

Please help me!

Original issue reported on code.google.com by maren.su...@gmail.com on 8 Jul 2014 at 8:45

GoogleCodeExporter commented 9 years ago
ERROR [main] Fatal transport error: Connection to http://en.wikipedia.org 
refused while fetching http://en.wikipedia.org/robots.txt (link found in doc #0)

same issue. i'm not able to crawl for any address. 
Please help me

Original comment by Nitheshk...@gmail.com on 23 Feb 2015 at 11:35