What steps will reproduce the problem?
1. When trying to dump all 404 crawl errors from a web site with the code:
static int DumpCrawlErrors(int startFrom)
{
WebmasterToolsService service = new WebmasterToolsService("exampleCo-exampleApp-1");
service.setUserCredentials("myuser", "mypassword");
string url = HttpUtility.UrlEncode("http://myDomain.com/");
string slug = Utilities.EncodeSlugHeader("https://www.google.com/webmasters/tools/feeds/" + url + "/crawlissues/?start-index=" + startFrom + "&max-results=100");
CrawlIssuesQuery feedQuery = new CrawlIssuesQuery(slug);
CrawlIssuesFeed feed = service.Query(feedQuery);
StringBuilder results = new StringBuilder();
foreach (CrawlIssuesEntry crawlIssuesEntry in feed.Entries)
{
string crawlUrl = ((XmlExtension)crawlIssuesEntry.ExtensionElements[2]).Node.InnerText;
if (crawlIssuesEntry.IssueType == "not-found")
{
results.AppendLine(string.Format("{0}\t{1}\t{2}\t{3}", crawlUrl, crawlIssuesEntry.IssueType, crawlIssuesEntry.IssueDetail, crawlIssuesEntry.LinkedFrom));
}
}
System.IO.File.AppendAllText("result.log", results.ToString());
return feed.Entries.Count;
}
What is the expected output?
I would expect the attribute crawlIssuesEntry.LinkedFrom to be an
IEnumerable<String> instead of type String (since the same 404 page is
referenced by more than one page).
What do you see instead?
The attribute crawlIssuesEntry.LinkedFrom is a single value when the webmaster
tools page clearly indicates that the 404 page was referenced by a number of
other pages.
What version of the product are you using? On what operating system?
Version v2.0.50727 of Google.GData.WebmasterTools on Windows 7 SP1 64bits
Original issue reported on code.google.com by rajivkum...@gmail.com on 15 Jan 2014 at 9:46
Original issue reported on code.google.com by
rajivkum...@gmail.com
on 15 Jan 2014 at 9:46