Sitemaps found during the crawl work fine because they're tagged with the correct handler; robots.txt files and sitemap files found with the pre-crawl CLI command aren't tagged correctly, however, and as such aren't parsed correctly.
Sitemap and robots.url handling are being reworked; as of 0.9.0 sitemap and robots files are not auto-discovered during crawls, making this issue moot.
Sitemaps found during the crawl work fine because they're tagged with the correct handler; robots.txt files and sitemap files found with the pre-crawl CLI command aren't tagged correctly, however, and as such aren't parsed correctly.