dfabulich / sitemapgen4j

SitemapGen4j is a library to generate XML sitemaps in Java.
Apache License 2.0
160 stars 90 forks source link

sitemap index for sitemap size > 10MB #17

Open mariodeci opened 9 years ago

mariodeci commented 9 years ago

is possible to force the creation for a sitemap if the current sitemap file size is greater then 10MB?

dmlux commented 1 year ago

I ran into the same problem and solved it by writing the sitemap contents to strings and check the length of these strings. If any of the generated lists exceeds 10Mb then I decreased the number of max URLs and started all over again. This might take up some time but if you have a chance to test locally with your production data you can find a better starting value for maxUrls which boosts the performance of the while loop for your production code. In our case it took at most a second or third try to find a better value for maxUrls.

private final Map<String, Pair<Date, String>> sitemaps = new HashMap<>();
private final AtomicInteger maxUrls = new AtomicInteger(50000);
private void generateGoogleImageSitemapContents(String fileNamePrefix, Function<Integer, GoogleImageSitemapGenerator> sitemapCreator) {
    Predicate<GoogleImageSitemapGenerator> exceeds10Mb = googleImageSitemapGenerator -> googleImageSitemapGenerator.writeAsStrings()
        .stream()
        .anyMatch(list -> list.getBytes(US_ASCII).length > 1e+7);

    GoogleImageSitemapGenerator generator = sitemapCreator.apply(maxUrls.get());
    if (generator == null) {
        return;
    }
    while (exceeds10Mb.test(generator) && maxUrls.get() > 0) {
        generator = sitemapCreator.apply(maxUrls.addAndGet(-maxUrls.get() / 2));
    }

    // In theory now all lists are smaller than 10Mb lets store them for later
    AtomicInteger counter = new AtomicInteger(0);
    generator.writeAsStrings()
        .forEach(sitemapContent -> sitemaps.put(fileNamePrefix + counter.getAndIncrement() + ".xml", Pair.of(new Date(), sitemapContent)));
}

And later you can use this method like this

generateGoogleImageSitemapContents("changes_daily_with_images_", maxUrls -> {
    try {
        GoogleImageSitemapGenerator imageGenerator = GoogleImageSitemapGenerator.builder(baseUrl, null)
            .dateFormat(new W3CDateFormat(MINUTE))
            .maxUrls(maxUrls)
            .build();

        addProducts(imageGenerator);
        addCategories(imageGenerator);

        return imageGenerator;
    } catch (MalformedURLException e) {
        throw new RuntimeException(e);
    }
});

In my case I store the sitemaps in-memory and deliver them on request. The Pair of date and sitemap content is used to add a <lastMod/>-Tag to each list. The lists are created before they are queried and are ready if the crawler requests them. This way it is also possible to update the list in background and use Google-Ping-Service to inform the crawler about changes.