This system works great, So I guess this is more of a feature request than a bug. From what I am seeing on a site that is extremely active like the one I am programming http://ewao.com there are literally millions of people connecting to the site and sharing articles so facebook is constantly scraping the website (unless a page is cached) causing facebook to get blocked from the site and unable to scrape pages. If there is anyway to white list domains or ips I would love to know how.
This system works great, So I guess this is more of a feature request than a bug. From what I am seeing on a site that is extremely active like the one I am programming http://ewao.com there are literally millions of people connecting to the site and sharing articles so facebook is constantly scraping the website (unless a page is cached) causing facebook to get blocked from the site and unable to scrape pages. If there is anyway to white list domains or ips I would love to know how.