Simple data export and feeds from FA.
Check out the documentation for a full list of functionality.
The file lib/faexport/scraper.rb
contains all the code required to access data from FA.
This API was originally developed by boothale, but after he had been missing and not responding to emails for many months, deer-spangle has forked it and taken care of it instead.
When attempting to use endpoints which require a login cookie to be supplied, or running your own copy of the API, you
will need to generate a valid FA cookie string.
A valid FA cookie string looks like this:
"b=3a485360-d203-4a38-97e8-4ff7cdfa244c; a=b1b985c4-d73e-492a-a830-ad238a3693ef"
The cookie a
and b
values can be obtained by checking your browser's storage inspector while on any FA page.
The storage inspector can be opened by pressing Shift+F9
on Firefox, and on Chrome, by opening the developer tools
with F12
and then selecting the "Application" tab, and then "Cookies".
You may want to do this in a private browsing session as logging out of your account will invalidate
the cookie and break the scraper.
This cookie must be for an account that is set to view the site in classic mode. Modern style cannot be parsed by this API.
To authenticate with the API, you will need to provide that string in the FA_COOKIE header. (Header. Not a cookie).
Another option, particularly for the authenticated RSS feeds, is to use HTTP basic auth, using the username "ab", and the password being the a and b cookie values, concatenated with a semi-colon between them. Like so:
https://ab:b1b985c4-d73e-492a-a830-ad238a3693ef;3a485360-d203-4a38-97e8-4ff7cdfa244c@faexport.spangle.org.uk/notifications/submissions.rss
You must not share such RSS feed links with others though, as your FA cookies are encoded into the URL, and could be used to access your account.
If you simply run:
make install
make run
It should install required packages, and then run the server, though it may warn of a missing FA_COOKIE environment variable.
You can customise the FA_COOKIE value and PORT by passing them like so:
make FA_COOKIE="b\=...\;a\=..." PORT=9292 run
For ease of development you can remove the need to specify an environment variable for the furaffinity cookie by
creating a file named settings.yml
in the root directory containing a valid FA cookie:
cookie: "b=3a485360-d203-4a38-97e8-4ff7cdfa244c; a=b1b985c4-d73e-492a-a830-ad238a3693ef"
This application is available as a docker image, so that you don't need to install ruby, and bundler and packages and such. The docker image is available on docker hub here: https://hub.docker.com/r/deerspangle/furaffinity-api
But to deploy a redis image and furaffinity API docker container, linked together, you can run
FA_COOKIE="b\=...\;a\=..." docker-compose up
or simple
make FA_COOKIE="b\=...\;a\=..." deploy
It will default to being exposed on port 80, but you can customise this by passing in the PORT environment variable.
make FA_COOKIE="b\=...\;a\=..." PORT=9292 deploy
If cloudflare protection is online, you can launch a pair of cloudflare bypass containers alongside the API rather easily:
make FA_COOKIE="b\=...\;a\=..." deploy_bypass
This application can be run on Heroku, just add an instance of Heroku Data for Redis®
for caching.
Rather than uploading settings.yml
, set the environment variable FA_COOKIE
to the generated cookie you gathered from FA.
There are a number of metrics exposed at /metrics
, which can be used for observability and such.
Metrics are available for deployed version, error rates, request/response times, and usage patterns between endpoints and format types.
Metrics are grouped into API metrics, and scraper metrics. Scraper metrics are prefixed with "faexport_scraper", API endpoint metrics are prefixed with "faexportendpoint", and all others are prefixed with just "faexport".
The prometheus metrics endpoint can be secured with basic auth by passing a PROMETHEUS_PASS environment variable. This will set the password for the /metrics
endpoint, with a blank username. This environment variable can be passed to locally running instances, or to docker or docker compose.