This is the part of the product that is responsible for importing the data.
It loads podcast feed urls from a file, normalizes them, and then updates them in a search engine. Currently only supports Azure Search.
Read more about it here: Podcast Feed Loader on the wiki
# test docker enviroment bindings before creating containers
docker-compose config
# start local development
docker-compose up -d
# start regular docker
docker-compose build && docker-compose up [ OPTIONAL ] -d (if console is not needed)
Command to stop container:
# to stop
docker-compose stop
We recommend you run this with Docker, but you can also do things the old fashioned way:
# Set environment variables (see below)
cd app
npm install
npm start
There are a couple environment settings you can set in order to actually update a search engine. Currently the only supported engine is Azure.
You can run the project without these settings. It won't actually update anything, but it's a good test.
SEARCH_PROVIDER="Azure"
AZURE_SEARCH_INDEX_NAME=“podcasts"
AZURE_SEARCH_ENDPOINT="https://podcasts.search.windows.net"
AZURE_SEARCH_ADMIN_API_KEY="YOUR_SECRET_HERE"
AZURE_SEARCH_API_VERSION="2017-11-11-Preview"
SEARCH_OLDEST_INDEX_DATE='2018-12-29'