SKS Menu Scraper is a tool designed to automatically fetch and parse information about canteen menus, such as dish names, portion sizes, and prices. The project saves the scraped data into a database and provides a RESTful API for users to access menu items.
The SKS Menu Scraper also includes a feature that acts as a wrapper for external API to handle and track the number of canteen users - source.
The API is available at sks-api.topwr.solvro.pl
. The following endpoints are available at the moment:
/api/v1/meals
curl -X GET https://sks-api.topwr.solvro.pl/api/v1/meals
[
{
"id": 85,
"name": "Napój z soku jabłkowo-wiśniowego",
"category": "DRINK",
"size": "200ml",
"price": "2.50",
"createdAt": "2024-11-08T11:53:38.644+00:00",
"updatedAt": "2024-11-08T11:53:38.644+00:00"
},
{
"id": 84,
"name": "Ziemniaki z koperkiem",
"category": "SIDE_DISH",
"size": "250g",
"price": "4.50",
"createdAt": "2024-11-08T11:53:38.644+00:00",
"updatedAt": "2024-11-08T11:53:38.644+00:00"
},
{
"id": 82,
"name": "Pałki drobiowe w ciescie crazy",
"category": "MEAT_DISH",
"size": "250g",
"price": "15.00",
"createdAt": "2024-11-08T11:53:38.642+00:00",
"updatedAt": "2024-11-08T11:53:38.642+00:00"
}
]
/api/v1/sks-users/current
curl -X GET https://sks-api.topwr.solvro.pl/api/v1/sks-users/current
{
"activeUsers": 1,
"movingAverage21": 49,
"externalTimestamp": "2024-11-11T13:40:00.000+00:00",
"createdAt": "2024-11-10T23:00:00.116+00:00",
"updatedAt": "2024-11-11T13:42:01.017+00:00",
"trend": "STABLE",
"isResultRecent": true,
"nextUpdateTimestamp": "2024-11-11T13:47:31.017+00:00"
}
/api/v1/sks-users/today
curl -X GET https://sks-api.topwr.solvro.pl/api/v1/sks-users/today
[
{
"activeUsers": 1,
"movingAverage21": 49,
"externalTimestamp": "2024-11-11T13:40:00.000+00:00",
"createdAt": "2024-11-10T23:00:00.116+00:00",
"updatedAt": "2024-11-11T13:42:01.017+00:00"
},
"{...}"
]
Clone the repository:
git clone https://github.com/Solvro/backend-topwr-sks.git
cd backend-topwr-sks
Install the required dependencies:
npm install
Set up the PostgreSQL database:
.env
file with your PostgreSQL credentials and database name.Set up the environment variables in the .env
file using the .env.example
template.
Run migrations to create the database schema:
node ace migration:run
Run scheduler for scrapper:
node ace scheduler:run
# or
node ace scheduler:work
Alternatively run scraping script once and individually:
node ace scrape:menu
node ace scrape:users
Start the development server:
npm run dev
Access the data using:
curl -X GET http://localhost:3333/api/v1/meals