YiSamYan / Job-Tracker-App

MIT License
0 stars 0 forks source link

Contributors Forks Stargazers Issues MIT License


Logo ![Job Tracker Screen Shot](images/screenshot.png) Frontend hosted on S3 and Backend on EC2

Job Tracker App

A full-stack job tracking application built with React (frontend) and Django (backend). This application helps users manage and track their job applications.

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Features
  5. Roadmap
  6. Contact

About The Project

The Job Tracker App is a comprehensive, full-stack application designed to help users efficiently manage and track their job applications. Built with a React frontend and a Django backend, the app simplifies the process of organizing and monitoring job applications by providing an intuitive interface and essential tools for creating, updating, deleting, and organizing job listings.

While the application originally included a web scraping feature to automatically fetch job details from platforms, but this feature does not work for some sites due to restrictions and blocks on scraping requests by those sites. However, the app still provides a smooth user experience through manual entry and bulk import options.

Key Features:

This project is designed for job seekers looking for a straightforward yet powerful tool to manage their job search, with a focus on user-friendly interfaces and modern web technologies.

(back to top)

Built With

React Typescript Django DjangoREST PostgreSQL Docker Selenium BeautifulSoup4

(back to top)

Getting Started

To get a local copy up and running follow these simple example steps.

Prerequisites

Before you begin, ensure you have met the following requirements:

For Web Scraping Feature

Installation

  1. Clone the repo

    git clone https://github.com/YiSamYan/Job-Tracker-App.git
  2. Configure Environment Variables a. Make sure to configure the .env.docker file to fit your setup and also the .env in the \backend\backend folder b. Make sure to add OPEN_AI_KEY to .env.docker to use the scraping tool

  3. Set up your venv

    python -m venv venv
    venv\Scrips\activate

To deactive, just run: deactivate

  1. Set Up the Frontend (locally) a. Install Frontend Dependencies

    npm install
    npm start
  2. Set Up the Backend (locally) a. Install Backend Dependencies

    pip install -r requirements.txt
    pip manage.py makemigrations
    pip manage.py migrate
    pip manage.py runserver
  3. Run the cypress test

    cd frontend
    npx cypress run
  4. Running the Application with Docker a. Build and Start the Containers

    docker-compose up --build

(back to top)

Features

[
  {
    "title": "Software Engineer",
    "company": "Tech Solutions",
    "status": "applied",
    "description": "Develop and maintain software applications.",
    "requirements": "3+ years experience with JavaScript and Python."
  },
  {
    "title": "Backend Developer",
    "company": "Innovative Corp",
    "status": "interviewing",
    "description": "Responsible for server-side development.",
    "requirements": "Experience with Node.js and Express."
  },
  { ... },
]

(back to top)

Usage

JobList See all saved Jobs - Can Edit and Delete, Search and Filter

JobForm Input information into the fields as desired. (Title, Company, Status, and Description are required) Option to also import a list of jobs provided

Web Scraping Feature This application includes a web scraping feature that automatically pulls job details (title, company, description) from job posting urls. (will not work with sites that blocks scraping requests)

  1. Selenium is used to render the dynamic content of the job posting pages.
  2. BeautifulSoup is used to parse the rendered HTML and extract relevant job information.
  3. Passed to ChatGPT in order to help organize the information
  4. Received info from CHATGPT and populate the corresponding inputs

To use the web scraping feature:

  1. Enter a valid job posting URL from Indeed or LinkedIn.
  2. Click "Scrape Job Details" to autofill job fields.
  3. Submit the form to save the job to the database.

Example POST Request for Scraping (API Endpoint) You can also test the scraping feature using Postman or cURL:

POST Request to /api/scrape-job/

{
  "url": "https://yisamyan.github.io/Job-Tracker-Scraping/"
}

Response:

{
  "title": "Software Engineer",
  "company": "Example Corp",
  "description": "We are looking for a software engineer with 3+ years of experience...",
  "requirements": "",
  "created_at": "09/06/2009 14:53:31",
  "updated_at": "10/09/2009 18:43:11"
}

(back to top)

Roadmap

(back to top)

Contact

LinkedIn

Yi Yan - yi.sam.yan@gmail.com

Project Link: https://github.com/YiSamYan/Job-Tracker-App

(back to top)