kumar-shridhar / Online-Toxicity-Detection

APOLLO-1: Online Toxicity Detection
MIT License
2 stars 4 forks source link
attention-model bert-model deep-learning nlp nlp-machine-learning roberta toxic-comment-classification

Python 3.7+ TensorFlow 2.1 License: MIT

.

Read more about the project in our blog.

Online Toxicity Detection

Project Apollo consist of a series of projects that are aimed at using Deep Learning for various applications. This work presents the first project: APOLLO-1. This project is aimed at developing an application that detects toxicity in an online conversation.

.


How to run

  1. Clone the repo: git clone https://github.com/kumar-shridhar/Online-Toxicity-Detection.git
  2. Make sure you have anaconda installed. If not, check here.
  3. Install all the requirements using conda yaml: conda env create -f environment_{os}.yml where {os} can be Windows or Linux.
  4. Download saved model from here
  5. Unzip the model and save in Online-Toxicity-Detection/apollo/inference folder.
  6. Run command:
    • cd Online-Toxicity-Detection
    • python apollo/Frontend/app.py
  7. Go to the link in the console and provide the YouTube URL, and adjust the sensitivity and number of comments. The results will be displayed in a chart form.
  8. You can export the .csv file of the final results by clicking on export results.

References


Contact

Feel free to contact the authors in case of any issues.

Naveed Akram, Ritu Yadav, Venkatesh Iyer Sadique Adnan Siddiqui, Ashutosh Mishra, Kumar Shridhar