Agri Doctor: Agronomy Advice API š
Project Description
This project involves developing robust APIs for an Agronomy application called Agri Doctor. The Agri Doctor API suite is designed to support Indian farmers by providing tailored agricultural advice using public data and AI/ML models. It aims to assist with disease prediction, seed/fertilizer recommendations, nutrition advice, and yield improvement insights to help farmers make data-driven decisions for better crop health and descreased losses.
Key Features:
- Crop and Disease Prediction endpoint: Predicts crop disease from the crop type and image uploaded by the farmer.
- Personalized Agri-Advice API: Tailored recommendations based on crop type, crop disease and current weather conditions.
Prerequisites
Before running the project, ensure you have the following tools installed:
Installation
Steps
-
Clone the repository:
git clone https://github.com/Protean-Samagra/Agri-Doctor
cd your-project-folder
-
Install the required dependencies:
pip install flask_restful
pip install flask_swagger_ui
-
Update the necessary URLs in the code:
- Swagger UI URL:
API_URL = 'http://127.0.0.1:5000/swagger.json' # Localhost example
- Weather API Key: Update your
apikey
in params
:
params2 = {'apikey': 'your_weather_api_key'}
params = {'apikey': 'your_weather_api_key'}
- Machine Learning API URL: Update the URL for the ML service:
url = "http://localhost:11434/api/generate" # Update if necessary
- Machine Learning model URL: Update the place where the model is stored:
- Currently it is at : C:/Users/ITBS/Desktop/disease_detection.keras
- Change it to where 'disease_detection.keras' file is stored
Methodology
-
Setting up the Flask Application:
- Start by creating a Flask app, which acts as a web server to handle different HTTP requests.
- Set up Swagger UI to provide interactive documentation for your API using Flask-RESTful and flask_swagger_ui.
-
City API:
- The /city endpoint hits an external weather service API (AccuWeather) to retrieve information about the city (in this case, 'Bangalore').
- The city_url points to the API for fetching city data based on the provided city name.
- You pass an API key and the city name as parameters (params2), then extract the location_key from the response JSON, which is necessary for retrieving weather information.
-
Weather API:
- The /weather endpoint uses the location_key from the city API to form the resource URL for the weather API.
- It hits the AccuWeather API again, using location_key to get real-time weather data, including conditions such as weather type, temperature, and units.
- This weather data is converted from JSON to string format and stored in the weather_prompt variable. This prompt will be passed later for further processing.
-
Machine Learning Model Integration:
- The /processing endpoint allows the user to upload an image file, representing the crop, along with a language preference.
- The uploaded image is processed using preprocess_image, which resizes and normalizes the image to make it suitable for the ML model.
- A pre-trained TensorFlow model (loaded from your local machine) is used to predict the crop's disease based on the image.
- The model predicts the disease by outputting a class index that maps to a list of diseases (disease_class_names). The predicted class is split into crop_type and disease_type.
-
Creating a Prompt:
- The weather_prompt (containing the weather information), along with the predicted crop_type and disease_type, is used to generate a prompt.
- The prompt is designed to simulate asking a plant pathologist for advice, which includes recommendations for fertilizer, chemicals, organic treatments, and practices tailored to the weather conditions.
-
Passing the Prompt to Ollama:
- You send this generated prompt to an AI model running on the Ollama server via a POST request.
- You can use various models. In english the best response is observed by "llama3.1". For Indic languages- "qwen2:1.5b".
- The response from the model contains the professional advice based on the crop and disease condition, which is returned to the user.
Machine learning model
- Run the model (disease_detction.ipynb)
- Save it as a .keras file / use the previously saved file
- Use the local location of the .keras file in the app.py file.
Usage
-
Start the Flask application:
flask run
-
Test the APIs on Postman
-
Access the Swagger UI for API documentation at:
http://127.0.0.1:5000/swagger
Ollama Setup
-
Install Ollama by following the instructions here.
-
Pull the model required:
ollama pull qwen2:1.5b
-
If the port is busy, set a different port:
set OLLAMA_HOST=127.0.0.1:11435
-
Start the Ollama server:
ollama serve
-
Ensure the Flask application points to the correct Ollama server port:
url = "http://localhost:11434/api/generate"