miguelgrinberg / Flask-SocketIO-Chat

A simple chat application that demonstrates how to structure a Flask-SocketIO application.
http://blog.miguelgrinberg.com/post/easy-websockets-with-flask-and-gevent
MIT License
672 stars 238 forks source link

Socket Handlers not being Triggered Even after i've imported them #37

Closed Curiouspaul1 closed 1 month ago

Curiouspaul1 commented 2 years ago

Hi @miguelgrinberg hope you're doing great, Paul here! (from the geoalchemy issue on flask-migrate a few weeks ago, big fan!). I am having some issues with getting my socketio handlers to work. I noticed a common cause that you've addressed in the past is "not importing them into your blueprint" so i have made sure to do that. My project structure is a lot like the one in this chat example, only i have more blueprints inside of a parent folder, its like this:

projectfolder/
   - core/
       | - __init__.py 
       | - bookings (blueprint)/
          | - __init__.py
          | - events.py
          | - views.py
       | - other blueprint
       | - other blueprint
   - wsgi.py

Here's what I have in bookings blueprint init.py file:

# flake8: noqa

from flask import Blueprint

bookings = Blueprint(
    "bookings", __name__, url_prefix="/bookings",
    template_folder='templates', static_folder='static'
)

from . import views, events

I have imported the events as you can see here.

In the bookings events.py file i have added the following handlers:

from core import socketio
from flask_socketio import send, emit
from flask import request
from core import db
from models.utils import Socket
from tasks.push_booking_to_queue import pbq
from extensions import redis_
import pygeohash as pgh

@socketio.on('location_update', namespace='/artisan')
def update_location(data):
    # update artisan location on redis
    redis_.geoadd(
        name="artisan_pos",
        values=(data['lat'], data['lon'], data['artisan_id'])
    )

#  option one
@socketio.on('order_updates')
def get_updates(data):
    # add client to special room
    room = request.sid
    # continually query celery for updates
    while True:
        task = pbq.AsyncResult(data['task_id'])
        if task.state == 'SUCCESS':
            result = task.info.get('match_profile')
            send(result, to=room)
            break

@socketio.on('accept_offer')
def offer(data):
    room = request.sid
    # notify client of update
    # order =
    emit('message', data, to=room)
    print(data)

@socketio.on('connect')
def connect():
    emit('welcome', 'welcome!', broadcast=True)
    print('someone connected')

Here i anticipate that the connection handler at least works but not any of the handlers are being triggered.

Here's the app factory in core/__init__.py

from flask import Flask
from config import config_options, DevConfig
from flask_sqlalchemy import SQLAlchemy
from flask_marshmallow import Marshmallow
from flask_migrate import Migrate
from flask_socketio import SocketIO
from celery import Celery
from flask_cors import CORS

# instantiate extensions
db, ma = SQLAlchemy(), Marshmallow()
socketio = SocketIO()
migrate = Migrate(include_schemas=True)
celery = Celery(__name__, broker=DevConfig.CELERY_BROKER_URL)
cors = CORS()

#  app factory
def create_app(config_name):
    app = Flask(__name__)

    # configure application
    app.config.from_object(config_options[config_name])

    # link extensions to app instance
    db.init_app(app)
    ma.init_app(app)
    migrate.init_app(app, db)
    celery.conf.update(app.config)
    cors.init_app(app)

    # register blueprints
    from .bookings import bookings
    from .payments import payments
    from .ratings import ratings
    from .security import security
    from .user import user

    app.register_blueprint(bookings)
    app.register_blueprint(payments)
    app.register_blueprint(ratings)
    app.register_blueprint(security)
    app.register_blueprint(user, url_prefix='/user')

    # socketio.init_app(app, async_mode="eventlet", engineio_logger=True)
    socketio.init_app(app)
    return app

And finally the wsgi.py file which runs the application:

# flake8: noqa

from json import load
from core import create_app, socketio
from dotenv import load_dotenv
from models.user_models import *
from models.base import *
from models.documents import *
from models.address import *
from models.bookings import *
from models.ratings import *
from models.location import *
from models.payments import *
from models.utils import *
import os

load_dotenv()

app = create_app(os.getenv('FLASK_CONFIG') or 'default')

@app.shell_context_processor
def make_shell_context():
    return dict(app=app, role=Role)

if __name__ == "__main__":
    # socket.run(app, host="0.0.0.0", port=5000)
    socketio.run(app)

I don't know if this is a consequence of putting all my blueprints in a package, because i see that your chat app example only has one blueprint which is on the same level as the chat.py that runs the application. Hopefully this is descriptive enough, i look forward to hearing from you, thank you!.

miguelgrinberg commented 2 years ago

See the troubleshooting section of the Flask-SocketIO docs to learn how to enable detailed logs. The logs might have some clues about the problem, I can't really tell from the code alone.

Curiouspaul1 commented 2 years ago

will do just that and let you know how it goes, thanks a lot

Curiouspaul1 commented 2 years ago

image

Okay so I just did that, turns out i actually did this before, but I turned it off. I think the logs show that my client is able to connect but for some reason the handlers don't trigger (not sure)

miguelgrinberg commented 2 years ago

There is no indication of problems in this log. I think the client isn't sending anything.

Curiouspaul1 commented 2 years ago

I can verify that the client works because i tried interacting with your chat app example, using the same client (postman) and it worked fine

miguelgrinberg commented 2 years ago

All I can tell you is that if the client was sending anything, then the log would show it. Even if it is complete garbage, the log would show that the client is sending. Your log does not show any indication of data received.

Curiouspaul1 commented 2 years ago

Hmmm, I see that's troubling I'll have to do some more digging

Curiouspaul1 commented 2 years ago

was sending anything, then th

what about the connect event, that should be triggered automatically at least since it can at least connect , the handler for "connect" should at least be triggered, even before any thing is sent by the client. What do you think?

miguelgrinberg commented 2 years ago

The connect event is the one that reads Received packet MESSAGE data 0. There are a few in your log. You can see that the server responds with a sid value.

Curiouspaul1 commented 2 years ago

Hi! Miguel, hope you're doing great, wanted to give you updates on the issue I'm having. So i decided to build a mirror of the application but progressively, instead of building the whole thing at once (this mostly involved copying and pasting files), and test each sector as i add them to this new mirror. It started out well and eventually right after i imported some celery tasks into the blueprint, the handlers stopped working and i was able to reproduce this behavior. I took a step backward and realised that it worked when i removed the celery task imports from the views and the events file. I don't know why this might be happening, but I'm at least glad i was able to figure out what was going on. I'll show my folder structure again and how i defined these celery tasks and how i used them in the blueprint so you can see them, perhaps i made a blunder with the way i used it.

projectfolder/
   - core/
       | - __init__.py 
       | - bookings (blueprint)/
          | - __init__.py
          | - events.py
          | - views.py
       | - other blueprint
       | - other blueprint
   - tasks/
     | - __init__.py
     | - task1.py
     | - task2.py
   - wsgi.py

This is just an addition to the folder structure i sent earlier on at the beginning, its mostly the same i just omitted some unnecessary files and subfolders but now that the tasks folder is important i am updating it so you can get the full context.

In tasks/__init__.py

#!/usr/bin/env python

# flake8: noqa

import os
from core import celery, create_app, socketio

app = create_app(os.getenv('FLASK_CONFIG') or 'default')
app.app_context().push()

from .second_task import stask

task1.py (this is an alias for the actual name btw)

from . import celery
from extensions import redis_

@celery.task(bind=True, name='push_booking_to_queue.pbq')
def pbq(self, booking_details):
   # does some stuff
    print('yes')

So back in the bookings blueprint (in its views and events.py) i import these tasks like so

in bookings/views.py

from tasks.task1 import pbq

@bookings.route('/', methods=['POST'])
# @login_required
# @permission_required(Permission.service_request)
def create_booking():
    data = request.get_json(force=True)
    new_order = Booking(params=data)
    # new_order.user = current_user
    db.session.add(new_order)
    db.session.commit()
    data['booking_id'] = new_order.booking_id
    init_task = pbq.apply_async([data])
    return {
        'status': 'success',
        'msg': 'booking created successfully',
        'data': {
            'task_id': init_task.id
        }
    }

in the bookings/events.py

from tasks.task1 import pbq

@socketio.on('order_updates')
def get_updates(data):
    # add client to special room
    room = request.sid
    # continually query celery for updates
    while True:
        task = pbq.AsyncResult(data['task_id'])
        if task.state == 'SUCCESS':
            result = task.info.get('match_profile')
            send(result, to=room)
            break

perhaps this is coming from the app context that celery is pushing or something (i have no idea tbh)

miguelgrinberg commented 2 years ago

@Curiouspaul1 I'm going to repeat my observation from before. The client isn't sending the events that you are expecting. I don't see why the tasks can influence what the client sends, but that is what you should be looking for, as the problem is not in the server.

Please go ahead and review the log screenshot that you pasted above once again. Note the line near the top that reads:

received event "joined" from ...

This is proof that your application is able to receive events. This event is sent by the application in this repository. At the time you capture the log you must have had an instance of this application running in your browser, and it was trying to connect.

Then look at the two connections that follow in your log. Both connect successfully, but they don't send any events.

Curiouspaul1 commented 2 years ago

Hi Miguel hope you're doing well. So I think you misunderstood me, possibly because I didn't add the new logs from my last test to my previous message, but what I'm trying to tell you is that the application works whenever I remove the "import from the tasks module" into my events file. For more context here's the behavior of the application as I have come to see:

image

from tasks.push_booking_to_queue import pbq

NOte: the tasks package contains more than one module, and importing any of these modules inside the events module results in the same behavior.

image

So i can assure you that this behavior is caused by importing these tasks. Please what could be the cause of this behavior, is it because another app instance exists inside the tasks folder

miguelgrinberg commented 2 years ago

So i can assure you that this behavior is caused by importing these tasks. Please what could be the cause of this behavior, is it because another app instance exists inside the tasks folder

I have no way to know, sorry. Here I have to believe you, because you haven't provided enough logs/information to confirm that this statement is accurate. Assuming you are correct, then the only way to find out is to debug the module that you are importing.