instructure / canvas-lms

The open LMS by Instructure, Inc.
https://github.com/instructure/canvas-lms/wiki
GNU Affero General Public License v3.0
5.52k stars 2.45k forks source link

Error during step 3 of LTI 1.3 launch "The launch has either expired or already been consumed" #2050

Open drscottlobo opened 2 years ago

drscottlobo commented 2 years ago

Summary:

I'm receiving an error at Step 3 of the LTI 1.3 flow with my LTI running an express server. Canvas is running on localhost:3000 and my LTI is running with a self-signed certificate on https://localhost:8000. The OIDC launch successfully launches the tool and then I send a redirect back to my local instance of Canvas http://localhost:3000/api/lti/authorize_redirect with the response body. Instead of receiving a JWT back from Canvas my auth route gets a JSON response with an error 'The launch has either expired or already been consumed'. I'm having trouble figuring out what's going wrong as that error is not very informative. Has anyone encountered this one?

Additional notes:

Here is my oidc route:

`router.post('/openid', async (req, res) => { req.session.login_request = req.body; const errors = isValidLogin(req);

if (errors.length === 0) {
    //// TODO check for platform in DB
    const response = {
        scope: 'openid',
        response_type: 'id_token',
        client_id: req.body.client_id,
        redirect_uri: 'https://localhost:8000/lti3/auth', /// local LTI app 
        login_hint: req.body.login_hint,
        state: uuidv4(),
        response_mode: 'form_post',
        nonce: uuidv4(),
        prompt: 'none',
    };
    if (req.body.hasOwnProperty('lti_message_hint')) {
        response.lti_message_hint = req.body.lti_message_hint;
    }
    req.session.login_response = response;
        const redirectUrl = url.format({
            pathname: 'http://localhost:3000/api/lti/authorize_redirect', //// local canvas instance 
            query: response,
        });
        return res.redirect(redirectUrl);
}

});`

And here is my JSON for the JWK on my lti's side:

{ keys: [ { kty: "RSA", kid: "TTO0gkzQ2naFR-pB96xtlWUm8hCztcQBm-X-Krm0F8E", use: "sig", alg: "RS256", e: "AQAB", n: "pTfYEdOhIwgBQrF-Y6CkhoLr7YrJU4Od7jfFAFa1pvq-wSXQXNFza6Tn5AapKYQ47DRr9aRVd4OlK7LCyFqupBm02yhVctR0sqyb9YXLlh8pjkqyP_wMQGk7mdZjA2fUiNjuD5FtyxMXUjdCA-SzuHq42CWUF66vh8Ca0M4tG73j03GWJnyJ_QscxSOes3OGXrMbkdUf49kACB32VYHRIVsIbWKbxosYV5vaqOyNFRJNPbTaRh5AnvJ3TYzPCVb1jnWVriuibKs75c0_R21eckPWxacpDsTawyNVZpT6UhEdF3EbqAaUXI8pORp4roZkTtvat69SgyngRtDbGieopw" } ] }

Also, the local instance of Canvas doesn't seem to have the same format for the keys as the canvas.instructure.com/ does:

{ keys: [ { kty: "RSA", e: "AQAB", n: "uX1MpfEMQCBUMcj0sBYI-iFaG5Nodp3C6OlN8uY60fa5zSBd83-iIL3n_qzZ8VCluuTLfB7rrV_tiX727XIEqQ", kid: "2018-05-18T22:33:20Z" }, { kty: "RSA", e: "AQAB", n: "uX1MpfEMQCBUMcj0sBYI-iFaG5Nodp3C6OlN8uY60fa5zSBd83-iIL3n_qzZ8VCluuTLfB7rrV_tiX727XIEqQ", kid: "2018-06-18T22:33:20Z" }, { kty: "RSA", e: "AQAB", n: "uX1MpfEMQCBUMcj0sBYI-iFaG5Nodp3C6OlN8uY60fa5zSBd83-iIL3n_qzZ8VCluuTLfB7rrV_tiX727XIEqQ", kid: "2018-07-18T22:33:20Z" } ] }

Local instance call stack:

[CANVAS_ERRORS] EXCEPTION LOG Lti::IMS::InvalidLaunch (no payload found in cache): /Users/lobomacpro/Sites/canvas/app/controllers/lti/ims/authentication_controller.rb:155:incached_launch_with_nonce' /Users/lobomacpro/Sites/canvas/app/controllers/lti/ims/authentication_controller.rb:163:in id_token' /Users/lobomacpro/Sites/canvas/app/controllers/lti/ims/authentication_controller.rb:118:invalidate_launch_eligibility!' /Users/lobomacpro/Sites/canvas/app/controllers/lti/ims/authentication_controller.rb:77:in authorize' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_controller/metal/basic_implicit_render.rb:6:insend_action' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/abstract_controller/base.rb:195:in process_action' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_controller/metal/rendering.rb:30:inprocess_action' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/abstract_controller/callbacks.rb:42:in block in process_action' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activesupport-6.0.4.1/lib/active_support/callbacks.rb:112:inblock in run_callbacks' /Users/lobomacpro/Sites/canvas/app/controllers/application_controller.rb:2816:in manage_live_events_context' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activesupport-6.0.4.1/lib/active_support/callbacks.rb:121:inblock in run_callbacks' /Users/lobomacpro/Sites/canvas/app/controllers/application_controller.rb:650:in compute_http_cost' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activesupport-6.0.4.1/lib/active_support/callbacks.rb:121:inblock in run_callbacks' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/inst_statsd-2.2.0/lib/inst_statsd/statsd.rb:97:in batch' /Users/lobomacpro/Sites/canvas/app/controllers/application_controller.rb:645:inbatch_statsd' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activesupport-6.0.4.1/lib/active_support/callbacks.rb:121:in block in run_callbacks' /Users/lobomacpro/Sites/canvas/lib/temp_cache.rb:30:inenable' /Users/lobomacpro/Sites/canvas/app/controllers/application_controller.rb:641:in enable_request_cache' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activesupport-6.0.4.1/lib/active_support/callbacks.rb:121:inblock in run_callbacks' /Users/lobomacpro/Sites/canvas/app/controllers/application_controller.rb:635:in set_locale' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activesupport-6.0.4.1/lib/active_support/callbacks.rb:121:inblock in run_callbacks' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activesupport-6.0.4.1/lib/active_support/callbacks.rb:139:in run_callbacks' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/abstract_controller/callbacks.rb:41:inprocess_action' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_controller/metal/rescue.rb:22:in process_action' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_controller/metal/instrumentation.rb:33:inblock in process_action' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activesupport-6.0.4.1/lib/active_support/notifications.rb:180:in block in instrument' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activesupport-6.0.4.1/lib/active_support/notifications/instrumenter.rb:24:ininstrument' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activesupport-6.0.4.1/lib/active_support/notifications.rb:180:in instrument' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_controller/metal/instrumentation.rb:32:inprocess_action' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_controller/metal/params_wrapper.rb:245:in process_action' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activerecord-6.0.4.1/lib/active_record/railties/controller_runtime.rb:27:inprocess_action' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/abstract_controller/base.rb:136:in process' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionview-6.0.4.1/lib/action_view/rendering.rb:39:inprocess' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_controller/metal.rb:190:in dispatch' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_controller/metal.rb:254:indispatch' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/routing/route_set.rb:50:in dispatch' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/routing/route_set.rb:33:inserve' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/journey/router.rb:49:in block in serve' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/journey/router.rb:32:ineach' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/journey/router.rb:32:in serve' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/routing/route_set.rb:834:incall' /Users/lobomacpro/Sites/canvas/gems/plugins/respondus_soap_endpoint/lib/respondus_soap_endpoint/middleware.rb:80:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/rack-brotli-1.0.0/lib/rack/brotli/deflater.rb:34:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/rack-2.2.3/lib/rack/deflater.rb:44:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/rack-2.2.3/lib/rack/chunked.rb:98:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/rack-2.2.3/lib/rack/tempfile_reaper.rb:15:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/rack-2.2.3/lib/rack/etag.rb:27:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/rack-2.2.3/lib/rack/conditional_get.rb:27:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/rack-2.2.3/lib/rack/head.rb:12:incall' /Users/lobomacpro/Sites/canvas/app/middleware/request_throttle.rb:66:in block in call' /Users/lobomacpro/Sites/canvas/app/middleware/request_throttle.rb:324:inreserve_capacity' /Users/lobomacpro/Sites/canvas/app/middleware/request_throttle.rb:61:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/http/content_security_policy.rb:18:incall' /Users/lobomacpro/Sites/canvas/gems/request_context/lib/request_context/session.rb:28:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/rack-2.2.3/lib/rack/session/abstract/id.rb:266:incontext' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/rack-2.2.3/lib/rack/session/abstract/id.rb:260:in call' /Users/lobomacpro/Sites/canvas/app/middleware/load_account.rb:49:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/middleware/cookies.rb:654:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/middleware/callbacks.rb:27:inblock in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activesupport-6.0.4.1/lib/active_support/callbacks.rb:101:in run_callbacks' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/middleware/callbacks.rb:26:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/middleware/actionable_exceptions.rb:18:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/middleware/debug_exceptions.rb:32:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/middleware/show_exceptions.rb:33:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/railties-6.0.4.1/lib/rails/rack/logger.rb:37:incall_app' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/railties-6.0.4.1/lib/rails/rack/logger.rb:28:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/middleware/remote_ip.rb:81:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/request_store-1.5.0/lib/request_store/middleware.rb:19:in call' /Users/lobomacpro/Sites/canvas/gems/request_context/lib/request_context/generator.rb:71:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/rack-2.2.3/lib/rack/method_override.rb:24:in call' /Users/lobomacpro/Sites/canvas/app/middleware/prevent_non_multipart_parse.rb:35:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/rack-2.2.3/lib/rack/runtime.rb:22:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/activesupport-6.0.4.1/lib/active_support/cache/strategy/local_cache_middleware.rb:29:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/middleware/executor.rb:14:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/middleware/static.rb:126:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/rack-2.2.3/lib/rack/sendfile.rb:110:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/actionpack-6.0.4.1/lib/action_dispatch/middleware/host_authorization.rb:82:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/railties-6.0.4.1/lib/rails/engine.rb:527:in call' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/railties-6.0.4.1/lib/rails/railtie.rb:190:inpublic_send' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/railties-6.0.4.1/lib/rails/railtie.rb:190:in method_missing' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/puma-5.2.2/lib/puma/configuration.rb:248:incall' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/puma-5.2.2/lib/puma/request.rb:76:in block in handle_request' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/puma-5.2.2/lib/puma/thread_pool.rb:337:inwith_force_shutdown' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/puma-5.2.2/lib/puma/request.rb:75:in handle_request' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/puma-5.2.2/lib/puma/server.rb:431:inprocess_client' /Users/lobomacpro/.rvm/gems/ruby-2.7.2/gems/puma-5.2.2/lib/puma/thread_pool.rb:145:in `block in spawn_thread'CONTEXT: {:extra=>{:request_id=>"a40065a4-2d98-4b35-a380-370f6266e549", :session_id=>"9a9134713243a868e38a0553d1b40007", :meta_headers=>""}, :tags=>{:type=>"lti"}}

` Would really appreciate some help!

Thanks.

ymaerschalck commented 2 years ago

Any word on this? We are experiencing the same error message.

drscottlobo commented 2 years ago

Hi ymaerschalck! No word on exactly what the issue is, but my suspicion is that is has something to do with the quickstart version not having the Redis cache set up and I couldn't get it working even after following the directions here: https://github.com/instructure/canvas-lms/wiki/Quick-Start#performance-tweaks

But this might help: I was able to get the Docker version of Canvas up and running using these instructions with a few additional tweaks from a more experienced developer and that version of Canvas did NOT throw the error(!), I'm guessing because it has the Redis cache (and who knows what else) already configured correctly to send back the correct JWTs.

I'm on a Mac so here's the changes I had to make for the Docker setup to work:

  1. Download the Canvas repo and Docker

  2. Install Docker Desktop and run it, make sure Preferences > Resources > Memory is set to 8 gigs or higher

  3. Then run the setup script from the Terminal. If the setup script doesn't recognize Docker from the terminal (it couldn't find it on mine), this hack from my friend worked: Comment out line 52 of docker_desktop_setup.sh like this: # exit 1

  4. Now you should be able to successfully run the setup script from the repo directory: ./script/docker_dev_setup.sh (note: it takes forever to build)

  5. Then run 'mutagen-compose up -d' and visit http://canvas.docker in your browser- this also took a good 30 seconds to a minute to load the page the first time...But this version should work for LTI 1.3 launches!

Hope this works for you, it was a serious week of hair-pulling trying to get it to work!

ymaerschalck commented 2 years ago

Thx @drscottlobo I'll give this a try. Did you also try this on a remote server or only on your local mac? We're developing a platform that acts as an external app in canvas, but for demo purposes we want to be able to run a version of Canvas on a remote server. For more info: i'm running the version in the stable/2022-01-19 branch because I had some other issues while installing. In one of the issues here I found that this branch didn't have that problem while the prod branch has. Do you know which branch you are using? Is it a specific release or just the master branch?

drscottlobo commented 2 years ago

I believe I am running the master branch from less than a month ago.

I did get a Bitnami package working on Google Cloud through Bitnami's cloud integrations, but I think the package on Bitnami is built from the quickstart (and was pretty old), so if I remember correctly it had the exact same error I was getting originally.

ymaerschalck commented 2 years ago

Ok, I also tried the Bitnami package on AWS and got it running succesfully. When comparing to our own hosted version, we were able to correct everything there by installing the latest version of redis and fixing the redis and cache config files for canvas. Then the only error that remained was fixed by implementing the fix from https://community.canvaslms.com/t5/Canvas-Developers-Group/Canvas-LTI-1-3-Error-Unknown-Key-Type/m-p/390285/highlight/true#M6345

Now we've got a working LTI1.3 implementation

jtong99 commented 2 years ago

@drscottlobo Hi, I am facing with this issue. Have you found any solutions? Thanks.

mrg-ml commented 2 years ago

fixing the redis and cache config files for canvas

How would someone determine the proper config files? Shouldn't Bitnami be configuring this?