Closed dmolin closed 2 years ago
I've to add:
My lti code is running on my machine (localhost, with local IP 192.168.1.25:3010) and Moodle is running in a VirtualBox VM (ip 192.168.1.239). They're on the same network and can see each others without any issue. I'm accessing both via http (so, no https)
This is the tool as configured in Moodle
Another interesting bit that might help in diagnosing what's happening here:
I tried changing the Public key type in Moodle from "Keyset URL" over to "RSA Key" and pasting the public key generated by the lticlient (that I get by callingplatform.platformPublicKey()
). (that is, the one in the form "---BEGIN PUBLIC KEY---.....")
I tried this because I wanted to rule out any issue in the LMS trying to reach out to the tool for the keys..
The result is that the LMS modal doesn't hang anymore, but it errors out immediately, this time with:
"Exception - Expired token"
But I've no idea how to interpret this. I can see that the Json web token is passed in to the Moodle library (I printed it out) but it looks like the public key I provide as "RSA Key" is not the correct one to decode the jwt that is sent back to Moodle. am I setting that information wrong?
I also tried to paste the JWT that ltijs sends back as DeepLinkingResponse into jwt.io. I can clearly see the JWT is decoded correctly but it indeed reports the signature to be invalid if I paste the public key in the PEM format there, so my take is that the Public key I'm providing to the LMS is the wrong one but I've no idea where to get it from at this point.
So, I've made a small progress:
it seems what I've to paste into the "Public key" field (in Moodle at least) is the public key as returned by my tool endpoint /lti/keys corresponding to the platform I'm using (my /lti/keys returns an array of keys, one per configured platform).
So, if I paste the public key (in the form { "kty": "RSA", "kid": "....", "alg": "RSA256", "use": "sig" }) I get a different result. No more "Expired token"!
Though, this is what I get now:
The code raising the error is both in JWK.php and JWT.php in Moodle and it seems to be where the RSA key is checked OR where the message signature is verified:
(JWK.php)
89 switch ($jwk['kty']) {
90 case 'RSA':
91 if (!empty($jwk['d'])) {
92 throw new UnexpectedValueException('RSA private keys are not supported');
93 }
94 if (!isset($jwk['n']) || !isset($jwk['e'])) {
95 throw new UnexpectedValueException('RSA keys must contain values for both "n" and "e"');
96 }
97
98 $pem = self::createPemFromModulusAndExponent($jwk['n'], $jwk['e']);
99 $publicKey = \openssl_pkey_get_public($pem);
100 if (false === $publicKey) {
101 throw new DomainException(
102 'OpenSSL error: ' . \openssl_error_string()
103 );
104 }
105 return new Key($publicKey, $jwk['alg']);
(JWT.php)
240 /**
241 * Verify a signature with the message, key and method. Not all methods
242 * are symmetric, so we must have a separate verify and sign method.
243 *
244 * @param string $msg The original message (header and body)
245 * @param string $signature The original signature
246 * @param string|resource $key For HS*, a string key works. for RS*, must be a resource of an openssl public key
247 * @param string $alg The algorithm
248 *
249 * @return bool
250 *
251 * @throws DomainException Invalid Algorithm, bad key, or OpenSSL failure
252 */
253 private static function verify($msg, $signature, $key, $alg)
254 {
255 if (empty(static::$supported_algs[$alg])) {
256 throw new DomainException('Algorithm not supported');
257 }
258
259 list($function, $algorithm) = static::$supported_algs[$alg];
260 switch ($function) {
261 case 'openssl':
262 $success = \openssl_verify($msg, $signature, $key, $algorithm);
263 if ($success === 1) {
264 return true;
265 } elseif ($success === 0) {
266 return false;
267 }
268 // returns 1 on success, 0 on failure, -1 on error.
269 throw new DomainException(
270 'OpenSSL error: ' . \openssl_error_string()
271 );
272 case 'sodium_crypto':
273 if (!function_exists('sodium_crypto_sign_verify_detached')) {
274 throw new DomainException('libsodium is not available');
275 }
276 try {
277 // The last non-empty line is used as the key.
278 $lines = array_filter(explode("\n", $key));
279 $key = base64_decode(end($lines));
280 return sodium_crypto_sign_verify_detached($signature, $msg, $key);
281 } catch (Exception $e) {
282 throw new DomainException($e->getMessage(), 0, $e);
283 }
284 case 'hash_hmac':
285 default:
286 $hash = \hash_hmac($algorithm, $msg, $key, true);
287 return self::constantTimeEquals($signature, $hash);
288 }
289 }
I tend to believe it's the one in JWT.php, since we're sending back a message to the LMS with an encoded JWT and the platform is supposedly trying to "verify" that message (and failing in doing so)
I have a very similar setup to yours (except I'm running Moodle in a docker container) and it seems to be working for me. My deeplink code is very similar to yours and I'm using Keyset URL
as the public key type:
Does the JWT that's passed to http://192.168.1.239/mod/lti/contentitem_return.php?course=2&id=2&sesskey=v3sFBYMDCU
validate on jwt.io? Also, are you sure your Moodle instance has access to http://192.168.1.25:3010?
Hey @johnnyoshika , thanks so much for your contribution! I'm really getting desperate here :(
So, the JWT passed back to the LMS to the contentitem_return.php page validates correctly on jwt.io using either the RSA key or the equivalent PEM format public key.
The weird thing is that if I use the public key url (instead of the RSA key) the LMS "hangs" when I submit the form... from the look of it it would seem like the LMS is unable to access my keys but that would be impossible: both the LMS and my LTI tool are on the same network!
I'm running Moodle in a VM (with a bridged network, so it's just one IP in my local network): so, Moodle is running on 192.168.1.239 and my LTI tool is running on 192.168.1.25; the 2 machine can see each other (I also keep an ssh open between them to check the logs).
The only thing that comes to mind is that my local LTI tool runs only on http (no https) but I cannot see how this would be a problem since I'm also accessing Moodle (running in the VM) via http (so they share the same un-secure protocol)
I think I already tried switching to https (using ngrok) with no difference in results...
That is very strange. I'm stumped as to why you're encountering this problem.
@johnnyoshika can you pls confirm if your "lti.setup()" code looks similar to mine? do you also use the "dynReg" section? (same question for devMode and cookies section)... Just wondering if there might be any difference there that might affect the communication between the LMS and the tool further down the road...
Also, this is what I send back to the LMS (I'm just decoding the JWT sent back in the self-posted form):
Do you see anything missing/misplaced there that could trigger an error from the LMS?
@johnnyoshika OK, So I found what the problem was (and you were totally right)! Stupid me, my local dev laptop had the Firewall configured and turned on (doh!), so it was blocking packets on the port of the LTI tool! So, now I've whitelisted connections on that port too and the LMS doesn't block anymore; Though, as soon as the form is submitted I'm back at square one with this error:
The token CANNOT be expired; the "exp" field in the payload is correctly set to a time in the future. I've also tried (for testing) to change the code in the ltijs library to set the expiration at 299 seconds (the allowed maximum is 300 secs) and I still get the same exact error... :(
Found the issue: the Bitnami VM I downloaded for Moodle (https://bitnami.com/stack/moodle/virtual-machine) was configured with the wrong time AND timezone! It was consistently one hour forward (and with no NTP service configured), so of course any payload I was sending back ended up being detected as expired.... After a bit of fiddling I now have it setup correctly and YES, my form is now submitted successfully! Thanks for much for the previous help and assistance @johnnyoshika !
One more question if I may:
I've noticed that when I launch the tool I've setup with deep linking, the launch payload I receive is lacking of some information that I normally get with a (non-deeplinked) launch, namely:
Do you notice that too?
Edit: Ah, found it: those information are controlled by checkboxes in the "Privacy" section when configuring the external tool in the course.
I'll close this ticket. 99% of it was down to network/VM configuration issues so there's really not any issue anymore π Thanks again for @johnnyoshika for the great help (also for offering me a chance to discuss this with someone! sometimes just voicing things aloud with someone else makes all of the difference!)
@dmolin Glad to hear that you solved it! π
@dmolin You opening this issue is a godsend for me, because I am running the moodle/bitnami container as well!
I have been trying to get into LTI and have also tried the example LTI tool from dmitry-viskov (https://github.com/dmitry-viskov/pylti1.3-django-example) in conjunction with moodle, but I have also run into this issue you had:
I have also tried to use the RSA Key option instead of the URL and entered the tool's public key into the field, but that only got me Deeplinking to work. I could choose options provided by the tool (in the case of the previously linked example tool: game difficulties) but after choosing an option it just throws the above-mentioned error.
Have you done anything else except for switching to RSA Key and using the public key of the LTI tool? Am I right to assume that you indeed used the LTI tool's public key? Is the public key of the tool indeed what you get by calling platform.platformPublicKey()
? I am just a little confused since I thought the platform
is the tool consumer (moodle), not the tool provider.
Regarding these endpoints you used:
Are these provided by the tool? If so, where can I get those? Or do you have to implement those endpoints yourself? I was not able to get a whole lot out of the documentation*, ltijs example server implementation or the example client app. I am not even sure if the ltijs-demo-server implementation is supposed to be an example LTI tool or what the ltijs-demo-client is for. In any case, I ran both and clicking around in the UI did nothing for me, because the LTI key was not specified? (What is that anyway? - I thought it was just a random string you use as a secret to sign, so not changing the "LTI-KEY" string to anything should work, no?)
*(I am pretty sure that's just my fault, so if you happen to know any other resources or the sections in question, I would highly appreciate it! Even better if you know any ltijs applications I can have a look at!!)
-Edit: How were you able to change the time for the moodle/bitnami container btw?
Any help from anyone is appreciated!
Hey @phiduo !
So, the good news is that ltijs actually works and it works pretty well! I'm currently using the "Keyset URL" (so, I'm not pasting any RSA public key when configuring the tool into the LMS).
But first, a bit of context:
What you're doing here is to configure "your" application as a "LMS Tool Provider" (that is, the application you're building is the one "providing" some external functionality to the LMS - Moodle - you're integrating with). The LMS in this context acts as "Tool consumer" (or as we refer to, as a "Platform").
The steps you follow when using LTIJS as a library are:
lti.setup()
function call. When doing this, you also provide a "SIGN KEY" that is nothing more than a sequence of characters that the LTIJS library will use to sign any packet exchanged with the LMS platform (my guess is that this key will be used as a seed when internally generating the private and public keys for your tool, something LTIJS takes care of on your behalf).
Example:
lti.setup("myverysecretkey000", {
url: "DB-Connection-URL"
}, {
// options
appRoute: "/lti/launch",
loginRoute: "/lti/login",
keysetRoute: "/lti/keys",
cookies: {
secure: false, // Set secure to true if the testing platform is in a different domain and https is being used
sameSite: "None", // set it to "None" if the testing platform is in a different domain and https is being used
},
devMode: true, // set it to false when in production and using https,
});
When you call the lti.deploy() function, LTIJS will automatically listen for incoming launch requests; the URLs for which LTIJS accepts incoming requests are the ones you specified in the lti.setup() call (appRoute, loginRoute, keysetRoute). You can actually choose whatever path you like for those URLs (personally I like keeping all of them under a common path, like "/lti/..."). Those path will then be used by LTIJS as URL endpoints for the complex IMS launch protocol and messages exchange to take place.
When configuring your tool in the LMS (Administration -> Plugins -> External Tool configuration) you can either specify a KeySet URL or an RSA Key. The main difference is that if you cofigure an RSA Key it's your responsibility (as a developer) to provide that information to the LMS Administrator; Using the Keyset URL is really the best option, since that URL is automatically queried by the LMS whenever it needs to decode any incoming message from your tool (like the Deep Linking response you send back to them, if you use that functionality).
If you use the Keyset URL (recommended!), just provide the URL you decided and put into your lti.setup() function call (the value you used for "keysetRoute"). That URL needs to be reachable from the LMS and you can easily check that this is the case: if your tool is running on http://192.168.1.99:3000, then you can just open a new tab in the browser and hit http://192.168.1.99:3000/lti/keys and you should see the JSON response with the public key(s) that LTIJS has created for you. This is the URL the LMS will hit to retrieve them too.
If you use the RSA Key method instead, you need to put the key corresponding to the platform you're using (from what I've seen the correct one to add there is the JSON one you get from hitting the /lti/keys endpoint on your tool Express server). There should be no reason for you to use this method though: the Keyset URL should just work; if it doesn't you're likely having issues at network level on your side (that was my case, because of my damn Firewall!)
That took a bit, since I'm unfamiliar with Debian distros (I'm on Arch). I'm on holidays right now, so I don't have with me the VM but if I remember correctly, these are the steps I followed:
Hopefully this should work. If for some reason it doesn't, just google "timedatecl Debian (or Ubuntu)" and I think you'll find info on how to use it. As a last resort, you can also run sudo timedatectl set-time
Hope this helps!
Hi @dmolin, I appreciate you going out of your way to give me this wonderful explanation! For the past few hours, I have tried my best to get this thing running with your advice and so far I have managed to output the initial (successful?) authentication request to the console. However, there seems to be a problem, as moodle cannot find the localhost:3000
root endpoint where my LTI tool is. TLDR below!
I think I have figured out the cause of this issue, but it seems I have hit a wall once again. As far as I understood, the moodle, as well as the associated mariadb, container are running in their own docker network, with mariadb running on its default port 3306 and moodle running on the default ports 8080 (and 443 for https). Since my LTI tool is running on the host's localhost and moodle is running in its own docker network, internally, moodle would not be able to connect to the LTI tool, right?
My setup is as follows:
localhost:80
on the host's localhost (localhost:8080
in it's docker network)localhost:3306
in it's docker network
localhost:3000
on the host's localhostSo I believe, the reason why moodle cannot proceed any further is because it cannot find localhost:3000
- and that's because internally for moodle, nothing is running on localhost:3000
.
The reason why content selection worked through deeplinking in the example I mentioned previously is that the browser can actually access the host network, while moodle cannot.
I have therefore thought about two solutions, of which the first one does obviously not work:
(I took your screenshot again, to better illustrate where to apply that).
But that obviously does not work since docker commands don't work in moodle.
version: '2'
services:
mariadb:
image: docker.io/bitnami/mariadb:10.6
environment:
# ALLOW_EMPTY_PASSWORD is recommended only for development.
- ALLOW_EMPTY_PASSWORD=yes
- MARIADB_USER=bn_moodle
- MARIADB_DATABASE=bitnami_moodle
- MARIADB_CHARACTER_SET=utf8mb4
- MARIADB_COLLATE=utf8mb4_unicode_ci
volumes:
- 'mariadb_data:/bitnami/mariadb'
network_mode: "host" # <---- To make it run in the host's network
moodle:
image: docker.io/bitnami/moodle:4
environment:
- MOODLE_DATABASE_HOST=mariadb
- MOODLE_DATABASE_PORT_NUMBER=3306
- MOODLE_DATABASE_USER=bn_moodle
- MOODLE_DATABASE_NAME=bitnami_moodle
# ALLOW_EMPTY_PASSWORD is recommended only for development.
- ALLOW_EMPTY_PASSWORD=yes
volumes:
- 'moodle_data:/bitnami/moodle'
- 'moodledata_data:/bitnami/moodledata'
depends_on:
- mariadb
network_mode: "host" # <---- To make it run in the host's network
volumes:
mariadb_data:
driver: local
moodle_data:
driver: local
moodledata_data:
driver: local
After those changes I just ran docker-compose up -d
and while both containers are running, I can only access mariadb on localhost:3306
, but not moodle - neither on localhost:8080
or localhost:443
. I am not sure why it is not running on those ports or on which port it now runs (I had to remove port mapping btw, because network_mode
and ports
are mutually exclusive).
TLDR:
Moodle cannot access localhost:3000
, where my tool is running. I assume it's because moodle is running in its own docker network and not in the host network. For moodle, internally, there is nothing on localhost:3000
, because the tool is running on the host network, outside of the docker network.
I sure hope my assumptions are not too far-fetched. Have you been running the bitnami/moodle container inside the docker network or in the host network? If you're running the moodle container inside the docker network, how did you manage to make the communication between the docker network and the host network work? (take a shot every time you read work, haha)
@dmolin Excellent explanation!
@phiduo Are you using the docker container settings from here? https://github.com/bitnami/bitnami-docker-moodle If so, I have the same setup as you. I'm pretty much using this docker-compose.yml verbatim but just an older version.
You're encountering the same network problem I had. Moodle inside the container was unable to reach the host (the LTI Tool Provider). This is also the same problem that @dmolin encountered, although I think @dmolin is running Moodle in a VM and not in a container. To solve my problem, I used ngrok to expose my LTI Tool Provider externally so anyone (including Moodle inside my container) can access it. In your case you need to expose the application running on port 3000, so something like this will do:
ngrok http 3000 --hostname my-lti-server.ngrok.io
Now you can use https://my-lti-server.ngrok.io when registering your LTI Tool with Moodle everywhere where you previously used localhost:3000. You'll also want to upgrade ngrok to a paid service so that you can reserve custom URLs. It would be a pain to have a dynamic URL that changes every time you want to run your application. You can do this in other ways, but ngrok just made it easy for me.
Once you get this all sorted, I highly recommend setting up dynamic registration so registering your tool with Moodle is so much easier. You'll essentially be able to paste https://my-lti-server.ngrok.io/register
into this textbox, click Add LTI Advantage
and be done:
@johnnyoshika thanks for chiming in! I knew you were using Docker and that you were the right person to provide advice for this issue (and yes, as you correctly said I've been running Moodle through the Bitnami VM - so in VirtualBox - so it was easier for me to make Moodle reach out to my own server since the VM - in bridged mode - was just another node in my local network).
Ngrok should make things work for sure.
I haven't tried yet using dynamic registration @johnnyoshika, but I'll definitely give it a go!
Hi @johnnyoshika, I have actually come up with the same idea to use ngrok and everything seems to work now! I am kicking myself for trying to fix the LTI tools for the past few days - but it turns out it was just a connection issue! Thanks a lot for the suggestion nonetheless, it feels good to know that somebody else also came to the ngrok conclusion. I haven't gotten to it, but I will definitely try dynamic registration as well, appreciate it @johnnyoshika!
While I was typing this comment I noticed @dmolin has also tuned in so while I have you, knowledgable guys, here, I was hoping you could give me some advice on the following. Hopefully this is not too off-topic.
First a tiny bit of background: I am currently building a middleware that should connect a remotely hosted engine (just spitting out numbers) to an LMS. This middleware should implement the UI to use those values provided by the engine. This UI should then be "integrated" into the LMS via LTI.
LMS <--> (My Middleware) <--> engine
Since LTIJS is capable of running an Express server, would you recommend putting the middleware functionalities onto that LTIJS Express server that is already provided?
LMS <--> (LTIJS including My Middleware) <--> engine
Or would you recommend having the LTIJS Express server function separately?
LMS <--> LTIJS <--> My Middleware <--> engine
I have also read that LTIJS can run in serverless mode so itself can function as a middleware*. Then I could theoretically do this (the opposite of 1.):
LMS <--> (My Middleware including LTIJS) <--> engine
*I hope that's what's meant by middleware here:
What approach do you think is the way to go?
@phiduo It depends on what your middleware is doing. I like to have all the LTI services (including running LTIJS) wrapped up in its own isolated middleware. This is also the approach that @Cvmcosta (the author of LTIJS) recommended to me. That way, that LTI middleware only has permission to the LTIJS database. I described my architecture in these comments:
https://github.com/examind-ai/ltijs-firestore/issues/3#issuecomment-1198368085 https://github.com/examind-ai/ltijs-firestore/issues/3#issuecomment-1199955523
Although my comments are in the context of using Firestore as the database, the general idea applies to any database including MongoDB, which is the default database for LTIJS.
@phiduo so, the way I see it you're still acting as a Tool Provider from the point of view of the LMS. You're just actually proxying the remote system you want to provide an interface to toward the LMS but you're still the one performing the LTI "dance" with the LMS.
I second @johnnyoshika in appreciating LTIJS doing its own things and managing its onw endpoints and DB in isolation. In my scenario, as an example, our platform runs on MetorJS and we use LTIJS to allow LMSes to access our platform; so, the end result is that our system runs on its own port and has its own endpoints and, on top of that, we have LTIJS running its own Express server with its own LTI-only endpoints (and separate DB)
I personally don't recommend running in serverless mode; I tried and the experience was not pleasant; if you do so, you'll have to take care of all the processing LTIJS automatically does when running its own Express server (like using a cookieParser() with an encryption key to name one, processing/validating the ltik token etc), so I prefer to have LTIJS doing its own things and then "routing" the requests (if/when necessary) toward my own platform when required
You should be able to just have LTIJS doing its job and launching your middleware as a Tool (with its own UI) and then having your code reach out to the remote system/engine you're proxying
@johnnyoshika apologies for bothering you again but I just wanted to ask if you tried using deep linking with Canvas... now, the code should be absolutely the same on "our" part (that is, we as the Tool developer). I get the launch call with a message type of "LtiDeepLinkingRequest"; I present my own content selection window; I post my response to the LMS platform via the auto-submitting form generated by ltijs;
Everything works perfectly in Moodle but it doesn't seem to be working with Canvas and I'm not sure as of why..
When I submit my LtiDeepLinkingResponse to canvas (it gets sent to http://192.168.1.249/courses/1/deep_linking_response?context_module_id=2&modal=true&placement=link_selection
, where my local Canvas LMS is running) what I see is that canvas respondx with a "302" (Redirect) to the "root" URL of the LMS ( http://192.168.1.249:80 in my case ) and I'm presented with a Canvas "login" page! This does not happen with Moodle: as soon as Moodle gets my reply the selection window closes and Moodle acknowledge my response by properly configuring the LTI Plugin; With Canvas, instead, the modal selection window remains open and I see the Canvas login page popping up...
This happens only in Canvas, with the same exact code that runs perfectly when interfacing with Moodle so I do wonder if there's something I need to do when configuring the tool (LTI Developer Key) in Canvas that I'm not doing here. Does anyone here have any experience with deep linking + Canvas?
@Cvmcosta did you happen to try working with Deep Linking and Canvas?
For anyone who stumbles into this issue: the issue seems to be when using Blink based browsers (Chrome, Chromium, Vivaldi, Edge) with self-hosted versions of Canvas; The browser is probably enforcing some kind of security policy and prevents Canvas from sending cookies that are relevant to the deep linking flow (making it fail as a result).
Using a browser like Firefox will make everything works π€¦π»
I'm self-hosting a Canvas instance using this Bitnami GCP image: https://bitnami.com/stack/canvaslms/cloud/google
I'm not having any trouble with deep link using a Blink based browser. I did have to update this file, however:
/opt/bitnami/canvaslms/config/dynamic_settings.yml
I added this to the production
section. I copied the values from the development
section which was already populated with the appropriate values:
store:
canvas:
lti-keys:
# these are all the same JWK but with different kid
# to generate a new key, run the following in a Canvas console:
#
# key = OpenSSL::PKey::RSA.generate(2048)
# key.public_key.to_jwk(kid: Time.now.utc.iso8601).to_json
jwk-past.json: "{redacted}"
jwk-present.json: "{redacted}"
jwk-future.json: "{redacted}"
Without those changes, attempting to deep link would result in this error:
{"errors":[{"message":"An error occurred.","error_code":"internal_server_error"}],"error_report_id":3}
Interesting: different Bitnami image, different issue π Thanks for letting me know, I might try that one then.
I've used both Canvas versions 2021.9.15-1
and 2022.7.20-142
and didn't have trouble with Blink based browsers in either of them. I had to do the dynamic_settings.yml
change to both, however.
@dmolin Same issue working fine on Firefox. But didn't work on google crom. Used aws bitnami setup. Have you found something?
Yeah, Firefox seemed to be the only browser working so I stuck with it. Chrome does indeed not work.
Email: @.*** Tel.: +43 67761296025
From: bhavintce @.> Sent: Thursday, August 3, 2023 1:27:25 PM To: Cvmcosta/ltijs @.> Cc: Philip Duong @.>; Mention @.> Subject: Re: [Cvmcosta/ltijs] DeepLinking not working (LMS hangs when receiving back the deep link response from ltijs) (Issue #156)
@dmolinhttps://github.com/dmolin Same issue working fine on Firefox. But didn't work on google crom. Used aws bitnami setup. Have you found something?
β Reply to this email directly, view it on GitHubhttps://github.com/Cvmcosta/ltijs/issues/156#issuecomment-1663812537, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ASKDSFELAIOSJK27TYHWTYLXTODJ3ANCNFSM54KMDMMQ. You are receiving this because you were mentioned.Message ID: @.***>
What's the problem with Chrome? Which LMS? We test with self hosted Bitnami Canvas instance running on GCP using LTIJS almost daily and we don't have any trouble with it. Our clients also use Canvas' cloud solution to use our LTI tool (which we heavily lean on LTIJS) and that works well as well.
@dmolin Same issue working fine on Firefox. But didn't work on google crom. Used aws bitnami setup. Have you found something?
@bhavintce If I remember correctly, my issue was only because I wasn't using secure https protocols on both ends; As far as I can remember, both my Canvas instance and my local environment (the "tool" I was implementing) had to be running on valid https endpoints; I had to use 2 separate tunneling tools at that time (ngrok and pktriot) to get that working but then all worked without issues
What's the problem with Chrome? Which LMS? We test with a self-hosted Bitnami Canvas instance running on GCP using LTIJS almost daily and we don't have any trouble with it. Our clients also use Canvas' cloud solution to use our LTI tool (which we heavily lean on LTIJS) and that works well as well.
I have created one external tool with deep linking using LTIJS. when I try to open this external tool in Firefox and click submit button close Ifram and pass the URL and name. it's working fine. but using Google Chrome browser opens Ifram properly but when I click to submit button inside ifram opens the canvas set up.
For development purposes it's okay but in production mode facing the same issue on the Google chrome browser or not?
I've written a little lti tool client to test out ltijs.
So far I'm able to setup the tool, register the platform and perform a basic launch. So far so good.
The main problem I'm having is in handling deep linking; I want the LMS (Moodle or Canvas) to be able to query my tool for a content selection but I cannot seem able to do that. I see the deep linking flow starting correctly (my lti.onDeepLinking() gets called and I can show a selection page) but when I submit back the deep linking result nothing happens; the LMS seems "hanging there" for a bit and then errors out.
This is my code for the tool:
When I configure the tool in the LMS (Moodle in this case) and I click on the "Select content" button to start the deep linking flow I correctly receive the callback from "onDeepLinking()" and show the selection page:
When I click on the content I want to select, I POST to my own "/deeplink" endpoint (see above in the code snippet) and that generates the self-submitting form that goes back to the LMS (it posts to: http://192.168.1.239/mod/lti/contentitem_return.php?course=2&id=2&sesskey=v3sFBYMDCU passing the encoded JWT param in the form data)
That's where things end... the moodle selection modal remains white for a while (while the POST is still "pending" in the Network tab in my browser) and then after a bit I get this error:
Any advice on how to fix this @Cvmcosta ? any help would be hugely appreciated; I've been stuck with this for a few days now and I'm starting to lose hope