Closed bbuccianti closed 4 years ago
@technomancy just made very good suggestions! Thanks all for your patience on explaining things.
This is good; it works in Jeejah again. But I still ran into a problem:
M-x monroe RET
*monroe: nil*
rather than *monroe: 7888*
M-x monroe RET localhost:7777
I think this is because Jeejah does not create a .nrepl-port
file.
I think it makes sense to read the port from the file, but there needs to be a way to identify the connections without that file; otherwise all connections that don't have a file will be confused with each other even though they are on different ports.
I've added a fix in order to check if monroe-locate-running-nrepl-host provides a port to name the buffer correctly. If not, it resolves to monroe or monroe-connection.
Maybe we can fall back to project folder instead of host+port when .nrepl-port doesn't provide one?
Maybe we can fall back to project folder instead of host+port when .nrepl-port doesn't provide one?
Problem with this approach is that it still goes back to the initial PR commit - dependence on project folder. Idea is to keep it project (location or setup) agnostic; reading .nrepl-port is just a sugar so user doesn't have to manualy type host and port.
Maybe I got lost in comments (and sorry if I oversighted something), but what is wrong with this approach:
Now, the only way to make connections unique is host and port combination. If host or port are not specified, monroe already assume (and should assume) localhost and 7888 as default values.
I understand the idea of keep the it project agnostic. But how do you decide to which of two connections send your code if you don't have the .nrepl-port?
Would be awful need to input the correct host:port every time you want to send code. Can we add buffers to the correct one? Kind of a list of buffers that you hook to some connection?
I thought and test a lot, read your suggestions and do my best for reaching to this https://github.com/sanel/monroe/pull/33/commits/7e0e3816af739e643b26b1d54b3bb9ad541460a8
I've tested opening two lein repl :headless
, and calling monroe-nrepl-server-start for two projects (to extract from .nrepl-port file).
I think this is working now! Please help me test it!
Super stoked to see this - recently I've been working on multiple Clojure services at the same time and current one connection limitation made it somewhat painful.
Just adding my 2c here (Sorry!) and a couple of questions:
I've created a scratch buffer behavior, but for Clojure (you can see it here) - will there be a way of linking it to a particular monroe connection? Essentially this:
Would be awful need to input the correct host:port every time you want to send code. Can we add buffers to the correct one? Kind of a list of buffers that you hook to some connection?
Also, I'm happy to contribute the scratch buffer to Monroe once the multiple connection support is merged in and if there's any interest.
Although I'm not sure if at that point we're rebuilding CIDER 😉
I just tested it with multiple repls without .nrepl-port
and it looks great. Nice job!
Would be awful need to input the correct host:port every time you want to send code. Can we add buffers to the correct one? Kind of a list of buffers that you hook to some connection?
I agree it's tricky to correlate them. One solution could be that the first time you run monroe-eval-*
from a monroe-interaction-mode
buffer which is not inside a directory containing a port file, you could ask the user which connection to use, (ideally with completion that shows all open connections) and store it in a buffer-local variable to use it next time.
But that could be done in a follow-up pull request since this one is already getting pretty long.
I just rebased all the commit to just one.
Did you tried to open a few with .nrepl-port too?
Just tried with emacs -Q and it seems to work properly on GNU Emacs 27.0.90.
@sanel can you try it again?
Thanks!
I tried this with simultaneous connections to Jeejah and Leiningen and it seems to work fine; nice work.
Something more that I can do to help with this?
LGTM :+1: I will merge this and test over the weekend.
Thank you @bbuccianti for your hard work, @technomancy for testing and all others for contributions. Especially for your patience :)
Seems to be working fine!
Maybe we can improve something, so let me know!