Closed jordwalke closed 10 years ago
Le samedi, 3 mai 2014 à 09:59, Jordan W a écrit :
I don't tend to see a lot of github repos that include the OPAM dependencies configuration directly in the source repo, which seems to make locally developing harder.
opam is a rather new thing and many people haven't realized yet and/or don't have the time to migrate to the interesting new workflows it allows. Unfortunately the development of tools like oasis2opam
will certainly slow down this migration even more.
Could someone kindly show me how to use OPAM as the way to locally develop many modules that depend on eachother? The easiest for now is for you to make a local repo with a dev-package (i.e. it's
url
file is a git:// url) for each of the module you are developing. In the future all this should become much simpler since as it will be allowed (see https://github.com/ocaml/opam/pull/1335) to pin a package without it existing in any repo, using the OPAM metadata found in the pin source to create a package.
Best,
Daniel
Okay, thanks for the information. I'll explain my user experience and what prompted these questions.
I wanted to build an OCaml project. Here's the thoughts that went through my head.
And then I hit this uncomfortable wall where it seemed to be difficult for the following reasons.
brew
(and I'm guessing ocamlbrew
), npm
, and many other package managers that I've used didn't make me feel like a ton of cruft would litter my filesystem. So maybe OPAM just needs to explain what's going on - where are things stored. How do we guarantee that packages clean up after themselves (npm
absolutely guarantees that dependencies and artifacts are cleaned up because it's impossible or difficult to even create system artifacts outside the node_modules
directory in the first place).This kind of made me apprehensive to use OPAM so I'd like to learn more before I jump to conclusions.
I'm just curious, have the maintainers looked at npm? It's a totally simple package manager that actually has nothing to do with JS at all. It's become a hugely popular workflow, and already has support for corporate deployments etc. If not using some pieces of that infrastructure, have we looked at it for inspiration for OPAM at all? Whatever makes it easier for people to get started, I say!
Any thoughts here? Has anyone had any experience with commonJS?
There's a lot to like about npm's mode of operation, but also good reasons why it's often not practical to use for OCaml (primarily due to the compilation burden of tracking all the dependencies carefully). However, it's possible to build the exact workflow you describe using OPAM trunk, and we should certainly do a better job of describing workflows more precisely. Here's an (untested) attempt to explain how to use OPAM in-place with no files installed outside of the root.
cd
into it.OPAMROOT
environment variable to $PWD/.opam
. This will ensure that all OPAM state for that session is within the current subdirectory.opam-repository
into a subdirectory as well, ensuring a stable package root. This is optional.opam init -a
, optionally specifying the opam-repository
. This gives you a working package set, all installed within .opam
.opam
file describing your current project, and its dependencies.opam pin .
and this local opam
file will create a new package in your universe.opam install --depends-only
and it should satisfy dependencies.opam config exec
, it will ensure that the right PATH
variables are set to use the local OPAM.All of this can be automated and hidden from the user in a Makefile. However, for day-to-day use I find it easier to just use a global ~/.opam
, but I can definitely see how this would be a useful workflow as well. In particular, if you added opam-admin make
into the mix, it would download local archives and guarantee that it works offline.
My work flow with OPAM:
@avsm It would be great to have a documented flow like this (maybe with some added helper opam commands built in to automate some of it). It might even be nice to encourage this as the default mode. I feel like all of this should be possible with very few keystrokes.
Create a Makefile that sets the OPAMROOT
I find that strange for some reason. My package that I'm developing will eventually be shared with other people (via OPAM hopefully). Other OPAM users will depend on it. Seems like this doesn't belong in my Makefile - rather it should be an OPAM wide setting to store all dependencies (or at least symlink them) from the project root. Would there be issues if someone else depends on my package (with my Makefile) and they //don't// set the OPAM root to be the project dir? (Not sure). Either way, it doesn't seem like it should be a concern of //my// project, rather it should be a concern of the command line tools.
Related question: If all deps are installed in the project dir, will it still use a global package cache so the project doesn't have to download redundant packages?
When starting developing using OPAM, I feel like the very first thing anyone should do is make a .opam
file that describes their dependencies that their project wants to pull in. Most of the docs I've found that discuss this are in the "Creating a Package" section. For simplicity, should we encourage any other way - I feel like it should be right on the getting started section. When I make a new project, I want to know that if I git clone
it to any new computer, it will automatically pull in everything - even if it's just a toy project.
I hope this is received as constructive feedback. I would just like it to be extremely easy for people to try out/develop with OCaml so that a wider audience gets to appreciate OCaml. Mirroring the development flow/dependency manager according to one of the world's most popular tools seems like one of the easiest ways to make progress.
This is very useful feedback indeed, please keep it coming. Delayed replies are simply due to a combination of vacation and other deadlines for opam developers :)
Quick response: you can substitute the OPAMROOT with a careful use of switches, but certain things (like pins) may be global across switches (need to confirm that). You can set the OPAMSWITCH env variable in the Makefile to just build everything under a custom switch and not touch global packages in the default system switch.
Making an opam file in the project is the central workflow in Daniel's topkg infrastructure. See my avsm/ocaml-cohttp#new-build branch for an experiment to turn cohttp to using it. Once that works well, I'll be confident in writing down the workflow.
On 18 May 2014, at 22:19, Jordan W notifications@github.com wrote:
@avsm It would be great to have a documented flow like this (maybe with some added helper opam commands built in to automate some of it). It might even be nice to encourage this as the default mode. I feel like all of this should be possible with very few keystrokes.
Create a Makefile that sets the OPAMROOT
I find that strange for some reason. My package that I'm developing will eventually be shared with other people (via OPAM hopefully). Other OPAM users will depend on it. Seems like this doesn't belong in my Makefile - rather it should be an OPAM wide setting to store all dependencies (or at least symlink them) from the project root. Would there be issues if someone else depends on my package (with my Makefile) and they //don't// set the OPAM root to be the project dir? (Not sure). Either way, it doesn't seem like it should be a concern of //my// project, rather it should be a concern of the command line tools.
Related question: If all deps are installed in the project dir, will it still use a global package cache so the project doesn't have to download redundant packages?
When starting developing using OPAM, I feel like the very first thing anyone should do is make a .opam file that describes their dependencies that their project wants to pull in. Most of the docs I've found that discuss this are in the "Creating a Package" section. For simplicity, should we encourage any other way - I feel like it should be right on the getting started section. When I make a new project, I want to know that if I git clone it to any new computer, it will automatically pull in everything - even if it's just a toy project.
I hope this is received as constructive feedback. I would just like it to be extremely easy for people to try out/develop with OCaml so that a wider audience gets to appreciate OCaml. Mirroring the development flow/dependency manager according to one of the world's most popular tools seems like one of the easiest ways to make progress.
— Reply to this email directly or view it on GitHub.
I'll have to give topkg
a try. Thanks.
@jordwalke Sorry for not replying earlier. I wanted to thank you for the very useful input, which arrived just at the right moment, while we were precisely discussing the workflow.
Now nearing release 1.2, here is a preliminary document detailing the preferred workflow. Hopefully things should be much smoother than they were before. You'll need the trunk version of OPAM to try it.
Addressing two of your concerns above:
~/.opam/<switch>
and I haven't seen packages mistakenly writing outside of it. So while it's possible that some packages may leave artifacts when uninstalled, filesystem clutter is still very restricted, and reinstalling the opam switch is enough to get rid of it so there is no longer-term problem. We still have plans to improve on this though.Thanks again for the valuable input !
Thanks, I'd love to try out the new workflow.
I'm still finding OPAMs method of storing packages confusing. Here's what I've been dealing with tonight: OPAM failed to remove js_of_ocaml
, so I went and force removed it from ~/.opam/version/. That didn't work as opam list
still shows that it's installed. How? Furthermore, opam install js_of_ocaml is supposed to install the "latest version" according to the documents, but it doesn't install the latest version (2.3) - it keeps my 1.4 version around. I thought a full removal would work, but as I've mentioned the full removal doesn't seem to work either.
Supposedly, ocamlfind can get out of sync with OPAM, and you have to go force deleting a module from both systems, but that really supports my stance that there shouldn't have to be a sophisticated program that finds all the possible places your libraries could exist on your file system (ocamlfind) - it should just be symlinked in the current directory of whatever particular project your are building. I believe ocamlbuild
would easily work with that system as it has a recursive mode. Also, anyone knows how to debug symlinks. You just follow them and stop when you see a broken link! I'd really like to see OPAM take inspiration from npm
- which, contrary to popular belief, is not a JS module system - it's a resource installer.
I'm attaching the error message that started this whole thing:
I feel like there should be no reason why removal of OPAM packages should ever fail. And this isn't really that helpful of a message.
Oh, there's this file called "installed" - let's see if manually removing the "js_of_ocaml" entry there makes it think it's actually uninstalled. That worked! (But I'm not sure why)
Could OPAM be made to work completely based on file system links to make this kind of troubleshooting much more intuitive?
I feel like there should be no reason why removal of OPAM packages should ever fail. And this isn't really that helpful of a message.
Indeed. This is a bug. Did you get this with opam master ? What is the return code ? All failure cases, error messages, etc. have been thoroughly reviewed since 1.1.
OPAM failed to remove js_of_ocaml, so I went and force removed it from ~/.opam/version/. That didn't work as opam list still shows that it's installed. How?
I don't know of a package manager that can perfectly handle the user manually removing files from its installation ; in case it gets out of sync, you can force OPAM to change its internal state without making modifications to the file system with opam install|remove --fake
, but that's generally not recommended.
Furthermore, opam install js_of_ocaml is supposed to install the "latest version" according to the documents, but it doesn't install the latest version (2.3) - it keeps my 1.4 version around.
You can simply ask to install the specific version with opam install js_of_ocaml.2.3
. OPAM will otherwise attempt to get the highest version, while optimising some other criteria like number of packages removed and number of changes: there may be some incompatibility with your installation, in this case the above command should explain the cause.
Could OPAM be made to work completely based on file system links to make this kind of troubleshooting much more intuitive?
I don't think this is a realistic change for the short term: opam handles a centralised installation (or several, using switches), that may contain the compiler, libraries, resources, binaries, editor modes, etc. ; symlinking, besides not being very portable, is an operation local to a given project, and of quite different scope.
Also, OPAM was written to make packaging easy, so that adoption could be quick, and so it conformed to the most common way libs were used. The lack of sync between ocamlfind and OPAM, though, is indeed annoying, and rest assured that there are long-running goals to better integrate ocamlfind.
I think I'm one version behind 1.1. I'll upgrade and let you know if I run into any more issues. Good to hear the bugs are getting worked out!
I don't know of a package manager that can perfectly handle the user manually removing files from its installation
This is exactly how npm
works. The npm install
command simply looks at the dependencies in your ./package.json
file, and reconciles it with what is on your disk in your current directory's ./node_modules/
. By default, npm install
only installs files locally within the current working directory, into ./node_modules
. If you want a "global installation" that pollutes your PATH, you have to go out of your way by adding the -g
flag, which is usually only used to install applications, not libraries/dependencies.
Of course, npm
might use symlinks to link to a cached package version somewhere instead of actually copying the code into ./node_modules/
but that's an implementation detail/optimization. Deleting the sym link is just like deleting the actual folder contents. Build systems and module loaders don't differentiate between sym links and actual directories. You can always delete a dependency by deleting a particular subdirectory/sym link such as ./node_modules/deleteThisDependency
. Then if you do npm install
again, it simply installs only that packages that are missing from the file system (the one you just deleted).
Keep in mind, npm
(the service - which is notoriously flaky) is very different from npm
(the file dependency installer) and I think OPAM would benefit greatly from mimicking the workflow. (Again, npm
has almost nothing to do with JavaScript, so it helps to keep an open mind). Your npm
package.json
can even point all of the required dependencies at any git
url, so you don't need any dependency on npm
(the service). Depending on a custom git url is trivial, you just list the full git url instead of the package version.
You can simply ask to install the specific version with
opam install js_of_ocaml.2.3
. OPAM will otherwise attempt to get the highest version, while optimising some other criteria like number of packages removed and number of changes: there may be some incompatibility with your installation, in this case the above command should explain the cause.
I really like those optimizations, especially when operating over a low bandwidth connection - perhaps this could be hinted at in the docs? Also it would be nice to demonstrate installing a //particular// version right there in the basic usage guide.
I don't think this is a realistic change for the short term: opam handles a centralised installation (or several, using switches), that may contain the compiler, libraries, resources, binaries, editor modes, etc. ; symlinking, besides not being very portable, is an operation local to a given project, and of quite different scope.
I must say I really like the OPAM switch, but I feel that a particular OCaml compiler version is just as much a dependency of a project as the actual libraries that it depends on and could also be specified in the "opam.json"
file (or whatever is equivalent). But I have a feeling all of this could be accomplished through sym links (though I'm happy to be proven wrong).
Also, OPAM was written to make packaging easy, so that adoption could be quick, and so it conformed to the most common way libs were used. The lack of sync between ocamlfind and OPAM, though, is indeed annoying, and rest assured that there are long-running goals to better integrate ocamlfind.
I see. The current approach probably makes sense if the goal was to make it easier for all of the people with existing projects in OCaml. Maybe it's also worth thinking about a separate (or even complementary OPAM flow) for masses who aren't yet using OCaml at all, but are likely to appreciate it. What would make it easiest for them to try it out and share new software with each other? Who is the best choice for a target audience? If I had to guess, the best audience to target is the massive amount of people who (for one reason or another) already have experience with npm
. There's many reasons for this.
npm
flow puts external dependencies and local development dependencies on the same playing field so it's very easy to use it for local productivity../node_modules
installation, local directory ./package.json
discourages global dependencies, eliminates frustrating errors when your system does not happen to have all the required dependencies installed.npm
's development/debugging flow is fairly intuitive (just follow/delete sym links!)npm
is hugely popular and familiar considering that it's relatively new among package managers. Even if npm
development flow was terrible, its popularity might trump any other reason to not mimic it.I have to say, OPAM has finally made trying/configuring OCaml pleasant! OCaml (the ecosystem) has come a long way from the last time I tried it two years ago and I believe OPAM can be credited with much of that. But could we also consider a more streamlined npm
style dependency (and more importantly developer) flow, if only to appeal to the much larger set of people who don't currently have existing OCaml projects?
Hi Jordan,
It's my understanding that npm
encourages libraries to depend on specific versions of dependencies and alows those dependencies to be included/symlinked differently for different libraries in the same application. That is, dependencies A and B may each depend on C but A depends on C.1 and B depends on C.2. Is this accurate?
How do you see this style of divergent transitive dependencies working with OCaml's type system? Particularly, I'm interested in how we might link multiple divergent deps into the same application. We need a way to deal with type aliases from grandparent deps into parent deps and then conflicts due to divergence. There are also module namespace issues that would need to be addressed.
I have considered writing analyses over packages' exposed signatures to determine if dependencies contribute types to a package's interface or only encapsulated functionality. Due to the aforementioned module namespacing issues among others, I haven't worked any more in this direction.
Given these type safety guarantees that npm
-style packaging would seem to violate, I would encourage you to think of an opam switch as a consistent package universe akin to a single npm
include. This universe is then shared among as many or as few packages as you wish. Packages which share the universe can safely interoperate. Packages which live in different switches may only interoperate through application interfaces (commands, services, etc).
What workflow concepts from npm
do you see as portable into this static type discipline? I think lightweight switch clones with branching package universes could be useful but the CLI may be hard to get right.
I don't particularly find value in the ability to simultaneously have two versions of the same package. I'm sure for larger projects, it's necessary to solve this problem, but TBPH, I can't even reason about what that means (when dependencies store state - and you have two versions of the same dependency, the state is not kept in sync). npm
provides the shrinkwrap
command to lock down these conflicts. Each module can specify a range of versions that it can accept as a dependency. Here's my limited understanding: I believe that if a single-version solution to the constraints exists then only that module will be used project-wide. If not, multiple versions will be included (which is the confusing part). For OPAM, we can mimic the development workflow of npm
, but just fail if there is no single-version solution, instead of including multiple versions.
The primary workflow concepts that I'm suggesting be borrowed into OPAM are not necessarily related to the versioning (though you might take a look at shrink-wrap etc). Here's the main benefits that I appreciate in npm
.
package.json
file that lists its dependencies. It's culturally not okay to just rely on things being installed globally into the system../node_modules/
subdir).npm install
simply reconciles what's on disk with what's in the package.json
listed as dependencies. You can delete things and npm install
just reinstalls them etc.package.json
file that declares your dependencies, also specifies that your project can be depended on and under what name etc. There's nothing "new to learn" when you want to create a new package - you've been creating a package all along, since the moment you listed your project's dependencies!The setup instructions for every project looks exactly like this:
git clone http://github.com.." && npm install
npm link
- this is the command that lets you say "don't fetch the package, go look at my other local copy of it because I'm developing it and I want to test my changes".Thanks again for the input. I'm closing this for 1.2 as we now have a clear workflow, but this is by no means the end of the discussion on dependencies/ocamlfind integration and improvements on that front.
.1. Ubiquity (everyone knows it).
I once had the pleasure of using npm
and I can't say I enjoyed the experience. Perhaps it grows on you.
.2. Every project has a package.json file that lists its dependencies. It's culturally not okay to just rely on things being installed globally into the system.
Every npm
project may do that. Not every JavaScript project uses npm
. Not every OCaml project uses opam. Opam offers both in-package metadata and out-of-band metadata for packaging projects that existed before opam or whose authors/maintainers don't want to support opam. Centralized metadata makes development of the opam tool and syntax more agile as well.
I do think that global installation is not ideal for every use case. With switches and related workflow improvements, OCaml and ocamlfind are moving away from global installation.
.3. Project dependencies are installed into the current directory (in a ./node_modules/ subdir).
I'm not sure how feasible or desirable this is for OCaml. In particular, I'd personally rather have a consistent location for my current universe of packages named by the present opam switch that guarantees that universe is consistent. If I have multiple packages under development, I'd like to see their deps and users in a single place. How do you see the npm
way as better than the current system?
.4. These dependencies are trivial to debug because they're just symlinks/folders.
I don't have much difficulty debugging opam packages that are pinned. Perhaps I'm missing the crucial benefit of having my package manager muck about inside my source tree.
.5.
npm install
simply reconciles what's on disk with what's in thepackage.json
listed as dependencies. You can delete things andnpm install
just reinstalls them etc.
This is a nice property that results from a distinct lack of build systems or staged-anything in JavaScript-land. It would be nice to have a build system for OCaml that would allow one to achieve similar results. For now, however, we need to support all build systems which necessitates opam executing commands that packagers have specified as "build" and "install". On the flip side, one can write quite complicated command-line tools in OCaml which can be executed without waiting for seconds for JIT compilation. I could see a case for something like opam repair
which removes and re-installs a package and checks dependent cmi files for matching checksums. This would short-circuit some of the rebuild-the-world issue when a package needs to be fixed.
In practice however, I'm not familiar with any scenario where an installed package suddenly has files go missing. When does this happen for you? If you change any interfaces in dependencies, all dependents will need to be rebuilt to ensure safety. Maybe you're looking for opam upgrade
or opam reinstall
?
.6. At the end of the day, your project is already in the form of a self contained module because the same package.json file that declares your dependencies, also specifies that your project can be depended on and under what name etc. There's nothing "new to learn" when you want to create a new package - you've been creating a package all along, since the moment you listed your project's dependencies!
We are moving in this direction with the recent pinning features of opam. The thing that is always "new to learn" is how to specify the package's metadata in either package.json
or opam
files. Is there a component of opam packaging that you find particularly unnecessary/confusing/difficult/new-to-learn? Making new opam packages has seemed fairly easy to me and should be getting quite a bit easier with new GitHub workflow integration coming online later this summer.
.7. The setup instructions for every project looks exactly like this:
git clone http://github.com.." && npm install
git clone http://github.com.." && opam pin name . && opam install name
Crucially, this only works for packages that have been directly written to use opam. In the general case of a packaged project, the installation is even easier:
opam install name
.8. No README ever needs to say "make sure you've globally installed X, Y, and Z".
I'm not sure that not documenting your dependencies (especially in the face of diamond deps) is a good idea. Opam is not a requirement to use OCaml or any OCaml library (except for those that use opam-as-a-library, of course).
.9.
npm link
- this is the command that lets you say "don't fetch the package, go look at my other local copy of it because I'm developing it and I want to test my changes".
opam pin
- this is the command that lets you say "don't fetch the package, go look at my other local copy of it because I'm developing it and I want to test my changes".
Overall, it sounds like there are definitely some usability improvements that could be made to opam and some that have already been made and will go out in 1.2. Some of your suggestions seem JavaScript-specific and documentation for developing with opam could definitely be improved to surface the most useful subcommands like pin
and reinstall
.
I'd recommend trying opam for development and then filing issues with specific suggestions for specific pain points as you encounter them. Alternately, picking a single use case and then detailing all of the pain points in the process of achieving your use case would also be helpful.
Thanks for your input!
git clone http://github.com.." && opam pin name . && opam install name
It can actually even be simplified to git clone http://github.com.." && opam pin name .
, you will now be prompted for install automatically.
Crucially, this only works for packages that have been directly written to use opam. In the general case of a packaged project, the installation is even easier:
opam install name
And if you want to have the source at hand and be able to hack it, which seems to be one of the points of npm's workflow, you can also use the brand new:
opam source name --dev --pin
opam source
downloads the source, --dev
selects the version-controlled upstream rather than the release archive, and --pin
locally pins the package to the local path containing the downloaded source.
I'm having a hard time understanding how pin
ning is the same thing as npm install
, but I haven't used pinning in OPAM yet. Can you help me out?
Here's what the docs say:
opam pin <package> </local/path>
This command will use the content of </local/path> to compile <package>. This means
that the next time you will do opam install <package>, the compilation process will be
using a mirror of </local/path> instead of downloading the archive.
This sounds so very different than the npm install
flow I described and still sounds very global. This makes it sound like now that you've 'pinned' this local package, it is now the global default for that package name whenever you do opam install x
. The npm
default is that nothing is global unless you specify that it should be (and it's rare that people do that). npm install
doesn't install anything globally - it only installs your dependencies of the local package.json
into (yep, you guessed it) the local directory. As a result of executing npm install
nothing outside of your directory can even perceive that you executed that command (and that's a good thing!)
If I'm misreading the description of opam pin
, please disregard the following:
The fact that there is a package.json
file (which lists dependencies and has a name/version field) makes this directory a package. There's no command to execute to 'turn it into a package'. You can simply push this to a github url, and anyone can start depending on it. Now, since most people don't have a git server running locally, you can do something similar to opam pin
called npm link
. But even that command has a nice pattern that discourages global pollution: You cd
into the directory of the specific depender, and do npm link ~/myProject/
to make sure that it uses your local copy of myProject
. Note that this doesn't effect anyone else globally.
To clarify my example, npm install
, does download the source of your current project's dependencies (not the source of the project itself - that should already be present). It downloads exactly what was specified in the package.json
file that you cloned. In my example, all that I'm cloning is the package.json
and some files that depend on those dependencies in package.json
. Together, these two files make a "package". This is what I cloned:
myRepo/
├── package.json # Lists dependencies (either npm package names or git urls)
└── myProjectsCode.js # Depends on stuff in `myProjectsCode`
This is a fully self contained package. It only has one code file. That code depends on a bunch of other packages that must be downloaded - as specified in package.json
. When you clone this project, you're not cloning all of the dependencies. Those still need to be downloaded.
When you npm install
(after cloning), you're downloading all the dependencies into the local directory. You're still not polluting the global namespace with this project's name, or any other project's name. Here's what it looks like after you npm install
in this local directory. Again, nothing outside of your directory can even perceive that this has happened.
myRepo/
├── package.json # Lists dependencies (either npm package names or git urls)
├── myProjectsCode.js # Code is just considered static files
└── node_modules/ # All the downloaded (or symlinks to cached) dependencies.
# Dependencies downloaded here.
I'm not saying OPAM can't do this - I'm saying that npms
choice to make this the canonical (and documented) flow is a good decision and OPAM should consider the same. Even that workflow you linked to (for 1.2) uses opam pin
which seems to have perceivable side effects (which is what I'm really trying to address here). Instead, can it start off with the equivalent of making an isolated project that doesn't make itself accessible to other projects on the file system, but still depends on external packages, similar to the npm
flow I mentioned?
Then, can it describe how to make a local project available to other local projects without poluting the global namespace?
Yes, npm
has its issues. None of the issues I've experienced had anything to do with the features/concepts I'm advocating for here. FWIW, npm
package.json
files can also include a build
statement.
Every npm project may do that. Not every JavaScript project uses npm. Not every OCaml project uses opam. Opam offers both in-package metadata and out-of-band metadata for packaging projects that existed before opam or whose authors/maintainers don't want to support opam. Centralized metadata makes development of the opam tool and syntax more agile as well.
I think retaining that ability is a great feature. But whether or not it should be the default for new packages that we build and encouraged in the workflow is a totally different question, correct?
Sorry to hijack the discussion, but I'm still unclear how to apply npm
principles to a world where dependencies needs (i) to be compiled and (ii) to be precise. In JavaScript, it's pretty easy to switch a version of a dependency from one to an other -- in OCaml you can't: all the compiled objects in your project (and in its dependencies) needs to be the same at compile time and at link time, otherwise the compiler complains about object checksum mismatches. So that means that:
Also how does package quality works with npm
? Do you have a way to check that a given set of packages work well together ? If someone has to fix some package metadata while the maintainer of the offending package is not responsive ? And finally, can you specify version constraints on npm
-- something really necessary for statically typed compiled languages -- and if yes, how does it resolve the package constraints ?
(I think) the confusion arises from two different usecases: library and application development. In the case of app development (e.g. Pfff), I would like to simply git clone git://... && opam install
and have all the right dependencies required to develop the app available locally (e.g. in $pwd/.opam
). In the case of libraries, this makes less sense since there's a universe of dependencies it has to interact with to be useful.
For the app case, I'd like to point back to my original comment on the topic: https://github.com/ocaml/opam/issues/1372#issuecomment-43403176
It is possible to automate this with a wrapper script and a custom OPAMROOT
, with the only downside being that it won't inherit any global settings from any other OPAM installation (most notably, remotes).
For libraries, I shared @samoht's doubts about how this could ever work in a statically typed world where dependencies need to be precisely tracked, and version conflicts are essential to prevent coinstallability issues. You can always treat libraries like the app case of course, but this would be very, very slow since there would be no caching between common dependencies.
It's obvious that there's some documentation improvements that need to happen here for 1.2 in terms of recommending workflows, so I'm reopening this issue.
@avsm Treating libraries like the app case seems much cleaner to me. When you say you couldn't cache anything - do you mean that compilation couldn't be cached, or downloading the resources? Surely, downloading resources could be cached.
BTW: npm
allows specifying a valid range of versions that your project can depend on. So if multiple libraries in your project specify compatible ranges, only one version is selected for your entire project. Again, through sym links.
Regarding the flow you cited - yes, I saw that, and I don't deny that OPAM can't mimic the npm
flow - I'm suggesting that it be given first class (perhaps even default) support in the OPAM command line API and documentation. Why can't this be the default? The common case is that someone is going to build an application that uses libraries (though I still wanted to learn why libraries couldn't efficiently be developed in this same way - but that's besides the point).
Here is the more detailed description of the pinning command on 1.2:
DESCRIPTION
This command allows local customisation of the packages in a given
switch. A package can be pinned to a specific upstream version, to a
path containing its source, to a version-controlled location or to an
URL. An `opam' file found at the root of the pinned source will
override the package's opam file from the repository, an `opam'
directory will override all its metadata.
It would be feasible to add a command or plugin that gets the local opam file from a project and does the equivalent of opam install --deps-only
to install the deps globally, then runs the build instructions from your local directory.
If I understand correctly, to have those dependencies installed locally, you would do the rough equivalent of:
cd <project>
export OPAMROOT=$PWD
opam init --comp=<suitable OCaml version>
opam install --deps-only <project>
Pretty sure nobody is going to want to recompile, or even re-extract the compiler and full dependencies for each package. Then you go again for any other package you have depending on this one ?
Also, it may be a "cultural" difference, but when there are lots of inter-dependents packages, I think that it's actually much easier to reason about a single, global universe, rather than a partial universe for each of them.
On 10 Jul 2014, at 09:55, Jordan W notifications@github.com wrote:
@avsm Treating libraries like the app case seems much cleaner to me. Wen you say you couldn't cache anything - do you mean that compilation couldn't be cached, or downloading the resources? Surely, downloading resources could be cached.
I think that's been explained several times in the thread now, notably Thomas' reply just before mine. Every universe of dependencies must be precisely tracked, since if I compiled A, B and C with some revisions, I can't just silently swap out a new revision of B or C without breaking A or B in OCaml (or Haskell or most statically typed languages with type erasure at compile time). OCaml will actively reject this with an 'inconsistent assumptions over the interface' error. BTW: npm allows specifying a valid range of versions that your project can depend on. So if multiple libraries in your project specify compatible ranges, only one version is selected for your entire project. Again, through sym links.
And so does OPAM, but the one that was actually selected for a particular compilation tree must remain the one selected for all other compilation trees that share compiled dependencies, or you have inconsistent assumptions. Regarding the flow you cited - yes, I saw that, and I don't deny that OPAM can't mimic the npm flow - I'm suggesting that it be given first class (perhaps even default) support in the OPAM command line API and documentation. Why can't this be the default? The common case is that someone is going to build an application that uses libraries (though I still wanted to learn why libraries couldn't efficiently be developed in this same way - but that's besides the point).
The common case is most certainly not someone building an application that uses libraries. Most people use OPAM to help develop OCaml libraries at present, with the applications being quite OPAM-free (they just call ocamlfind to find their dependencies and don't particularly care where they come from).
I agree that having an opam-local
script for app development would be useful, but not as the default (at least until our build systems speed up 10x, which should happen soon :-)
-anil
I still think opam is missing some kind of lightweight switch and/or the ability to be able to live with different versions of the package in the same universe.
When I made the incompatible release of react
it was kind of painful since I needed it for one of my project but then js_of_ocaml
could not compile with that incompatible version, so I'd sometime reinstall things (was too lazy to create a switch). In fact I was wondering wether such a hack could work: install to ($SWITCH/lib/$NAME/$VERSION
etc.) register one ocamlfind
name per version with the bare name symlinking to the latest one.
On 10 Jul 2014, at 10:57, Daniel Bünzli notifications@github.com wrote:
I still think opam is missing some kind of lightweight switch and/or the ability to be able to live with different versions of the package in the same universe.
When I made the incompatible release of react it was kind of painful since I needed it for one of my project but thenjs_of_ocaml could not compile with that incompatible version, so I'd sometime reinstall things (was too lazy to create a switch). In fact I was wondering wether such a hack could work: install to ($SWITCH/lib/$NAME/$VERSION etc.) register one ocamlfind name per version with the bare name symlinking to the latest one.
Did you try a switch against your system compiler? That requires no compiler building at all, and is instantaneous.
opam switch -A system react-1.0
.
I agree that simultaneous packages would be nice to have, but the cost/benefit isn't favourable since it would require fixing ocamlfind to understand this notion as well.
-anil
The common case is most certainly not someone building an application that uses libraries. Most people use OPAM to help develop OCaml libraries at present, with the applications being quite OPAM-free (they just call ocamlfind to find their dependencies and don't particularly care where they come from).
That's kind of the point I was trying to make. I wasn't trying to describe the current state of OPAM usage, but more aspirationally, project to where it would be if it helped streamline building of apps. In the end state where OPAM modeled dependencies of all kinds (apps and libraries), I feel that there would be more apps than libraries - more people trying things out than publishing their work for others to use.
I won't comment on the universe dependency resolution stuff.
Didn't try the switch against the system compiler. But then if the system compiler is not the OCaml version you want it is still painful.
Re fixing ocamlfind, isn't it possible to cope with the current system ? How again does an ocamlfind package gets it name ? From the directory in lib
? If that's the case just install to $SWITCH/lib/$NAME-$VERSION
you can now use the ocamlfind name $NAME-$VERSION
to link against the version you want (and again just symlink $NAME to the latest $NAME-$VERSION). Hackish but could work.
On 10 Jul 2014, at 11:11, Jordan W notifications@github.com wrote:
That's kind of the point I was trying to make. I wasn't trying to describe the current state of OPAM usage, but more aspirationally, project to where it would be if it helped streamline building of apps. In the end state where OPAM modeled dependencies of all kinds (apps and libraries), I feel that there would be more apps than libraries - more people trying things out than publishing their work for others to use.
That's an excellent aspiration, but, (much like Facebook's growth through the years) needs to be staged carefully so as not to sabotage the workflow of the early adopters who provide the baseline libraries that the applications actually use.
For now, supporting the application workflow explicitly via an opam-local
script that automates the OPAMROOT
and opam init
steps would let us test the waters for this approach, and not really require any changes to the core of OPAM. This should especially make it easy to compile the likes of Pfff (see #409 for @zmagg's request) without understanding the details of OPAM.
I'm curious about why @dbuenzli thinks that system switches aren't lightweight enough though before going down this path though.
-anil
@jordwalke it's perfectly possible to use opam to build apps (and 1.2 will be great with that respect, as you will be able to create a package by just pinning a package opam pin add myapp $REPO
). IIUC the only thing you are really missing is the truly isolated universes. I'm sure that over the time there will be improvements in that area. Maybe you also need to open up a little bit to the way opam operates and not try to necessarily want to frame it in what you know.
On 10 Jul 2014, at 11:13, Daniel Bünzli notifications@github.com wrote:
Didn't try the switch against the system compiler. But then if the system compiler is not the OCaml version you want it is still painful.
One step at a time! That needs a relocatable compiler patch, but the common case for me anyway is to use the latest stable compiler, which is typically my system compiler. If the
system
switch workflow is ok, then it's a compiler change to sort out the case of a non-system compiler. Re fixing ocamlfind, isn't it possible to cope with the current system ? How again does an ocamlfind package gets it name ? From the directory in lib ? If that's the case just install to $SWITCH/lib/$NAME-$VERSION you can now use the ocamlfind name $NAME-$VERSION to link against the version you want (and again just symlink $NAME to the latest $NAME-$VERSION). Hackish but could work.
Build systems need to know about this. Perhaps a job for assemblage...
-anil
@avsm Two things. First a switch is a statefull operation on your whole environment... In user interface speak it puts you in a mode, modes are bad in general as they lead to mode errors (your brain think you are in a mode but in fact you are not).
Second you also end up having to mirror basic config stuff in the different switches and need to keep them up to date (e.g. installing ocp-index
,ocp-indent
, opam update && opam upgrade) which altoghether is a rather painful bureaucratic business if your only goal it to be able to live with two different version of a single package.
Le jeudi, 10 juillet 2014 à 11:21, Anil Madhavapeddy a écrit :
One step at a time! That needs a relocatable compiler patch, but the common case for me anyway is to use the latest stable compiler, which is typically my system compiler. If the
system
switch workflow is ok, then it's a compiler change to sort out the case of a non-system compiler.
In any case I don't think lightweight switches are the way to go for the reason I mentionned before (too much bookeeping).
Build systems need to know about this. Perhaps a job for assemblage... Not really if build systems simply generate a .install file opam can actually take care of that. Socially that would entice users to use build systems that generate .install files (look you'll be able to have two versions of the same library installed at the same time…)
Daniel
On 10 Jul 2014, at 11:30, Daniel Bünzli notifications@github.com wrote:
Le jeudi, 10 juillet 2014 à 11:21, Anil Madhavapeddy a écrit :
One step at a time! That needs a relocatable compiler patch, but the common case for me anyway is to use the latest stable compiler, which is typically my system compiler. If the
system
switch workflow is ok, then it's a compiler change to sort out the case of a non-system compiler.In any case I don't think lightweight switches are the way to go for the reason I mentionned before (too much bookeeping).
The real issue there is that we are mixing up host tools (ocp-indent, merlin) with local developments. That definitely needs to be addressed somehow. Less global state per switch is the goal...
Build systems need to know about this. Perhaps a job for assemblage... Not really if build systems simply generate a .install file opam can actually take care of that. Socially that would entice users to use build systems that generate .install files (look you'll be able to have two versions of the same library installed at the same time…)
Yes, good point.
-anil
The real issue there is that we are mixing up host tools (ocp-indent, merlin) with local developments. That definitely needs to be addressed somehow. Less global state per switch is the goal...
Build-deps and multi-switch packages are getting us this way. Getting stg together for build deps is my last unfinished goal for 1.2, and I've added a dummy package flag that could be used to help transition for the latter, in due time.
It's obvious that there's some documentation improvements that need to happen here for 1.2 in terms of recommending workflows, so I'm reopening this issue.
Could you be more specific on what areas you feel are lacking in https://github.com/AltGr/opam-wiki/blob/1.2/Packaging.md ? Or should this be better documented somewhere else ?
On 10 Jul 2014, at 12:37, Louis Gesbert notifications@github.com wrote:
It's obvious that there's some documentation improvements that need to happen here for 1.2 in terms of recommending workflows, so I'm reopening this issue.
Could you be more specific on what areas you feel are lacking in https://github.com/AltGr/opam-wiki/blob/1.2/Packaging.md ? Or should this be better documented somewhere else ?
That reads very well to me! The only thing I might add is a TL;DR at the top summarising the essential commands in one quick list.
Good idea, I'll add that. Note that the file is not published yet not to confuse people using the current release, but it's intended to be at http://opam.ocaml.org/doc/Packaging.html.
The other doc pages need refreshing/reorganisation too (developping and packaging should probably be a single page now), and I'll probably rewrite most of the FAQ ; but that can probably be done during the beta.
Le jeudi, 10 juillet 2014 à 14:14, Anil Madhavapeddy a écrit :
That reads very well to me! The only thing I might add is a TL;DR at the top summarising the essential commands in one quick list.
Workflow wise I would also say that I find the idea about making an opam directory a little bit ill-advised, a toplevel opam
file should be enough. The descr
file should be autogenerated in someway (e.g. from a convention in README.md, see e.g. [1] for my packages) and the url
file aswell, I think we want as much as possible avoiding people committing version names, tarball locations, checksum and what not to their repos, so this howto should be somehow examplar.
Best,
Daniel
[1] https://github.com/dbuenzli/pkgopkg/blob/master/bin/pkg-opam-descr
@dbuenzli Maybe the behaviour should be slightly changed then, as the presence of an opam
directory overrides the whole repository metadata, while a file only overrides the opam file, keeping e.g. the files/
subdirectory from the repo. The url
file is, obviously, always ignored since the package is pinned.
Indeed my tutorial mixes a little bit local packaging and full packages on the repo ; I chose the simplest way to get there, but the "publishing" step is expected to be much enhanced with the upcoming platform effort and opam publish
.
For now, supporting the application workflow explicitly via an
opam-local
script that automates theOPAMROOT
andopam init
steps would let us test the waters for this approach, and not really require any changes to the core of OPAM.
Yes, this is all I'm asking for and I understand that not all of the problems are going to be solved in one step.
Getting fine-grained versions without polluting globals sounds like a hard problem to solve. Longer term, maybe there's some tradeoffs that could be made to achieve perfect local sandboxing (slower initial compile times etc - which I'd be willing to accept this in some cases). I'll save those requests for a later time. I mainly brought this up to confirm that the OPAM community views globalness as a concession required to achieve performance - not a value.
For now, just implementing and documenting the opam-local
workflow would be great and I'll happily test it out immediately. (Request: By default, can opam-local install
not pin
the local project globally? I understand if you need to make the dependencies available globally (for the reasons you cited), but not the local package whos dependencies you're installing. The user could optionally pin the local project globally, with a follow up opam pin
command). Is my understanding correct?
Didn't mean to close the issue, thanks.
I want to chime in, since I come from the JVM world, and I too have some issues with the way OPAM handles dependencies globally by default.
The package manager I am most familiar with is sbt, for Scala. The way it works is very similar to NPM - the only minor difference being that the JVM natively handles a classpath which is a forest instead of a tree, so one can just add all dependencies to the classpath, rather than using symlinks.
Now, there are issues when your project requires libraries A and B, which in turn depend on different versions C1 and C2 of a library C. In such cases, you can decide which version, if any, works for both and use that one in your project. This is easy because linking on the JVM is done at runtime, so neither A and B need to commit to C1 or C2 at compile time.
If I understand correctly, this is not really possible in OCaml, due to the fact that linking is done at compile time. What is then the obstruction to A linking to C1 and B linking to C2 and including both as depencies of your project?
Moreover, I tried to follow the whole thread, but I am not sure what is the general consensus at the end. This is also because I am new to Opam and OCaml and I couldn't follow some parts of it.
Say I want to start a new project with a single dependency that I do not want to install globally. For the sake of example, let me assume that dependency is Jane Street core, which should be rather common. What configuration do I need in this case?
@andreaferretti:
What is then the obstruction to
A
linking toC1
andB
linking toC2
and including both as depencies of your project?
I'd claim that supporting the use case of A
/B
's dependencies on C
when that dependency can't be resolved to a specific version that satisfies both - is not well defined. It makes sense that OPAM
would not support it right now, and npm shrinkwrap
is the command I always run after npm install
to make sure that one single version is installed.
That being said, I think including two different versions of C
might be a good idea, but someone would have to define what that means and eliminate all the potential errors that could occur when there is state stored in C
and users of C
expect that the state is shared across anyone that depends on "C".
Hello,
I would like to see how people use OPAM to develop locally. Here's how the ultra-popular resource installer NPM handles development/installation workflows. I was curious if it's possible to achieve the same.
Your git repo for the package your creating has all the source within it:
The
package.json
describes all of the dependencies.When you
cd
into the directory and executenpm install
, all of the dependencies are automatically installed into the subdirectorynode_modules
(a temporary directory - you can delete it and it will be regenerated).Packages aren't allowed to implement anything like
make install
and install crazy artifacts across the system that they forget to uninstall. Everything is installed locally to that particular directory. There's a system cache that is the store of truth for all module versions globally, but it's just a cache and things innode_modules
just symlink to it.If you want to develop locally, you just create a structure like the above and describe your dependencies in the
package.json
and then do annpm install
from within that directory - it installs all the dependencies locally so you don't need to worry about cluttering some distant directories. If you want to locally develop two packagesmyPackageOne
which depends on another locally developed packagemyPackageTwo
, then you just create two separate directories and tell the global system that "hey, you should know aboutmyPackageTwo
".Then you tell
myPackageOne
that it should actually point to the local version ofmyPackageTwo
I don't tend to see a lot of github repos that include the OPAM dependencies configuration directly in the source repo, which seems to make locally developing harder. Don't people usually just want to
git clone
and thencd
and do something likeopam installAllDependenciesWithoutClutteringMySystem
I also notice that packages seem to be encouraged to do whatever system manipulation they want as long as they clean up. Does this ever cause problems?
Could someone kindly show me how to use OPAM as the way to locally develop many modules that depend on eachother? Could this be made prominent in the official docs? I'd like to use OPAM not just as a way to install artifacts into my system, but I'd like to use OPAM as a core part of my development workflow.
Thanks a ton for the help!