numenta / nupic.core-legacy

Implementation of core NuPIC algorithms in C++ (under construction)
http://numenta.org
GNU Affero General Public License v3.0
272 stars 276 forks source link

nupic.core is installable as a standalone library #77

Closed oxtopus closed 9 years ago

oxtopus commented 10 years ago

Right now, the numenta/nupic and numenta/nupic.core repositories are tightly coupled and have an integrated build process. It'd be nice if nupic.core were installable to a common location and reusable independent of the git repositories. Even better, there were binary releases that didn't require a git checkout.

This presumably means that there is a separate build and install command:

mkdir -p $NUPIC_CORE/build/scripts
cd $NUPIC_CORE/build/scripts
cmake $NUPIC_CORE/src
make -j3
make install
utensil commented 10 years ago

+1

Related to numenta/nupic#805 and part of nupic.core Extraction Plan: Step 2: Prepare for nupic.core release.

utensil commented 10 years ago

Ideally, this could be implemented like following:

In CMake:

  1. search for user-wide env var or config file specified location, as I purposed in #817 , this provides a developer's overide
  2. if not found, search for user-wide standard location
  3. if not found, search for system-wide standard location
  4. if not found, check out the corresponding version of nupic.core, build it and install it to a standard location
  5. if the above has found a nupic.core installation, header include path and library path will be automated set and printed
  6. building the binding is as easy as:
cmake && make && make intall

and setup.py calls the above as discussed in numenta/nupic#805 and being implemented by @david-ragazzi at numenta/nupic#809 .

The above would improve the pip-installable experience to build nupic.

oxtopus commented 10 years ago

:+1: with one minor consideration. I believe @subutai has mentioned this, too.

The cmake && make && make install cycle should install to system-level by default, say, /usr/local/lib. Alternate installation paths can be specified as CLI args to CMake, and its up to the user to use sudo where appropriate. This is a pattern that is repeated elsewhere over and over with configure + make.

utensil commented 10 years ago

The cmake && make && make install cycle should install to system-level by default, say, /usr/local/lib. Alternate installation paths can be specified as CLI args to CMake, and its up to the user to use sudo where appropriate. This is a pattern that is repeated elsewhere over and over with configure + make.

Thanks for pointing out. I do have confusion about the standard pattern (because in node.js, npm install -g without sudo would install to ~/.npm and otherwise system-wide), and my previous comment should be updated accordingly:

run make install for nupic.core would install nupic.core to a system-wide standard place, that includes headers and .a

That said, I still believe node.js's approach does add some value. But let's follow the convention of configure + make here.

rhyolight commented 10 years ago

:+1:

rcrowder commented 10 years ago

On Windows we have made progress with packaging and deploying to NuGet I'm working towards documenting the deployment and use of the deployed package soon. Ahmed has done the grunt of the shared library support and packaging.

Example can be seen here; https://ci.appveyor.com/project/kandeel/nupic-core-393/build/artifacts

david-ragazzi commented 10 years ago

Hi guys, as Windows supports PIP, what's the point on use NuGet to get packages? Is the objective get all packages already handled by PIP? If so I think use two package managers could increase complexity with little gain..

rcrowder commented 10 years ago

I agree. It would be nice to have one package manager. The task to support non-VC x86 compilers could push the Windows build back towards Travis, and a common packaging and deployment solution.

Nuget is nice. In that it integrates into Visual Studio nicely. Last night it was a breeze to setup a new project and include the core nuget package, link and debug etc.

Only reason in my mind at the moment is that ease of use on Windows through Visual Studio. Is Pip common place on PC? (I guess if you develop Python on Windows it should be). Food for thought..

david-ragazzi commented 10 years ago

I agree. It would be nice to have one package manager. The task to support non-VC x86 compilers could push the Windows build back towards Travis, and a common packaging and deployment solution. Nuget is nice. In that it integrates into Visual Studio nicely. Last night it was a breeze to setup a new project and include the core nuget package, link and debug etc.

I also like NuGet (my background is .NET), but as far I know NuGet is dependent of .NET IDEs (VS, MonoDevelop, SharpDevelop, etc: http://docs.nuget.org/docs/start-here/nuget-faq) while PIP is the default package manager for cross platforms in addiction to be supported by Travis as you said. I think that when Travis finally support Windows, if you addopt PIP right now, we will avoid rework to port from NuGet to PIP again. Furthermore, NuPIC community is already accustomed to use PIP, which could help you with any future issues.

Only reason in my mind at the moment is that ease of use on Windows through Visual Studio. Is Pip common place on PC? (I guess if you develop Python on Windows it should be). Food for thought..

Yeah, PIP works well on Windows and newer versions of Python already ships with PIP!

utensil commented 10 years ago

IMHO, pip may bring everyone nupic but not nupic.core. We would need to distribute nupic.core to all major package managers(yum, apt-get etc. and inclues NuGet) after we have "nupic.core is installable as a standalone library" , because it's a C++ project and not a Python one.

utensil commented 10 years ago

fpm looks promising for the purpose of distribute nupic.core to *nix-style package managers. Found it at https://github.com/showcases/package-managers .

breznak commented 9 years ago

"..promising for the purpose of distribute nupic.core to *nix-style package managers..."

I would not think we need to put much work in maintaining binary packages for unix/linux. Typically users install from their enabled repositories, so it's the package maintainer's job to provide a compiled package (in format needed for the distro). Of course, we can put some (rpm\dep) packages somewhere on the site (+ cookies if done automatically after build)

rhyolight commented 9 years ago

As long as it decreases installation time.


Matt Taylor OS Community Flag-Bearer Numenta

On Thu, Oct 9, 2014 at 9:54 AM, breznak notifications@github.com wrote:

"..promising for the purpose of distribute nupic.core to *nix-style package managers..."

I would not think we need to put much work in maintaining binary packages for unix/linux. Typically users install from their enabled repositories, so it's the package maintainer's job to provide a compiled package (in format needed for the distro). Of course, we can put some (rpm\dep) packages somewhere on the site (+ cookies if done automatically after build)

— Reply to this email directly or view it on GitHub https://github.com/numenta/nupic.core/issues/77#issuecomment-58540626.

breznak commented 9 years ago

As long as it decreases installation time.

I'm 100% :+1: for nupic.core as a binary. I just suggesting we dont have to worry about distribution format (just provide .tar.gz)

oxtopus commented 9 years ago

@breznak I believe we can do exactly as you suggest with https://github.com/numenta/nupic.core/pull/228 by tarring up the resulting directory specified by -DCMAKE_INSTALL_PREFIX directory after make install.

Should be trivial to add support for rpm and other packaging formats, too.