opencog / attention

OpenCog Attention Allocation Subsystem
Other
12 stars 19 forks source link

Build fails on Debian sid (CMake or Boost issue?) #25

Open mhatta opened 3 years ago

mhatta commented 3 years ago

The build of "attention" module fails on the current Debian sid:

(snip)
-- Configuring done
CMake Error at opencog/attention/CMakeLists.txt:8 (ADD_LIBRARY):
  Target "attention" links to target "Boost::system" but the target was not
  found.  Perhaps a find_package() call is missing for an IMPORTED target, or
  an ALIAS target is missing?

CMake Error at opencog/attention/CMakeLists.txt:8 (ADD_LIBRARY):
  Target "attention" links to target "Boost::filesystem" but the target was
  not found.  Perhaps a find_package() call is missing for an IMPORTED
  target, or an ALIAS target is missing?
(snip)

Modifying FIND_PACKAGE in CMakeLists.txt to something like FIND_PACKAGE(Boost REQUIRED COMPONENTS system filesystem) fixes this, but I'm not sure this is the way to go. The current version of CMake in sid is 3.16.3, Boost is 1.71.0.3.

linas commented 3 years ago

There are some other recent bug reports similar to this; they were fixed by moving all the python code to a python module.

-- are you using the latest atomspace? -- did you do a clean build i.e. rm -r build; mkdir build; cd build; cmake .. -- any old, stale libraries in the library search path, that are accidentally being linked?

Problem is, of course, nothing there actually uses boost. The only thing that does use it is python, and attention does not link directly to python.

ngeiswei commented 3 years ago

Has anybody made progress on that? I'm having the same problem on Manjaro.

linas commented 3 years ago

@ngeiswei last time we talked, you said the problem went away, after the refactoring of the python code ... since attention no longer links to python, and thus, no longer links to boost, I can't imagine how this problem can arise, unless there is some left-over cruft in the build directory, or the install directories...

This comment: https://github.com/opencog/opencog/issues/3653#issuecomment-664773134 where I quote: "I had this problem on Manjaro, opencog/atomspace#2738 fixed it "

linas commented 3 years ago

the python refactoring that eliminated the boost dependencies was this one: opencog/atomspace#2738

mjsduncan commented 3 years ago

i have the same problem on ubuntu 20.04, with current pull (master c7a557a) and had to use @mhatta 's fix described above for cmake to work.

ngeiswei commented 3 years ago

So you know I have pushed

https://github.com/opencog/attention/tree/workaround-issue-25

with @mhatta workaround until this gets fixed (I'm willing to bite the bullet but I just have more important/interesting things to do ATM) that I'll keep rebased onto the master for the time being.

linas commented 3 years ago

This was patched just now in pull req #28. All that it really does is to work around a python bug. We need someone ho understands python to maintain the python code.

(This is a python bug because only python uses those boost components. attention does not actually use them, so the proposed fix shouldn't be needed ...)