rmcantin / bayesopt

BayesOpt: A toolbox for bayesian optimization, experimental design and stochastic bandits.
GNU Affero General Public License v3.0
396 stars 92 forks source link

Segfault for low discrete parameter space #27

Closed tfachmann closed 2 years ago

tfachmann commented 4 years ago

I am not getting the DiscreteModel to run without a segmentation fault. (continuous model works fine)

It might be connected to the discrete parameter space. The error can be reproduced by changing the number of discrete points (line 80) in examples/bo_disc.cpp:

const size_t nPoints = 1000  // large space - works fine
const size_t nPoints = 10  // small space - segfaults

The first output of valgrind:

==64059== Memcheck, a memory error detector
==64059== Copyright (C) 2002-2015, and GNU GPL'd, by Julian Seward et al.
==64059== Using Valgrind-3.12.0 and LibVEX; rerun with -h for copyright info
==64059== Command: ./bin/bo_disc
==64059==
Running C++ interface
- 12:30:51.665538 INFO: Expected 6 hyperparameters. Replicating parameters and prior.
- 12:30:51.728427 INFO: Using default parameters for criteria.
==64059== Invalid read of size 8
==64059==    at 0x43D25A: bayesopt::DiscreteModel::generateInitialPoints(boost::numeric::ublas::matrix<double, boost::numeric::ublas::basic_row_major<unsigned long, long>, boost::numeric::ublas::unbounded_array<double, std::allocator<double> > >&) (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==    by 0x442E43: bayesopt::BayesOptBase::initializeOptimization() (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==    by 0x44383C: bayesopt::BayesOptBase::optimize(boost::numeric::ublas::vector<double, boost::numeric::ublas::unbounded_array<double, std::allocator<double> > >&) (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==    by 0x43528C: main (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==  Address 0x5a94bd8 is 8 bytes after a block of size 240 alloc'd
==64059==    at 0x4C2A6F0: operator new(unsigned long) (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so)
==64059==    by 0x43D355: bayesopt::DiscreteModel::generateInitialPoints(boost::numeric::ublas::matrix<double, boost::numeric::ublas::basic_row_major<unsigned long, long>, boost::numeric::ublas::unbounded_array<double, std::allocator<double> > >&) (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==    by 0x442E43: bayesopt::BayesOptBase::initializeOptimization() (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==    by 0x44383C: bayesopt::BayesOptBase::optimize(boost::numeric::ublas::vector<double, boost::numeric::ublas::unbounded_array<double, std::allocator<double> > >&) (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==    by 0x43528C: main (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
==64059==

Compiler used: gcc / g++ 4.8.5

rmcantin commented 4 years ago

generateInitialPoints assume that the number of posible discrete values is larger than the number of points used for initialization. In that example: par.n_init_samples = 20; so nPointsshould be at least 20.

If you can evaluate all the posible points from the first iteration, what is the point of running an optimization?