glm-tools / pyglmnet

Python implementation of elastic-net regularized generalized linear models
http://glm-tools.github.io/pyglmnet/
MIT License
280 stars 84 forks source link

weird effect in verbose #56

Closed hugoguh closed 8 years ago

hugoguh commented 8 years ago

if I instantiate the same model again, the verbose adds one repetition: screen shot 2016-05-08 at 1 39 53 pm

it adds one more every time I instantiate a model. Doesn’t even have to be the same model again.

jasmainak commented 8 years ago

hmm ... can you share a few lines of code to reproduce the problem?

hugoguh commented 8 years ago

sure! let’s see it with only one reg_lambda, first just like in README, except setting one reg_lambda:

import numpy as np
import scipy.sparse as sps
from sklearn.preprocessing import StandardScaler
from pyglmnet import GLM

# create class of Generalized Linear model
model = GLM(distr='poisson', verbose=True, alpha=0.05, reg_lambda = 0.1)

n_samples, n_features = 10000, 100

# coefficients
beta0 = np.random.normal(0.0, 1.0, 1)
beta = sps.rand(n_features, 1, 0.1)
beta = np.array(beta.todense())

# training data
Xr = np.random.normal(0.0, 1.0, [n_samples, n_features])
yr = model.simulate(beta0, beta, Xr)

# testing data
Xt = np.random.normal(0.0, 1.0, [n_samples, n_features])
yt = model.simulate(beta0, beta, Xt)

# fit Generalized Linear Model
scaler = StandardScaler().fit(Xr)

now fit

model.fit(scaler.transform(Xr), yr)

here’s the output, looks good except for that extra space ;): screen shot 2016-05-09 at 11 02 30 am

now instantiate another (or re-instantiate same model):

model_another = GLM(distr='poisson', verbose=True, alpha=0.05, reg_lambda = 0.1)

now fit either model:

model.fit(scaler.transform(Xr), yr)

here’s the output, it repeats everything twice: screen shot 2016-05-09 at 11 06 25 am

looks like the number of repetitions equals the number of models you have instantiated so far

hugoguh commented 8 years ago

remove that extra spacing after dL/L:... also :)