sandialabs / pyttb

Python Tensor Toolbox
https://pyttb.readthedocs.io
BSD 2-Clause "Simplified" License
26 stars 13 forks source link

`gcp_opt` with `StochasticSolver`s doesn't report iteration information #251

Closed jeremy-myers closed 1 year ago

jeremy-myers commented 1 year ago

Minimal working example

X = ttb.tenones((2, 2))
X[0, 1] = 0.0
X[1, 0] = 0.0
rank = 2

# Select Gaussian objective
objective = Objectives.GAUSSIAN

# try not setting printitn (default = 1 in both Adam constructor and gcp_opt)
optimizer = Adam(max_iters=3)
result_adam, _, info_adam = ttb.gcp_opt(
    data=X, rank=rank, objective=objective, optimizer=optimizer, printitn=1
)

# try setting printitn in gcp_opt()
optimizer = Adam(max_iters=3)
result_adam, _, info_adam = ttb.gcp_opt(
    data=X, rank=rank, objective=objective, optimizer=optimizer, printitn=1
)

# try setting printitn in Adam constructor
optimizer = Adam(max_iters=3, printitn=1)
result_adam, _, info_adam = ttb.gcp_opt(
    data=X, rank=rank, objective=objective, optimizer=optimizer
)

# try both
optimizer = Adam(max_iters=3, printitn=1)
result_adam, _, info_adam = ttb.gcp_opt(
    data=X, rank=rank, objective=objective, optimizer=optimizer, printitn=1
)

# try SGD
optimizer = SGD(max_iters=3, printitn=1)
result_adam, _, info_adam = ttb.gcp_opt(
    data=X, rank=rank, objective=objective, optimizer=optimizer, printitn=1
)

# try Adagrad
optimizer = Adagrad(max_iters=3, printitn=1)
result_adam, _, info_adam = ttb.gcp_opt(
    data=X, rank=rank, objective=objective, optimizer=optimizer, printitn=1
)

## <--- no output
ntjohnson1 commented 1 year ago

To clarify you aren't getting iteration information or ANY printed information? (If the first) It looks like I created the msg content but didn't log it here https://github.com/sandialabs/pyttb/blob/8c9bdbba6706e6f91094009dee53068bc4e96dee/pyttb/gcp/optimizers.py#L200 so just adding a log call similar to the one at the end of the function should globally resolve for the stoachastic optimizers

jeremy-myers commented 1 year ago

To clarify you aren't getting iteration information or ANY printed information? (If the first) It looks like I created the msg content but didn't log it here

https://github.com/sandialabs/pyttb/blob/8c9bdbba6706e6f91094009dee53068bc4e96dee/pyttb/gcp/optimizers.py#L200

so just adding a log call similar to the one at the end of the function should globally resolve for the stoachastic optimizers

The first. I won't be able to get to it til next week, if someone else wants to fix this and close the issue.