Open segalinc opened 4 years ago
this may be related to the issue #229 i posted earlier, seems to be something related with the preprocessings of the input but i still cant say exactly what it is .
iam also using all libs in (seeminly) right versions, however the assertion error persists, the image_T_only produces Nans
yeah,it loses it's usability if you can only play in their environment
yes i tried to hack the whole thing out, even securing the dependency version of numpy and tensoflow it does not works. Its get nans as outputs.
I think a path to uncover is see what make_T_vis and show functions are doing. Seem a bit crazy to me that versions cant guarantee the workings, but there may be more dependency in the backend we are not aware of ...
Please, if you can solve it feedback us, thank you.
On Fri, Feb 21, 2020 at 12:51 PM Cristina Segalin notifications@github.com wrote:
yeah,it loses it's usability if you can only play in their environment
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/tensorflow/lucid/issues/230?email_source=notifications&email_token=AAOOCX47ZQZYGQNKHXXVZFLRD72B5A5CNFSM4KYZKUTKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEMTEYXQ#issuecomment-589712478, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOOCX32WUJGUK56JUY33BTRD72B5ANCNFSM4KYZKUTA .
for me is more a matter of no being able to load some of the attributes in some modules or visualize correctly the objects and also assertions errors as well. If I had the time I would re-implement what I need but I am lacking of that at the moment.
Keep me posted as well, thank you
Which code are you trying to execute ? Are you able to go though the tutorial, but outside Colab env ?
On Fri, Feb 21, 2020 at 1:07 PM Cristina Segalin notifications@github.com wrote:
for me is more a matter of no being able to load some of the attributes in some modules or visualize correctly the objects and also assertions errors as well. If I had the time I would re-implement what I need but I am lacking of that at the moment.
Keep me posted as well, thank you
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/tensorflow/lucid/issues/230?email_source=notifications&email_token=AAOOCX4BKOQ3DDJX2LRAC4LRD734VA5CNFSM4KYZKUTKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEMTGNIQ#issuecomment-589719202, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOOCX3DVCIJCZUHHA62HM3RD734VANCNFSM4KYZKUTA .
I am playing around with the texture 3d synthesis. I can run in on colab but outside it's a mess.
I think this is the case for the other notebooks too. Basic tutorial goes the same thing (and it is the basics ! its not as hard as 3D textures), maybe there is a pre-processing taking place to adequate the code to _img_url function that helps to display in colab, but this preprocessing is buggying others.
I ask the mainteiners and the code creator to try to revise it for jupyter-notebooks in a plenty fresh installation, and see what happens ...
On Fri, Feb 21, 2020 at 1:18 PM Cristina Segalin notifications@github.com wrote:
I am playing around with the texture 3d synthesis. I can run in on colab but outside it's a mess.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/tensorflow/lucid/issues/230?email_source=notifications&email_token=AAOOCX657WMDGDO7C3NBZXDRD75EXA5CNFSM4KYZKUTKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEMTHRHQ#issuecomment-589723806, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOOCX337IXQBRUDIU4HI3TRD75EXANCNFSM4KYZKUTA .
that would be really cool!!
I would start with the following strategy, try to execute in eager mode or try to get rid of every tensorflow function that preprocess image, since you have to start a session to get the tensor, this means the preprocessing will take place only when you do the whole thing.
Them see what is going on with the number is numpy, once you solved the nan problem, go back to tf functions.
But i still dont know if this gonna works, best regards !
On Fri, Feb 21, 2020 at 1:27 PM Cristina Segalin notifications@github.com wrote:
that would be really cool!!
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/tensorflow/lucid/issues/230?email_source=notifications&email_token=AAOOCX6LI4GTDOPHYES366DRD76ITA5CNFSM4KYZKUTKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEMTISPY#issuecomment-589728063, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOOCX26UVSSBQBX2DOUNBLRD76ITANCNFSM4KYZKUTA .
try to put it before all the code you are executing tf.enable_eager_execution() tf.enable_resource_variables()
them go on hacking and seeing the variable values ...
On Fri, Feb 21, 2020 at 1:35 PM Daniel Penalva dkajah@gmail.com wrote:
I would start with the following strategy, try to execute in eager mode or try to get rid of every tensorflow function that preprocess image, since you have to start a session to get the tensor, this means the preprocessing will take place only when you do the whole thing.
Them see what is going on with the number is numpy, once you solved the nan problem, go back to tf functions.
But i still dont know if this gonna works, best regards !
On Fri, Feb 21, 2020 at 1:27 PM Cristina Segalin notifications@github.com wrote:
that would be really cool!!
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/tensorflow/lucid/issues/230?email_source=notifications&email_token=AAOOCX6LI4GTDOPHYES366DRD76ITA5CNFSM4KYZKUTKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEMTISPY#issuecomment-589728063, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOOCX26UVSSBQBX2DOUNBLRD76ITANCNFSM4KYZKUTA .
or you can simply create a session with the graph in the context (the supposition is it is alreaded builded by model keras instance), generate an init_op and run it, then eval image, but seems an homeric turn around.
On Fri, Feb 21, 2020 at 1:46 PM Daniel Penalva dkajah@gmail.com wrote:
try to put it before all the code you are executing tf.enable_eager_execution() tf.enable_resource_variables()
them go on hacking and seeing the variable values ...
On Fri, Feb 21, 2020 at 1:35 PM Daniel Penalva dkajah@gmail.com wrote:
I would start with the following strategy, try to execute in eager mode or try to get rid of every tensorflow function that preprocess image, since you have to start a session to get the tensor, this means the preprocessing will take place only when you do the whole thing.
Them see what is going on with the number is numpy, once you solved the nan problem, go back to tf functions.
But i still dont know if this gonna works, best regards !
On Fri, Feb 21, 2020 at 1:27 PM Cristina Segalin < notifications@github.com> wrote:
that would be really cool!!
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/tensorflow/lucid/issues/230?email_source=notifications&email_token=AAOOCX6LI4GTDOPHYES366DRD76ITA5CNFSM4KYZKUTKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEMTISPY#issuecomment-589728063, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOOCX26UVSSBQBX2DOUNBLRD76ITANCNFSM4KYZKUTA .
You can use something like this to try to debug the code while the devs are busy to not attend this bug.
def debug_info(tensor, mode="value"):
import matplotlib.pyplot as plt
with tf.Session(graph=tf.get_default_graph()) as sess:
opinit=tf.global_variables_initializer()
sess.run(opinit)
t = tensor.eval()
print(t.shape)
print(t)
plt.imshow((t[0,...]*255))
The thing iam wondering can be causing bug is not the pre-processing, since it showed numerical matrices, but objective.as_objective can be breaking the path of the layer or the tensor, returning something other than the tensor it must be returned, but i cant look at this point right now. C ya latter
On Fri, Feb 21, 2020 at 2:17 PM Daniel Penalva dkajah@gmail.com wrote:
or you can simply create a session with the graph in the context (the supposition is it is alreaded builded by model keras instance), generate an init_op and run it, then eval image, but seems an homeric turn around.
On Fri, Feb 21, 2020 at 1:46 PM Daniel Penalva dkajah@gmail.com wrote:
try to put it before all the code you are executing tf.enable_eager_execution() tf.enable_resource_variables()
them go on hacking and seeing the variable values ...
On Fri, Feb 21, 2020 at 1:35 PM Daniel Penalva dkajah@gmail.com wrote:
I would start with the following strategy, try to execute in eager mode or try to get rid of every tensorflow function that preprocess image, since you have to start a session to get the tensor, this means the preprocessing will take place only when you do the whole thing.
Them see what is going on with the number is numpy, once you solved the nan problem, go back to tf functions.
But i still dont know if this gonna works, best regards !
On Fri, Feb 21, 2020 at 1:27 PM Cristina Segalin < notifications@github.com> wrote:
that would be really cool!!
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/tensorflow/lucid/issues/230?email_source=notifications&email_token=AAOOCX6LI4GTDOPHYES366DRD76ITA5CNFSM4KYZKUTKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEMTISPY#issuecomment-589728063, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOOCX26UVSSBQBX2DOUNBLRD76ITANCNFSM4KYZKUTA .
Thanks for reaching out @cristinasegalin and @Uiuran .
Lucid itself does not depend on colab, but code in the notebooks often relies on colab features to create visualizations of lucid's output. I personally use lucid outside of colab on a day to day basis, as do many of my colleagues.
In order to say more, I'd need to you to report specific failures with debugging details.
Please keep in mind that Lucid is research code. It's maintained by researchers actively engaged in interpretability research. We share it with the community in the hopes that it helpful, but we don't have the capacity to provide detailed user support or debugging.
Linus method. Thank you C. Olah.
One of the problems is that the Show function returns nan. However make_t_image and transform_f evaluations returns numeric values, what make me wonderer that the problem may be in objective.as_objective, that seems to translate the name of the layer passed by in the function and return a Tensor.
As I mentioned earlier, the first try in the function render_vis of the tutorial notebook return nans, the same happens to the notebook of @cristinasegalin. This makes me think the problem may be the same.
You use in daily basis, but which parameter did you change to use in your jupyter ?? (or you use in google colab version anyway ? unfortunately the code is note open).
On Fri, Feb 21, 2020 at 4:43 PM Christopher Olah notifications@github.com wrote:
Thanks for reaching out @cristinasegalin https://github.com/cristinasegalin and @Uiuran https://github.com/Uiuran .
Lucid itself does not depend on colab, but code in the notebooks often relies on colab features to create visualizations of lucid's output. I personally use lucid outside of colab on a day to day basis, as do many of my colleagues.
In order to say more, I'd need to you to report specific failures with debugging details.
Please keep in mind that Lucid is research code. It's maintained by researchers actively engaged in interpretability research. We share it with the community in the hopes that it helpful, but we don't have the capacity to provide detailed user support or debugging.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/tensorflow/lucid/issues/230?email_source=notifications&email_token=AAOOCX5AVFTVR4BYKVV4UUDREAVEXA5CNFSM4KYZKUTKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEMT3RAI#issuecomment-589805697, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOOCX6MIPZBJCE4O2DPNZ3REAVEXANCNFSM4KYZKUTA .
thank you again, iam trying the following (this is mentioned in the render_vis docstring)
with tf.get_default_graph().as_default() as graph, tf.Session() as sess:
T = render.make_vis_T(model, "mixed4a:300")
tf.initialize_all_variables().run()
for i in range(500):
if 10*(i//10)-i == 0:
print(i)
sess.run([T("vis_op").outputs[0]])
print(T("input").eval()[0].shape)
plt.imshow(255*T("input").eval()[0])
But something seens not to be working, since i only get gray images or nan (the algorithm is non deterministic in some place too ...)
i cant get the code to work, nor the same notebook of the tutorial, nor this simpler version (or more boring as stated in the docstring). As Colah said, there may be numerical stability issues, but it never happened to me when i was working on the deep dream code, so i will be back to that code, and as soon as this works outside colab, then i came back to contribute.
An aside: the mere fact that one thing works optimally in the cloud, but not outside of it, even with same dependency (apparently ...), is bizarre (of course that's the reason why we are researching). That's mean that there is something out there that is a dependency to Lucid.
Another way to see it is if the authors contrib with scripts that would work outside colab, as to exemplify what type of source code they thing does the job in any environment or .... add more dependencies that now are not visible.
I am really anxious to see this work, because doing Art work with the old original deep dream is pretty tedious, iam having to structure a lot of code in the place of editing-images time. Also, the old deep dream has a lot of gimmick magick such those random tiles and laplacians, slowly i wanna see it understabled and usable by digital artists. I know you guys are author of the old Deep Dream too, so i hope we can work on these together.
Regards.
Sorry for the late response I had to work on other stuff. I also took time to run all the colab notebook outside colab and they actually work except for some few parts where I think there are some discrepancies between python2 and python3 (so change few things to adjust) or using svelte as I haven't set that up. So I wasn't fully able to run the building blocks and activation atlas notebooks because of that but all the other works actually outside on a jupyter notebook. I guess as there are some dependencies on ipython then some of the function may not work if just using regular python environment
the ones that for me don't work and I am interested in are actually the texture synthesis and style 3d. When I run them and try to visualize the results nothing is shown. I think is a problem with opengl as I get an error when trying to define renderer = glrenderer.MeshRenderer((512, 512)) or trying to visualize show.textured_mesh
Can you post your configurations, versions and the versions of dependencies that you are running (i was not able to run nor texture 3D nor the basics). Maybe if i fix my dependencies, and maybe this would be placed as a commit to lucid install setup, than you can be helped with more hands.
Was there ever any resolution on this issue? I can't seem to get Lucid to run the code from the first tutorial outside of the collab
model = models.InceptionV1()
model.load_graphdef()
_ = render.render_vis(model, "mixed4a_pre_relu:476")
gives...
File "/home/emassey3/anaconda3/envs/tf1/lib/python3.7/site-packages/lucid/misc/io/serialize_array.py", line 51, in _normalize_array assert not np.isnan(array).any() AssertionError
I am trying to run one of the ipynb outside colab in a juoyter notebook but visualizations are not showed, some model don't work.
I am using latest version of lucid I think is 0.3.8 and tensorflow-gpu 1.14.0