Closed farazk86 closed 2 years ago
@farazk86 thank you for submitting the issue. With the information you provided, I was able to recreate the issue on my end.
Hi,
Thanks for verifying the bug. Is there any update on this? Is the problem related to Tensorflow 2? Would going to an earlier version of ludwig fix this?
@farazk86 re: the sampled_softmax
issue. We are still looking at it. In the meantime, you should be able to use the regular softmax cross entropy loss function.
@farazk86 re: the
sampled_softmax
issue. We are still looking at it. In the meantime, you should be able to use the regular softmax cross entropy loss function.
I am still getting the same error with softmax_xross_entropy
:(
below is my model file:
training:
epochs: 800
early_stop: 60
batch_size: 128
# dropout_rate: 0.3
input_features:
-
name: nhl
type: text
level: word
encoder: t5
reduce_output: null
trainable: true
output_features:
-
name: volpiano
type: text
level: word
decoder: generator
attention: bahdanau
loss:
type: softmax_cross_entropy
error:
ValueError: Invalid reduction dimension 2 for input with 2 dimensions. for '{{node ecd/text_output_feature/Max}} = Max[T=DT_FLOAT, Tidx=DT_INT32, keep_dims=false](ecd/text_output_featu
re/Abs_1, ecd/text_output_feature/Max/reduction_indices)' with input shapes: [128,512], [] and with computed input tensors: input[1] = <2>.
Please advise. :(
Hi @jimthompson5802 , I admire this library and I would like to contribute.
is causing the error. I tried by commenting out while keeping the loss type softmax_cross_entropy
and it worked. I am not sure if that could have undesirable effects.
If you think I can work on this issue, please let me know.
@jenishah thank you for the positive feedback on Ludwig.
Re: the issue you are reporting. Please provide a the config.yaml
and a small dataset, if possible, to see if I can reproduce the error.
Hi @jimthompson5802 , I admire this library and I would like to contribute.
I think
is causing the error. I tried by commenting out while keeping the loss type
softmax_cross_entropy
and it worked. I am not sure if that could have undesirable effects.If you think I can work on this issue, please let me know.
Thanks for identifying the cause of this error :)
Did you find any effect of commenting this line out on your final result?
@farazk86 @jenishah I forgot to mention a PR was recently merged that fixed the sampled_softmax
issue. This fix will be in the next release of Ludwig. In the meantime, if you want to see if the fix addresses this issue, just uninstall Ludwig and reinstall from Ludwig's main branch using this command:
pip install git+https://github.com/ludwig-ai/ludwig.git
Let me know if this helped or not.
@farazk86 @jenishah I forgot to mention a PR was recently merged that fixed the
sampled_softmax
issue. This fix will be in the next release of Ludwig. In the meantime, if you want to see if the fix addresses this issue, just uninstall Ludwig and reinstall from Ludwig's main branch using this command:pip install git+https://github.com/ludwig-ai/ludwig.git
Let me know if this helped or not.
Hi Jim,
Thanks for the reply. I was excited that the problem was resolved and uninstalled previous ludwig and installed from the above location as mentioned, but unfortunately I am getting the same error as before for the same dataset in my first post :(
ValueError: Invalid reduction dimension 2 for input with 2 dimensions. for '{{node ecd/text_output_feature/Max}} = Max[T=DT_FLOAT, Tidx=DT_INT32, keep_dims=false](ecd/text_output_featu
re/Abs_1, ecd/text_output_feature/Max/reduction_indices)' with input shapes: [128,512], [] and with computed input tensors: input[1] = <2>.
I tried with sampled_softmax_cross_entropy
and softmax_cross_entropy
.
by installing from git link my ludwig version is : 0.4.dev0
@farazk86 thank you for the feedback. Looks like there may be another subtle issue still left over from the latest PR. Let me take a look at it.
@farazk86 @jenishah I forgot to mention a PR was recently merged that fixed the
sampled_softmax
issue. This fix will be in the next release of Ludwig. In the meantime, if you want to see if the fix addresses this issue, just uninstall Ludwig and reinstall from Ludwig's main branch using this command:pip install git+https://github.com/ludwig-ai/ludwig.git
Let me know if this helped or not.
Hi Jim,
Thanks for the reply. I was excited that the problem was resolved and uninstalled previous ludwig and installed from the above location as mentioned, but unfortunately I am getting the same error as before for the same dataset in my first post :(
ValueError: Invalid reduction dimension 2 for input with 2 dimensions. for '{{node ecd/text_output_feature/Max}} = Max[T=DT_FLOAT, Tidx=DT_INT32, keep_dims=false](ecd/text_output_featu re/Abs_1, ecd/text_output_feature/Max/reduction_indices)' with input shapes: [128,512], [] and with computed input tensors: input[1] = <2>.
I tried with
sampled_softmax_cross_entropy
andsoftmax_cross_entropy
.by installing from git link my ludwig version is :
0.4.dev0
I was able to run the code with that.. But I did not get chance to look deep enough and verify if it is safe to add flag to skip that part.
@farazk86 @jenishah I forgot to mention a PR was recently merged that fixed the
sampled_softmax
issue. This fix will be in the next release of Ludwig. In the meantime, if you want to see if the fix addresses this issue, just uninstall Ludwig and reinstall from Ludwig's main branch using this command:pip install git+https://github.com/ludwig-ai/ludwig.git
Let me know if this helped or not.
Hi Jim, Thanks for the reply. I was excited that the problem was resolved and uninstalled previous ludwig and installed from the above location as mentioned, but unfortunately I am getting the same error as before for the same dataset in my first post :(
ValueError: Invalid reduction dimension 2 for input with 2 dimensions. for '{{node ecd/text_output_feature/Max}} = Max[T=DT_FLOAT, Tidx=DT_INT32, keep_dims=false](ecd/text_output_featu re/Abs_1, ecd/text_output_feature/Max/reduction_indices)' with input shapes: [128,512], [] and with computed input tensors: input[1] = <2>.
I tried with
sampled_softmax_cross_entropy
andsoftmax_cross_entropy
. by installing from git link my ludwig version is :0.4.dev0
I was able to run the code with that.. But I did not get chance to look deep enough and verify if it is safe to add flag to skip that part.
Hi @jenishah did you manage to run the code with the line of code you refered to earlier commented out or did your installation of github run on my data without any modification to the code?
Thanks
@farazk86 @jenishah I forgot to mention a PR was recently merged that fixed the
sampled_softmax
issue. This fix will be in the next release of Ludwig. In the meantime, if you want to see if the fix addresses this issue, just uninstall Ludwig and reinstall from Ludwig's main branch using this command:pip install git+https://github.com/ludwig-ai/ludwig.git
Let me know if this helped or not.
Hi Jim, Thanks for the reply. I was excited that the problem was resolved and uninstalled previous ludwig and installed from the above location as mentioned, but unfortunately I am getting the same error as before for the same dataset in my first post :(
ValueError: Invalid reduction dimension 2 for input with 2 dimensions. for '{{node ecd/text_output_feature/Max}} = Max[T=DT_FLOAT, Tidx=DT_INT32, keep_dims=false](ecd/text_output_featu re/Abs_1, ecd/text_output_feature/Max/reduction_indices)' with input shapes: [128,512], [] and with computed input tensors: input[1] = <2>.
I tried with
sampled_softmax_cross_entropy
andsoftmax_cross_entropy
. by installing from git link my ludwig version is :0.4.dev0
I was able to run the code with that.. But I did not get chance to look deep enough and verify if it is safe to add flag to skip that part.
Hi @jenishah did you manage to run the code with the line of code you refered to earlier commented out or did your installation of github run on my data without any modification to the code?
Thanks
I ran on your data, and your yaml file, but I commented out the line that I mentioned. Also, it worked only with softmax_cross_entropy
loss
We are looking into this issue anyway, will update soon.
We are looking into this issue anyway, will update soon.
Have there been any updates on the reduce input error? I am also experiencing the same issue and receiving the same error reported above:
ValueError: Invalid reduction dimension 2 for input with 2 dimensions. for '{{node ecd/text_output_feature/Max}} = Max[T=DT_FLOAT, Tidx=DT_INT32, keep_dims=false](ecd/text_output_feature/Abs_1, ecd/text_output_feature/Max/reduction_indices)' with input shapes: [64,512], [] and with computed input tensors: input[1] = <2>.
Thanks in advance for your help!
@Losbal just to confirm, this still happens in the latest release v0.4.1? What if you tried using the current master (v0.5-dev0)? We have done a big rework on master so it's very likely that the issue was solved as a side effect of it. If not @justinxzhao could potentially help.
@w4nderlust Yes, I was using v0.4.1 but I just installed v0.5 on a separate cluster to test and it seems to have resolved the issue. Thanks for your help! (N.B. I mentioned a different KeyError after switching to v0.5 in an earlier version of this reply but was able to resolve it, so please disregard it if you caught it pre-edit).
Thanks for the update @Losbal. Glad you aren't running into any errors on 0.5. Closing, though feel free to reopen if you run into any issues.
@farazk86 @Losbal sorry it took so long to solve the issue, glad we managed to do it though! :)
Describe the bug Hi, I previously used Ludwig when it was using the
Tensorflow1.x
backend. And I created a machine translation project using it. But now after updating ludwig to the latest version, I can no longer run the same project.I am basically following the configuration mentioned in the example for translation: https://ludwig-ai.github.io/ludwig-docs/examples/#machine-translation
below is my
model_definition.yaml
my terminal command:
This is the error that is produced:
Here are a few lines from the
training_file.csv
Environment (please complete the following information):