You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I want to froze the model to conduct unittest. When I run the command
"g2p-seq2seq --model_dir model_folder_bre --freeze"
There exisits the bug:
AssertionError: transformer/parallel_0_5/transformer/body/decoder/layer_0/self_attention/multihead_attention/dot_product_attention/Softmax is not in graph.
I don't know how to fix this bug. I'll will be grateful for your help.
The text was updated successfully, but these errors were encountered:
Hi, I want to froze the model to conduct unittest. When I run the command
"g2p-seq2seq --model_dir model_folder_bre --freeze"
There exisits the bug:
AssertionError: transformer/parallel_0_5/transformer/body/decoder/layer_0/self_attention/multihead_attention/dot_product_attention/Softmax is not in graph.
I don't know how to fix this bug. I'll will be grateful for your help.
The text was updated successfully, but these errors were encountered: