You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When trying to scriptify Tabnet for portability in training, there are compilation errors in the SparseMax and EntMax15 functions.
Could not export Python function call 'Entmax15Function'. Remove calls to Python functions before export. Did you forget to add @script or @script_method annotation? If this is a nn.ModuleList, add it to __constants__:
This is a known issue in Pytorch with script compilation of torch.autograd.Function which looks to be triaged with no intention to fix. pytorch/pytorch#22329
What is the current behavior?
Compilation into torch.script will fail
If the current behavior is a bug, please provide the steps to reproduce.
Do you absolutely need a torch script for your production environment ? If you have python install in production, then you can install pytorch-tabnet and just use the save/load framework.
Otherwise please don't hesitate to open a PR with an updated version of Entmax and Sparsemax that are scriptable.
When trying to scriptify Tabnet for portability in training, there are compilation errors in the SparseMax and EntMax15 functions.
This is a known issue in Pytorch with script compilation of
torch.autograd.Function
which looks to be triaged with no intention to fix.pytorch/pytorch#22329
What is the current behavior?
Compilation into torch.script will fail
If the current behavior is a bug, please provide the steps to reproduce.
or
Expected behavior
Should compile into jit.script or jit.trace
Screenshots
Other relevant information:
tabnet version: 3.1.1 & 4.0
python version: 3.8+
Suggested fix
Include an alternative implementation of Softmax and EntMax15 which do not use
torch.autograd.Function
The text was updated successfully, but these errors were encountered: