-
Notifications
You must be signed in to change notification settings - Fork 643
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversion for Padding related to dynamic input shape failed #1921
Comments
Has this issue been fixed? seems like the error remains |
I am facing the same issue here: |
I'm also running into this issue! Same thing, I have a padding scheme that relies directly on the input shape, and it does not seem like when the conversion is happening that the |
I've applied a fixing PR which is based on 7.0b2. Before reviewed and merged, I recommend it as a temporary work-around. Test snippet, which contains 2 different dynamic shape dim:
Output:
|
🐞Describing the bug
If model contained a Padding, whose pad value is related to dynamic input shape, the conversion failed.
Stack Trace
To Reproduce
My local installed
coremltools
has no change:System environment (please complete the following information):
Additional context
A more meaningful explanation of this use is from Self-Attention with Relative Position Representations
window_size
in the network, regardless of inputAlso, I'm not sure whether this is a bug or is expected behavior (that it's known not supported)
The text was updated successfully, but these errors were encountered: