-
Notifications
You must be signed in to change notification settings - Fork 643
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[PyTorch] torch.listconstruct causing issue for other ops #1926
Comments
In the proposed fix to issue #1303, we are trying to gather symbols
|
In reproducing issue #1921,
The |
Yes, for the #1921 case (for op 'pad'):
If we want to fix it referring to the solution in #1303 ( #1922 ), we basically want:
To be specific, in #1921 case:
if we hard coded the solution for it:
and modify op
#1921 is confirmed fixed (temporarily) |
However, I have some doubts before proposing a general fix for it. in the above case, the padding value is hard-coded:
My question: How can we relate from node '9'.inputs all the way to input x symbolic value, with data structure:
Edit: I've noticed |
Hi @xorange, thanks for looking into this issue! About relating a symbol to input symbols, you should be able to simply compare if those symbols are the same: we propagate symbols using As of the fix, I have several thoughts that might be easier:
|
Quoting from #2050:
Upon #2037, and another net structure on my hand that shared a similar root cause, it is clear that only targetting op for example,
or
|
Thanks for reply ! I'll look into it.
Yes this should be a cleaner way for
I think we both agree that a generalized fix is what we want here... Because I've already come across several cases that Could you share some functions that rely on "list of symbols" for me to design ? Let me see if I can cover for those, or learn the current design better (because clearly I'm missing something here). |
Unfortunately I cannot tell from top of my mind 😞 We could try to modify |
Any progress on this issue? |
None on my end. Having trouble coordinating between work and spare time for this, and it requires a lot to digest the whole design. No progress will be made from me at least before 2024 Q4 sry. |
This is the root cause to many issues. When symbolic shape is involved in
torch.listconstruct
, instead of a CoreML tensor, we simply return the list as isAppendix 1: Issues Sharing the Same Root Cause
Appendix 2: Ops Impacted by the Root Cause
torch.GroupNorm
torch.pad
torch.index_put
The text was updated successfully, but these errors were encountered: