-
-
Notifications
You must be signed in to change notification settings - Fork 237
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature suggestion: Support ONNX models? #242
Comments
Not a bad idea, but do projects like these not already cover this use-case? ( |
They don’t really work unless it’s a super simple CNN |
I use ONNXRUNTIME as an inference engine but is quite bloated. I was thinking something like frugally that is a simple header only library might be good alternative for drag’n’drop inference in c++ projects. |
The only thing is ONNX is heavily featured. So if you did start supporting it, you might get a flood of issues ONNX related. Something like “doesn’t support ... in ONNX” |
So I’m willing to bet frugally would rapidly become an ONNX library more so than a keras library. But that’s not a bad thing. That would please the Pytorch community greatly. |
I guess there's a reason for that. 😬 Emulating all the tiny idiosyncrasies of Keras in frugally-deep was already a ridiculous time investment. My gut feeling is, that trying to support ONNX too might be a huge can of worms. But I'll look into it nonetheless. 🙂
That should be ok. We (the frugally-deep devs) can choose which features we want to support and which not, I guess. 😉 |
I guess you might have to go by ONNX opset rather than choosing specific features to support. For example opset 10 or lower. Then everyone is on the same page as to what is supported. I’m glad you think this is an ok idea. |
Maybe if you supported ONNX you would get a lot of PRs from the pytorch/ONNX community. So this could be an investment. :) |
Are you interested in giving implementing a POC a try? |
I would love to but I would have to ask my bosses for some time to contribute which they would probably not agree to. I understand if you poo-poo the idea given that I would likely not contribute. |
I have no idea how long it would take to convert ONNX opset 10 say to the frugally JSON format. I could investigate and maybe do it in a few weekend but my partner might castrate me. |
I understand open source isn’t a free service. So I maintain this is just an idea. You are welcome to say no and close the issue. |
Ok, I've familiarized myself a bit more with ONNX, and I really like the concept. Also, I believe, your suggestion to support it is a good idea. It might help many users and broaden the potential audience for frugally-deep. 👍 ONNX even has good documentation on how to implement a new backend, on opsets, and even on how to test one's backend implemention. However, at least right now, it seems far from trivial to adjust frugally-deep into that direction. To me, it seems it's not just "converting ONNX opset |
This sounds promising. It's currently a real pain when wanting to infer ONNX models on mobile platforms or more niche hardware, which big libraries like onnxruntime do not support. Libraries like https://github.com/alibaba/MNN and https://github.com/Tencent/ncnn are designed to make inference as simple as possible (from a library dependency perspective) but they don't support ONNX particularly well. So embracing the whole header-only ethos for ONNX would be a big leap forward for a lot of people I think. |
In case I start such a totally fresh project, I feel tempted, however, to use Rust instead of C++. 😉 |
Nooooooooooooo |
That will throw portability out the window. |
|
Well from a personal point of view I code exclusively in c++ so a rust library isn't particularly useful unless rust/c++ interoperability is extremely simple, which I doubt it is. I think you will reach a wider community in c++ than you will in rust. A lot of languages have bindings to C and c++ not necessarily to rust. So writing in c++ will please more than the c++ community. The whole ethos of a header only c++ library solves a LOT of problems in the embedded world particularly if your code base is mainly C/C++. If you tell users to bind their code to rust code they will find it easier to link to a cumbersome library like onnxruntime. So, in my point of view, writing it in rust won't gain as much traction. Also, I don't think Rust solves that many problems. It's too similar to c++. The compiler is slow. It's not faster. If someone really wants to write safe code without engaging brain, I would choose Go or V whenever a stable release comes out. So please please stick to C++. |
Thanks a lot for the explanation. 👍
For me, languages are more like tools that can be switched. I don't consider myself a "C++ programmer", just a software developer who currently happens to use C++ for that project. 🧑🏭
There are some promising projects. Most of them, however, focus on calling C++ from Rust. But I think there are also things to do it the other way around.
Right now, probably yes. Rust, however, can be compiled to use C calling conventions, then the ABI is similar, and when calling into the pre-compiled library from Python, etc. you, ideally, won't even notice the difference.
Yeah, header files, in general, are a quite C/C++ specific thing. When designing a language I'd never choose this complication, and I think most modern languages don't.
Maybe, let's hope Rust gains even more traction. Also, when I'm choosing what non-profit projects to work on in my free time, traction is not the main reason. I'm more interested in solving a particular problem I currently have while developing something else, but especially having fun, learning what interests me. If there is a big gap in the library market, i.e., C++-header-only-library for ONNX scoring, somebody might fill it at some point. Maybe when I annoy you too much, it might be you in a rage-coding mood? 😉
It targets the same use cases, but the language is very different, I'd say. For me, the Rust-to-C++ relationship is similar to Kotlin-to-Java. Kotlin targets the same use-cases as Java does, but it's much more modern, safer, less verbose, more fun, and more productive to work with. No programming language will dominate forever. Change is the only constant. 😁
Well, you still have to use your brain, especially when learning about the ownership system. Rust, in that sense, feels a bit similar to Haskel for me. You don't debug Rust code, but the Rust compiler debugs your brain. 🧠
Go? The language made for people who are not capable of understanding a brilliant language? 😛 I've not used V yet, but it looks nice. However, the surrounding ecosystem seems very small.
I'll not move over frugally-deep to a different language. But for new projects: Meh. |
Oh: |
That does look good. But it is in Rust... I mean I could look at calling Rust from C++ but not worth the effort for me. |
Just noticed that the situation with ARM environments isn't looking great. Most of them build, but almost none are properly tested. Weirdly MIPS and POWERPC seem to be supported better. I guess Mozilla aren't too fussed. |
|
How about supporting ONNX in frugally? You could have a protobuf importer for ONNX models or add a tool which converts ONNX to the JSON format you use? Just a thought. A header only ONNX inference engine would be very very useful.
The text was updated successfully, but these errors were encountered: