Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature suggestion: Support ONNX models? #242

Open
pfeatherstone opened this issue Aug 15, 2020 · 24 comments
Open

Feature suggestion: Support ONNX models? #242

pfeatherstone opened this issue Aug 15, 2020 · 24 comments

Comments

@pfeatherstone
Copy link

How about supporting ONNX in frugally? You could have a protobuf importer for ONNX models or add a tool which converts ONNX to the JSON format you use? Just a thought. A header only ONNX inference engine would be very very useful.

@Dobiasd
Copy link
Owner

Dobiasd commented Aug 15, 2020

Not a bad idea, but do projects like these not already cover this use-case?

(model.onnx -> onnx2keras -> model.h5 -> frugally-deep/keras_export/convert_model.py -> model.json -> #include <fdeep.hpp>)

@pfeatherstone
Copy link
Author

They don’t really work unless it’s a super simple CNN

@pfeatherstone
Copy link
Author

I use ONNXRUNTIME as an inference engine but is quite bloated. I was thinking something like frugally that is a simple header only library might be good alternative for drag’n’drop inference in c++ projects.

@pfeatherstone
Copy link
Author

The only thing is ONNX is heavily featured. So if you did start supporting it, you might get a flood of issues ONNX related. Something like “doesn’t support ... in ONNX”

@pfeatherstone
Copy link
Author

So I’m willing to bet frugally would rapidly become an ONNX library more so than a keras library. But that’s not a bad thing. That would please the Pytorch community greatly.

@Dobiasd
Copy link
Owner

Dobiasd commented Aug 15, 2020

They don’t really work unless it’s a super simple CNN

I guess there's a reason for that. 😬

Emulating all the tiny idiosyncrasies of Keras in frugally-deep was already a ridiculous time investment. My gut feeling is, that trying to support ONNX too might be a huge can of worms. But I'll look into it nonetheless. 🙂


The only thing is ONNX is heavily featured. So if you did start supporting it, you might get a flood of issues ONNX related. Something like “doesn’t support ... in ONNX”

That should be ok. We (the frugally-deep devs) can choose which features we want to support and which not, I guess. 😉

@pfeatherstone
Copy link
Author

I guess you might have to go by ONNX opset rather than choosing specific features to support. For example opset 10 or lower. Then everyone is on the same page as to what is supported. I’m glad you think this is an ok idea.

@pfeatherstone
Copy link
Author

Maybe if you supported ONNX you would get a lot of PRs from the pytorch/ONNX community. So this could be an investment. :)

@Dobiasd
Copy link
Owner

Dobiasd commented Aug 15, 2020

Are you interested in giving implementing a POC a try?

@pfeatherstone
Copy link
Author

I would love to but I would have to ask my bosses for some time to contribute which they would probably not agree to. I understand if you poo-poo the idea given that I would likely not contribute.

@pfeatherstone
Copy link
Author

pfeatherstone commented Aug 15, 2020

I have no idea how long it would take to convert ONNX opset 10 say to the frugally JSON format. I could investigate and maybe do it in a few weekend but my partner might castrate me.

@pfeatherstone
Copy link
Author

I understand open source isn’t a free service. So I maintain this is just an idea. You are welcome to say no and close the issue.

@Dobiasd
Copy link
Owner

Dobiasd commented Aug 17, 2020

Ok, I've familiarized myself a bit more with ONNX, and I really like the concept. Also, I believe, your suggestion to support it is a good idea. It might help many users and broaden the potential audience for frugally-deep. 👍

ONNX even has good documentation on how to implement a new backend, on opsets, and even on how to test one's backend implemention.

However, at least right now, it seems far from trivial to adjust frugally-deep into that direction. To me, it seems it's not just "converting ONNX opset n to fdeep-JSON format", but also changing and adding a lot of internal things. If we would want to do this, it might even make sense to support only ONNX. Maybe this would even be a new project, that just re-uses knowledge gained by implementing frugally-deep. Currently, I think I've not enough free time on my hands to start such an initiative. But I'll keep it in mind.

@pfeatherstone
Copy link
Author

This sounds promising. It's currently a real pain when wanting to infer ONNX models on mobile platforms or more niche hardware, which big libraries like onnxruntime do not support. Libraries like https://github.com/alibaba/MNN and https://github.com/Tencent/ncnn are designed to make inference as simple as possible (from a library dependency perspective) but they don't support ONNX particularly well. So embracing the whole header-only ethos for ONNX would be a big leap forward for a lot of people I think.

@Dobiasd
Copy link
Owner

Dobiasd commented Aug 20, 2020

In case I start such a totally fresh project, I feel tempted, however, to use Rust instead of C++. 😉

@pfeatherstone
Copy link
Author

Nooooooooooooo

@pfeatherstone
Copy link
Author

That will throw portability out the window.

@Dobiasd
Copy link
Owner

Dobiasd commented Aug 20, 2020

rustc also supports more than one platform. 😉
But I guess there are devices our there that one can target with C++ but not with Rust. Could you give a concrete example, that you see as problematic?

@pfeatherstone
Copy link
Author

Well from a personal point of view I code exclusively in c++ so a rust library isn't particularly useful unless rust/c++ interoperability is extremely simple, which I doubt it is. I think you will reach a wider community in c++ than you will in rust. A lot of languages have bindings to C and c++ not necessarily to rust. So writing in c++ will please more than the c++ community. The whole ethos of a header only c++ library solves a LOT of problems in the embedded world particularly if your code base is mainly C/C++. If you tell users to bind their code to rust code they will find it easier to link to a cumbersome library like onnxruntime. So, in my point of view, writing it in rust won't gain as much traction. Also, I don't think Rust solves that many problems. It's too similar to c++. The compiler is slow. It's not faster. If someone really wants to write safe code without engaging brain, I would choose Go or V whenever a stable release comes out. So please please stick to C++.

@Dobiasd
Copy link
Owner

Dobiasd commented Aug 21, 2020

Thanks a lot for the explanation. 👍

Well from a personal point of view I code exclusively in c++

For me, languages are more like tools that can be switched. I don't consider myself a "C++ programmer", just a software developer who currently happens to use C++ for that project. 🧑‍🏭

so a rust library isn't particularly useful unless rust/c++ interoperability is extremely simple

There are some promising projects. Most of them, however, focus on calling C++ from Rust. But I think there are also things to do it the other way around.

So writing in c++ will please more than the c++ community.

Right now, probably yes. Rust, however, can be compiled to use C calling conventions, then the ABI is similar, and when calling into the pre-compiled library from Python, etc. you, ideally, won't even notice the difference.

The whole ethos of a header only c++ library solves a LOT of problems in the embedded world particularly if your code base is mainly C/C++.

Yeah, header files, in general, are a quite C/C++ specific thing. When designing a language I'd never choose this complication, and I think most modern languages don't.

If you tell users to bind their code to rust code they will find it easier to link to a cumbersome library like onnxruntime. So, in my point of view, writing it in rust won't gain as much traction.

Maybe, let's hope Rust gains even more traction. Also, when I'm choosing what non-profit projects to work on in my free time, traction is not the main reason. I'm more interested in solving a particular problem I currently have while developing something else, but especially having fun, learning what interests me. If there is a big gap in the library market, i.e., C++-header-only-library for ONNX scoring, somebody might fill it at some point. Maybe when I annoy you too much, it might be you in a rage-coding mood? 😉

Also, I don't think Rust solves that many problems. It's too similar to c++.

It targets the same use cases, but the language is very different, I'd say. For me, the Rust-to-C++ relationship is similar to Kotlin-to-Java. Kotlin targets the same use-cases as Java does, but it's much more modern, safer, less verbose, more fun, and more productive to work with. No programming language will dominate forever. Change is the only constant. 😁

If someone really wants to write safe code without engaging brain

Well, you still have to use your brain, especially when learning about the ownership system. Rust, in that sense, feels a bit similar to Haskel for me. You don't debug Rust code, but the Rust compiler debugs your brain. 🧠

I would choose Go or V whenever a stable release comes out.

Go? The language made for people who are not capable of understanding a brilliant language? 😛

I've not used V yet, but it looks nice. However, the surrounding ecosystem seems very small.

So please please stick to C++.

I'll not move over frugally-deep to a different language. But for new projects: Meh.

@Dobiasd
Copy link
Owner

Dobiasd commented Aug 21, 2020

Oh:
https://github.com/snipsco/tract
Nice! 🙂

@pfeatherstone
Copy link
Author

That does look good. But it is in Rust... I mean I could look at calling Rust from C++ but not worth the effort for me.

@pfeatherstone
Copy link
Author

rustc also supports more than one platform. wink
But I guess there are devices our there that one can target with C++ but not with Rust. Could you give a concrete example, that you see as problematic?

Just noticed that the situation with ARM environments isn't looking great. Most of them build, but almost none are properly tested. Weirdly MIPS and POWERPC seem to be supported better. I guess Mozilla aren't too fussed.

@pfeatherstone
Copy link
Author

the Rust compiler debugs your brain
I'm in the process of swapping as many raw pointers in my code base with smart pointers, particularly std::unique_ptr if possible. I have to say, the C++ compiler is debugging my brain now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants