-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wheels for linux #138
Comments
Cool. You setting this up? :) |
From my understanding of this, there isn't really anything to setup per se. The So, it's an extra step before making a release. Note: Again, I've never really tried it myself. This is from looking at other projects. |
I'm not seeing these instructions. Or any instructions really. What On Sat, Jun 18, 2016 at 6:31 AM, AbdealiJK [email protected] wrote:
|
The instructions I've gotten are from here : https://github.com/pypa/python-manylinux-demo/blob/master/.travis.yml So, (untesed) commands would be like: $ docker pull quay.io/pypa/manylinux1_x86_64
$ docker run quay.io/pypa/manylinux1_x86_64
# Now... inside the docker image:
# Install dependencies of dlib into the docker here
$ pip install numpy
$ yum install cmake boost-devel
# Compile wheels for various python versions
$ for PYBIN in /opt/python/*/bin; do
${PYBIN}/pip install -r /io/dev-requirements.txt
${PYBIN}/pip wheel <path_to_package> -w wheelhouse/
done
# Bundle external shared libraries into the wheels (if any?)
$ for whl in wheelhouse/*.whl; do
auditwheel repair $whl -w /io/wheelhouse/
done
# Now the wheels should be present in /io/wheelshouse. Install and test them. Here's the build-wheels.sh found in the example project - https://github.com/pypa/python-manylinux-demo/blob/master/travis/build-wheels.sh PS: I've also made an issue in manylinux to give these "raw"? instructions to make it easier for users that do not want travis - pypa/manylinux#73 |
Cool.
I'm not super motivated to worry about this right now though. If you want
to figure out how to automate this whole thing with some convenient script
I'll do it for the upcoming release. But if not I'm not going to do this
right now.
|
@davisking making an automated script is a good idea. Im a little busy this week though. A few questions : |
I'm training a convnet to classify imagenet images and I'm basically just
waiting for the training to complete. It takes over a week, but it's been
running a while and should finish in a few days. That's the only thing
holding up the next release. So the next release will probably be next
weekend.
I do most stuff on Ubuntu.
|
I can help set up wheel building - but I don't know how to run the tests on the Python code - can you give me any pointers? |
@matthew-brett Awesome to see you here :) And thanks for looking into this ! I can't seem to be able to find tests for the python either. @davisking are there any o.O ? |
Yeah, there aren't any python tests :/
dtest tests the C++ library. But the python bindings I just test by
running the examples, which is less than ideal.
|
There's a draft of the manylinux build here : https://github.com/MacPython/dlib-wheels These are building here : https://travis-ci.org/MacPython/dlib-wheels/jobs/141487558 But - the problem is that I don't know anything about boost, and so I'm doing a huge brute force install of the boost libraries : https://github.com/MacPython/dlib-wheels/blob/master/config.sh#L29 That big build is proving very slow indeed. Can you give any hints on what I need to install from boost in order for the dlib Python wheel to compile? |
You only need the boost-python part of boost, which is quick to compile.
Everything else is unnecessary. There are instructions for compiling
boost-python in the cmake file that builds dlib's module.
|
Am I right that dlib requires boost-python version exactly equal to 1.41.0 ? https://github.com/davisking/dlib/blob/master/dlib/cmake_utils/add_python_module#L65 EDIT - no - this is a minimum dependency - see below. |
Sorry - scratch that - I see that 1.41.0 is a specification of a minimum dependency. Watching the installs I can get working, I see that you have an optional dependency on a cblas implementation such as OpenBLAS. There's some code with "numpy" but it looks like you don't use the numpy standard headers, got with I can build the dlib code without error with
I assume that the default install doesn't use any not-generic CPU features such as AVX? I'm asking because the wheel should not depend on the CPU features of the host that it was built on. |
Boost error finding Python include path for Python 3 :
Obscure error with no informative error message on Python 2.7:
|
Sadly CMake is an ugly beast for building Python extensions. |
@matthew-brett The boost building issue with py34 was something I had gotten a while back too : travis-ci/travis-ci#6099 Is it possible to use a conda environment ? That simplified things quite a bit for me in travis. And that was what I had done finally (and am still doing - https://github.com/AbdealiJK/file-metadata/blob/master/.travis.yml#L49) |
@matthew-brett Also, about the dlib issue - It seems the next thing is to find the Boost python library. My output is:
|
I didn't do anything non-portable to interface with numpy as far as I'm aware. It should work with any version. The python bindings will use SSE4 by default, but not AVX. There is a lot of variation in how boost.python is installed. It's a frequent user complaint that they get errors from mixing boost.python versions with the wrong version of python, or it's installed in some weird place. I don't really think this is a CMake problem. How else would we compile these modules? You need some reliable way to find the version of boost.python and python libs that match the version of python the user is going to be running. If that could be solved then the setup.py could pass that information to cmake and it should be fine. For a point of comparison, I made MITIE and it uses CMake to build python extensions and no one ever has any problems with it. But it doesn't use boost.python. Instead I wrote a C API which uses ctypes from python. So there is no dependency on python libs or boost or anything when compiling the shared object. However, writing a C API and binding it through ctypes is a huge pain in the ass. It's only reasonable to do with a small API. Is there some way to find boost.python and python libs in setup.py and verrify that they are compiled together correctly? I think that's the heart of the issue. |
The maximum guaranteed on a 64-bit install is SSE2, so, for a binary wheel on OSX or Linux, I guess that should be the setting - otherwise the wheels will segfault on older systems. Sorry - yes - I should have said only that cmake and Python distutils are two very different systems. I think that's why the bug with finding the Python3 include files has not been fixed for a while in the boost-python cmake configuration. It's not to hard to work round that bug, I guess just adding the Python include files manually to the I'm afraid I'm really inexperienced with cmake, and very experienced (sadly) with distutils, so I wasn't sure how to debug the failure to find or use the boost-python installation. I suppose I'd have to somehow find the implementation of FindPackage for boost-python in cmake, but I wasn't sure where to look. |
The main cmake scripts for the pyhon bindings are in this file
https://github.com/davisking/dlib/blob/master/dlib/cmake_utils/add_python_module.
In general, in cmake there are a few ways to help it find files. There are
a few functions like find_path, find_library, find_program and so forth.
Generally you can also just have lists of places to look like "python often
installed here".
Don't mess with CFLAGS though. If you look at add_python_module you will
get the idea. Also, I moved the file a few commits ago, in dlib v19.0 it's
right in the dlib folder.
|
I added both of you as collaborators on the repo, in case you'd be interested in fixing up the builds somehow. |
@matthew-brett do you have an update on this? Otherwise I'll take a stab at it. |
No, sorry, I got lost in the wilds of cmake / distutils. Please do go ahead and have a stab - let me know if you think I can help. I know a lot about distutils and the manylinux / OSX wheel building machinery. |
So I got cmake 2.8.12 and boost 1.64.0 installed from source on a manylinux docker image. However now I'm quite stuck with @matthew-brett any thoughts? For reference:
|
The problem is that the manylinux docker images deliberately do not have the libpython libraries, because the default install of Python on Debian / Ubuntu does not have these libraries either. However, the default install does have all the necessary symbols linked, just not via the libpython library. So, to fix this, I'm afraid you somehow have to tell the building procedure not to look for libpython but assume the relevant symbols are present at runtime. |
Hmm, so the linking of dlib/dlib/cmake_utils/add_python_module Line 129 in 664ef39
I have no clue how to tell that part not to look for libpython. @davisking do you have an idea? |
No idea. I'm pretty sure you need libpython.
|
You definitely do not need libpython at run time, because the Python binary has all the symbols you need. So the error here is the assumption by the build process that libpython is necessary at build time. I know that VTK does allow undefined Python symbols at build time - here's a suggestive reference: https://gitlab.kitware.com/vtk/vtk/commit/50d088ab9cbd4bc0e0215cbe6bdfdea9a392ca4b |
Huh, well, modify the cmake script to not link to it and see if it works out. |
is dlib wheels for linux available yet? |
No. Someone should set this up :) |
Although I should also mention that part of the reason I haven't done this, aside from being busy, is because the current setup will compile dlib appropriately for your system. Like do you have a GPU or AVX instructions? It will detect those and use them if available. That has to be done at compile time. A precompiled binary would have to disable all that stuff and so be slower for many users. |
BTW, tensorflow solves this problem by providing multiple wheels like tensorflow with cpu-only support / tensorflow-gpu with gpu support. The user has to know and decide there are multiple variants though. |
i'm installing face_recognition library which depends on dlib, on a resource constrained machine and it is taking a whole day just to build dlib |
Yes, that would be fine. Someone should set that up. A whole day is kinda ridiculous. Where did it take most of the time? On the very last linking step? |
python_dlib -> compiling image.cpp, image2.cpp, the instance has only 512 mb and it is hitting swap. maybe should try allocating more memory temporarily and just build it for now 😄 |
Ok. Well, you do need enough RAM. No getting around that. You should try splitting those files into 4 files. That would reduce the amount of RAM required per file. You can also uncomment the set(PYBIND11_LTO_CXX_FLAGS "") in dlib/tools/python/CMakeLists.txt and it will turn off link time optimization, which makes the build a lot faster and doesn't seem to materially impact runtime speed. I will likely just disable LTO in the future because of how long it makes the compile process. Anyway, if you want to try those things out and submit a PR if it fixes the compile times that would be sweet :) |
Warning: this issue has been inactive for 61 days and will be automatically closed on 2018-09-07 if there is no further activity. If you are waiting for a response but haven't received one it's likely your question is somehow inappropriate. E.g. you didn't follow the issue submission instructions, or your question is easily answerable by reading the FAQ, dlib's documentation, or a Google search. |
Notice: this issue has been closed because it has been inactive for 65 days. You may reopen this issue if it has been closed in error. |
Is there any update on this? |
Numpy, Scipy, etc now provide wheels for linux using manylinux which makes installing them much easier.
It would be nice if dlib did so too, as the compilation does take quite a while.
ManyLinux: https://github.com/pypa/manylinux
Example project using manylinux: https://github.com/pypa/python-manylinux-demo
The text was updated successfully, but these errors were encountered: