-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
info singularity repo build #96
Comments
Hi Kristina, James said you were doing something along those lines. Sounds like it'll be useful :) I don't mind If you'd like to restructure things to make it easier to reuse in your pipeline. It's been a while since i've written anything new in nextflow, but i know there's some new syntax for dealing with imports/workflows and I think the new modules syntax would probably be good to adopt. Regarding Docker etc. The current setup is really all based on conda. RE: A possible solution would be to create an independent container RE: we manage through individual containers I've found in the past that dealing with dozens of individual containers (e.g. in PanAnn/TE) was kind of a pain. Happy to talk more through any ideas you have. All the best, PS. Apologies for the essay :) |
HI Darcy Thank you for getting back to me! This is the way our tools are structured: it has a workflow-subworkflow-module structure organization which relies on a container download from each process module. I don't know if that was a feature of nf when you coded it, but it's quite handy it since it's handled automatically by the pipeline. This process structure downloads and uses a container inside the pipeline:
You don't need to download it manually and it works well in complex pipelines, without the requirement to download the steps individually. We also tried to use Conda, but it was slow and painful. Using the As for your suggestions, I think this one looks the most promising:
Would that work if I had the licensed software somewhere locally and I pointed to their path? We already use SignalP in this way in one of the pipelines and it works ok. What do you think? If that won't work, I may need to do some coding on predector, which I can add to a separate branch here. I am unsure how that would be updated/merged if there were new releases. The best would be to use it as it is and call it in our pipeline, but I will know if that may work out with the solution before. |
Quick reply: RE: container download from each process module. Yeah this was always a feature. RE: ** Conda, but it was slow and painful.** RE: Would that work if I had the licensed software somewhere locally and I pointed to their path? My thinking is that rather than installing inside the container, i just untar the package and modify it in a process, and then we can pass the folder to later processes. That way it can still run inside controlled containers but maybe reduce some initial friction. The only real issue is managing the dependencies. e.g. SignalP6 will still need some kind of environment to provide pytorch etc. I think some of them don't work without running the installer so it would need some workarounds to set things up properly. I often feel like writing Signal peptide prediction tools just to spite the DTU and their frustrating license requirements. RE: I am unsure how that would be updated/merged if there were new releases. |
Hi Darcy,
Thank you for maintaining predector. We plan to incorporate it in our MycoProcessor pipeline at the CCDM and make it reusable as a subworflow under a broader toolkit. Unfortunately, Predector, as it is, is unsuitable for integration into an existing subworkflow, especially for the installation of software, which we manage through individual containers.
Also, the current predector container creation uses Docker as a base and builds singularity through that. A possible solution would be to create an independent container, excluding the proprietary software, and share it on DockerHub. We can then better incorporate it into our pipeline.
But if you have a suggestion or another idea, I would be happy to hear them!
Cheers
Kristina
The text was updated successfully, but these errors were encountered: