Assuming that you have python3
(and virtualenv) installed, the fastest
way to establish yourself a development environment (or a sample deployment),
is via virtualenv:
git clone https://github.com/dandi/dandi-cli \
&& cd dandi-cli \
&& virtualenv --system-site-packages --python=python3 venvs/dev3 \
&& source venvs/dev3/bin/activate \
&& pip install -e .[test]
Install pre-commit dependency with pip install pre-commit
In the source directory
pre-commit install
You can run all tests locally by running tox
(you can install tox
running pip install tox
):
tox -e py3
In order to check proper linting and typing of your changes
you can also run tox
with lint
and typing
:
tox -e lint,typing
The dandi-archive repository provides a
docker-compose recipe for establishing a local instance of a fresh dandi-archive.
See
DEVELOPMENT.md:Docker
for the instructions. In a new instance, you would need to generate a new API
key to be used by the dandi
client for upload etc.
Relevant dandi
client commands (such as upload
) are aware of such an
instance as dandi-api-local-docker-tests
. See the note below on the
DANDI_DEVEL
environment variable, which is needed in order to expose the
development command line options.
-
DANDI_DEVEL
-- enables otherwise hidden command line options, such as explicit specification of the dandi-api instance. All those options would otherwise be hidden from the user-visible (--help
) interface, unless this env variable is set to a non-empty value -
DANDI_API_KEY
-- avoids using keyrings, thus making it possible to "temporarily" use another account etc for the "API" version of the server. -
DANDI_LOG_LEVEL
-- set log level. By defaultINFO
, should be an int (10
-DEBUG
). -
DANDI_CACHE
-- clear persistent cache handling. Known values areclear
- would clear the cache,ignore
- would ignore it. Note that for metadata cache we use only released portion ofdandi.__version__
as a token. If handling of metadata has changed while developing, set this env var toclear
to have cacheclear()
ed before use. -
DANDI_INSTANCEHOST
-- defaults tolocalhost
. Point to host/IP which hosts a local instance of dandiarchive. -
DANDI_TESTS_PERSIST_DOCKER_COMPOSE
-- When set, the tests will reuse the same Docker containers across test runs instead of creating & destroying a new set on each run. Set this environment variable to0
to cause the containers to be destroyed at the end of the next run. -
DANDI_TESTS_PULL_DOCKER_COMPOSE
-- When set to an empty string or0
, the tests will not pull the latest needed Docker images at the start of a run if older versions of the images are already present. -
DANDI_TESTS_NO_VCR
— When set, the use of vcrpy to playback captured HTTP requests during testing will be disabled -
DANDI_DEVEL_INSTRUMENT_REQUESTS_SUPERLEN
-- When set, theupload()
function will patchrequests
to log the results of calls torequests.utils.super_len()
-
DANDI_DOWNLOAD_AGGRESSIVE_RETRY
-- When set, would makedownload()
retry very aggressively - it would keep trying if at least some bytes are downloaded on each attempt. Typically is not needed and could be a sign of network issues.
The Sourcegraph browser extension can be used to view code coverage information as follows:
-
Install the Sourcegraph browser extension in your browser (Chrome or Firefox only)
-
Sign up for a Sourcegraph account if you don't already have one. You must be signed in to Sourcegraph for the remaining steps.
-
Enable the Codecov Sourcegraph extension
-
On GitHub, when viewing a dandi-cli source file (either on a branch or in a pull request diff), there will be a "Coverage: X%" button at the top of the source listing. Pressing this button will toggle highlighting of the source lines based on whether they are covered by tests or not.
New releases of dandi-cli are created via a GitHub Actions workflow built
around auto
. Whenever a pull request is
merged that has the "release
" label, auto
updates the changelog based on
the pull requests since the last release, commits the results, tags the new
commit with the next version number, and creates a GitHub release for the tag.
This in turn triggers a job for building an sdist & wheel for the project and
uploading them to PyPI.
The section that auto
adds to the changelog on a new release consists of the
titles of all pull requests merged into master since the previous release,
organized by label. auto
recognizes the following PR labels:
major
— for changes corresponding to an increase in the major version componentminor
— for changes corresponding to an increase in the minor version componentpatch
— for changes corresponding to an increase in the patch/micro version component; this is the default label for unlabelled PRsinternal
— for changes only affecting the internal APIdocumentation
— for changes only affecting the documentationtests
— for changes to testsdependencies
— for updates to dependency versionsperformance
— for performance improvements