Skip to content

Commit

Permalink
scripts: ci: Detect API-breaking changes in the PRs
Browse files Browse the repository at this point in the history
This script will check the PR's changes in the header files and DTS
bindings that may affects compatibility of the public API. This will
reduce risk of accidentally breaking the API. Workflow that runs
on each PR will add a comment with analysis summary.

Signed-off-by: Dominik Kilian <[email protected]>
  • Loading branch information
doki-nordic committed May 15, 2024
1 parent cd66e53 commit 27ca4f0
Show file tree
Hide file tree
Showing 22 changed files with 2,288 additions and 0 deletions.
129 changes: 129 additions & 0 deletions .github/workflows/api-check.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,129 @@
name: API Check

on:
pull_request:
branches:
- main
workflow_dispatch:
inputs:
new_commit:
type: string
required: true
description: New Commit
old_commit:
type: string
required: true
description: Old Commit

jobs:
build:
runs-on: ubuntu-latest
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
steps:
- name: Checkout sources
uses: nordicbuilder/action-checkout-west-update@main
with:
git-fetch-depth: 0
west-update-args: ''

- name: cache-pip
uses: actions/cache@v3
with:
path: ~/.cache/pip
key: ${{ runner.os }}-doc-pip

- name: Git rebase
if: github.event_name == 'pull_request'
env:
BASE_REF: ${{ github.base_ref }}
working-directory: ncs/nrf
run: |
git remote -v
git branch
git rebase origin/${BASE_REF}
# debug
git log --pretty=oneline -n 5
- name: Install packages
run: |
sudo apt update
sudo apt-get install -y ninja-build mscgen plantuml
sudo snap install yq
DOXYGEN_VERSION=$(yq ".doxygen.version" ./ncs/nrf/scripts/tools-versions-linux.yml)
wget --no-verbose "https://github.com/doxygen/doxygen/releases/download/Release_${DOXYGEN_VERSION//./_}/doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz"
tar xf doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz
echo "${PWD}/doxygen-${DOXYGEN_VERSION}/bin" >> $GITHUB_PATH
cp -r ncs/nrf/scripts/ci/api_check .
- name: Install Python dependencies
working-directory: ncs
run: |
sudo pip3 install -U setuptools wheel pip
pip3 install -r nrf/doc/requirements.txt
pip3 install -r ../api_check/requirements.txt
- name: West zephyr-export
working-directory: ncs
run: |
west zephyr-export
- name: Checkout new commit and west update
if: github.event_name == 'workflow_dispatch'
working-directory: ncs/nrf
run: |
git checkout ${{ github.event.inputs.new_commit }}
west update
- name: Collect data from new commit
working-directory: ncs/nrf
run: |
source ../zephyr/zephyr-env.sh
echo =========== NEW COMMIT ===========
git log -n 1
cmake -GNinja -Bdoc/_build -Sdoc
python3 ../../api_check/utils/interrupt_on.py "syncing doxygen output" ninja -C doc/_build nrf
python3 ../../api_check/headers doc/_build/nrf/doxygen/xml --save-input ../../headers-new.pkl
python3 ../../api_check/dts -n - --save-input ../../dts-new.pkl
rm -Rf doc/_build
- name: Checkout old commit and west update
working-directory: ncs/nrf
run: |
git checkout ${{ github.event.inputs.old_commit }}${{ github.base_ref }}
cd ..
west update
- name: Collect data from old commit
working-directory: ncs/nrf
run: |
source ../zephyr/zephyr-env.sh
echo =========== OLD COMMIT ===========
git log -n 1
cmake -GNinja -Bdoc/_build -Sdoc
python3 ../../api_check/utils/interrupt_on.py "syncing doxygen output" ninja -C doc/_build nrf
python3 ../../api_check/headers doc/_build/nrf/doxygen/xml --save-input ../../headers-old.pkl
python3 ../../api_check/dts -n - --save-input ../../dts-old.pkl
- name: Check
working-directory: ncs/nrf
run: |
python3 ../../api_check/headers --format github --resolve-paths . --relative-to . --save-stats ../../headers-stats.json ../../headers-new.pkl ../../headers-old.pkl || true
python3 ../../api_check/dts --format github --relative-to . --save-stats ../../dts-stats.json -n ../../dts-new.pkl -o ../../dts-old.pkl || true
echo Headers stats
cat ../../headers-stats.json || true
echo DTS stats
cat ../../dts-stats.json || true
- name: Update PR
if: github.event_name == 'pull_request'
working-directory: ncs/nrf
env:
PR_NUMBER: ${{ github.event.number }}
GITHUB_ACTOR: ${{ github.actor }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_REPO: ${{ github.repository }}
GITHUB_RUN_ID: ${{ github.run_id }}
run: |
python3 ../../api_check/pr ../../headers-stats.json ../../dts-stats.json
2 changes: 2 additions & 0 deletions scripts/ci/api_check/dts/__main__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
from main import main

Check warning on line 1 in scripts/ci/api_check/dts/__main__.py

View workflow job for this annotation

GitHub Actions / call-workflow / Run license checks on patch series (PR)

License Problem

Any license is allowed for this file, but it is recommended to use a more suitable one.
main()
66 changes: 66 additions & 0 deletions scripts/ci/api_check/dts/args.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
# Copyright (c) 2024 Nordic Semiconductor ASA
#
# SPDX-License-Identifier: LicenseRef-Nordic-5-Clause

import sys
import argparse
from pathlib import Path


class ArgsClass:
new: 'list[list[str]]'
old: 'list[list[str]]|None'
format: str
relative_to: 'Path | None'
save_stats: 'Path | None'
save_input: 'Path | None'
save_old_input: 'Path | None'
dump_json: 'Path | None'


def parse_args() -> ArgsClass:
parser = argparse.ArgumentParser(add_help=False,

Check failure on line 22 in scripts/ci/api_check/dts/args.py

View workflow job for this annotation

GitHub Actions / Run compliance checks on patch series (PR)

E9901

scripts/ci/api_check/dts/args.py:22 Argument parser with abbreviations is disallowed (argument-parser-with-abbreviations)
description='Detect DTS binding changes.')
parser.add_argument('-n', '--new', nargs='+', action='append', required=True,
help='List of directories where to search the new DTS binding. ' +
'The "-" will use the "ZEPHYR_BASE" environment variable to find ' +
'DTS binding in default directories.')
parser.add_argument('-o', '--old', nargs='+', action='append',
help='List of directories where to search the old DTS binding. ' +
'The "-" will use the "ZEPHYR_BASE" environment variable to find ' +
'DTS binding in default directories. You should skip this if you ' +
'want to pre-parse the input with the "--save-input" option.')
parser.add_argument('--format', choices=('text', 'github'), default='text',
help='Output format. Default is "text".')
parser.add_argument('--relative-to', type=Path,
help='Show relative paths in messages.')
parser.add_argument('--save-stats', type=Path,
help='Save statistics to JSON file.')
parser.add_argument('--save-input', metavar='FILE', type=Path,
help='Pre-parse and save the new input to a file. The file format may change ' +
'from version to version. Use always the same version ' +
'of this tool for one file.')
parser.add_argument('--save-old-input', metavar='FILE', type=Path,
help='Pre-parse and save the old input to a file.')
parser.add_argument('--dump-json', metavar='FILE', type=Path,
help='Dump input data to a JSON file (only for debug purposes).')
parser.add_argument('--help', action='help',
help='Show this help and exit.')
args: ArgsClass = parser.parse_args()

if (args.old is None) and (args.save_input is None):
parser.print_usage()
print('error: at least one of the following arguments is required: old-input, --save-input', file=sys.stderr)
sys.exit(2)

args.relative_to = args.relative_to.absolute() if args.relative_to else None

return args


args: ArgsClass = parse_args()


if __name__ == '__main__':
import json
print(json.dumps(args.__dict__, indent=4, default=lambda x: str(x)))

Check warning on line 66 in scripts/ci/api_check/dts/args.py

View workflow job for this annotation

GitHub Actions / Run compliance checks on patch series (PR)

W0108

scripts/ci/api_check/dts/args.py:66 Lambda may not be necessary (unnecessary-lambda)
110 changes: 110 additions & 0 deletions scripts/ci/api_check/dts/bindings_parser.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
# Copyright (c) 2024 Nordic Semiconductor ASA
#
# SPDX-License-Identifier: LicenseRef-Nordic-5-Clause

import sys
import pickle
from pathlib import Path
from utils import devicetree_sources, warning

if devicetree_sources:
sys.path.insert(0, devicetree_sources)

from devicetree import edtlib


class ParseResult:
bindings: 'list[Binding]'
binding_by_name: 'dict[str, Binding]'
def __init__(self):
self.bindings = []
self.binding_by_name = {}

class Property:
name: str
type: str
description: str
enum: 'set[str]'
const: 'str | None'
default: 'str | None'
deprecated: bool
required: bool
specifier_space: str

def __init__(self, prop: edtlib.PropertySpec):
self.name = prop.name
self.type = prop.type or ''
self.description = prop.description or ''
self.enum = set([ str(x) for x in (prop.enum or []) ])

Check warning on line 38 in scripts/ci/api_check/dts/bindings_parser.py

View workflow job for this annotation

GitHub Actions / Run compliance checks on patch series (PR)

R1718

scripts/ci/api_check/dts/bindings_parser.py:38 Consider using a set comprehension (consider-using-set-comprehension)
self.const = str(prop.const) if prop.const else None
self.default = str(prop.default) if prop.default else None
self.deprecated = prop.deprecated or False
self.required = prop.required or False
self.specifier_space = str(prop.specifier_space or '')

class Binding:
path: str
name: str
description: str
cells: str
buses: str
properties: 'dict[str, Property]'

def __init__(self, binding: edtlib.Binding, file: Path):
self.path = str(file)
self.name = binding.compatible or self.path
if binding.on_bus is not None:
self.name += '@' + binding.on_bus
self.description = binding.description or ''
cells_array = [
f'{name}={";".join(value)}' for name, value in (binding.specifier2cells or {}).items()
]
cells_array.sort()
self.cells = '&'.join(cells_array)
busses_array = list(binding.buses or [])
busses_array.sort()
self.buses = ';'.join(busses_array)
self.properties = {}
for key, value in (binding.prop2specs or {}).items():
prop = Property(value)
self.properties[key] = prop


def get_binding_files(bindings_dirs: 'list[Path]') -> 'list[Path]':
binding_files = []
for bindings_dir in bindings_dirs:
if not bindings_dir.is_dir():
raise FileNotFoundError(f'Bindings directory "{bindings_dir}" not found.')
for file in bindings_dir.glob('**/*.yaml'):
binding_files.append(file)
for file in bindings_dir.glob('**/*.yml'):
binding_files.append(file)
return binding_files


def parse_bindings(dirs_or_pickle: 'list[Path]|Path') -> ParseResult:
result = ParseResult()
if isinstance(dirs_or_pickle, list):
yaml_files = get_binding_files(dirs_or_pickle)
fname2path: 'dict[str, str]' = {
path.name: str(path) for path in yaml_files
}
for binding_file in yaml_files:
try:
binding = Binding(edtlib.Binding(str(binding_file), fname2path, None, False, False), binding_file)
if binding.name in result.binding_by_name:
warning(f'Repeating binding {binding.name}: {binding.path} {result.binding_by_name[binding.name].path}')
result.bindings.append(binding)
result.binding_by_name[binding.name] = binding
except edtlib.EDTError as err:
warning(err)
else:
with open(dirs_or_pickle, 'rb') as fd:
result = pickle.load(fd)
return result


def save_bindings(parse_result: ParseResult, file: Path):
with open(file, 'wb') as fd:
pickle.dump(parse_result, fd)

Check warning on line 110 in scripts/ci/api_check/dts/bindings_parser.py

View workflow job for this annotation

GitHub Actions / Run compliance checks on patch series (PR)

C0305

scripts/ci/api_check/dts/bindings_parser.py:110 Trailing newlines (trailing-newlines)
Loading

0 comments on commit 27ca4f0

Please sign in to comment.