dirsql 0.1.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- dirsql-0.1.0/.claude/CLAUDE.md +119 -0
- dirsql-0.1.0/.github/workflows/minor-release.yml +14 -0
- dirsql-0.1.0/.github/workflows/patch-release.yml +45 -0
- dirsql-0.1.0/.github/workflows/pr-monitor.yml +16 -0
- dirsql-0.1.0/.github/workflows/publish.yml +300 -0
- dirsql-0.1.0/.github/workflows/python-lint.yml +35 -0
- dirsql-0.1.0/.github/workflows/python-test.yml +45 -0
- dirsql-0.1.0/.github/workflows/rust-test.yml +41 -0
- dirsql-0.1.0/.gitignore +8 -0
- dirsql-0.1.0/Cargo.lock +922 -0
- dirsql-0.1.0/Cargo.toml +27 -0
- dirsql-0.1.0/LICENSE +21 -0
- dirsql-0.1.0/PKG-INFO +15 -0
- dirsql-0.1.0/SUMMARY.md +62 -0
- dirsql-0.1.0/index.js +5 -0
- dirsql-0.1.0/justfile +53 -0
- dirsql-0.1.0/package.json +8 -0
- dirsql-0.1.0/pyproject.toml +27 -0
- dirsql-0.1.0/src/db.rs +312 -0
- dirsql-0.1.0/src/differ.rs +372 -0
- dirsql-0.1.0/src/lib.rs +15 -0
- dirsql-0.1.0/src/matcher.rs +116 -0
- dirsql-0.1.0/src/scanner.rs +100 -0
- dirsql-0.1.0/src/watcher.rs +227 -0
|
@@ -0,0 +1,119 @@
|
|
|
1
|
+
# dirsql Development
|
|
2
|
+
|
|
3
|
+
## Scratch Files
|
|
4
|
+
|
|
5
|
+
Write scratch/temporary files to `/tmp` instead of asking permission. Use unique filenames to avoid collisions with other sessions.
|
|
6
|
+
|
|
7
|
+
## Workflow
|
|
8
|
+
|
|
9
|
+
- Work in git worktrees under `.worktrees/` folder
|
|
10
|
+
- **NEVER commit directly to main** - always create a PR
|
|
11
|
+
- One PR per bead. Beads should be concise and small -- as small as possible while still being useful
|
|
12
|
+
- Use `bd` (Beads) for task tracking: `bd list`, `bd show <id>`, `bd ready`
|
|
13
|
+
|
|
14
|
+
### Git Worktrees
|
|
15
|
+
|
|
16
|
+
**ALL work happens in git worktrees.** Never edit files in the root repo directory. Never commit outside a worktree.
|
|
17
|
+
|
|
18
|
+
#### Creating a Worktree
|
|
19
|
+
|
|
20
|
+
```bash
|
|
21
|
+
git worktree add .worktrees/my-feature -b feat/my-feature
|
|
22
|
+
cd .worktrees/my-feature
|
|
23
|
+
```
|
|
24
|
+
|
|
25
|
+
#### Removing a Worktree
|
|
26
|
+
|
|
27
|
+
**DANGER: removing a worktree while your shell CWD is inside it permanently breaks the shell.** The ONLY safe procedure:
|
|
28
|
+
|
|
29
|
+
```bash
|
|
30
|
+
# Step 1: Move CWD to the root repo FIRST (not optional)
|
|
31
|
+
cd /home/duncan/work/code/projects/dirsql
|
|
32
|
+
|
|
33
|
+
# Step 2: Now remove the worktree
|
|
34
|
+
git worktree remove .worktrees/my-feature
|
|
35
|
+
```
|
|
36
|
+
|
|
37
|
+
**Do NOT skip step 1. Do NOT substitute `git -C` for `cd`.**
|
|
38
|
+
|
|
39
|
+
### Beads Workflow
|
|
40
|
+
|
|
41
|
+
**Lifecycle:**
|
|
42
|
+
1. **Claim it FIRST**: `bd update <id> --claim` before any work
|
|
43
|
+
2. **Create worktree and branch**
|
|
44
|
+
3. **Link the PR**: `bd update <id> --external-ref "gh-<pr-number>"` after creating the PR
|
|
45
|
+
4. **Close**: `bd close <id>` immediately after the PR is merged
|
|
46
|
+
|
|
47
|
+
### Subagent Workflow
|
|
48
|
+
|
|
49
|
+
New work on beads should be done via subagents in isolated worktrees. Each subagent:
|
|
50
|
+
1. Creates a worktree and branch for its bead
|
|
51
|
+
2. Does the implementation work (red/green TDD)
|
|
52
|
+
3. Pushes the branch and opens a PR
|
|
53
|
+
4. Monitors the PR and proactively resolves:
|
|
54
|
+
- CI failures
|
|
55
|
+
- GPG signing complaints
|
|
56
|
+
- Merge conflicts
|
|
57
|
+
5. Continues monitoring until the PR is in a mergeable state
|
|
58
|
+
|
|
59
|
+
### Orchestrator Responsibilities
|
|
60
|
+
|
|
61
|
+
The orchestrator (main Claude session) must proactively:
|
|
62
|
+
1. **Monitor all open PRs** -- don't wait for the user to report failures. Check CI status after agent completion and on an ongoing basis.
|
|
63
|
+
2. **Fix CI failures** on open PRs immediately, either directly or by dispatching a fix agent.
|
|
64
|
+
3. **Handle post-merge cleanup** as soon as a PR merges (pull main, remove worktree, delete branch, close bead).
|
|
65
|
+
4. **Keep the user informed** of PR status without being asked.
|
|
66
|
+
5. **Use foreground monitoring** when waiting on CI and there's no other work to do. Background monitoring causes the conversation to go silent -- use it only when there's genuinely parallel work to perform.
|
|
67
|
+
|
|
68
|
+
### Post-Merge Cleanup
|
|
69
|
+
|
|
70
|
+
After a PR merges, the agent (or orchestrator) must:
|
|
71
|
+
1. Pull main in the **root repo**: `git -C /home/duncan/work/code/projects/dirsql pull origin main`
|
|
72
|
+
2. **Move CWD to root repo first** (CRITICAL -- never remove a worktree from inside it): `cd /home/duncan/work/code/projects/dirsql`
|
|
73
|
+
3. Remove the worktree: `git worktree remove .worktrees/<name>`
|
|
74
|
+
4. Delete the local branch: `git branch -d <branch-name>`
|
|
75
|
+
5. **Verify the bead is addressed** by the merged PR, then close it: `bd close <id>`
|
|
76
|
+
|
|
77
|
+
## Testing
|
|
78
|
+
|
|
79
|
+
### Red/Green Development
|
|
80
|
+
|
|
81
|
+
Follow **red/green** (test-first) methodology:
|
|
82
|
+
|
|
83
|
+
1. **Write the test first** -- it must capture the desired behavior
|
|
84
|
+
2. **Run it and confirm it fails (RED)** -- do NOT proceed until the test turns red reliably. A test that passes before implementation proves nothing.
|
|
85
|
+
3. **Make the minimal change to pass (GREEN)** -- only then write the implementation
|
|
86
|
+
4. Refactor if needed, keeping tests green
|
|
87
|
+
|
|
88
|
+
### TDD Order: Outside-In
|
|
89
|
+
|
|
90
|
+
Tests are written **before** implementation, starting from the outermost layer:
|
|
91
|
+
|
|
92
|
+
1. **Integration test first** -- proves the feature works from the consumer's perspective
|
|
93
|
+
2. **Unit tests** -- written as you implement each module
|
|
94
|
+
|
|
95
|
+
A feature is not done until integration tests pass and cover the new functionality.
|
|
96
|
+
|
|
97
|
+
### When to Write What
|
|
98
|
+
|
|
99
|
+
**Does the commit change the public-facing API?**
|
|
100
|
+
- Yes -> **integration test required**, plus unit tests as you go
|
|
101
|
+
- No -> Check if adequate integration coverage already exists:
|
|
102
|
+
- Adequate -> unit tests only
|
|
103
|
+
- Gaps -> add the missing integration tests, plus unit tests
|
|
104
|
+
|
|
105
|
+
**Always write unit tests.** The question is whether you also need integration tests.
|
|
106
|
+
|
|
107
|
+
### Test Locations
|
|
108
|
+
|
|
109
|
+
- **Unit tests**: Colocated with source
|
|
110
|
+
- Python: `foo.py` -> `foo_test.py` in same directory
|
|
111
|
+
- Rust: inline `#[cfg(test)]` module at bottom of each source file
|
|
112
|
+
- **Integration tests**: `tests/integration/` -- test the Python SDK layer, mock third-party deps (SQLite, LLM calls). Heavy use of pytest fixtures. Run in CI.
|
|
113
|
+
- **E2E tests**: `tests/e2e/` -- real filesystem, real SQLite, real LLM calls, no mocks. Heavy use of pytest fixtures. **NOT run in CI** (eventual LLM calls make them non-free). Run locally by Claude after significant code changes.
|
|
114
|
+
|
|
115
|
+
### E2E Test Policy
|
|
116
|
+
|
|
117
|
+
E2E tests are your primary feedback mechanism. Run them liberally after significant changes -- they catch issues that integration tests miss because integration tests mock out SQLite and (eventually) LLM calls. But do NOT add them to CI workflows. They are a local development tool.
|
|
118
|
+
|
|
119
|
+
See skillet or karat for examples of test organization, fixtures, and pytest-describe patterns.
|
|
@@ -0,0 +1,45 @@
|
|
|
1
|
+
name: Patch Release
|
|
2
|
+
|
|
3
|
+
on:
|
|
4
|
+
schedule:
|
|
5
|
+
# Run at 2:00 AM UTC every day
|
|
6
|
+
- cron: '0 2 * * *'
|
|
7
|
+
push:
|
|
8
|
+
branches: [main]
|
|
9
|
+
workflow_dispatch:
|
|
10
|
+
|
|
11
|
+
jobs:
|
|
12
|
+
check:
|
|
13
|
+
runs-on: ubuntu-latest
|
|
14
|
+
outputs:
|
|
15
|
+
should_release: ${{ steps.decide.outputs.should_release }}
|
|
16
|
+
steps:
|
|
17
|
+
- id: decide
|
|
18
|
+
env:
|
|
19
|
+
EVENT: ${{ github.event_name }}
|
|
20
|
+
STRATEGY: ${{ vars.RELEASE_STRATEGY }}
|
|
21
|
+
COMMIT_MSG: ${{ github.event.head_commit.message }}
|
|
22
|
+
run: |
|
|
23
|
+
if [ "$EVENT" = "workflow_dispatch" ]; then
|
|
24
|
+
echo "should_release=true" >> "$GITHUB_OUTPUT"
|
|
25
|
+
elif [ "$EVENT" = "schedule" ] && [ "$STRATEGY" != "immediate" ]; then
|
|
26
|
+
echo "should_release=true" >> "$GITHUB_OUTPUT"
|
|
27
|
+
elif [ "$EVENT" = "push" ] && [ "$STRATEGY" = "immediate" ]; then
|
|
28
|
+
case "$COMMIT_MSG" in
|
|
29
|
+
*'[no-release]'*) echo "should_release=false" >> "$GITHUB_OUTPUT" ;;
|
|
30
|
+
*) echo "should_release=true" >> "$GITHUB_OUTPUT" ;;
|
|
31
|
+
esac
|
|
32
|
+
else
|
|
33
|
+
echo "should_release=false" >> "$GITHUB_OUTPUT"
|
|
34
|
+
fi
|
|
35
|
+
|
|
36
|
+
release:
|
|
37
|
+
needs: check
|
|
38
|
+
if: needs.check.outputs.should_release == 'true'
|
|
39
|
+
uses: ./.github/workflows/publish.yml
|
|
40
|
+
with:
|
|
41
|
+
bump_type: patch
|
|
42
|
+
secrets: inherit
|
|
43
|
+
permissions:
|
|
44
|
+
contents: write
|
|
45
|
+
id-token: write
|
|
@@ -0,0 +1,300 @@
|
|
|
1
|
+
name: Publish Release
|
|
2
|
+
|
|
3
|
+
on:
|
|
4
|
+
workflow_call:
|
|
5
|
+
inputs:
|
|
6
|
+
bump_type:
|
|
7
|
+
description: 'Version bump type: patch or minor'
|
|
8
|
+
required: true
|
|
9
|
+
type: string
|
|
10
|
+
|
|
11
|
+
jobs:
|
|
12
|
+
check-python:
|
|
13
|
+
runs-on: ubuntu-latest
|
|
14
|
+
outputs:
|
|
15
|
+
has_python: ${{ steps.check.outputs.has_python }}
|
|
16
|
+
steps:
|
|
17
|
+
- uses: actions/checkout@v6
|
|
18
|
+
- name: Check for pyproject.toml
|
|
19
|
+
id: check
|
|
20
|
+
run: |
|
|
21
|
+
if [ -f pyproject.toml ]; then
|
|
22
|
+
echo "has_python=true" >> $GITHUB_OUTPUT
|
|
23
|
+
else
|
|
24
|
+
echo "has_python=false" >> $GITHUB_OUTPUT
|
|
25
|
+
fi
|
|
26
|
+
|
|
27
|
+
build:
|
|
28
|
+
needs: check-python
|
|
29
|
+
if: needs.check-python.outputs.has_python == 'true'
|
|
30
|
+
runs-on: ${{ matrix.os }}
|
|
31
|
+
strategy:
|
|
32
|
+
fail-fast: false
|
|
33
|
+
matrix:
|
|
34
|
+
include:
|
|
35
|
+
- os: ubuntu-latest
|
|
36
|
+
target: x86_64-unknown-linux-gnu
|
|
37
|
+
# aarch64-unknown-linux-gnu disabled: cross-compilation fails in manylinux container
|
|
38
|
+
# TODO: re-enable once bundled SQLite cross-compilation is sorted out
|
|
39
|
+
- os: macos-latest
|
|
40
|
+
target: x86_64-apple-darwin
|
|
41
|
+
- os: macos-latest
|
|
42
|
+
target: aarch64-apple-darwin
|
|
43
|
+
- os: windows-latest
|
|
44
|
+
target: x86_64-pc-windows-msvc
|
|
45
|
+
steps:
|
|
46
|
+
- uses: actions/checkout@v6
|
|
47
|
+
with:
|
|
48
|
+
fetch-depth: 0
|
|
49
|
+
|
|
50
|
+
- uses: PyO3/maturin-action@v1
|
|
51
|
+
with:
|
|
52
|
+
target: ${{ matrix.target }}
|
|
53
|
+
args: --release --out dist
|
|
54
|
+
manylinux: auto
|
|
55
|
+
|
|
56
|
+
- uses: actions/upload-artifact@v4
|
|
57
|
+
with:
|
|
58
|
+
name: wheels-${{ matrix.target }}
|
|
59
|
+
path: dist
|
|
60
|
+
|
|
61
|
+
sdist:
|
|
62
|
+
needs: check-python
|
|
63
|
+
if: needs.check-python.outputs.has_python == 'true'
|
|
64
|
+
runs-on: ubuntu-latest
|
|
65
|
+
steps:
|
|
66
|
+
- uses: actions/checkout@v6
|
|
67
|
+
with:
|
|
68
|
+
fetch-depth: 0
|
|
69
|
+
|
|
70
|
+
- uses: PyO3/maturin-action@v1
|
|
71
|
+
with:
|
|
72
|
+
command: sdist
|
|
73
|
+
args: --out dist
|
|
74
|
+
|
|
75
|
+
- uses: actions/upload-artifact@v4
|
|
76
|
+
with:
|
|
77
|
+
name: sdist
|
|
78
|
+
path: dist
|
|
79
|
+
|
|
80
|
+
tag:
|
|
81
|
+
runs-on: ubuntu-latest
|
|
82
|
+
permissions:
|
|
83
|
+
contents: write
|
|
84
|
+
outputs:
|
|
85
|
+
created: ${{ steps.tag.outputs.created }}
|
|
86
|
+
new_version: ${{ steps.version.outputs.new_version }}
|
|
87
|
+
steps:
|
|
88
|
+
- uses: actions/checkout@v6
|
|
89
|
+
with:
|
|
90
|
+
fetch-depth: 0
|
|
91
|
+
|
|
92
|
+
- name: Get current version
|
|
93
|
+
id: current
|
|
94
|
+
run: |
|
|
95
|
+
latest_tag=$(git tag --sort=-v:refname | grep -E '^v[0-9]' | head -n1)
|
|
96
|
+
|
|
97
|
+
if [ -z "$latest_tag" ]; then
|
|
98
|
+
echo "No tags found"
|
|
99
|
+
echo "version=0.0.0" >> $GITHUB_OUTPUT
|
|
100
|
+
echo "has_tags=false" >> $GITHUB_OUTPUT
|
|
101
|
+
else
|
|
102
|
+
echo "Latest tag: $latest_tag"
|
|
103
|
+
echo "version=${latest_tag#v}" >> $GITHUB_OUTPUT
|
|
104
|
+
echo "has_tags=true" >> $GITHUB_OUTPUT
|
|
105
|
+
|
|
106
|
+
commits_since_tag=$(git rev-list ${latest_tag}..HEAD --count)
|
|
107
|
+
echo "commits_since_tag=$commits_since_tag" >> $GITHUB_OUTPUT
|
|
108
|
+
fi
|
|
109
|
+
|
|
110
|
+
- name: Check for changes (patch only)
|
|
111
|
+
if: inputs.bump_type == 'patch' && steps.current.outputs.has_tags == 'true'
|
|
112
|
+
id: check_changes
|
|
113
|
+
run: |
|
|
114
|
+
if [ "${{ steps.current.outputs.commits_since_tag }}" -eq 0 ]; then
|
|
115
|
+
echo "No new commits since last tag, skipping release"
|
|
116
|
+
echo "should_release=false" >> $GITHUB_OUTPUT
|
|
117
|
+
else
|
|
118
|
+
echo "Found ${{ steps.current.outputs.commits_since_tag }} commits since last tag"
|
|
119
|
+
echo "should_release=true" >> $GITHUB_OUTPUT
|
|
120
|
+
fi
|
|
121
|
+
|
|
122
|
+
- name: Calculate new version
|
|
123
|
+
id: version
|
|
124
|
+
run: |
|
|
125
|
+
current="${{ steps.current.outputs.version }}"
|
|
126
|
+
IFS='.' read -r major minor patch <<< "$current"
|
|
127
|
+
|
|
128
|
+
if [ "${{ inputs.bump_type }}" == "minor" ]; then
|
|
129
|
+
new_version="${major}.$((minor + 1)).0"
|
|
130
|
+
else
|
|
131
|
+
new_version="${major}.${minor}.$((patch + 1))"
|
|
132
|
+
fi
|
|
133
|
+
|
|
134
|
+
echo "new_version=$new_version" >> $GITHUB_OUTPUT
|
|
135
|
+
echo "New version will be: $new_version"
|
|
136
|
+
|
|
137
|
+
- name: Check if tag exists
|
|
138
|
+
id: tag_check
|
|
139
|
+
run: |
|
|
140
|
+
if git ls-remote --tags origin | grep -q "refs/tags/v${{ steps.version.outputs.new_version }}$"; then
|
|
141
|
+
echo "Tag v${{ steps.version.outputs.new_version }} already exists on remote"
|
|
142
|
+
echo "exists=true" >> $GITHUB_OUTPUT
|
|
143
|
+
else
|
|
144
|
+
echo "exists=false" >> $GITHUB_OUTPUT
|
|
145
|
+
fi
|
|
146
|
+
|
|
147
|
+
- name: Determine if release should proceed
|
|
148
|
+
id: should_release
|
|
149
|
+
run: |
|
|
150
|
+
if [ "${{ steps.tag_check.outputs.exists }}" == "true" ]; then
|
|
151
|
+
echo "proceed=false" >> $GITHUB_OUTPUT
|
|
152
|
+
exit 0
|
|
153
|
+
fi
|
|
154
|
+
|
|
155
|
+
if [ "${{ inputs.bump_type }}" == "patch" ] && [ "${{ steps.current.outputs.has_tags }}" == "true" ]; then
|
|
156
|
+
if [ "${{ steps.check_changes.outputs.should_release }}" == "false" ]; then
|
|
157
|
+
echo "proceed=false" >> $GITHUB_OUTPUT
|
|
158
|
+
exit 0
|
|
159
|
+
fi
|
|
160
|
+
fi
|
|
161
|
+
|
|
162
|
+
echo "proceed=true" >> $GITHUB_OUTPUT
|
|
163
|
+
|
|
164
|
+
- name: Create and push tag
|
|
165
|
+
if: steps.should_release.outputs.proceed == 'true'
|
|
166
|
+
id: tag
|
|
167
|
+
run: |
|
|
168
|
+
git config user.name "github-actions[bot]"
|
|
169
|
+
git config user.email "github-actions[bot]@users.noreply.github.com"
|
|
170
|
+
git tag -a "v${{ steps.version.outputs.new_version }}" -m "Release v${{ steps.version.outputs.new_version }}"
|
|
171
|
+
git push origin "v${{ steps.version.outputs.new_version }}"
|
|
172
|
+
echo "created=true" >> $GITHUB_OUTPUT
|
|
173
|
+
|
|
174
|
+
publish-pypi:
|
|
175
|
+
needs: [tag, build, sdist]
|
|
176
|
+
if: always() && needs.tag.outputs.created == 'true' && needs.sdist.result == 'success'
|
|
177
|
+
runs-on: ubuntu-latest
|
|
178
|
+
permissions:
|
|
179
|
+
id-token: write
|
|
180
|
+
environment: release
|
|
181
|
+
steps:
|
|
182
|
+
- uses: actions/download-artifact@v4
|
|
183
|
+
with:
|
|
184
|
+
pattern: wheels-*
|
|
185
|
+
merge-multiple: true
|
|
186
|
+
path: dist
|
|
187
|
+
|
|
188
|
+
- uses: actions/download-artifact@v4
|
|
189
|
+
with:
|
|
190
|
+
name: sdist
|
|
191
|
+
path: dist
|
|
192
|
+
|
|
193
|
+
- name: Publish to PyPI
|
|
194
|
+
uses: pypa/gh-action-pypi-publish@release/v1
|
|
195
|
+
with:
|
|
196
|
+
attestations: false
|
|
197
|
+
|
|
198
|
+
publish-crates:
|
|
199
|
+
needs: tag
|
|
200
|
+
if: needs.tag.outputs.created == 'true'
|
|
201
|
+
runs-on: ubuntu-latest
|
|
202
|
+
steps:
|
|
203
|
+
- uses: actions/checkout@v6
|
|
204
|
+
with:
|
|
205
|
+
fetch-depth: 0
|
|
206
|
+
|
|
207
|
+
- name: Install Rust
|
|
208
|
+
uses: dtolnay/rust-toolchain@stable
|
|
209
|
+
|
|
210
|
+
- name: Update Cargo.toml version
|
|
211
|
+
run: |
|
|
212
|
+
sed -i 's/^version = ".*"/version = "${{ needs.tag.outputs.new_version }}"/' Cargo.toml
|
|
213
|
+
|
|
214
|
+
- name: Publish to crates.io
|
|
215
|
+
env:
|
|
216
|
+
CARGO_REGISTRY_TOKEN: ${{ secrets.CARGO_REGISTRY_TOKEN }}
|
|
217
|
+
run: |
|
|
218
|
+
for attempt in 1 2 3; do
|
|
219
|
+
echo "Attempt $attempt of 3"
|
|
220
|
+
if cargo publish --allow-dirty; then
|
|
221
|
+
echo "Published successfully"
|
|
222
|
+
break
|
|
223
|
+
fi
|
|
224
|
+
if [ "$attempt" -lt 3 ]; then
|
|
225
|
+
echo "Publish failed, retrying in 15s..."
|
|
226
|
+
sleep 15
|
|
227
|
+
else
|
|
228
|
+
echo "All attempts failed"
|
|
229
|
+
exit 1
|
|
230
|
+
fi
|
|
231
|
+
done
|
|
232
|
+
|
|
233
|
+
publish-npm:
|
|
234
|
+
needs: tag
|
|
235
|
+
if: needs.tag.outputs.created == 'true'
|
|
236
|
+
runs-on: ubuntu-latest
|
|
237
|
+
permissions:
|
|
238
|
+
contents: read
|
|
239
|
+
id-token: write
|
|
240
|
+
steps:
|
|
241
|
+
- uses: actions/checkout@v6
|
|
242
|
+
|
|
243
|
+
- name: Check if package.json exists
|
|
244
|
+
id: check
|
|
245
|
+
run: |
|
|
246
|
+
if [ -f package.json ]; then
|
|
247
|
+
echo "exists=true" >> $GITHUB_OUTPUT
|
|
248
|
+
else
|
|
249
|
+
echo "exists=false" >> $GITHUB_OUTPUT
|
|
250
|
+
fi
|
|
251
|
+
|
|
252
|
+
- name: Setup Node.js
|
|
253
|
+
if: steps.check.outputs.exists == 'true'
|
|
254
|
+
uses: actions/setup-node@v4
|
|
255
|
+
with:
|
|
256
|
+
node-version: '20'
|
|
257
|
+
registry-url: https://registry.npmjs.org
|
|
258
|
+
|
|
259
|
+
- name: Publish to npm
|
|
260
|
+
if: steps.check.outputs.exists == 'true'
|
|
261
|
+
run: npm publish --provenance --access public
|
|
262
|
+
|
|
263
|
+
github-release:
|
|
264
|
+
needs: [tag, publish-pypi, publish-crates, publish-npm]
|
|
265
|
+
if: always() && needs.tag.outputs.created == 'true'
|
|
266
|
+
runs-on: ubuntu-latest
|
|
267
|
+
permissions:
|
|
268
|
+
contents: write
|
|
269
|
+
steps:
|
|
270
|
+
- uses: actions/checkout@v6
|
|
271
|
+
with:
|
|
272
|
+
fetch-depth: 0
|
|
273
|
+
|
|
274
|
+
- name: Create GitHub Release
|
|
275
|
+
env:
|
|
276
|
+
GH_TOKEN: ${{ github.token }}
|
|
277
|
+
run: |
|
|
278
|
+
gh release create "v${{ needs.tag.outputs.new_version }}" \
|
|
279
|
+
--repo ${{ github.repository }} \
|
|
280
|
+
--title "v${{ needs.tag.outputs.new_version }}" \
|
|
281
|
+
--generate-notes
|
|
282
|
+
|
|
283
|
+
rollback:
|
|
284
|
+
needs: [tag, publish-pypi, publish-crates, publish-npm]
|
|
285
|
+
if: |
|
|
286
|
+
always() &&
|
|
287
|
+
needs.tag.outputs.created == 'true' &&
|
|
288
|
+
needs.publish-pypi.result != 'success' &&
|
|
289
|
+
needs.publish-crates.result != 'success' &&
|
|
290
|
+
needs.publish-npm.result != 'success'
|
|
291
|
+
runs-on: ubuntu-latest
|
|
292
|
+
permissions:
|
|
293
|
+
contents: write
|
|
294
|
+
steps:
|
|
295
|
+
- uses: actions/checkout@v6
|
|
296
|
+
with:
|
|
297
|
+
fetch-depth: 0
|
|
298
|
+
|
|
299
|
+
- name: Rollback tag on failure
|
|
300
|
+
run: git push --delete origin "v${{ needs.tag.outputs.new_version }}"
|
|
@@ -0,0 +1,35 @@
|
|
|
1
|
+
name: Python Lint
|
|
2
|
+
|
|
3
|
+
on:
|
|
4
|
+
push:
|
|
5
|
+
branches: [main]
|
|
6
|
+
paths:
|
|
7
|
+
- '**.py'
|
|
8
|
+
- 'pyproject.toml'
|
|
9
|
+
- 'uv.lock'
|
|
10
|
+
pull_request:
|
|
11
|
+
paths:
|
|
12
|
+
- '**.py'
|
|
13
|
+
- 'pyproject.toml'
|
|
14
|
+
- 'uv.lock'
|
|
15
|
+
|
|
16
|
+
jobs:
|
|
17
|
+
lint:
|
|
18
|
+
runs-on: ubuntu-latest
|
|
19
|
+
steps:
|
|
20
|
+
- uses: actions/checkout@v6
|
|
21
|
+
|
|
22
|
+
- name: Install uv
|
|
23
|
+
uses: astral-sh/setup-uv@v7
|
|
24
|
+
|
|
25
|
+
- name: Install just
|
|
26
|
+
uses: extractions/setup-just@v2
|
|
27
|
+
|
|
28
|
+
- name: Install dependencies
|
|
29
|
+
run: uv sync --extra dev
|
|
30
|
+
|
|
31
|
+
- name: Lint
|
|
32
|
+
run: uv run just lint
|
|
33
|
+
|
|
34
|
+
- name: Format check
|
|
35
|
+
run: uv run just format-check
|
|
@@ -0,0 +1,45 @@
|
|
|
1
|
+
name: Python Test
|
|
2
|
+
|
|
3
|
+
on:
|
|
4
|
+
push:
|
|
5
|
+
branches: [main]
|
|
6
|
+
paths:
|
|
7
|
+
- '**.py'
|
|
8
|
+
- 'pyproject.toml'
|
|
9
|
+
- 'uv.lock'
|
|
10
|
+
- 'tests/**'
|
|
11
|
+
pull_request:
|
|
12
|
+
paths:
|
|
13
|
+
- '**.py'
|
|
14
|
+
- 'pyproject.toml'
|
|
15
|
+
- 'uv.lock'
|
|
16
|
+
- 'tests/**'
|
|
17
|
+
|
|
18
|
+
jobs:
|
|
19
|
+
test:
|
|
20
|
+
runs-on: ubuntu-latest
|
|
21
|
+
strategy:
|
|
22
|
+
matrix:
|
|
23
|
+
python-version: ["3.12", "3.13"]
|
|
24
|
+
steps:
|
|
25
|
+
- uses: actions/checkout@v6
|
|
26
|
+
|
|
27
|
+
- name: Install Rust
|
|
28
|
+
uses: dtolnay/rust-toolchain@stable
|
|
29
|
+
|
|
30
|
+
- name: Install uv
|
|
31
|
+
uses: astral-sh/setup-uv@v7
|
|
32
|
+
with:
|
|
33
|
+
python-version: ${{ matrix.python-version }}
|
|
34
|
+
|
|
35
|
+
- name: Install just
|
|
36
|
+
uses: extractions/setup-just@v2
|
|
37
|
+
|
|
38
|
+
- name: Install dependencies
|
|
39
|
+
run: uv sync --extra dev
|
|
40
|
+
|
|
41
|
+
- name: Build Rust extension
|
|
42
|
+
run: uv run maturin develop
|
|
43
|
+
|
|
44
|
+
- name: Run tests
|
|
45
|
+
run: uv run just test-ci
|
|
@@ -0,0 +1,41 @@
|
|
|
1
|
+
name: Rust Test
|
|
2
|
+
|
|
3
|
+
on:
|
|
4
|
+
push:
|
|
5
|
+
branches: [main]
|
|
6
|
+
paths:
|
|
7
|
+
- '**.rs'
|
|
8
|
+
- 'Cargo.toml'
|
|
9
|
+
- 'Cargo.lock'
|
|
10
|
+
pull_request:
|
|
11
|
+
paths:
|
|
12
|
+
- '**.rs'
|
|
13
|
+
- 'Cargo.toml'
|
|
14
|
+
- 'Cargo.lock'
|
|
15
|
+
|
|
16
|
+
jobs:
|
|
17
|
+
test:
|
|
18
|
+
runs-on: ubuntu-latest
|
|
19
|
+
steps:
|
|
20
|
+
- uses: actions/checkout@v6
|
|
21
|
+
|
|
22
|
+
- name: Install Rust
|
|
23
|
+
uses: dtolnay/rust-toolchain@stable
|
|
24
|
+
|
|
25
|
+
- name: Cache cargo
|
|
26
|
+
uses: actions/cache@v4
|
|
27
|
+
with:
|
|
28
|
+
path: |
|
|
29
|
+
~/.cargo/registry
|
|
30
|
+
~/.cargo/git
|
|
31
|
+
target
|
|
32
|
+
key: ${{ runner.os }}-cargo-${{ hashFiles('Cargo.lock') }}
|
|
33
|
+
|
|
34
|
+
- name: Run tests
|
|
35
|
+
run: cargo test
|
|
36
|
+
|
|
37
|
+
- name: Clippy
|
|
38
|
+
run: cargo clippy -- -D warnings
|
|
39
|
+
|
|
40
|
+
- name: Format check
|
|
41
|
+
run: cargo fmt -- --check
|