mirror of
https://github.com/GeWuYou/GFramework.git
synced 2026-05-14 14:58:58 +08:00
Compare commits
No commits in common. "main" and "v0.0.48" have entirely different histories.
@ -1,113 +0,0 @@
|
|||||||
# GFramework Skills
|
|
||||||
|
|
||||||
公开入口目前包含 `gframework-doc-refresh`、`gframework-batch-boot` 与 `gframework-multi-agent-batch`。
|
|
||||||
|
|
||||||
## 公开入口
|
|
||||||
|
|
||||||
### `gframework-doc-refresh`
|
|
||||||
|
|
||||||
按源码模块驱动文档刷新,而不是按 `guide`、`tutorial`、`api` 等类型拆入口。
|
|
||||||
|
|
||||||
适用场景:
|
|
||||||
|
|
||||||
- 刷新某个模块的 landing page
|
|
||||||
- 复核专题页是否与源码、测试、README 一致
|
|
||||||
- 评估是否需要补 API reference 或教程
|
|
||||||
- 在 adoption path 不清晰时引入 `ai-libs/` 消费者接法作为补充证据
|
|
||||||
|
|
||||||
推荐调用:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
/gframework-doc-refresh <module>
|
|
||||||
```
|
|
||||||
|
|
||||||
示例:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
/gframework-doc-refresh Core
|
|
||||||
/gframework-doc-refresh Godot.SourceGenerators
|
|
||||||
/gframework-doc-refresh Cqrs
|
|
||||||
```
|
|
||||||
|
|
||||||
### `gframework-batch-boot`
|
|
||||||
|
|
||||||
在 `gframework-boot` 的基础上,自动推进可分批执行的重复性任务,不需要人工一轮轮重新触发。
|
|
||||||
|
|
||||||
适用场景:
|
|
||||||
|
|
||||||
- analyzer warning reduction
|
|
||||||
- 大批量测试结构收口
|
|
||||||
- 分模块文档刷新 wave
|
|
||||||
- 任何有明确 stop condition 的多批次任务
|
|
||||||
|
|
||||||
推荐调用:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
/gframework-batch-boot <task-or-stop-condition>
|
|
||||||
```
|
|
||||||
|
|
||||||
批处理阈值速记:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
/gframework-batch-boot 75
|
|
||||||
/gframework-batch-boot 75 2000
|
|
||||||
```
|
|
||||||
|
|
||||||
- 单个数字默认表示“当前分支全部提交相对远程 `origin/main` 接近多少个文件变更时停止”
|
|
||||||
- 两个数字默认表示“当前分支全部提交相对远程 `origin/main` 的 `文件数 OR 变更行数`”,顺序固定为 `<files> <lines>`
|
|
||||||
- 不推荐写 `/gframework-batch-boot 75 | 2000`,因为 `|` 很像 shell pipe;若用户这样写,也应按 OR 语义理解并在后续说明中归一化成无 `|` 版本
|
|
||||||
|
|
||||||
示例:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
/gframework-batch-boot 75
|
|
||||||
/gframework-batch-boot 75 2000
|
|
||||||
/gframework-batch-boot continue analyzer warning reduction until branch diff vs origin/main approaches 75 files
|
|
||||||
/gframework-batch-boot keep refactoring repetitive source-generator tests in bounded batches
|
|
||||||
```
|
|
||||||
|
|
||||||
### `gframework-multi-agent-batch`
|
|
||||||
|
|
||||||
当用户希望主 Agent 负责拆分任务、派发互不冲突的 subagent 切片、核对进度、维护 `ai-plan`、验收结果并持续推进时,使用该入口。
|
|
||||||
|
|
||||||
适用场景:
|
|
||||||
|
|
||||||
- 复杂任务已经明确可以拆成多个互不冲突的写面
|
|
||||||
- 主 Agent 需要持续 review / integrate,而不是把执行权完全交给单个 worker
|
|
||||||
- 需要把 delegated scope、验证结果与下一恢复点同步写回 `ai-plan`
|
|
||||||
- 任务仍要受 branch diff、context budget 与 reviewability 边界约束
|
|
||||||
|
|
||||||
推荐调用:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
/gframework-multi-agent-batch <task>
|
|
||||||
```
|
|
||||||
|
|
||||||
示例:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
/gframework-multi-agent-batch continue the current cqrs optimization by delegating non-conflicting benchmark and runtime slices
|
|
||||||
/gframework-multi-agent-batch coordinate parallel subagents, keep ai-plan updated, and stop when reviewability starts to degrade
|
|
||||||
```
|
|
||||||
|
|
||||||
## 共享资源
|
|
||||||
|
|
||||||
- `_shared/DOCUMENTATION_STANDARDS.md`
|
|
||||||
- 统一的文档规则、证据顺序与验证要求
|
|
||||||
- `_shared/module-map.json`
|
|
||||||
- 机器可读的模块映射表
|
|
||||||
- `_shared/module-config.sh`
|
|
||||||
- 轻量 shell 辅助函数
|
|
||||||
|
|
||||||
## 内部资源
|
|
||||||
|
|
||||||
`gframework-doc-refresh/` 下包含:
|
|
||||||
|
|
||||||
- `references/`
|
|
||||||
- 模块选择、证据顺序、输出策略
|
|
||||||
- `templates/`
|
|
||||||
- landing page、专题页、API reference、教程模板
|
|
||||||
- `scripts/`
|
|
||||||
- 模块扫描与文档验证脚本
|
|
||||||
|
|
||||||
旧 `vitepress-*` skills 不再作为并列公开入口保留。
|
|
||||||
@ -1,126 +0,0 @@
|
|||||||
# GFramework 文档编写规范
|
|
||||||
|
|
||||||
本文件只保留跨模块稳定生效的写作与校验规则,不再维护容易失真的固定页面清单。
|
|
||||||
|
|
||||||
模块到源码、测试、README、`docs/zh-CN` 栏目以及 `ai-libs/` 参考入口的映射,统一以
|
|
||||||
`.agents/skills/_shared/module-map.json` 为准。
|
|
||||||
|
|
||||||
## 证据顺序
|
|
||||||
|
|
||||||
统一按以下顺序判断文档应写什么、删什么、保留什么:
|
|
||||||
|
|
||||||
1. 源码、公开 XML docs、`*.csproj`
|
|
||||||
2. 对应测试和 snapshot
|
|
||||||
3. 模块 `README.md`
|
|
||||||
4. 当前 `docs/zh-CN` 页面
|
|
||||||
5. `ai-libs/` 下已验证的消费者项目
|
|
||||||
6. 归档文档,仅在前述证据无法解释当前行为时回看
|
|
||||||
|
|
||||||
不要把旧文档互相抄写当成“更新”。
|
|
||||||
|
|
||||||
## 模块驱动规则
|
|
||||||
|
|
||||||
- 先按源码模块归一化输入,再决定落到 landing page、专题页、API reference、教程还是仅做校验。
|
|
||||||
- 如果用户给的是栏目名而不是源码模块名,先映射回模块;若仍有歧义,只给归一化建议,不直接生成文档。
|
|
||||||
- 文档栏目是派生输出,不是主输入源。
|
|
||||||
|
|
||||||
## `ai-libs/` 使用边界
|
|
||||||
|
|
||||||
`ai-libs/` 只用于补消费者视角证据:
|
|
||||||
|
|
||||||
- 验证真实接入目录结构
|
|
||||||
- 查最小 wiring、扩展点装配方式
|
|
||||||
- 给 adoption path 提供端到端例子
|
|
||||||
|
|
||||||
不要用 `ai-libs/` 覆盖以下事实:
|
|
||||||
|
|
||||||
- 公共 API 契约
|
|
||||||
- 当前版本支持范围
|
|
||||||
- Source Generator 诊断与生成语义
|
|
||||||
|
|
||||||
如果 `ai-libs/` 与当前源码或测试冲突,以当前仓库实现为准,并在文档里写明迁移或兼容边界。
|
|
||||||
|
|
||||||
## 公开文档边界
|
|
||||||
|
|
||||||
- `README.md` 与 `docs/**` 面向框架使用者,不面向治理执行者。
|
|
||||||
- 不要把 inventory、覆盖基线、恢复点、批处理阈值、review 线程、待补审计波次等内部治理信息写进公开页面。
|
|
||||||
- XML、README、测试与 `ai-libs/` 证据可以驱动文档决策,但公开页面只能输出读者真正需要的内容:
|
|
||||||
- 模块边界
|
|
||||||
- 最小接入路径
|
|
||||||
- 推荐阅读顺序
|
|
||||||
- 源码 / XML / API 的入口提示
|
|
||||||
- 如果确实需要保留治理基线、盘点结果或后续治理计划,把它写到 `ai-plan/**` 或其他 contributor-only 记录里。
|
|
||||||
- 当页面需要引用 XML 文档时,写“应优先查看哪些类型、命名空间或契约,以及为什么”,不要写覆盖数量、盘点日期或“已覆盖 / 未覆盖”状态。
|
|
||||||
|
|
||||||
## Markdown 规则
|
|
||||||
|
|
||||||
### 泛型与 HTML 转义
|
|
||||||
|
|
||||||
代码块外出现泛型或 XML 标签时必须转义:
|
|
||||||
|
|
||||||
- `List<T>`
|
|
||||||
- `Result<TValue, TError>`
|
|
||||||
- `<summary>`
|
|
||||||
- `<param>`
|
|
||||||
|
|
||||||
### Frontmatter
|
|
||||||
|
|
||||||
每个文档都必须包含合法 frontmatter:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
---
|
|
||||||
title: 文档标题
|
|
||||||
description: 1-2 句话描述当前页面解决什么问题
|
|
||||||
---
|
|
||||||
```
|
|
||||||
|
|
||||||
### 代码块
|
|
||||||
|
|
||||||
- 始终标注语言,如 `csharp`、`bash`、`json`
|
|
||||||
- 示例只保留当前实现可追溯的最小路径
|
|
||||||
- 必要时写中文注释解释接入原因或边界,不要堆砌与代码同步无关的注释
|
|
||||||
|
|
||||||
### 链接
|
|
||||||
|
|
||||||
- 只链接到当前仓库真实存在的页面
|
|
||||||
- 站内链接优先使用 `/zh-CN/...` 形式
|
|
||||||
- 如果文档站不允许跳出 `docs/` 根目录,就不要把仓库 README 写成站内链接
|
|
||||||
|
|
||||||
## 输出优先级
|
|
||||||
|
|
||||||
统一按以下顺序决定产出:
|
|
||||||
|
|
||||||
1. 先修模块 README、landing page 与 adoption path
|
|
||||||
2. 再修失真的专题页
|
|
||||||
3. 再补 API reference
|
|
||||||
4. 最后才补教程
|
|
||||||
|
|
||||||
## 用户页检查点
|
|
||||||
|
|
||||||
- 用户读完页面后,应知道怎么采用、该看哪几个入口,而不是知道当前治理批次做到哪一轮。
|
|
||||||
- 如果一段内容删掉日期、数量、基线、治理术语后就失去价值,它大概率不该出现在公开文档里。
|
|
||||||
- 表格优先表达“何时看什么、解决什么问题”,不要表达“当前盘点覆盖到哪里”。
|
|
||||||
|
|
||||||
## 验证清单
|
|
||||||
|
|
||||||
- [ ] frontmatter 正确
|
|
||||||
- [ ] 代码块语言标记齐全
|
|
||||||
- [ ] 泛型和 XML 标签已转义
|
|
||||||
- [ ] 站内链接存在
|
|
||||||
- [ ] 示例与当前实现一致
|
|
||||||
- [ ] `ai-libs/` 只作为消费者接入参考,没有覆盖源码契约
|
|
||||||
|
|
||||||
## 验证工具
|
|
||||||
|
|
||||||
统一复用 `gframework-doc-refresh/scripts/` 下的校验脚本:
|
|
||||||
|
|
||||||
- `validate-frontmatter.sh`
|
|
||||||
- `validate-links.sh`
|
|
||||||
- `validate-code-blocks.sh`
|
|
||||||
- `validate-all.sh`
|
|
||||||
|
|
||||||
需要站点级验证时,执行:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd docs && bun run build
|
|
||||||
```
|
|
||||||
@ -1,271 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# 共享的模块配置
|
|
||||||
# 机器可读映射以 .agents/skills/_shared/module-map.json 为准。
|
|
||||||
|
|
||||||
normalize_module() {
|
|
||||||
local INPUT
|
|
||||||
INPUT="$(echo "$1" | tr '[:upper:]' '[:lower:]' | tr ' ' '-' | tr '_' '-')"
|
|
||||||
|
|
||||||
case "$INPUT" in
|
|
||||||
core|core-runtime|runtime-core|core-module)
|
|
||||||
echo "Core"
|
|
||||||
;;
|
|
||||||
core.abstractions|core-abstractions)
|
|
||||||
echo "Core.Abstractions"
|
|
||||||
;;
|
|
||||||
core.sourcegenerators|core-source-generators|core-sourcegenerators)
|
|
||||||
echo "Core.SourceGenerators"
|
|
||||||
;;
|
|
||||||
core.sourcegenerators.abstractions|core-source-generators-abstractions)
|
|
||||||
echo "Core.SourceGenerators.Abstractions"
|
|
||||||
;;
|
|
||||||
game|game-runtime|runtime-game|game-module)
|
|
||||||
echo "Game"
|
|
||||||
;;
|
|
||||||
game.abstractions|game-abstractions)
|
|
||||||
echo "Game.Abstractions"
|
|
||||||
;;
|
|
||||||
game.sourcegenerators|game-source-generators)
|
|
||||||
echo "Game.SourceGenerators"
|
|
||||||
;;
|
|
||||||
godot|godot-runtime|runtime-godot|godot-module)
|
|
||||||
echo "Godot"
|
|
||||||
;;
|
|
||||||
godot.sourcegenerators|godot-source-generators|godot-generators)
|
|
||||||
echo "Godot.SourceGenerators"
|
|
||||||
;;
|
|
||||||
godot.sourcegenerators.abstractions|godot-source-generators-abstractions)
|
|
||||||
echo "Godot.SourceGenerators.Abstractions"
|
|
||||||
;;
|
|
||||||
cqrs|mediator|cqrs-module)
|
|
||||||
echo "Cqrs"
|
|
||||||
;;
|
|
||||||
cqrs.abstractions|cqrs-abstractions)
|
|
||||||
echo "Cqrs.Abstractions"
|
|
||||||
;;
|
|
||||||
cqrs.sourcegenerators|cqrs-source-generators)
|
|
||||||
echo "Cqrs.SourceGenerators"
|
|
||||||
;;
|
|
||||||
ecs|ecs.arch|ecs-arch)
|
|
||||||
echo "Ecs.Arch"
|
|
||||||
;;
|
|
||||||
ecs.arch.abstractions|ecs-arch-abstractions)
|
|
||||||
echo "Ecs.Arch.Abstractions"
|
|
||||||
;;
|
|
||||||
sourcegenerators.common|source-generators-common)
|
|
||||||
echo "SourceGenerators.Common"
|
|
||||||
;;
|
|
||||||
*)
|
|
||||||
return 1
|
|
||||||
;;
|
|
||||||
esac
|
|
||||||
}
|
|
||||||
|
|
||||||
get_all_modules() {
|
|
||||||
cat <<'EOF'
|
|
||||||
Core
|
|
||||||
Core.Abstractions
|
|
||||||
Core.SourceGenerators
|
|
||||||
Core.SourceGenerators.Abstractions
|
|
||||||
Game
|
|
||||||
Game.Abstractions
|
|
||||||
Game.SourceGenerators
|
|
||||||
Godot
|
|
||||||
Godot.SourceGenerators
|
|
||||||
Godot.SourceGenerators.Abstractions
|
|
||||||
Cqrs
|
|
||||||
Cqrs.Abstractions
|
|
||||||
Cqrs.SourceGenerators
|
|
||||||
Ecs.Arch
|
|
||||||
Ecs.Arch.Abstractions
|
|
||||||
SourceGenerators.Common
|
|
||||||
EOF
|
|
||||||
}
|
|
||||||
|
|
||||||
is_valid_module() {
|
|
||||||
normalize_module "$1" >/dev/null 2>&1
|
|
||||||
}
|
|
||||||
|
|
||||||
get_source_dirs() {
|
|
||||||
local MODULE
|
|
||||||
MODULE="$(normalize_module "$1")" || return 1
|
|
||||||
|
|
||||||
case "$MODULE" in
|
|
||||||
Core)
|
|
||||||
echo "GFramework.Core"
|
|
||||||
;;
|
|
||||||
Core.Abstractions)
|
|
||||||
echo "GFramework.Core.Abstractions"
|
|
||||||
;;
|
|
||||||
Core.SourceGenerators)
|
|
||||||
echo "GFramework.Core.SourceGenerators"
|
|
||||||
;;
|
|
||||||
Core.SourceGenerators.Abstractions)
|
|
||||||
echo "GFramework.Core.SourceGenerators.Abstractions"
|
|
||||||
;;
|
|
||||||
Game)
|
|
||||||
echo "GFramework.Game"
|
|
||||||
;;
|
|
||||||
Game.Abstractions)
|
|
||||||
echo "GFramework.Game.Abstractions"
|
|
||||||
;;
|
|
||||||
Game.SourceGenerators)
|
|
||||||
echo "GFramework.Game.SourceGenerators"
|
|
||||||
;;
|
|
||||||
Godot)
|
|
||||||
echo "GFramework.Godot"
|
|
||||||
;;
|
|
||||||
Godot.SourceGenerators)
|
|
||||||
echo "GFramework.Godot.SourceGenerators"
|
|
||||||
;;
|
|
||||||
Godot.SourceGenerators.Abstractions)
|
|
||||||
echo "GFramework.Godot.SourceGenerators.Abstractions"
|
|
||||||
;;
|
|
||||||
Cqrs)
|
|
||||||
echo "GFramework.Cqrs"
|
|
||||||
;;
|
|
||||||
Cqrs.Abstractions)
|
|
||||||
echo "GFramework.Cqrs.Abstractions"
|
|
||||||
;;
|
|
||||||
Cqrs.SourceGenerators)
|
|
||||||
echo "GFramework.Cqrs.SourceGenerators"
|
|
||||||
;;
|
|
||||||
Ecs.Arch)
|
|
||||||
echo "GFramework.Ecs.Arch"
|
|
||||||
;;
|
|
||||||
Ecs.Arch.Abstractions)
|
|
||||||
echo "GFramework.Ecs.Arch.Abstractions"
|
|
||||||
;;
|
|
||||||
SourceGenerators.Common)
|
|
||||||
echo "GFramework.SourceGenerators.Common"
|
|
||||||
;;
|
|
||||||
esac
|
|
||||||
}
|
|
||||||
|
|
||||||
get_test_projects() {
|
|
||||||
local MODULE
|
|
||||||
MODULE="$(normalize_module "$1")" || return 1
|
|
||||||
|
|
||||||
case "$MODULE" in
|
|
||||||
Core|Core.Abstractions)
|
|
||||||
echo "GFramework.Core.Tests/GFramework.Core.Tests.csproj"
|
|
||||||
;;
|
|
||||||
Core.SourceGenerators|Core.SourceGenerators.Abstractions|Game.SourceGenerators|Cqrs.SourceGenerators|SourceGenerators.Common)
|
|
||||||
echo "GFramework.SourceGenerators.Tests/GFramework.SourceGenerators.Tests.csproj"
|
|
||||||
;;
|
|
||||||
Game|Game.Abstractions)
|
|
||||||
echo "GFramework.Game.Tests/GFramework.Game.Tests.csproj"
|
|
||||||
;;
|
|
||||||
Godot)
|
|
||||||
echo "GFramework.Godot.Tests/GFramework.Godot.Tests.csproj"
|
|
||||||
;;
|
|
||||||
Godot.SourceGenerators|Godot.SourceGenerators.Abstractions)
|
|
||||||
echo "GFramework.Godot.SourceGenerators.Tests/GFramework.Godot.SourceGenerators.Tests.csproj"
|
|
||||||
;;
|
|
||||||
Cqrs|Cqrs.Abstractions)
|
|
||||||
echo "GFramework.Cqrs.Tests/GFramework.Cqrs.Tests.csproj"
|
|
||||||
;;
|
|
||||||
Ecs.Arch|Ecs.Arch.Abstractions)
|
|
||||||
echo "GFramework.Ecs.Arch.Tests/GFramework.Ecs.Arch.Tests.csproj"
|
|
||||||
;;
|
|
||||||
esac
|
|
||||||
}
|
|
||||||
|
|
||||||
get_readme_paths() {
|
|
||||||
local MODULE
|
|
||||||
MODULE="$(normalize_module "$1")" || return 1
|
|
||||||
|
|
||||||
case "$MODULE" in
|
|
||||||
Core)
|
|
||||||
echo "GFramework.Core/README.md"
|
|
||||||
;;
|
|
||||||
Core.Abstractions)
|
|
||||||
echo "GFramework.Core.Abstractions/README.md"
|
|
||||||
;;
|
|
||||||
Core.SourceGenerators)
|
|
||||||
echo "GFramework.Core.SourceGenerators/README.md"
|
|
||||||
;;
|
|
||||||
Core.SourceGenerators.Abstractions)
|
|
||||||
echo "GFramework.Core.SourceGenerators.Abstractions/README.md"
|
|
||||||
;;
|
|
||||||
Game)
|
|
||||||
echo "GFramework.Game/README.md"
|
|
||||||
;;
|
|
||||||
Game.Abstractions)
|
|
||||||
echo "GFramework.Game.Abstractions/README.md"
|
|
||||||
;;
|
|
||||||
Game.SourceGenerators)
|
|
||||||
echo "GFramework.Game.SourceGenerators/README.md"
|
|
||||||
;;
|
|
||||||
Godot)
|
|
||||||
echo "GFramework.Godot/README.md"
|
|
||||||
;;
|
|
||||||
Godot.SourceGenerators)
|
|
||||||
echo "GFramework.Godot.SourceGenerators/README.md"
|
|
||||||
;;
|
|
||||||
Godot.SourceGenerators.Abstractions)
|
|
||||||
echo "GFramework.Godot.SourceGenerators.Abstractions/README.md"
|
|
||||||
;;
|
|
||||||
Cqrs)
|
|
||||||
echo "GFramework.Cqrs/README.md"
|
|
||||||
;;
|
|
||||||
Cqrs.Abstractions)
|
|
||||||
echo "GFramework.Cqrs.Abstractions/README.md"
|
|
||||||
;;
|
|
||||||
Cqrs.SourceGenerators)
|
|
||||||
echo "GFramework.Cqrs.SourceGenerators/README.md"
|
|
||||||
;;
|
|
||||||
Ecs.Arch)
|
|
||||||
echo "GFramework.Ecs.Arch/README.md"
|
|
||||||
;;
|
|
||||||
Ecs.Arch.Abstractions)
|
|
||||||
echo "GFramework.Ecs.Arch.Abstractions/README.md"
|
|
||||||
;;
|
|
||||||
SourceGenerators.Common)
|
|
||||||
echo "GFramework.SourceGenerators.Common/README.md"
|
|
||||||
;;
|
|
||||||
*)
|
|
||||||
return 1
|
|
||||||
;;
|
|
||||||
esac
|
|
||||||
}
|
|
||||||
|
|
||||||
infer_module_from_namespace() {
|
|
||||||
local NAMESPACE="$1"
|
|
||||||
|
|
||||||
if [[ "$NAMESPACE" == GFramework.Core.SourceGenerators.Abstractions* ]]; then
|
|
||||||
echo "Core.SourceGenerators.Abstractions"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Core.SourceGenerators* ]]; then
|
|
||||||
echo "Core.SourceGenerators"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Core.Abstractions* ]]; then
|
|
||||||
echo "Core.Abstractions"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Core* ]]; then
|
|
||||||
echo "Core"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Game.SourceGenerators* ]]; then
|
|
||||||
echo "Game.SourceGenerators"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Game.Abstractions* ]]; then
|
|
||||||
echo "Game.Abstractions"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Game* ]]; then
|
|
||||||
echo "Game"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Godot.SourceGenerators.Abstractions* ]]; then
|
|
||||||
echo "Godot.SourceGenerators.Abstractions"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Godot.SourceGenerators* ]]; then
|
|
||||||
echo "Godot.SourceGenerators"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Godot* ]]; then
|
|
||||||
echo "Godot"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Cqrs.SourceGenerators* ]]; then
|
|
||||||
echo "Cqrs.SourceGenerators"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Cqrs.Abstractions* ]]; then
|
|
||||||
echo "Cqrs.Abstractions"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Cqrs* ]]; then
|
|
||||||
echo "Cqrs"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Ecs.Arch.Abstractions* ]]; then
|
|
||||||
echo "Ecs.Arch.Abstractions"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.Ecs.Arch* ]]; then
|
|
||||||
echo "Ecs.Arch"
|
|
||||||
elif [[ "$NAMESPACE" == GFramework.SourceGenerators.Common* ]]; then
|
|
||||||
echo "SourceGenerators.Common"
|
|
||||||
else
|
|
||||||
return 1
|
|
||||||
fi
|
|
||||||
}
|
|
||||||
@ -1,434 +0,0 @@
|
|||||||
{
|
|
||||||
"version": 1,
|
|
||||||
"description": "Canonical documentation refresh module map for GFramework skills.",
|
|
||||||
"modules": {
|
|
||||||
"Core": {
|
|
||||||
"aliases": ["core", "core-runtime", "runtime-core", "core module"],
|
|
||||||
"source_paths": ["GFramework.Core"],
|
|
||||||
"project_file": "GFramework.Core/GFramework.Core.csproj",
|
|
||||||
"test_projects": ["GFramework.Core.Tests/GFramework.Core.Tests.csproj"],
|
|
||||||
"readme_paths": ["GFramework.Core/README.md"],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/core/index.md"],
|
|
||||||
"topics": [
|
|
||||||
"docs/zh-CN/core/architecture.md",
|
|
||||||
"docs/zh-CN/core/context.md",
|
|
||||||
"docs/zh-CN/core/lifecycle.md",
|
|
||||||
"docs/zh-CN/core/events.md",
|
|
||||||
"docs/zh-CN/core/property.md",
|
|
||||||
"docs/zh-CN/core/logging.md",
|
|
||||||
"docs/zh-CN/core/state-management.md",
|
|
||||||
"docs/zh-CN/core/coroutine.md"
|
|
||||||
],
|
|
||||||
"fallback": [
|
|
||||||
"docs/zh-CN/getting-started/quick-start.md",
|
|
||||||
"docs/zh-CN/api-reference/index.md"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": [
|
|
||||||
"ai-libs/CoreGrid/CoreGrid.csproj",
|
|
||||||
"ai-libs/CoreGrid/global",
|
|
||||||
"ai-libs/CoreGrid/docs"
|
|
||||||
],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"Architecture|RegisterModel|RegisterSystem|BindableProperty|EventBus\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Core.Abstractions": {
|
|
||||||
"aliases": ["core.abstractions", "core-abstractions", "core abstractions"],
|
|
||||||
"source_paths": ["GFramework.Core.Abstractions"],
|
|
||||||
"project_file": "GFramework.Core.Abstractions/GFramework.Core.Abstractions.csproj",
|
|
||||||
"test_projects": ["GFramework.Core.Tests/GFramework.Core.Tests.csproj"],
|
|
||||||
"readme_paths": ["GFramework.Core.Abstractions/README.md"],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/abstractions/core-abstractions.md"],
|
|
||||||
"topics": [
|
|
||||||
"docs/zh-CN/core/index.md",
|
|
||||||
"docs/zh-CN/core/architecture.md",
|
|
||||||
"docs/zh-CN/core/context.md"
|
|
||||||
],
|
|
||||||
"fallback": ["docs/zh-CN/api-reference/index.md"]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": ["ai-libs/CoreGrid/CoreGrid.csproj"],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"GFramework\\.Core\\.Abstractions|IArchitecture|IModel|ISystem\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Core.SourceGenerators": {
|
|
||||||
"aliases": [
|
|
||||||
"core.sourcegenerators",
|
|
||||||
"core-sourcegenerators",
|
|
||||||
"core-source-generators",
|
|
||||||
"core source generators"
|
|
||||||
],
|
|
||||||
"source_paths": ["GFramework.Core.SourceGenerators"],
|
|
||||||
"project_file": "GFramework.Core.SourceGenerators/GFramework.Core.SourceGenerators.csproj",
|
|
||||||
"test_projects": ["GFramework.SourceGenerators.Tests/GFramework.SourceGenerators.Tests.csproj"],
|
|
||||||
"readme_paths": ["GFramework.Core.SourceGenerators/README.md"],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/source-generators/index.md"],
|
|
||||||
"topics": [
|
|
||||||
"docs/zh-CN/source-generators/context-aware-generator.md",
|
|
||||||
"docs/zh-CN/source-generators/context-get-generator.md",
|
|
||||||
"docs/zh-CN/source-generators/priority-generator.md",
|
|
||||||
"docs/zh-CN/source-generators/logging-generator.md"
|
|
||||||
],
|
|
||||||
"fallback": ["docs/zh-CN/api-reference/index.md"]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": ["ai-libs/CoreGrid/global", "ai-libs/CoreGrid/scripts"],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"ContextAware|ContextGet|Priority|GeneratedLogger|Log\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Core.SourceGenerators.Abstractions": {
|
|
||||||
"aliases": [
|
|
||||||
"core.sourcegenerators.abstractions",
|
|
||||||
"core-source-generators-abstractions",
|
|
||||||
"core source generators abstractions"
|
|
||||||
],
|
|
||||||
"source_paths": ["GFramework.Core.SourceGenerators.Abstractions"],
|
|
||||||
"project_file": "GFramework.Core.SourceGenerators.Abstractions/GFramework.Core.SourceGenerators.Abstractions.csproj",
|
|
||||||
"test_projects": ["GFramework.SourceGenerators.Tests/GFramework.SourceGenerators.Tests.csproj"],
|
|
||||||
"readme_paths": [],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/source-generators/index.md"],
|
|
||||||
"topics": [
|
|
||||||
"docs/zh-CN/source-generators/context-aware-generator.md",
|
|
||||||
"docs/zh-CN/source-generators/context-get-generator.md",
|
|
||||||
"docs/zh-CN/source-generators/priority-generator.md"
|
|
||||||
],
|
|
||||||
"fallback": ["docs/zh-CN/api-reference/index.md"]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": ["ai-libs/CoreGrid/global"],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"GFramework\\.Core\\.SourceGenerators\\.Abstractions|\\[ContextAware\\]|\\[Priority\\]\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Game": {
|
|
||||||
"aliases": ["game", "game-runtime", "runtime-game", "game module"],
|
|
||||||
"source_paths": ["GFramework.Game"],
|
|
||||||
"project_file": "GFramework.Game/GFramework.Game.csproj",
|
|
||||||
"test_projects": ["GFramework.Game.Tests/GFramework.Game.Tests.csproj"],
|
|
||||||
"readme_paths": ["GFramework.Game/README.md"],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/game/index.md"],
|
|
||||||
"topics": [
|
|
||||||
"docs/zh-CN/game/scene.md",
|
|
||||||
"docs/zh-CN/game/ui.md",
|
|
||||||
"docs/zh-CN/game/data.md",
|
|
||||||
"docs/zh-CN/game/storage.md",
|
|
||||||
"docs/zh-CN/game/serialization.md",
|
|
||||||
"docs/zh-CN/game/setting.md",
|
|
||||||
"docs/zh-CN/game/config-system.md"
|
|
||||||
],
|
|
||||||
"fallback": [
|
|
||||||
"docs/zh-CN/getting-started/quick-start.md",
|
|
||||||
"docs/zh-CN/api-reference/index.md"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": [
|
|
||||||
"ai-libs/CoreGrid/global",
|
|
||||||
"ai-libs/CoreGrid/scenes",
|
|
||||||
"ai-libs/CoreGrid/addons"
|
|
||||||
],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"SceneRouter|UiRouter|Setting|Storage|Serialization|Config\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Game.Abstractions": {
|
|
||||||
"aliases": ["game.abstractions", "game-abstractions", "game abstractions"],
|
|
||||||
"source_paths": ["GFramework.Game.Abstractions"],
|
|
||||||
"project_file": "GFramework.Game.Abstractions/GFramework.Game.Abstractions.csproj",
|
|
||||||
"test_projects": ["GFramework.Game.Tests/GFramework.Game.Tests.csproj"],
|
|
||||||
"readme_paths": ["GFramework.Game.Abstractions/README.md"],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/abstractions/game-abstractions.md"],
|
|
||||||
"topics": [
|
|
||||||
"docs/zh-CN/game/index.md",
|
|
||||||
"docs/zh-CN/game/scene.md",
|
|
||||||
"docs/zh-CN/game/ui.md"
|
|
||||||
],
|
|
||||||
"fallback": ["docs/zh-CN/api-reference/index.md"]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": ["ai-libs/CoreGrid/global", "ai-libs/CoreGrid/scenes"],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"ISceneFactory|ISceneRoot|UiInteractionProfile|IUiRoot\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Game.SourceGenerators": {
|
|
||||||
"aliases": [
|
|
||||||
"game.sourcegenerators",
|
|
||||||
"game-source-generators",
|
|
||||||
"game source generators"
|
|
||||||
],
|
|
||||||
"source_paths": ["GFramework.Game.SourceGenerators"],
|
|
||||||
"project_file": "GFramework.Game.SourceGenerators/GFramework.Game.SourceGenerators.csproj",
|
|
||||||
"test_projects": ["GFramework.SourceGenerators.Tests/GFramework.SourceGenerators.Tests.csproj"],
|
|
||||||
"readme_paths": ["GFramework.Game.SourceGenerators/README.md"],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/source-generators/index.md"],
|
|
||||||
"topics": [
|
|
||||||
"docs/zh-CN/source-generators/auto-scene-generator.md",
|
|
||||||
"docs/zh-CN/source-generators/auto-ui-page-generator.md"
|
|
||||||
],
|
|
||||||
"fallback": ["docs/zh-CN/game/index.md"]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": ["ai-libs/CoreGrid/scenes", "ai-libs/CoreGrid/addons"],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"AutoScene|AutoUiPage|SceneRouter|UiRouter\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Godot": {
|
|
||||||
"aliases": ["godot", "godot-runtime", "runtime-godot", "godot module"],
|
|
||||||
"source_paths": ["GFramework.Godot"],
|
|
||||||
"project_file": "GFramework.Godot/GFramework.Godot.csproj",
|
|
||||||
"test_projects": ["GFramework.Godot.Tests/GFramework.Godot.Tests.csproj"],
|
|
||||||
"readme_paths": ["GFramework.Godot/README.md"],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/godot/index.md"],
|
|
||||||
"topics": [
|
|
||||||
"docs/zh-CN/godot/architecture.md",
|
|
||||||
"docs/zh-CN/godot/scene.md",
|
|
||||||
"docs/zh-CN/godot/ui.md",
|
|
||||||
"docs/zh-CN/godot/storage.md",
|
|
||||||
"docs/zh-CN/godot/setting.md",
|
|
||||||
"docs/zh-CN/godot/signal.md"
|
|
||||||
],
|
|
||||||
"fallback": [
|
|
||||||
"docs/zh-CN/source-generators/index.md",
|
|
||||||
"docs/zh-CN/api-reference/index.md"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": [
|
|
||||||
"ai-libs/CoreGrid/project.godot",
|
|
||||||
"ai-libs/CoreGrid/global",
|
|
||||||
"ai-libs/CoreGrid/scenes"
|
|
||||||
],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"AutoLoad|InputActions|Node|Signal|PackedScene|Godot\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Godot.SourceGenerators": {
|
|
||||||
"aliases": [
|
|
||||||
"godot.sourcegenerators",
|
|
||||||
"godot-source-generators",
|
|
||||||
"godot source generators",
|
|
||||||
"godot generators"
|
|
||||||
],
|
|
||||||
"source_paths": ["GFramework.Godot.SourceGenerators"],
|
|
||||||
"project_file": "GFramework.Godot.SourceGenerators/GFramework.Godot.SourceGenerators.csproj",
|
|
||||||
"test_projects": ["GFramework.Godot.SourceGenerators.Tests/GFramework.Godot.SourceGenerators.Tests.csproj"],
|
|
||||||
"readme_paths": ["GFramework.Godot.SourceGenerators/README.md"],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/source-generators/index.md"],
|
|
||||||
"topics": [
|
|
||||||
"docs/zh-CN/source-generators/godot-project-generator.md",
|
|
||||||
"docs/zh-CN/source-generators/get-node-generator.md",
|
|
||||||
"docs/zh-CN/source-generators/bind-node-signal-generator.md",
|
|
||||||
"docs/zh-CN/source-generators/auto-register-exported-collections-generator.md"
|
|
||||||
],
|
|
||||||
"fallback": ["docs/zh-CN/godot/index.md"]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": [
|
|
||||||
"ai-libs/CoreGrid/project.godot",
|
|
||||||
"ai-libs/CoreGrid/global",
|
|
||||||
"ai-libs/CoreGrid/scenes"
|
|
||||||
],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"GetNode|BindNodeSignal|AutoRegisterExported|project\\.godot|AutoLoad\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Godot.SourceGenerators.Abstractions": {
|
|
||||||
"aliases": [
|
|
||||||
"godot.sourcegenerators.abstractions",
|
|
||||||
"godot-source-generators-abstractions",
|
|
||||||
"godot source generators abstractions"
|
|
||||||
],
|
|
||||||
"source_paths": ["GFramework.Godot.SourceGenerators.Abstractions"],
|
|
||||||
"project_file": "GFramework.Godot.SourceGenerators.Abstractions/GFramework.Godot.SourceGenerators.Abstractions.csproj",
|
|
||||||
"test_projects": ["GFramework.Godot.SourceGenerators.Tests/GFramework.Godot.SourceGenerators.Tests.csproj"],
|
|
||||||
"readme_paths": [],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/source-generators/index.md"],
|
|
||||||
"topics": [
|
|
||||||
"docs/zh-CN/source-generators/godot-project-generator.md",
|
|
||||||
"docs/zh-CN/source-generators/get-node-generator.md",
|
|
||||||
"docs/zh-CN/source-generators/bind-node-signal-generator.md"
|
|
||||||
],
|
|
||||||
"fallback": ["docs/zh-CN/godot/index.md"]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": ["ai-libs/CoreGrid/project.godot", "ai-libs/CoreGrid/global"],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"GFramework\\.Godot\\.SourceGenerators\\.Abstractions|GetNode|BindNodeSignal\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Cqrs": {
|
|
||||||
"aliases": ["cqrs", "mediator", "cqrs module"],
|
|
||||||
"source_paths": ["GFramework.Cqrs"],
|
|
||||||
"project_file": "GFramework.Cqrs/GFramework.Cqrs.csproj",
|
|
||||||
"test_projects": ["GFramework.Cqrs.Tests/GFramework.Cqrs.Tests.csproj"],
|
|
||||||
"readme_paths": ["GFramework.Cqrs/README.md"],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/core/cqrs.md"],
|
|
||||||
"topics": [
|
|
||||||
"docs/zh-CN/core/command.md",
|
|
||||||
"docs/zh-CN/core/query.md",
|
|
||||||
"docs/zh-CN/core/cqrs.md"
|
|
||||||
],
|
|
||||||
"fallback": [
|
|
||||||
"docs/zh-CN/core/index.md",
|
|
||||||
"docs/zh-CN/api-reference/index.md"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": ["ai-libs/CoreGrid/global", "ai-libs/CoreGrid/scripts"],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"CommandHandler|QueryHandler|RegisterCqrs|PipelineBehavior\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Cqrs.Abstractions": {
|
|
||||||
"aliases": ["cqrs.abstractions", "cqrs-abstractions", "cqrs abstractions"],
|
|
||||||
"source_paths": ["GFramework.Cqrs.Abstractions"],
|
|
||||||
"project_file": "GFramework.Cqrs.Abstractions/GFramework.Cqrs.Abstractions.csproj",
|
|
||||||
"test_projects": ["GFramework.Cqrs.Tests/GFramework.Cqrs.Tests.csproj"],
|
|
||||||
"readme_paths": ["GFramework.Cqrs.Abstractions/README.md"],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/core/cqrs.md"],
|
|
||||||
"topics": [
|
|
||||||
"docs/zh-CN/core/command.md",
|
|
||||||
"docs/zh-CN/core/query.md"
|
|
||||||
],
|
|
||||||
"fallback": ["docs/zh-CN/api-reference/index.md"]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": ["ai-libs/CoreGrid/global"],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"GFramework\\.Cqrs\\.Abstractions|ICommand|IQuery|IRequest\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Cqrs.SourceGenerators": {
|
|
||||||
"aliases": [
|
|
||||||
"cqrs.sourcegenerators",
|
|
||||||
"cqrs-source-generators",
|
|
||||||
"cqrs source generators"
|
|
||||||
],
|
|
||||||
"source_paths": ["GFramework.Cqrs.SourceGenerators"],
|
|
||||||
"project_file": "GFramework.Cqrs.SourceGenerators/GFramework.Cqrs.SourceGenerators.csproj",
|
|
||||||
"test_projects": ["GFramework.SourceGenerators.Tests/GFramework.SourceGenerators.Tests.csproj"],
|
|
||||||
"readme_paths": ["GFramework.Cqrs.SourceGenerators/README.md"],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/source-generators/index.md"],
|
|
||||||
"topics": [],
|
|
||||||
"fallback": [
|
|
||||||
"docs/zh-CN/core/cqrs.md",
|
|
||||||
"docs/zh-CN/api-reference/index.md"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": ["ai-libs/CoreGrid/global"],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"GFramework\\.Cqrs\\.SourceGenerators|RequestHandler|PipelineBehavior\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Ecs.Arch": {
|
|
||||||
"aliases": ["ecs.arch", "ecs-arch", "ecs arch", "ecs"],
|
|
||||||
"source_paths": ["GFramework.Ecs.Arch"],
|
|
||||||
"project_file": "GFramework.Ecs.Arch/GFramework.Ecs.Arch.csproj",
|
|
||||||
"test_projects": ["GFramework.Ecs.Arch.Tests/GFramework.Ecs.Arch.Tests.csproj"],
|
|
||||||
"readme_paths": ["GFramework.Ecs.Arch/README.md"],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/ecs/index.md"],
|
|
||||||
"topics": ["docs/zh-CN/ecs/arch.md"],
|
|
||||||
"fallback": [
|
|
||||||
"docs/zh-CN/core/index.md",
|
|
||||||
"docs/zh-CN/api-reference/index.md"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": ["ai-libs/CoreGrid/scripts", "ai-libs/CoreGrid/global"],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"Arch\\.Core|World|SystemGroup|QueryDescription\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"Ecs.Arch.Abstractions": {
|
|
||||||
"aliases": [
|
|
||||||
"ecs.arch.abstractions",
|
|
||||||
"ecs-arch-abstractions",
|
|
||||||
"ecs arch abstractions"
|
|
||||||
],
|
|
||||||
"source_paths": ["GFramework.Ecs.Arch.Abstractions"],
|
|
||||||
"project_file": "GFramework.Ecs.Arch.Abstractions/GFramework.Ecs.Arch.Abstractions.csproj",
|
|
||||||
"test_projects": ["GFramework.Ecs.Arch.Tests/GFramework.Ecs.Arch.Tests.csproj"],
|
|
||||||
"readme_paths": [],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/ecs/index.md"],
|
|
||||||
"topics": ["docs/zh-CN/ecs/arch.md"],
|
|
||||||
"fallback": ["docs/zh-CN/api-reference/index.md"]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": ["ai-libs/CoreGrid/scripts"],
|
|
||||||
"search_hints": [
|
|
||||||
"rg -n \"GFramework\\.Ecs\\.Arch\\.Abstractions|IArchSystem|IArchModel\" ai-libs/CoreGrid"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"SourceGenerators.Common": {
|
|
||||||
"aliases": [
|
|
||||||
"sourcegenerators.common",
|
|
||||||
"source-generators-common",
|
|
||||||
"source generators common"
|
|
||||||
],
|
|
||||||
"source_paths": ["GFramework.SourceGenerators.Common"],
|
|
||||||
"project_file": "GFramework.SourceGenerators.Common/GFramework.SourceGenerators.Common.csproj",
|
|
||||||
"test_projects": ["GFramework.SourceGenerators.Tests/GFramework.SourceGenerators.Tests.csproj"],
|
|
||||||
"readme_paths": [],
|
|
||||||
"docs": {
|
|
||||||
"landing": ["docs/zh-CN/source-generators/index.md"],
|
|
||||||
"topics": [],
|
|
||||||
"fallback": ["docs/zh-CN/api-reference/index.md"]
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": [],
|
|
||||||
"search_hints": []
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"docs_section_aliases": {
|
|
||||||
"core": ["Core"],
|
|
||||||
"abstractions": ["Core.Abstractions", "Game.Abstractions", "Ecs.Arch.Abstractions"],
|
|
||||||
"game": ["Game"],
|
|
||||||
"godot": ["Godot"],
|
|
||||||
"cqrs": ["Cqrs"],
|
|
||||||
"ecs": ["Ecs.Arch"],
|
|
||||||
"source-generators": [
|
|
||||||
"Core.SourceGenerators",
|
|
||||||
"Game.SourceGenerators",
|
|
||||||
"Cqrs.SourceGenerators",
|
|
||||||
"Godot.SourceGenerators"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@ -1,121 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# 生成代码示例辅助脚本
|
|
||||||
# 用法: generate-examples.sh <类型> <类名> [命名空间]
|
|
||||||
|
|
||||||
set -e
|
|
||||||
|
|
||||||
TYPE="$1" # class/interface/enum
|
|
||||||
CLASS_NAME="$2"
|
|
||||||
NAMESPACE="$3"
|
|
||||||
|
|
||||||
if [ -z "$TYPE" ] || [ -z "$CLASS_NAME" ]; then
|
|
||||||
echo "用法: $0 <类型> <类名> [命名空间]"
|
|
||||||
echo "类型: class, interface, enum"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
echo "=========================================="
|
|
||||||
echo "代码示例生成指南"
|
|
||||||
echo "=========================================="
|
|
||||||
echo "类型: $TYPE"
|
|
||||||
echo "类名: $CLASS_NAME"
|
|
||||||
if [ -n "$NAMESPACE" ]; then
|
|
||||||
echo "命名空间: $NAMESPACE"
|
|
||||||
fi
|
|
||||||
echo ""
|
|
||||||
|
|
||||||
# 根据类型提供示例生成指南
|
|
||||||
case "$TYPE" in
|
|
||||||
class)
|
|
||||||
echo "## 类示例生成建议"
|
|
||||||
echo ""
|
|
||||||
echo "1. 基本用法示例:"
|
|
||||||
echo " - 实例化对象"
|
|
||||||
echo " - 调用主要方法"
|
|
||||||
echo " - 访问公共属性"
|
|
||||||
echo ""
|
|
||||||
echo "2. 常见场景示例:"
|
|
||||||
echo " - 实际应用案例"
|
|
||||||
echo " - 与其他组件的集成"
|
|
||||||
echo ""
|
|
||||||
echo "3. 高级用法示例(如适用):"
|
|
||||||
echo " - 复杂配置"
|
|
||||||
echo " - 扩展和自定义"
|
|
||||||
echo ""
|
|
||||||
echo "示例模板:"
|
|
||||||
echo '```csharp'
|
|
||||||
echo "// 创建实例"
|
|
||||||
echo "var instance = new $CLASS_NAME();"
|
|
||||||
echo ""
|
|
||||||
echo "// 使用主要功能"
|
|
||||||
echo "instance.MainMethod();"
|
|
||||||
echo '```'
|
|
||||||
;;
|
|
||||||
interface)
|
|
||||||
echo "## 接口示例生成建议"
|
|
||||||
echo ""
|
|
||||||
echo "1. 实现接口:"
|
|
||||||
echo " - 展示如何实现该接口"
|
|
||||||
echo " - 实现所有必需成员"
|
|
||||||
echo ""
|
|
||||||
echo "2. 使用接口:"
|
|
||||||
echo " - 通过接口类型使用实例"
|
|
||||||
echo " - 依赖注入场景"
|
|
||||||
echo ""
|
|
||||||
echo "示例模板:"
|
|
||||||
echo '```csharp'
|
|
||||||
echo "// 实现接口"
|
|
||||||
echo "public class My$CLASS_NAME : $CLASS_NAME"
|
|
||||||
echo "{"
|
|
||||||
echo " // 实现成员"
|
|
||||||
echo "}"
|
|
||||||
echo ""
|
|
||||||
echo "// 使用接口"
|
|
||||||
echo "$CLASS_NAME instance = new My$CLASS_NAME();"
|
|
||||||
echo '```'
|
|
||||||
;;
|
|
||||||
enum)
|
|
||||||
echo "## 枚举示例生成建议"
|
|
||||||
echo ""
|
|
||||||
echo "1. 基本用法:"
|
|
||||||
echo " - 枚举值赋值"
|
|
||||||
echo " - 值比较"
|
|
||||||
echo ""
|
|
||||||
echo "2. Switch 语句:"
|
|
||||||
echo " - 根据枚举值执行不同逻辑"
|
|
||||||
echo ""
|
|
||||||
echo "3. 枚举转换:"
|
|
||||||
echo " - 字符串转枚举"
|
|
||||||
echo " - 枚举转整数"
|
|
||||||
echo ""
|
|
||||||
echo "示例模板:"
|
|
||||||
echo '```csharp'
|
|
||||||
echo "// 使用枚举值"
|
|
||||||
echo "var value = $CLASS_NAME.SomeValue;"
|
|
||||||
echo ""
|
|
||||||
echo "// Switch 语句"
|
|
||||||
echo "switch (value)"
|
|
||||||
echo "{"
|
|
||||||
echo " case $CLASS_NAME.Value1:"
|
|
||||||
echo " // 处理逻辑"
|
|
||||||
echo " break;"
|
|
||||||
echo " case $CLASS_NAME.Value2:"
|
|
||||||
echo " // 处理逻辑"
|
|
||||||
echo " break;"
|
|
||||||
echo "}"
|
|
||||||
echo '```'
|
|
||||||
;;
|
|
||||||
*)
|
|
||||||
echo "错误: 不支持的类型: $TYPE"
|
|
||||||
exit 1
|
|
||||||
;;
|
|
||||||
esac
|
|
||||||
|
|
||||||
echo ""
|
|
||||||
echo "注意事项:"
|
|
||||||
echo "- 使用项目的命名约定"
|
|
||||||
echo "- 包含必要的 using 语句"
|
|
||||||
echo "- 确保示例代码可以编译运行"
|
|
||||||
echo "- 参考现有教程的代码风格"
|
|
||||||
|
|
||||||
exit 0
|
|
||||||
@ -1,48 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# 解析 C# XML 文档注释
|
|
||||||
# 用法: parse-csharp-xml.sh <C# 文件路径>
|
|
||||||
|
|
||||||
set -e
|
|
||||||
|
|
||||||
FILE_PATH="$1"
|
|
||||||
|
|
||||||
if [ -z "$FILE_PATH" ]; then
|
|
||||||
echo "用法: $0 <C# 文件路径>"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ ! -f "$FILE_PATH" ]; then
|
|
||||||
echo "错误: 文件不存在: $FILE_PATH"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
echo "解析 C# XML 文档注释: $FILE_PATH"
|
|
||||||
|
|
||||||
# 提取 summary 标签内容
|
|
||||||
echo "=== Summary ==="
|
|
||||||
grep -A 5 "/// <summary>" "$FILE_PATH" | grep "///" | sed 's/.*\/\/\/\s*//' | sed 's/<summary>//g' | sed 's/<\/summary>//g' || echo "未找到 summary"
|
|
||||||
|
|
||||||
# 提取 param 标签内容
|
|
||||||
echo ""
|
|
||||||
echo "=== Parameters ==="
|
|
||||||
grep "/// <param" "$FILE_PATH" | sed 's/.*\/\/\/\s*//' || echo "未找到 param"
|
|
||||||
|
|
||||||
# 提取 returns 标签内容
|
|
||||||
echo ""
|
|
||||||
echo "=== Returns ==="
|
|
||||||
grep "/// <returns>" "$FILE_PATH" | sed 's/.*\/\/\/\s*//' | sed 's/<returns>//g' | sed 's/<\/returns>//g' || echo "未找到 returns"
|
|
||||||
|
|
||||||
# 提取 exception 标签内容
|
|
||||||
echo ""
|
|
||||||
echo "=== Exceptions ==="
|
|
||||||
grep "/// <exception" "$FILE_PATH" | sed 's/.*\/\/\/\s*//' || echo "未找到 exception"
|
|
||||||
|
|
||||||
# 提取 example 标签内容
|
|
||||||
echo ""
|
|
||||||
echo "=== Examples ==="
|
|
||||||
grep -A 10 "/// <example>" "$FILE_PATH" | grep "///" | sed 's/.*\/\/\/\s*//' || echo "未找到 example"
|
|
||||||
|
|
||||||
echo ""
|
|
||||||
echo "解析完成"
|
|
||||||
|
|
||||||
exit 0
|
|
||||||
@ -1,57 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# 更新 VitePress 侧边栏配置
|
|
||||||
# 用法: update-vitepress-nav.sh <文档路径> <文档标题>
|
|
||||||
|
|
||||||
set -e
|
|
||||||
|
|
||||||
DOC_PATH="$1"
|
|
||||||
DOC_TITLE="$2"
|
|
||||||
CONFIG_FILE="docs/.vitepress/config.mts"
|
|
||||||
|
|
||||||
if [ -z "$DOC_PATH" ] || [ -z "$DOC_TITLE" ]; then
|
|
||||||
echo "用法: $0 <文档路径> <文档标题>"
|
|
||||||
echo "示例: $0 /zh-CN/api-reference/core/architecture Architecture"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ ! -f "$CONFIG_FILE" ]; then
|
|
||||||
echo "错误: 找不到 VitePress 配置文件: $CONFIG_FILE"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
# 提取模块名称(core/game/godot/source-generators)
|
|
||||||
MODULE=$(echo "$DOC_PATH" | grep -oP '(?<=/zh-CN/)[^/]+' | head -1)
|
|
||||||
|
|
||||||
if [ -z "$MODULE" ]; then
|
|
||||||
echo "错误: 无法从路径中提取模块名称: $DOC_PATH"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
echo "=========================================="
|
|
||||||
echo "VitePress 导航配置更新"
|
|
||||||
echo "=========================================="
|
|
||||||
echo "模块: $MODULE"
|
|
||||||
echo "路径: $DOC_PATH"
|
|
||||||
echo "标题: $DOC_TITLE"
|
|
||||||
echo ""
|
|
||||||
|
|
||||||
# 检查配置文件中是否已存在该路径
|
|
||||||
if grep -q "link: '$DOC_PATH'" "$CONFIG_FILE"; then
|
|
||||||
echo "✓ 该文档已存在于导航配置中"
|
|
||||||
exit 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
echo "提示: 需要在 VitePress 配置中添加新条目"
|
|
||||||
echo ""
|
|
||||||
echo "建议的配置条目:"
|
|
||||||
echo "{ text: '$DOC_TITLE', link: '$DOC_PATH' }"
|
|
||||||
echo ""
|
|
||||||
echo "请使用以下方式之一更新配置:"
|
|
||||||
echo "1. 让 AI 使用 Edit 工具直接修改 $CONFIG_FILE"
|
|
||||||
echo "2. 手动编辑配置文件并添加上述条目到对应的 sidebar 部分"
|
|
||||||
echo ""
|
|
||||||
|
|
||||||
# 输出相关的 sidebar 配置路径
|
|
||||||
echo "相关的 sidebar 配置路径: '/zh-CN/$MODULE/'"
|
|
||||||
|
|
||||||
exit 0
|
|
||||||
@ -1,207 +0,0 @@
|
|||||||
---
|
|
||||||
name: gframework-batch-boot
|
|
||||||
description: Repository-specific bulk-task workflow for the GFramework repo. Use when Codex should start from the normal GFramework boot context and then continue a repetitive or large-scope task in automatic batches without waiting for manual round-by-round prompts, especially for analyzer warning cleanup, repetitive test refactors, documentation waves, or similar multi-file work with an explicit stop condition such as changed-file count, warning count, or timebox.
|
|
||||||
---
|
|
||||||
|
|
||||||
# GFramework Batch Boot
|
|
||||||
|
|
||||||
## Overview
|
|
||||||
|
|
||||||
Use this skill when `gframework-boot` is necessary but not sufficient because the task should keep advancing in bounded
|
|
||||||
batches until a clear stop condition is met.
|
|
||||||
|
|
||||||
Treat `AGENTS.md` as the source of truth. This skill extends `gframework-boot`; it does not replace it.
|
|
||||||
If the task's defining requirement is that the main agent must keep acting as dispatcher, reviewer, `ai-plan` owner,
|
|
||||||
and final integrator for multiple parallel workers, prefer `gframework-multi-agent-batch` and use this skill's stop
|
|
||||||
condition guidance as a secondary reference.
|
|
||||||
|
|
||||||
Context budget is a first-class stop signal. Do not keep batching merely because a file-count threshold still has
|
|
||||||
headroom if the active conversation, loaded repo artifacts, validation output, and pending recovery updates suggest the
|
|
||||||
agent is approaching its safe working-context limit.
|
|
||||||
|
|
||||||
## Startup Workflow
|
|
||||||
|
|
||||||
1. Execute the normal `gframework-boot` startup sequence first:
|
|
||||||
- read `AGENTS.md`
|
|
||||||
- read `.ai/environment/tools.ai.yaml`
|
|
||||||
- read `ai-plan/public/README.md`
|
|
||||||
- read the mapped active topic `todos/` and `traces/`
|
|
||||||
2. Classify the task as a batch candidate only if all of the following are true:
|
|
||||||
- the work is repetitive, sliceable, or likely to require multiple similar iterations
|
|
||||||
- each batch can be given an explicit ownership boundary
|
|
||||||
- a stop condition can be measured locally
|
|
||||||
- the task does not primarily need the orchestration-heavy main-agent workflow captured by `gframework-multi-agent-batch`
|
|
||||||
3. Before any delegation, define the batch objective in one sentence:
|
|
||||||
- warning family reduction
|
|
||||||
- repeated test refactor pattern
|
|
||||||
- module-by-module documentation refresh
|
|
||||||
- other repetitive multi-file cleanup
|
|
||||||
4. Before the first implementation batch, estimate whether the current task is likely to stay below roughly 80% of the
|
|
||||||
agent's safe working-context budget through one more full batch cycle:
|
|
||||||
- include already loaded `AGENTS.md`, skills, `ai-plan` files, recent command output, active diffs, and expected validation output
|
|
||||||
- if another batch would probably push the conversation near the limit, plan to stop after the current batch even if
|
|
||||||
branch-size thresholds still have room
|
|
||||||
|
|
||||||
## Baseline Selection
|
|
||||||
|
|
||||||
When the stop condition depends on branch size or changed-file count, choose the baseline carefully.
|
|
||||||
|
|
||||||
1. Prefer the freshest remote-tracking reference that already exists locally:
|
|
||||||
- `origin/main`
|
|
||||||
- or the mapped upstream base branch for the current topic
|
|
||||||
2. Do not default to local `main` when `refs/heads/main` is behind `refs/remotes/origin/main`.
|
|
||||||
3. If both local and remote-tracking refs exist, report:
|
|
||||||
- ref name
|
|
||||||
- short SHA
|
|
||||||
- committer date
|
|
||||||
4. If only a local branch exists, state that the baseline may be stale before using it.
|
|
||||||
5. When the task is tied to a PR or topic branch rather than `main`, prefer that explicit upstream comparison target over
|
|
||||||
a generic `main`.
|
|
||||||
|
|
||||||
For changed-file limits, measure branch-wide scope against the chosen baseline, not just the current working tree:
|
|
||||||
|
|
||||||
- use `git diff --name-only <baseline>...HEAD`
|
|
||||||
- do not confuse branch diff size with `git status --short`
|
|
||||||
|
|
||||||
For changed-line limits, also measure branch-wide scope against the chosen baseline:
|
|
||||||
|
|
||||||
- prefer `git diff --numstat <baseline>...HEAD`
|
|
||||||
- treat "changed lines" as `added + deleted` summed across the branch diff
|
|
||||||
- do not use working-tree-only line counts as a substitute for branch-wide scope
|
|
||||||
|
|
||||||
For shorthand numeric thresholds, use a fixed default baseline:
|
|
||||||
|
|
||||||
- compare the current branch's cumulative diff against remote `origin/main`
|
|
||||||
- include all commits reachable from `HEAD` that are not already in `origin/main`
|
|
||||||
- do not reinterpret shorthand thresholds as "this batch only" or "current unstaged changes only"
|
|
||||||
- only use another baseline when the user explicitly names it in the prompt
|
|
||||||
|
|
||||||
## Stop Conditions
|
|
||||||
|
|
||||||
Choose one primary stop condition before the first batch and restate it to the user.
|
|
||||||
|
|
||||||
When the user does not explicitly override the priority order, use:
|
|
||||||
|
|
||||||
1. context-budget safety
|
|
||||||
2. semantic batch boundary / reviewability
|
|
||||||
3. the user-requested local metric such as files, lines, warnings, or time
|
|
||||||
|
|
||||||
Common stop conditions:
|
|
||||||
|
|
||||||
- the next batch would likely push the agent above roughly 80% of its safe working-context budget
|
|
||||||
- branch diff vs baseline approaches a file-count threshold
|
|
||||||
- warnings-only build reaches a target count
|
|
||||||
- a specific hotspot list is exhausted
|
|
||||||
- a timebox or validation budget is reached
|
|
||||||
|
|
||||||
If multiple stop conditions exist, rank them and treat one as primary.
|
|
||||||
|
|
||||||
Treat file-count or line-count thresholds as coarse repository-scope signals, not as a proxy for AI context health.
|
|
||||||
When they disagree with context-budget safety, context-budget safety wins.
|
|
||||||
|
|
||||||
## Shorthand Stop-Condition Syntax
|
|
||||||
|
|
||||||
`gframework-batch-boot` may be invoked with shorthand numeric thresholds when the user clearly wants a branch-size stop
|
|
||||||
condition instead of a long natural-language prompt.
|
|
||||||
|
|
||||||
Interpret shorthand as follows:
|
|
||||||
|
|
||||||
- `$gframework-batch-boot 75`
|
|
||||||
- means: stop when the current branch's cumulative diff vs remote `origin/main` approaches `75` changed files
|
|
||||||
- `$gframework-batch-boot 75 2000`
|
|
||||||
- means: stop when the current branch's cumulative diff vs remote `origin/main` approaches `75` changed files OR
|
|
||||||
`2000` changed lines
|
|
||||||
- default positional meaning is `<files> <lines>`
|
|
||||||
- `$gframework-batch-boot 75 | 2000`
|
|
||||||
- may be interpreted as the same OR shorthand in plain-language chat
|
|
||||||
- when restating, planning, or documenting the command, normalize it to `$gframework-batch-boot 75 2000`
|
|
||||||
- prefer the no-pipe form because `|` is easy to confuse with a shell pipeline
|
|
||||||
|
|
||||||
When shorthand is used:
|
|
||||||
|
|
||||||
- report the resolved thresholds explicitly before the first batch
|
|
||||||
- report that the baseline is remote `origin/main`, unless the user explicitly overrides it
|
|
||||||
- if two numeric thresholds are present, treat file count as the default primary metric for status reporting unless the
|
|
||||||
user says otherwise
|
|
||||||
- stop when either threshold is reached or exceeded, even if the other threshold still has headroom
|
|
||||||
|
|
||||||
## Batch Loop
|
|
||||||
|
|
||||||
1. Inspect the current state before the first batch:
|
|
||||||
- current branch and active topic
|
|
||||||
- selected baseline
|
|
||||||
- current stop-condition metric
|
|
||||||
- current context-budget posture and whether one more batch is safe
|
|
||||||
- next candidate slices
|
|
||||||
2. Keep the critical path local.
|
|
||||||
3. Delegate only bounded slices with explicit ownership:
|
|
||||||
- one file
|
|
||||||
- one warning family within one project
|
|
||||||
- one module documentation wave
|
|
||||||
4. For each worker batch, specify:
|
|
||||||
- objective
|
|
||||||
- owned files or subsystem
|
|
||||||
- required validation commands
|
|
||||||
- output format
|
|
||||||
- reminder that other agents may be editing the repo
|
|
||||||
5. While workers run, use the main thread for non-overlapping tasks:
|
|
||||||
- queue the next candidate slice
|
|
||||||
- inspect the next hotspot
|
|
||||||
- recompute branch size or warning distribution
|
|
||||||
6. After each completed batch:
|
|
||||||
- integrate or verify the result
|
|
||||||
- rerun the required validation
|
|
||||||
- recompute the primary stop-condition metric
|
|
||||||
- reassess whether one more batch would likely push the agent near or beyond roughly 80% context usage
|
|
||||||
- decide immediately whether to continue or stop
|
|
||||||
7. Do not require the user to manually trigger every round unless:
|
|
||||||
- the next slice is ambiguous
|
|
||||||
- a validation failure changes strategy
|
|
||||||
- the batch objective conflicts with the active topic
|
|
||||||
|
|
||||||
## Task Tracking
|
|
||||||
|
|
||||||
For multi-batch work, keep recovery artifacts current.
|
|
||||||
|
|
||||||
- Update the active `ai-plan/public/<topic>/todos/` document when a meaningful batch lands.
|
|
||||||
- Update the matching `traces/` document with:
|
|
||||||
- accepted delegated scope
|
|
||||||
- validation milestones
|
|
||||||
- current stop-condition metric
|
|
||||||
- next recommended batch
|
|
||||||
- Keep the active recovery point concise; archive detailed history when it starts to sprawl.
|
|
||||||
|
|
||||||
## Delegation Defaults
|
|
||||||
|
|
||||||
- Prefer `worker` subagents for independent write slices.
|
|
||||||
- Prefer `explorer` subagents for read-only hotspot ranking or next-batch discovery.
|
|
||||||
- Keep each worker ownership boundary disjoint.
|
|
||||||
- Avoid launching a new batch when the expected write set would push the branch beyond the declared threshold without a
|
|
||||||
deliberate decision.
|
|
||||||
|
|
||||||
## Completion
|
|
||||||
|
|
||||||
Stop the loop when any of the following becomes true:
|
|
||||||
|
|
||||||
- the next batch would likely push the agent near or beyond roughly 80% of its safe working-context budget
|
|
||||||
- the primary stop condition has been reached or exceeded
|
|
||||||
- the remaining slices are no longer low-risk
|
|
||||||
- validation failures indicate the task is no longer repetitive
|
|
||||||
- the branch has grown large enough that reviewability would materially degrade
|
|
||||||
|
|
||||||
When stopping, report:
|
|
||||||
|
|
||||||
- whether context budget was the deciding factor
|
|
||||||
- which baseline was used
|
|
||||||
- the exact metric value at stop time
|
|
||||||
- completed batches
|
|
||||||
- remaining candidate batches
|
|
||||||
- whether further work should continue in a new turn or after rebasing/fetching
|
|
||||||
|
|
||||||
## Example Triggers
|
|
||||||
|
|
||||||
- `Use $gframework-batch-boot 75 to keep reducing analyzer warnings until the branch diff vs baseline approaches 75 files.`
|
|
||||||
- `Use $gframework-batch-boot 75 2000 to keep reducing warnings until the branch diff approaches 75 files or 2000 changed lines.`
|
|
||||||
- `Use $gframework-batch-boot and keep reducing analyzer warnings until the branch diff vs origin/main approaches 75 files.`
|
|
||||||
- `Use $gframework-batch-boot to continue this repetitive test refactor in bounded batches until the warning count drops below 10.`
|
|
||||||
- `Use $gframework-batch-boot and refresh module docs in waves without asking me to trigger every round.`
|
|
||||||
@ -1,4 +0,0 @@
|
|||||||
interface:
|
|
||||||
display_name: "GFramework Batch Boot"
|
|
||||||
short_description: "Run boot, then iterate bounded bulk batches"
|
|
||||||
default_prompt: "Use $gframework-batch-boot to start from the normal GFramework boot context and continue the current repetitive task in automatic bounded batches until the declared stop condition is reached."
|
|
||||||
@ -1,92 +0,0 @@
|
|||||||
---
|
|
||||||
name: gframework-boot
|
|
||||||
description: Repository-specific boot workflow for the GFramework repo. Use when Codex needs to start or resume work in this repository from short prompts such as "boot", "continue", "read AGENTS", or "start the next step"; when the user expects Codex to first read AGENTS.md, .ai/environment/tools.ai.yaml, and public ai-plan tracking files; or when Codex should assess task complexity, decide whether explorer or worker subagents are warranted, and then proceed under the repository's workflow rules.
|
|
||||||
---
|
|
||||||
|
|
||||||
# GFramework Boot
|
|
||||||
|
|
||||||
## Overview
|
|
||||||
|
|
||||||
Use this skill to bootstrap work in the GFramework repository with minimal user prompting.
|
|
||||||
Treat `AGENTS.md` as the source of truth. Use this skill to enforce a startup sequence, not to replace repository rules.
|
|
||||||
If the task clearly requires the main agent to keep coordinating multiple parallel subagents while maintaining
|
|
||||||
`ai-plan` and reviewing each result, switch to `gframework-multi-agent-batch` after the boot context is established.
|
|
||||||
|
|
||||||
## Startup Workflow
|
|
||||||
|
|
||||||
1. Read `AGENTS.md` before choosing tools, planning edits, or delegating work.
|
|
||||||
2. Read `.ai/environment/tools.ai.yaml` to confirm the preferred local toolchain.
|
|
||||||
3. Read `ai-plan/public/README.md` before asking the user for missing context.
|
|
||||||
4. If `ai-plan/public/README.md` maps the current branch or worktree to active topics, inspect those topics'
|
|
||||||
`todos/` and `traces/` directories in listed priority order.
|
|
||||||
5. If no mapping exists, scan `ai-plan/public/<topic>/todos/` and `ai-plan/public/<topic>/traces/` across active
|
|
||||||
topics, and ignore `ai-plan/public/archive/` unless the user explicitly asks for historical context.
|
|
||||||
6. Treat `ai-plan/public/<topic>/archive/` as secondary context even for active topics; only read it when the active
|
|
||||||
todo/trace files point there or when the user explicitly asks for historical detail.
|
|
||||||
7. If `ai-plan/private/<branch-or-worktree>/` exists and is relevant, treat it as private recovery context for the
|
|
||||||
current worktree only and do not assume it should be committed.
|
|
||||||
8. Classify the task state:
|
|
||||||
- `new`: no matching recovery document exists, or the user is clearly starting fresh work
|
|
||||||
- `resume`: a matching todo or trace exists and the user is continuing that thread
|
|
||||||
- `recovery`: prior work looks partial, interrupted, or ambiguous and the next safe recovery point must be reconstructed
|
|
||||||
9. Choose the best matching `ai-plan` artifacts:
|
|
||||||
- Prefer topics explicitly mapped from `ai-plan/public/README.md`
|
|
||||||
- Prefer path names or headings that match the user's task wording
|
|
||||||
- Break ties by most recently updated trace or todo
|
|
||||||
- If ambiguity would materially change implementation, summarize the candidates and ask one concise question
|
|
||||||
10. Classify the task complexity before deciding on subagents:
|
|
||||||
- `simple`: one concern, one file or module, no parallel discovery required
|
|
||||||
- `medium`: a small number of modules, some read-only exploration helpful, critical path still easy to keep local
|
|
||||||
- `complex`: cross-module design, migration, large refactor, or work likely to exceed one context window
|
|
||||||
11. Estimate the current context-budget posture before substantive execution:
|
|
||||||
- account for loaded startup artifacts, active `ai-plan` files, visible diffs, open validation output, and likely next-step output volume
|
|
||||||
- if the task already appears near roughly 80% of a safe working-context budget, prefer closing the current batch,
|
|
||||||
refreshing recovery artifacts, and stopping at the next natural semantic boundary instead of starting a fresh broad slice
|
|
||||||
12. Apply the delegation policy from `AGENTS.md`:
|
|
||||||
- Keep the critical path local
|
|
||||||
- Use `explorer` with `gpt-5.1-codex-mini` for narrow read-only questions, tracing, inventory, and comparisons
|
|
||||||
- Use `worker` with `gpt-5.4` only for bounded implementation tasks with explicit ownership
|
|
||||||
- Do not delegate purely for ceremony; delegate only when it materially shortens the task or controls context growth
|
|
||||||
- If the user explicitly wants the main agent to keep orchestrating multiple workers through several review/integration
|
|
||||||
cycles, prefer `gframework-multi-agent-batch` over ad-hoc delegation
|
|
||||||
13. Before editing files, tell the user what you read, how you classified the task, whether subagents will be used,
|
|
||||||
and the first implementation step.
|
|
||||||
14. Proceed with execution, validation, and documentation updates required by `AGENTS.md`.
|
|
||||||
|
|
||||||
## Task Tracking
|
|
||||||
|
|
||||||
For multi-step, cross-module, or interruption-prone work, maintain the repository recovery artifacts instead of keeping state only in chat.
|
|
||||||
|
|
||||||
- Update `ai-plan/public/README.md` whenever the active topic set or worktree mapping changes.
|
|
||||||
- Update the active public document under `ai-plan/public/<topic>/todos/` with completed work, validation results,
|
|
||||||
risks, and the next recovery point.
|
|
||||||
- Update the matching public trace under `ai-plan/public/<topic>/traces/` with key decisions, delegated scope, and the
|
|
||||||
immediate next step.
|
|
||||||
- Keep the active todo/trace files concise enough for `boot` to use as default entrypoints. When completed, validated
|
|
||||||
stages start piling up, move their detailed history into `ai-plan/public/<topic>/archive/` and leave archive
|
|
||||||
pointers in the active files.
|
|
||||||
- Move stage-complete artifacts into `ai-plan/public/<topic>/archive/`, and move completed topics into
|
|
||||||
`ai-plan/public/archive/<topic>/` so `boot` does not keep reloading stale context.
|
|
||||||
- Keep worktree-private scratch recovery files under `ai-plan/private/` and do not treat them as commit targets.
|
|
||||||
- Never write secrets, machine-specific paths, or other sensitive environment details into any `ai-plan/**` artifact.
|
|
||||||
- If the task is clearly complex and no recovery artifact exists yet, create one before substantive edits.
|
|
||||||
|
|
||||||
## Recovery Heuristics
|
|
||||||
|
|
||||||
- If the user says `next step`, `continue`, `继续`, or similar resume language, read `ai-plan/public/README.md`
|
|
||||||
first, then search the mapped active topics before scanning the broader public area.
|
|
||||||
- If the current branch and the mapped active topics describe the same feature area, prefer resuming those topics first.
|
|
||||||
- If the repository state suggests in-flight work but no recovery document matches, reconstruct the safest next step from code, tests, and Git state before asking the user for clarification.
|
|
||||||
- If the current turn already carries heavy recovery context, broad diffs, or long validation output, prefer a
|
|
||||||
recovery-point update and a clean stop over starting another large slice just because the code task itself remains open.
|
|
||||||
|
|
||||||
## Example Triggers
|
|
||||||
|
|
||||||
- `boot`
|
|
||||||
- `Use $gframework-boot and continue the current task`
|
|
||||||
- `Read AGENTS and public ai-plan, then start the next step`
|
|
||||||
- `继续当前任务,先看 AGENTS.md 和 public ai-plan`
|
|
||||||
|
|
||||||
## References
|
|
||||||
|
|
||||||
Read `references/startup-artifacts.md` when you need a quick reminder of the repository entrypoints, task-state heuristics, or delegation defaults without re-reading the entire skill.
|
|
||||||
@ -1,4 +0,0 @@
|
|||||||
interface:
|
|
||||||
display_name: "GFramework Boot"
|
|
||||||
short_description: "Bootstrap GFramework repository tasks"
|
|
||||||
default_prompt: "Use $gframework-boot to start or resume work in this GFramework repository."
|
|
||||||
@ -1,38 +0,0 @@
|
|||||||
# Startup Artifacts
|
|
||||||
|
|
||||||
## Required Reads
|
|
||||||
|
|
||||||
- `AGENTS.md`
|
|
||||||
- `.ai/environment/tools.ai.yaml`
|
|
||||||
- `ai-plan/public/README.md`
|
|
||||||
- the selected `ai-plan/public/<topic>/todos/` directories
|
|
||||||
- the selected `ai-plan/public/<topic>/traces/` directories
|
|
||||||
|
|
||||||
## AI-Plan Selection Heuristics
|
|
||||||
|
|
||||||
- Match the current branch or worktree against `ai-plan/public/README.md` first.
|
|
||||||
- If the index maps the current worktree to topics, inspect those topics in listed order before scanning anything else.
|
|
||||||
- Match the user's wording against public todo and trace file names next.
|
|
||||||
- Prefer the newest matching trace when several candidates describe the same feature area.
|
|
||||||
- If one file records a clearer recovery point than a newer but vague file, prefer the clearer recovery point.
|
|
||||||
- Ignore `ai-plan/public/archive/**` unless the user explicitly requests historical recovery context.
|
|
||||||
- Even inside an active topic, prefer the root `todos/` and `traces/` entry files first; only read `archive/` when the
|
|
||||||
active files point there or when the user asks for historical detail.
|
|
||||||
- If a matching `ai-plan/private/<branch-or-worktree>/` directory exists, use it only as private context for the current worktree.
|
|
||||||
|
|
||||||
## Complexity Defaults
|
|
||||||
|
|
||||||
- `simple`: keep everything local, no subagent
|
|
||||||
- `medium`: keep design local, optionally use one `explorer` for parallel read-only discovery
|
|
||||||
- `complex`: keep architecture and integration local, delegate only bounded non-blocking subtasks
|
|
||||||
|
|
||||||
## Model Defaults
|
|
||||||
|
|
||||||
- `explorer`: `gpt-5.1-codex-mini`
|
|
||||||
- `worker`: `gpt-5.4`
|
|
||||||
|
|
||||||
## Startup Summary Template
|
|
||||||
|
|
||||||
Use a short update before execution:
|
|
||||||
|
|
||||||
`Read AGENTS.md, the environment inventory, ai-plan/public/README.md, and the relevant public ai-plan artifacts. This looks like a <task-state> <complexity> task. I will <delegate-or-not> and start with <first-step>.`
|
|
||||||
@ -1,218 +0,0 @@
|
|||||||
---
|
|
||||||
name: gframework-doc-refresh
|
|
||||||
description: "Refresh or reassess GFramework documentation for a source module such as Core, Game, Godot, Cqrs, Ecs.Arch, or their generator/abstraction packages. Use this when the user asks to update module docs, re-evaluate landing pages, fix outdated topic pages, refresh API reference coverage, verify adoption paths against source/tests/README, or compare current docs with ai-libs consumer wiring. Recommended command: /gframework-doc-refresh <module>."
|
|
||||||
---
|
|
||||||
|
|
||||||
# Purpose
|
|
||||||
|
|
||||||
Use this skill to refresh GFramework documentation from source-first evidence.
|
|
||||||
|
|
||||||
The public entry is module-driven, not doc-type-driven:
|
|
||||||
|
|
||||||
- Input: a source module or a resolvable docs section alias
|
|
||||||
- Output: the minimal documentation update set needed for that module
|
|
||||||
- Evidence: code, tests, README, current docs, then `ai-libs/`
|
|
||||||
|
|
||||||
Do not start by deciding “this is an API doc task” or “this is a tutorial task”.
|
|
||||||
Decide that only after the module scan.
|
|
||||||
|
|
||||||
# Triggers
|
|
||||||
|
|
||||||
Use this skill when the user asks things like:
|
|
||||||
|
|
||||||
- `refresh docs for Core`
|
|
||||||
- `update Game module docs`
|
|
||||||
- `根据 Godot 模块源码刷新文档`
|
|
||||||
- `重新评估 Cqrs 模块文档并更新`
|
|
||||||
- `核对 Godot.SourceGenerators 的文档状态`
|
|
||||||
- `看看 source-generators 栏目哪些页面已经失真`
|
|
||||||
|
|
||||||
Recommended command form:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
/gframework-doc-refresh <module>
|
|
||||||
```
|
|
||||||
|
|
||||||
# Supported Modules
|
|
||||||
|
|
||||||
Canonical module names:
|
|
||||||
|
|
||||||
- `Core`
|
|
||||||
- `Core.Abstractions`
|
|
||||||
- `Core.SourceGenerators`
|
|
||||||
- `Core.SourceGenerators.Abstractions`
|
|
||||||
- `Game`
|
|
||||||
- `Game.Abstractions`
|
|
||||||
- `Game.SourceGenerators`
|
|
||||||
- `Godot`
|
|
||||||
- `Godot.SourceGenerators`
|
|
||||||
- `Godot.SourceGenerators.Abstractions`
|
|
||||||
- `Cqrs`
|
|
||||||
- `Cqrs.Abstractions`
|
|
||||||
- `Cqrs.SourceGenerators`
|
|
||||||
- `Ecs.Arch`
|
|
||||||
- `Ecs.Arch.Abstractions`
|
|
||||||
- `SourceGenerators.Common`
|
|
||||||
|
|
||||||
The canonical mapping lives in `.agents/skills/_shared/module-map.json`.
|
|
||||||
|
|
||||||
If the user supplies a docs section name:
|
|
||||||
|
|
||||||
- resolve it back to a source module first
|
|
||||||
- if it maps to multiple modules, stop at normalization guidance and do not draft docs yet
|
|
||||||
|
|
||||||
# Workflow
|
|
||||||
|
|
||||||
## 1. Normalize the input
|
|
||||||
|
|
||||||
Run:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
python3 .agents/skills/gframework-doc-refresh/scripts/scan_module_evidence.py <module>
|
|
||||||
```
|
|
||||||
|
|
||||||
The script normalizes aliases, reports ambiguity, and prints the module's evidence surface.
|
|
||||||
|
|
||||||
If the result is ambiguous:
|
|
||||||
|
|
||||||
- return the candidate modules
|
|
||||||
- ask the user to pick the intended source module
|
|
||||||
- do not continue into document generation
|
|
||||||
|
|
||||||
## 2. Scan the evidence surface
|
|
||||||
|
|
||||||
For the resolved module, inspect:
|
|
||||||
|
|
||||||
- source directories
|
|
||||||
- `*.csproj`
|
|
||||||
- relevant test projects
|
|
||||||
- sibling `README.md`
|
|
||||||
- mapped `docs/zh-CN` landing pages and topic pages
|
|
||||||
- optional `ai-libs/` consumer evidence when needed
|
|
||||||
|
|
||||||
Always confirm the actual files in the repository.
|
|
||||||
Do not assume the mapping is enough on its own.
|
|
||||||
|
|
||||||
## 3. Decide whether `ai-libs/` is needed
|
|
||||||
|
|
||||||
Use `ai-libs/` when:
|
|
||||||
|
|
||||||
- adoption path is unclear from source and README alone
|
|
||||||
- extension points need a real consumer wiring example
|
|
||||||
- current docs have concepts but lack an end-to-end integration path
|
|
||||||
|
|
||||||
Do not rely on `ai-libs/` for:
|
|
||||||
|
|
||||||
- public API contract definitions
|
|
||||||
- generator diagnostics or semantic guarantees
|
|
||||||
- claims about what the current version officially supports
|
|
||||||
|
|
||||||
If `ai-libs/` conflicts with current source or tests, keep source/tests as the contract and document the migration boundary.
|
|
||||||
|
|
||||||
## 4. Judge the documentation state
|
|
||||||
|
|
||||||
Classify the module into one or more of these states:
|
|
||||||
|
|
||||||
- missing landing page
|
|
||||||
- stale landing page
|
|
||||||
- stale topic page
|
|
||||||
- missing or stale API reference coverage
|
|
||||||
- stale tutorial/example
|
|
||||||
- validation-only
|
|
||||||
|
|
||||||
Base this on evidence, not on the previous docs shape.
|
|
||||||
|
|
||||||
## 5. Choose the output set
|
|
||||||
|
|
||||||
Always prioritize:
|
|
||||||
|
|
||||||
1. README / landing page / adoption path
|
|
||||||
2. topic pages
|
|
||||||
3. API reference
|
|
||||||
4. tutorials
|
|
||||||
|
|
||||||
If the module only needs validation or relinking, do not generate extra pages.
|
|
||||||
|
|
||||||
## 6. Draft or update docs
|
|
||||||
|
|
||||||
Load only the template that matches the output you selected:
|
|
||||||
|
|
||||||
- `templates/module-landing.md`
|
|
||||||
- `templates/topic-refresh.md`
|
|
||||||
- `templates/api-reference.md`
|
|
||||||
- `templates/tutorial.md`
|
|
||||||
|
|
||||||
Keep examples minimal, current, and traceable to source or tests.
|
|
||||||
|
|
||||||
## 7. Validate
|
|
||||||
|
|
||||||
Run the internal validators as needed:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
bash .agents/skills/gframework-doc-refresh/scripts/validate-all.sh <file-or-directory>
|
|
||||||
```
|
|
||||||
|
|
||||||
For site-level confirmation after doc edits:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd docs && bun run build
|
|
||||||
```
|
|
||||||
|
|
||||||
# Evidence Order
|
|
||||||
|
|
||||||
Use this exact priority:
|
|
||||||
|
|
||||||
1. source code, XML docs, `*.csproj`
|
|
||||||
2. tests and snapshots
|
|
||||||
3. module `README.md`
|
|
||||||
4. current `docs/zh-CN` pages
|
|
||||||
5. verified `ai-libs/` consumers
|
|
||||||
6. archived docs only as fallback context
|
|
||||||
|
|
||||||
# Output Rules
|
|
||||||
|
|
||||||
- Prefer correcting the adoption path over expanding page count.
|
|
||||||
- Do not copy wording from outdated docs just to keep page volume.
|
|
||||||
- Public docs must stay reader-facing. Do not write inventory, coverage baseline, recovery-point, batch-metric, review
|
|
||||||
backlog, or audit-wave wording into `README.md` or `docs/**`.
|
|
||||||
- Use neutral, destination-first section names and link labels. Do not expose raw filenames or paths such as
|
|
||||||
`game/index.md`, `README.md`, or `../core/cqrs.md` as visible reader-facing labels when a semantic label is
|
|
||||||
available.
|
|
||||||
- Do not use rhetorical or conversational headings in public docs, such as “你真正会用到的公开入口”、
|
|
||||||
“先理解包关系” or “想看……转到……”. Prefer direct labels such as “公开入口”、
|
|
||||||
“模块与包关系” and “相关主题”.
|
|
||||||
- Keep public docs out of internal product-decision tone. Do not publish repository-governance wording such as
|
|
||||||
“当前阶段的结论”、“不建议立即启动” or audience-maintainer tradeoff discussions unless the page itself is a public
|
|
||||||
adoption guide and the wording has been rewritten as reader-facing suitability guidance.
|
|
||||||
- If XML or audit evidence is relevant, translate it into reader guidance such as “which types to inspect first” or
|
|
||||||
“which entry points define the contract”, instead of exposing counts, dates, or governance status.
|
|
||||||
- Escape generics outside code blocks.
|
|
||||||
- Keep internal links real and current.
|
|
||||||
- Mark code blocks with explicit languages.
|
|
||||||
- Use the smallest example that demonstrates the current contract.
|
|
||||||
- Consumer examples may align with `ai-libs/`, but must not exceed the current module contract.
|
|
||||||
|
|
||||||
# Validation
|
|
||||||
|
|
||||||
Use the shared standards in `.agents/skills/_shared/DOCUMENTATION_STANDARDS.md`.
|
|
||||||
|
|
||||||
When this skill changes public docs, prefer:
|
|
||||||
|
|
||||||
1. focused validator on touched pages
|
|
||||||
2. `cd docs && bun run build`
|
|
||||||
|
|
||||||
When this skill changes the skill system itself:
|
|
||||||
|
|
||||||
1. validate `SKILL.md` frontmatter exists
|
|
||||||
2. run the module scan script for representative modules
|
|
||||||
3. confirm obsolete `vitepress-*` public entries are gone
|
|
||||||
|
|
||||||
# References
|
|
||||||
|
|
||||||
Read these only when needed:
|
|
||||||
|
|
||||||
- `.agents/skills/_shared/DOCUMENTATION_STANDARDS.md`
|
|
||||||
- `.agents/skills/_shared/module-map.json`
|
|
||||||
- `references/module-selection.md`
|
|
||||||
- `references/evidence-and-ai-libs.md`
|
|
||||||
- `references/output-strategy.md`
|
|
||||||
@ -1,4 +0,0 @@
|
|||||||
interface:
|
|
||||||
display_name: "GFramework Doc Refresh"
|
|
||||||
short_description: "Refresh module docs from code-first evidence"
|
|
||||||
default_prompt: "Use $gframework-doc-refresh to refresh a GFramework module's docs from source, tests, README, current docs, and ai-libs evidence."
|
|
||||||
@ -1,35 +0,0 @@
|
|||||||
# Evidence And `ai-libs`
|
|
||||||
|
|
||||||
The evidence order is fixed:
|
|
||||||
|
|
||||||
1. source code, XML docs, `*.csproj`
|
|
||||||
2. tests and snapshots
|
|
||||||
3. module README
|
|
||||||
4. current `docs/zh-CN`
|
|
||||||
5. `ai-libs/`
|
|
||||||
6. archived docs
|
|
||||||
|
|
||||||
## When To Use `ai-libs`
|
|
||||||
|
|
||||||
Use `ai-libs/` to answer questions like:
|
|
||||||
|
|
||||||
- How is this extension point wired in a real project?
|
|
||||||
- What does the minimal project layout look like?
|
|
||||||
- Which project-side files need to exist for this module to work end to end?
|
|
||||||
|
|
||||||
## When Not To Use `ai-libs`
|
|
||||||
|
|
||||||
Do not use `ai-libs/` as the primary source for:
|
|
||||||
|
|
||||||
- public API semantics
|
|
||||||
- exact generator output guarantees
|
|
||||||
- supported package matrix
|
|
||||||
- diagnostics behavior
|
|
||||||
|
|
||||||
## Conflict Rule
|
|
||||||
|
|
||||||
If `ai-libs/` drifts from the current repo:
|
|
||||||
|
|
||||||
- trust source and tests
|
|
||||||
- mention the drift as a compatibility or migration note
|
|
||||||
- do not document old consumer behavior as if it were still the contract
|
|
||||||
@ -1,29 +0,0 @@
|
|||||||
# Module Selection
|
|
||||||
|
|
||||||
Use `.agents/skills/_shared/module-map.json` as the canonical source for:
|
|
||||||
|
|
||||||
- supported modules
|
|
||||||
- aliases
|
|
||||||
- source paths
|
|
||||||
- test projects
|
|
||||||
- README paths
|
|
||||||
- docs landing/topic/fallback pages
|
|
||||||
- `ai-libs/` reference roots
|
|
||||||
|
|
||||||
Selection rules:
|
|
||||||
|
|
||||||
1. Prefer explicit canonical module names.
|
|
||||||
2. Resolve docs section aliases back to source modules before scanning docs.
|
|
||||||
3. If an alias maps to multiple modules, stop and return the candidate list.
|
|
||||||
4. If a module has no dedicated docs section, fall back to the nearest existing landing page or API index instead of inventing a fake section.
|
|
||||||
|
|
||||||
Representative ambiguous inputs:
|
|
||||||
|
|
||||||
- `source-generators` -> likely one of `Core.SourceGenerators`, `Game.SourceGenerators`, `Cqrs.SourceGenerators`, `Godot.SourceGenerators`
|
|
||||||
- `abstractions` -> likely one of `Core.Abstractions`, `Game.Abstractions`, `Ecs.Arch.Abstractions`
|
|
||||||
|
|
||||||
Representative resolvable aliases:
|
|
||||||
|
|
||||||
- `core-abstractions` -> `Core.Abstractions`
|
|
||||||
- `godot generators` -> `Godot.SourceGenerators`
|
|
||||||
- `ecs` -> `Ecs.Arch`
|
|
||||||
@ -1,38 +0,0 @@
|
|||||||
# Output Strategy
|
|
||||||
|
|
||||||
The module scan determines the document type.
|
|
||||||
|
|
||||||
Use this priority:
|
|
||||||
|
|
||||||
1. fix README / landing page / adoption path
|
|
||||||
2. fix stale topic pages
|
|
||||||
3. add or refresh API reference coverage
|
|
||||||
4. add or refresh tutorials
|
|
||||||
|
|
||||||
## Landing Page Checklist
|
|
||||||
|
|
||||||
- module purpose
|
|
||||||
- package relationship
|
|
||||||
- minimum adoption path
|
|
||||||
- real entry points
|
|
||||||
- next-reading links
|
|
||||||
|
|
||||||
## Topic Page Checklist
|
|
||||||
|
|
||||||
- current role
|
|
||||||
- public entry points
|
|
||||||
- minimum example
|
|
||||||
- compatibility or migration boundary
|
|
||||||
- related pages
|
|
||||||
|
|
||||||
## API Reference Checklist
|
|
||||||
|
|
||||||
- only for types or members that materially help consumers
|
|
||||||
- grounded in XML docs and source
|
|
||||||
- no speculative examples
|
|
||||||
|
|
||||||
## Tutorial Checklist
|
|
||||||
|
|
||||||
- only after the landing path is accurate
|
|
||||||
- keep the scenario traceable to source/tests or `ai-libs/`
|
|
||||||
- explain why each step exists, not just the code shape
|
|
||||||
@ -1,226 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
"""Normalize a GFramework docs module input and report its evidence surface."""
|
|
||||||
|
|
||||||
from __future__ import annotations
|
|
||||||
|
|
||||||
import argparse
|
|
||||||
import json
|
|
||||||
import sys
|
|
||||||
from pathlib import Path
|
|
||||||
from typing import Any
|
|
||||||
|
|
||||||
|
|
||||||
SCRIPT_DIR = Path(__file__).resolve().parent
|
|
||||||
REPO_ROOT = SCRIPT_DIR.parents[3]
|
|
||||||
MODULE_MAP_PATH = REPO_ROOT / ".agents/skills/_shared/module-map.json"
|
|
||||||
|
|
||||||
|
|
||||||
def load_module_map() -> dict[str, Any]:
|
|
||||||
return json.loads(MODULE_MAP_PATH.read_text(encoding="utf-8"))
|
|
||||||
|
|
||||||
|
|
||||||
def normalize_key(value: str) -> str:
|
|
||||||
return value.strip().lower().replace("_", "-").replace(" ", "-")
|
|
||||||
|
|
||||||
|
|
||||||
def resolve_module(raw_input: str, module_map: dict[str, Any]) -> dict[str, Any]:
|
|
||||||
modules = module_map["modules"]
|
|
||||||
docs_section_aliases = module_map.get("docs_section_aliases", {})
|
|
||||||
normalized = normalize_key(raw_input)
|
|
||||||
|
|
||||||
for canonical_name in modules:
|
|
||||||
if normalize_key(canonical_name) == normalized:
|
|
||||||
return {"status": "ok", "module": canonical_name, "reason": "canonical"}
|
|
||||||
|
|
||||||
for canonical_name, config in modules.items():
|
|
||||||
aliases = config.get("aliases", [])
|
|
||||||
if normalized in {normalize_key(alias) for alias in aliases}:
|
|
||||||
return {"status": "ok", "module": canonical_name, "reason": "alias"}
|
|
||||||
|
|
||||||
if normalized in docs_section_aliases:
|
|
||||||
candidates = docs_section_aliases[normalized]
|
|
||||||
if len(candidates) == 1:
|
|
||||||
return {"status": "ok", "module": candidates[0], "reason": "docs_section"}
|
|
||||||
return {
|
|
||||||
"status": "ambiguous",
|
|
||||||
"reason": "docs_section",
|
|
||||||
"input": raw_input,
|
|
||||||
"candidates": candidates,
|
|
||||||
}
|
|
||||||
|
|
||||||
fuzzy = [
|
|
||||||
canonical_name
|
|
||||||
for canonical_name in modules
|
|
||||||
if normalized in normalize_key(canonical_name) or normalize_key(canonical_name) in normalized
|
|
||||||
]
|
|
||||||
if fuzzy:
|
|
||||||
return {"status": "unknown", "reason": "closest_match", "input": raw_input, "candidates": fuzzy}
|
|
||||||
|
|
||||||
return {"status": "unknown", "reason": "no_match", "input": raw_input, "candidates": []}
|
|
||||||
|
|
||||||
|
|
||||||
def collect_path_state(paths: list[str]) -> list[dict[str, Any]]:
|
|
||||||
states: list[dict[str, Any]] = []
|
|
||||||
for relative_path in paths:
|
|
||||||
absolute_path = REPO_ROOT / relative_path
|
|
||||||
states.append(
|
|
||||||
{
|
|
||||||
"path": relative_path,
|
|
||||||
"exists": absolute_path.exists(),
|
|
||||||
"kind": "dir" if absolute_path.is_dir() else "file",
|
|
||||||
}
|
|
||||||
)
|
|
||||||
return states
|
|
||||||
|
|
||||||
|
|
||||||
def assess_docs(module_config: dict[str, Any]) -> list[str]:
|
|
||||||
docs_config = module_config["docs"]
|
|
||||||
landing = collect_path_state(docs_config.get("landing", []))
|
|
||||||
topics = collect_path_state(docs_config.get("topics", []))
|
|
||||||
assessment: list[str] = []
|
|
||||||
|
|
||||||
if landing and not any(item["exists"] for item in landing):
|
|
||||||
assessment.append("landing_missing")
|
|
||||||
elif landing:
|
|
||||||
assessment.append("landing_present")
|
|
||||||
|
|
||||||
if not topics:
|
|
||||||
assessment.append("topic_docs_not_mapped")
|
|
||||||
else:
|
|
||||||
existing_topics = sum(1 for item in topics if item["exists"])
|
|
||||||
if existing_topics == 0:
|
|
||||||
assessment.append("topic_docs_missing")
|
|
||||||
elif existing_topics < len(topics):
|
|
||||||
assessment.append("topic_docs_partial")
|
|
||||||
else:
|
|
||||||
assessment.append("topic_docs_present")
|
|
||||||
|
|
||||||
return assessment
|
|
||||||
|
|
||||||
|
|
||||||
def build_report(module_name: str, module_config: dict[str, Any]) -> dict[str, Any]:
|
|
||||||
source_paths = collect_path_state(module_config.get("source_paths", []))
|
|
||||||
test_projects = collect_path_state(module_config.get("test_projects", []))
|
|
||||||
readmes = collect_path_state(module_config.get("readme_paths", []))
|
|
||||||
docs_config = module_config["docs"]
|
|
||||||
ai_libs = module_config.get("ai_libs", {})
|
|
||||||
|
|
||||||
report = {
|
|
||||||
"status": "ok",
|
|
||||||
"module": module_name,
|
|
||||||
"source_paths": source_paths,
|
|
||||||
"project_file": collect_path_state([module_config["project_file"]])[0],
|
|
||||||
"test_projects": test_projects,
|
|
||||||
"readme_paths": readmes,
|
|
||||||
"docs": {
|
|
||||||
"landing": collect_path_state(docs_config.get("landing", [])),
|
|
||||||
"topics": collect_path_state(docs_config.get("topics", [])),
|
|
||||||
"fallback": collect_path_state(docs_config.get("fallback", []))
|
|
||||||
},
|
|
||||||
"ai_libs": {
|
|
||||||
"paths": collect_path_state(ai_libs.get("paths", [])),
|
|
||||||
"search_hints": ai_libs.get("search_hints", []),
|
|
||||||
},
|
|
||||||
"assessment": assess_docs(module_config),
|
|
||||||
}
|
|
||||||
|
|
||||||
if readmes and not any(item["exists"] for item in readmes):
|
|
||||||
report["assessment"].append("readme_missing")
|
|
||||||
|
|
||||||
if test_projects and not any(item["exists"] for item in test_projects):
|
|
||||||
report["assessment"].append("tests_missing")
|
|
||||||
|
|
||||||
if not ai_libs.get("paths"):
|
|
||||||
report["assessment"].append("ai_libs_optional")
|
|
||||||
|
|
||||||
if not docs_config.get("topics"):
|
|
||||||
report["assessment"].append("fallback_docs_only")
|
|
||||||
|
|
||||||
return report
|
|
||||||
|
|
||||||
|
|
||||||
def print_text_report(report: dict[str, Any]) -> None:
|
|
||||||
if report["status"] != "ok":
|
|
||||||
print(json.dumps(report, ensure_ascii=False, indent=2))
|
|
||||||
return
|
|
||||||
|
|
||||||
print(f"module: {report['module']}")
|
|
||||||
print("assessment:")
|
|
||||||
for item in report["assessment"]:
|
|
||||||
print(f" - {item}")
|
|
||||||
|
|
||||||
print("source:")
|
|
||||||
for item in report["source_paths"]:
|
|
||||||
print(f" - {'OK' if item['exists'] else 'MISS'} {item['path']}")
|
|
||||||
|
|
||||||
project_file = report["project_file"]
|
|
||||||
print(f"project: {'OK' if project_file['exists'] else 'MISS'} {project_file['path']}")
|
|
||||||
|
|
||||||
print("tests:")
|
|
||||||
for item in report["test_projects"]:
|
|
||||||
print(f" - {'OK' if item['exists'] else 'MISS'} {item['path']}")
|
|
||||||
|
|
||||||
print("readme:")
|
|
||||||
if report["readme_paths"]:
|
|
||||||
for item in report["readme_paths"]:
|
|
||||||
print(f" - {'OK' if item['exists'] else 'MISS'} {item['path']}")
|
|
||||||
else:
|
|
||||||
print(" - none mapped")
|
|
||||||
|
|
||||||
print("docs landing:")
|
|
||||||
for item in report["docs"]["landing"]:
|
|
||||||
print(f" - {'OK' if item['exists'] else 'MISS'} {item['path']}")
|
|
||||||
|
|
||||||
print("docs topics:")
|
|
||||||
if report["docs"]["topics"]:
|
|
||||||
for item in report["docs"]["topics"]:
|
|
||||||
print(f" - {'OK' if item['exists'] else 'MISS'} {item['path']}")
|
|
||||||
else:
|
|
||||||
print(" - none mapped")
|
|
||||||
|
|
||||||
print("docs fallback:")
|
|
||||||
for item in report["docs"]["fallback"]:
|
|
||||||
print(f" - {'OK' if item['exists'] else 'MISS'} {item['path']}")
|
|
||||||
|
|
||||||
print("ai-libs:")
|
|
||||||
if report["ai_libs"]["paths"]:
|
|
||||||
for item in report["ai_libs"]["paths"]:
|
|
||||||
print(f" - {'OK' if item['exists'] else 'MISS'} {item['path']}")
|
|
||||||
else:
|
|
||||||
print(" - none mapped")
|
|
||||||
|
|
||||||
if report["ai_libs"]["search_hints"]:
|
|
||||||
print("ai-libs search hints:")
|
|
||||||
for item in report["ai_libs"]["search_hints"]:
|
|
||||||
print(f" - {item}")
|
|
||||||
|
|
||||||
|
|
||||||
def main() -> int:
|
|
||||||
parser = argparse.ArgumentParser(description=__doc__)
|
|
||||||
parser.add_argument("module", help="Canonical module name, alias, or docs section name.")
|
|
||||||
parser.add_argument("--json", action="store_true", help="Emit JSON instead of text.")
|
|
||||||
args = parser.parse_args()
|
|
||||||
|
|
||||||
module_map = load_module_map()
|
|
||||||
resolution = resolve_module(args.module, module_map)
|
|
||||||
|
|
||||||
if resolution["status"] != "ok":
|
|
||||||
if args.json:
|
|
||||||
print(json.dumps(resolution, ensure_ascii=False, indent=2))
|
|
||||||
else:
|
|
||||||
print(json.dumps(resolution, ensure_ascii=False, indent=2))
|
|
||||||
return 1
|
|
||||||
|
|
||||||
report = build_report(resolution["module"], module_map["modules"][resolution["module"]])
|
|
||||||
report["resolution"] = resolution
|
|
||||||
|
|
||||||
if args.json:
|
|
||||||
print(json.dumps(report, ensure_ascii=False, indent=2))
|
|
||||||
else:
|
|
||||||
print_text_report(report)
|
|
||||||
|
|
||||||
return 0
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
sys.exit(main())
|
|
||||||
@ -1,67 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# 运行统一文档校验脚本集合。
|
|
||||||
|
|
||||||
set -e
|
|
||||||
|
|
||||||
TARGET="$1"
|
|
||||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
|
||||||
|
|
||||||
if [ -z "$TARGET" ]; then
|
|
||||||
echo "用法: $0 <文件或目录路径>"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ ! -e "$TARGET" ]; then
|
|
||||||
echo "错误: 路径不存在: $TARGET"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ -f "$TARGET" ]; then
|
|
||||||
FILES=("$TARGET")
|
|
||||||
else
|
|
||||||
mapfile -t FILES < <(find "$TARGET" -type f -name "*.md" | sort)
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ ${#FILES[@]} -eq 0 ]; then
|
|
||||||
echo "未找到 Markdown 文件"
|
|
||||||
exit 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
TOTAL_ERRORS=0
|
|
||||||
FAILED_FILES=0
|
|
||||||
|
|
||||||
for FILE in "${FILES[@]}"; do
|
|
||||||
FILE_ERRORS=0
|
|
||||||
|
|
||||||
echo "验证: $FILE"
|
|
||||||
|
|
||||||
if ! bash "$SCRIPT_DIR/validate-frontmatter.sh" "$FILE"; then
|
|
||||||
FILE_ERRORS=$((FILE_ERRORS + 1))
|
|
||||||
fi
|
|
||||||
|
|
||||||
if ! bash "$SCRIPT_DIR/validate-links.sh" "$FILE"; then
|
|
||||||
FILE_ERRORS=$((FILE_ERRORS + 1))
|
|
||||||
fi
|
|
||||||
|
|
||||||
if ! bash "$SCRIPT_DIR/validate-code-blocks.sh" "$FILE"; then
|
|
||||||
FILE_ERRORS=$((FILE_ERRORS + 1))
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ $FILE_ERRORS -eq 0 ]; then
|
|
||||||
echo "✓ $FILE"
|
|
||||||
else
|
|
||||||
echo "✗ $FILE"
|
|
||||||
FAILED_FILES=$((FAILED_FILES + 1))
|
|
||||||
fi
|
|
||||||
|
|
||||||
TOTAL_ERRORS=$((TOTAL_ERRORS + FILE_ERRORS))
|
|
||||||
echo ""
|
|
||||||
done
|
|
||||||
|
|
||||||
if [ $TOTAL_ERRORS -eq 0 ]; then
|
|
||||||
echo "✓ 所有验证通过"
|
|
||||||
exit 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
echo "✗ 验证失败:$FAILED_FILES 个文件存在问题"
|
|
||||||
exit 1
|
|
||||||
@ -1,62 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# 验证 Markdown 代码块是否闭合并带有语言标记。
|
|
||||||
|
|
||||||
set -e
|
|
||||||
|
|
||||||
FILE="$1"
|
|
||||||
|
|
||||||
if [ -z "$FILE" ]; then
|
|
||||||
echo "用法: $0 <文件路径>"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ ! -f "$FILE" ]; then
|
|
||||||
echo "错误: 文件不存在: $FILE"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
ERROR_COUNT=0
|
|
||||||
WARNING_COUNT=0
|
|
||||||
CODE_FENCE_COUNT=$(grep -c '^```' "$FILE" || true)
|
|
||||||
|
|
||||||
if [ $((CODE_FENCE_COUNT % 2)) -ne 0 ]; then
|
|
||||||
echo "✗ 错误: 存在未闭合的代码块"
|
|
||||||
ERROR_COUNT=$((ERROR_COUNT + 1))
|
|
||||||
fi
|
|
||||||
|
|
||||||
LINE_NUMBER=0
|
|
||||||
IN_CODE_BLOCK=0
|
|
||||||
while IFS= read -r LINE || [ -n "$LINE" ]; do
|
|
||||||
LINE_NUMBER=$((LINE_NUMBER + 1))
|
|
||||||
|
|
||||||
if [[ "$LINE" =~ ^\`\`\`(cs|c#|C#)$ ]]; then
|
|
||||||
echo "⚠ 警告: 第 $LINE_NUMBER 行使用了非标准 C# 标记,建议改为 csharp"
|
|
||||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [[ "$LINE" =~ ^\`\`\` ]]; then
|
|
||||||
if [ "$IN_CODE_BLOCK" -eq 0 ]; then
|
|
||||||
if [[ "$LINE" == '```' ]]; then
|
|
||||||
echo "⚠ 警告: 第 $LINE_NUMBER 行的代码块缺少语言标记"
|
|
||||||
WARNING_COUNT=$((WARNING_COUNT + 1))
|
|
||||||
fi
|
|
||||||
|
|
||||||
IN_CODE_BLOCK=1
|
|
||||||
else
|
|
||||||
IN_CODE_BLOCK=0
|
|
||||||
fi
|
|
||||||
fi
|
|
||||||
done < "$FILE"
|
|
||||||
|
|
||||||
if [ $ERROR_COUNT -eq 0 ] && [ $WARNING_COUNT -eq 0 ]; then
|
|
||||||
echo "✓ 代码块验证通过"
|
|
||||||
exit 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ $ERROR_COUNT -eq 0 ]; then
|
|
||||||
echo "⚠ 代码块验证通过,但有 $WARNING_COUNT 个警告"
|
|
||||||
exit 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
echo "✗ 代码块验证失败($ERROR_COUNT 个错误,$WARNING_COUNT 个警告)"
|
|
||||||
exit 1
|
|
||||||
@ -1,40 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# 验证 Markdown frontmatter。
|
|
||||||
|
|
||||||
set -e
|
|
||||||
|
|
||||||
FILE="$1"
|
|
||||||
|
|
||||||
if [ -z "$FILE" ]; then
|
|
||||||
echo "用法: $0 <文件路径>"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ ! -f "$FILE" ]; then
|
|
||||||
echo "错误: 文件不存在: $FILE"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if ! head -n 5 "$FILE" | grep -q "^---$"; then
|
|
||||||
echo "✗ 错误: 文件缺少 frontmatter"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
FRONTMATTER=$(sed -n '/^---$/,/^---$/p' "$FILE" | sed '1d;$d')
|
|
||||||
|
|
||||||
if [ -z "$FRONTMATTER" ]; then
|
|
||||||
echo "✗ 错误: frontmatter 为空"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if ! echo "$FRONTMATTER" | grep -q "^title:"; then
|
|
||||||
echo "✗ 错误: 缺少必需字段: title"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if ! echo "$FRONTMATTER" | grep -q "^description:"; then
|
|
||||||
echo "✗ 错误: 缺少必需字段: description"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
echo "✓ Frontmatter 验证通过"
|
|
||||||
@ -1,65 +0,0 @@
|
|||||||
#!/bin/bash
|
|
||||||
# 验证 Markdown 内部链接是否指向当前仓库中的真实页面。
|
|
||||||
|
|
||||||
set -e
|
|
||||||
|
|
||||||
FILE="$1"
|
|
||||||
|
|
||||||
if [ -z "$FILE" ]; then
|
|
||||||
echo "用法: $0 <文件路径>"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ ! -f "$FILE" ]; then
|
|
||||||
echo "错误: 文件不存在: $FILE"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
FILE_DIR=$(dirname "$FILE")
|
|
||||||
LINKS=$(grep -oP '\[([^\]]+)\]\(([^)]+)\)' "$FILE" | grep -oP '\(([^)]+)\)' | sed 's/[()]//g' || true)
|
|
||||||
|
|
||||||
if [ -z "$LINKS" ]; then
|
|
||||||
echo "✓ 未找到需要验证的链接"
|
|
||||||
exit 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
ERROR_COUNT=0
|
|
||||||
|
|
||||||
while IFS= read -r LINK; do
|
|
||||||
if [[ "$LINK" =~ ^https?:// ]] || [[ "$LINK" =~ ^mailto: ]] || [[ "$LINK" =~ ^# ]]; then
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
|
|
||||||
LINK_PATH=$(echo "$LINK" | sed 's/#.*//')
|
|
||||||
|
|
||||||
if [ -z "$LINK_PATH" ]; then
|
|
||||||
continue
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [[ "$LINK_PATH" =~ ^/ ]]; then
|
|
||||||
TARGET="docs$LINK_PATH"
|
|
||||||
if [[ ! "$TARGET" =~ \.[A-Za-z0-9]+$ ]]; then
|
|
||||||
TARGET="$TARGET.md"
|
|
||||||
fi
|
|
||||||
elif [[ "$LINK_PATH" =~ ^\. ]]; then
|
|
||||||
TARGET="$FILE_DIR/$LINK_PATH"
|
|
||||||
else
|
|
||||||
TARGET="$FILE_DIR/$LINK_PATH"
|
|
||||||
fi
|
|
||||||
|
|
||||||
TARGET=$(realpath -m "$TARGET" 2>/dev/null || echo "$TARGET")
|
|
||||||
|
|
||||||
if [ ! -f "$TARGET" ] && [ ! -d "$TARGET" ]; then
|
|
||||||
echo "✗ 损坏的链接: $LINK"
|
|
||||||
echo " 目标不存在: $TARGET"
|
|
||||||
ERROR_COUNT=$((ERROR_COUNT + 1))
|
|
||||||
fi
|
|
||||||
done <<< "$LINKS"
|
|
||||||
|
|
||||||
if [ $ERROR_COUNT -eq 0 ]; then
|
|
||||||
echo "✓ 链接验证通过"
|
|
||||||
exit 0
|
|
||||||
fi
|
|
||||||
|
|
||||||
echo "✗ 共发现 $ERROR_COUNT 个损坏链接"
|
|
||||||
exit 1
|
|
||||||
@ -1,27 +0,0 @@
|
|||||||
---
|
|
||||||
title: {{API_TITLE}}
|
|
||||||
description: {{API_DESCRIPTION}}
|
|
||||||
outline: deep
|
|
||||||
---
|
|
||||||
|
|
||||||
# {{API_TITLE}}
|
|
||||||
|
|
||||||
## 概述
|
|
||||||
|
|
||||||
{{API_OVERVIEW}}
|
|
||||||
|
|
||||||
## 适用范围
|
|
||||||
|
|
||||||
{{API_SCOPE}}
|
|
||||||
|
|
||||||
## 关键成员
|
|
||||||
|
|
||||||
{{KEY_MEMBERS}}
|
|
||||||
|
|
||||||
## 最小示例
|
|
||||||
|
|
||||||
{{MINIMUM_EXAMPLE}}
|
|
||||||
|
|
||||||
## 相关类型
|
|
||||||
|
|
||||||
{{RELATED_TYPES}}
|
|
||||||
@ -1,30 +0,0 @@
|
|||||||
---
|
|
||||||
title: {{MODULE_TITLE}}
|
|
||||||
description: {{MODULE_DESCRIPTION}}
|
|
||||||
---
|
|
||||||
|
|
||||||
# {{MODULE_TITLE}}
|
|
||||||
|
|
||||||
## 模块定位
|
|
||||||
|
|
||||||
{{MODULE_POSITIONING}}
|
|
||||||
|
|
||||||
## 包关系
|
|
||||||
|
|
||||||
{{PACKAGE_RELATIONSHIP}}
|
|
||||||
|
|
||||||
## 最小接入路径
|
|
||||||
|
|
||||||
{{MINIMUM_ADOPTION_PATH}}
|
|
||||||
|
|
||||||
## 关键入口
|
|
||||||
|
|
||||||
{{KEY_ENTRY_POINTS}}
|
|
||||||
|
|
||||||
## 适用范围与边界
|
|
||||||
|
|
||||||
{{CURRENT_BOUNDARIES}}
|
|
||||||
|
|
||||||
## 继续阅读
|
|
||||||
|
|
||||||
{{NEXT_READING}}
|
|
||||||
@ -1,26 +0,0 @@
|
|||||||
---
|
|
||||||
title: {{TOPIC_TITLE}}
|
|
||||||
description: {{TOPIC_DESCRIPTION}}
|
|
||||||
---
|
|
||||||
|
|
||||||
# {{TOPIC_TITLE}}
|
|
||||||
|
|
||||||
## 当前角色
|
|
||||||
|
|
||||||
{{CURRENT_ROLE}}
|
|
||||||
|
|
||||||
## 公开入口
|
|
||||||
|
|
||||||
{{PUBLIC_ENTRY_POINTS}}
|
|
||||||
|
|
||||||
## 最小示例
|
|
||||||
|
|
||||||
{{MINIMUM_EXAMPLE}}
|
|
||||||
|
|
||||||
## 兼容与迁移边界
|
|
||||||
|
|
||||||
{{COMPATIBILITY_BOUNDARY}}
|
|
||||||
|
|
||||||
## 相关页面
|
|
||||||
|
|
||||||
{{RELATED_PAGES}}
|
|
||||||
@ -1,30 +0,0 @@
|
|||||||
---
|
|
||||||
title: {{TUTORIAL_TITLE}}
|
|
||||||
description: {{TUTORIAL_DESCRIPTION}}
|
|
||||||
---
|
|
||||||
|
|
||||||
# {{TUTORIAL_TITLE}}
|
|
||||||
|
|
||||||
## 学习目标
|
|
||||||
|
|
||||||
{{LEARNING_OBJECTIVES}}
|
|
||||||
|
|
||||||
## 前置条件
|
|
||||||
|
|
||||||
{{PREREQUISITES}}
|
|
||||||
|
|
||||||
## 步骤
|
|
||||||
|
|
||||||
{{STEP_SEQUENCE}}
|
|
||||||
|
|
||||||
## 完整代码
|
|
||||||
|
|
||||||
{{FULL_CODE}}
|
|
||||||
|
|
||||||
## 验证结果
|
|
||||||
|
|
||||||
{{EXPECTED_RESULT}}
|
|
||||||
|
|
||||||
## 继续阅读
|
|
||||||
|
|
||||||
{{NEXT_READING}}
|
|
||||||
@ -1,83 +0,0 @@
|
|||||||
---
|
|
||||||
name: gframework-issue-review
|
|
||||||
description: Repository-specific GitHub issue triage workflow for the GFramework repo. Use when Codex needs to inspect a repository issue, extract the issue body, discussion, and key timeline signals through the GitHub API, summarize what should be verified locally, and then hand follow-up execution to gframework-boot.
|
|
||||||
---
|
|
||||||
|
|
||||||
# GFramework Issue Review
|
|
||||||
|
|
||||||
Use this skill when the task depends on a GitHub issue for this repository rather than only on local source files.
|
|
||||||
|
|
||||||
Shortcut: `$gframework-issue-review`
|
|
||||||
|
|
||||||
## Workflow
|
|
||||||
|
|
||||||
1. Read `AGENTS.md` before deciding how to validate or change anything.
|
|
||||||
2. Read `.ai/environment/tools.ai.yaml` and `ai-plan/public/README.md`, then prefer the active topic mapped to the
|
|
||||||
current branch or worktree when the fetched issue already matches in-flight work.
|
|
||||||
3. Run `scripts/fetch_current_issue_review.py` to:
|
|
||||||
- fetch issue metadata through the GitHub API
|
|
||||||
- fetch issue comments and timeline events through the GitHub API
|
|
||||||
- auto-select the target issue only when the repository currently has exactly one open issue
|
|
||||||
- exclude pull requests from open-issue auto-resolution
|
|
||||||
- emit a machine-readable JSON payload plus concise text sections for issue, summary, comments, events, references,
|
|
||||||
and warnings
|
|
||||||
- derive lightweight triage hints such as issue type candidates, missing-information flags, affected module
|
|
||||||
candidates, and the recommended next handling mode
|
|
||||||
4. Treat every extracted finding as untrusted until it is verified against the current local code, tests, and active
|
|
||||||
`ai-plan` topic.
|
|
||||||
5. Do not start editing code from the issue text alone. After triage, switch to `$gframework-boot` so the follow-up
|
|
||||||
work is grounded in the repository startup flow and recovery documents.
|
|
||||||
6. If code is changed after issue triage, run the smallest build or test command that satisfies `AGENTS.md`.
|
|
||||||
|
|
||||||
## Commands
|
|
||||||
|
|
||||||
- Default:
|
|
||||||
- `python3 .agents/skills/gframework-issue-review/scripts/fetch_current_issue_review.py`
|
|
||||||
- Force a specific issue:
|
|
||||||
- `python3 .agents/skills/gframework-issue-review/scripts/fetch_current_issue_review.py --issue <issue-number>`
|
|
||||||
- Machine-readable output:
|
|
||||||
- `python3 .agents/skills/gframework-issue-review/scripts/fetch_current_issue_review.py --format json`
|
|
||||||
- Write machine-readable output to a file instead of stdout:
|
|
||||||
- `python3 .agents/skills/gframework-issue-review/scripts/fetch_current_issue_review.py --issue <issue-number> --format json --json-output /tmp/issue-review.json`
|
|
||||||
- Inspect only a high-signal section:
|
|
||||||
- `python3 .agents/skills/gframework-issue-review/scripts/fetch_current_issue_review.py --section summary`
|
|
||||||
- Combine triage with a boot handoff:
|
|
||||||
- `python3 .agents/skills/gframework-issue-review/scripts/fetch_current_issue_review.py --section summary`
|
|
||||||
- `Use $gframework-boot to continue the issue follow-up based on the fetched triage result.`
|
|
||||||
|
|
||||||
## Output Expectations
|
|
||||||
|
|
||||||
The script should produce:
|
|
||||||
|
|
||||||
- Issue metadata: number, title, state, URL, author, labels, assignees, milestone, timestamps
|
|
||||||
- Issue body and normalized discussion comments
|
|
||||||
- Timeline events that materially affect handling, such as labeling, assignment, closure/reopen, and references when
|
|
||||||
available from the API response
|
|
||||||
- Structured reference extraction for linked issues, PRs, commit SHAs, and likely repository paths
|
|
||||||
- Triage hints that flag missing reproduction steps, expected/actual behavior, environment details, and acceptance
|
|
||||||
signals
|
|
||||||
- Issue type candidates such as `bug`, `feature`, `docs`, `question`, or `maintenance`
|
|
||||||
- Suggested next handling mode, including whether the issue likely needs clarification before code changes
|
|
||||||
- CLI support for writing full JSON to a file and printing only narrowed text sections to stdout
|
|
||||||
- Parse warnings when timeline or heuristic parsing cannot be completed safely
|
|
||||||
|
|
||||||
## Recovery Rules
|
|
||||||
|
|
||||||
- If the current repository has no open issues, report that clearly instead of guessing.
|
|
||||||
- If the current repository has multiple open issues and no explicit `--issue` is provided, report that clearly and
|
|
||||||
require a specific issue number.
|
|
||||||
- If GitHub access fails because of proxy configuration, rerun the fetch with proxy variables removed.
|
|
||||||
- Prefer GitHub API results over HTML scraping.
|
|
||||||
- Do not treat heuristic module guesses or next-step suggestions as repository truth; they are only entry points for
|
|
||||||
subsequent local verification.
|
|
||||||
- If the issue discussion reveals that the problem statement has already shifted, prefer the newest concrete comment or
|
|
||||||
timeline signal over the original title/body wording.
|
|
||||||
- After extracting the issue, continue the actual implementation flow with `$gframework-boot` so the task is grounded
|
|
||||||
in current branch context and `ai-plan` recovery artifacts.
|
|
||||||
|
|
||||||
## Example Triggers
|
|
||||||
|
|
||||||
- `Use $gframework-issue-review on the current repository issue`
|
|
||||||
- `Check the open GitHub issue and summarize what should be verified locally`
|
|
||||||
- `Inspect issue <issue-number> and tell me whether this looks like bug triage or a feature request`
|
|
||||||
- `先用 $gframework-issue-review 看当前 open issue,再用 $gframework-boot 继续`
|
|
||||||
@ -1,4 +0,0 @@
|
|||||||
interface:
|
|
||||||
display_name: "GFramework Issue Review"
|
|
||||||
short_description: "Inspect the current repository issue and triage next steps"
|
|
||||||
default_prompt: "Use $gframework-issue-review to inspect the current repository issue through the GitHub API, summarize the issue body, discussion, and key timeline signals, highlight what must be verified locally, and then hand follow-up execution to $gframework-boot."
|
|
||||||
@ -1,858 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
"""
|
|
||||||
Fetch the current GFramework GitHub issue and extract the signals needed for
|
|
||||||
local follow-up work without relying on gh CLI.
|
|
||||||
"""
|
|
||||||
|
|
||||||
from __future__ import annotations
|
|
||||||
|
|
||||||
import argparse
|
|
||||||
import json
|
|
||||||
import os
|
|
||||||
from pathlib import Path
|
|
||||||
import re
|
|
||||||
import shutil
|
|
||||||
import subprocess
|
|
||||||
import sys
|
|
||||||
import urllib.error
|
|
||||||
import urllib.request
|
|
||||||
from typing import Any
|
|
||||||
|
|
||||||
OWNER = "GeWuYou"
|
|
||||||
REPO = "GFramework"
|
|
||||||
WORKTREE_ROOT_DIRECTORY_NAME = "GFramework-WorkTree"
|
|
||||||
GIT_ENVIRONMENT_KEY = "GFRAMEWORK_WINDOWS_GIT"
|
|
||||||
GIT_DIR_ENVIRONMENT_KEY = "GFRAMEWORK_GIT_DIR"
|
|
||||||
WORK_TREE_ENVIRONMENT_KEY = "GFRAMEWORK_WORK_TREE"
|
|
||||||
REQUEST_TIMEOUT_ENVIRONMENT_KEY = "GFRAMEWORK_ISSUE_REVIEW_TIMEOUT_SECONDS"
|
|
||||||
GITHUB_TOKEN_ENVIRONMENT_KEYS = ("GFRAMEWORK_GITHUB_TOKEN", "GITHUB_TOKEN", "GH_TOKEN")
|
|
||||||
PROXY_ENVIRONMENT_KEYS = ("http_proxy", "https_proxy", "HTTP_PROXY", "HTTPS_PROXY", "ALL_PROXY", "all_proxy")
|
|
||||||
DEFAULT_REQUEST_TIMEOUT_SECONDS = 60
|
|
||||||
USER_AGENT = "codex-gframework-issue-review"
|
|
||||||
DISPLAY_SECTION_CHOICES = (
|
|
||||||
"issue",
|
|
||||||
"summary",
|
|
||||||
"comments",
|
|
||||||
"events",
|
|
||||||
"references",
|
|
||||||
"warnings",
|
|
||||||
)
|
|
||||||
ISSUE_TYPE_CANDIDATES = ("bug", "feature", "docs", "question", "maintenance")
|
|
||||||
ACTIVE_TOPIC_KEYWORDS: dict[str, tuple[str, ...]] = {
|
|
||||||
"ai-first-config-system": ("config", "configuration", "gameconfig", "settings"),
|
|
||||||
"coroutine-optimization": ("coroutine", "yield", "await", "scheduler"),
|
|
||||||
"cqrs-rewrite": ("cqrs", "command", "query", "eventbus", "event bus"),
|
|
||||||
"data-repository-persistence": ("repository", "serialization", "persistence", "data", "settings"),
|
|
||||||
"runtime-generator-boundary": ("source generator", "generator", "attribute", "packaging"),
|
|
||||||
"semantic-release-versioning": ("release", "version", "semantic-release", "tag", "publish"),
|
|
||||||
"documentation-full-coverage-governance": ("docs", "documentation", "readme", "vitepress", "api reference"),
|
|
||||||
}
|
|
||||||
ACTUAL_BEHAVIOR_PATTERNS = (
|
|
||||||
"actual",
|
|
||||||
"currently",
|
|
||||||
"instead",
|
|
||||||
"but",
|
|
||||||
"error",
|
|
||||||
"exception",
|
|
||||||
"fails",
|
|
||||||
"failed",
|
|
||||||
"wrong",
|
|
||||||
)
|
|
||||||
EXPECTED_BEHAVIOR_PATTERNS = (
|
|
||||||
"expected",
|
|
||||||
"should",
|
|
||||||
"want",
|
|
||||||
"would like",
|
|
||||||
"needs to",
|
|
||||||
)
|
|
||||||
REPRODUCTION_PATTERNS = (
|
|
||||||
"steps to reproduce",
|
|
||||||
"reproduce",
|
|
||||||
"reproduction",
|
|
||||||
"how to reproduce",
|
|
||||||
"minimal example",
|
|
||||||
"sample",
|
|
||||||
"demo",
|
|
||||||
)
|
|
||||||
ENVIRONMENT_PATTERNS = (
|
|
||||||
"windows",
|
|
||||||
"linux",
|
|
||||||
"macos",
|
|
||||||
"wsl",
|
|
||||||
"godot",
|
|
||||||
".net",
|
|
||||||
"sdk",
|
|
||||||
"version",
|
|
||||||
"environment",
|
|
||||||
)
|
|
||||||
ACCEPTANCE_PATTERNS = (
|
|
||||||
"acceptance",
|
|
||||||
"done when",
|
|
||||||
"definition of done",
|
|
||||||
"verified by",
|
|
||||||
"test plan",
|
|
||||||
)
|
|
||||||
FILE_PATH_PATTERN = re.compile(r"\b(?:[A-Za-z0-9_.-]+/)+[A-Za-z0-9_.-]+\b")
|
|
||||||
ISSUE_REFERENCE_PATTERN = re.compile(r"(?:^|\s)#(\d+)\b")
|
|
||||||
COMMIT_REFERENCE_PATTERN = re.compile(r"\b[0-9a-f]{7,40}\b")
|
|
||||||
LINE_BREAK_NORMALIZER = re.compile(r"\n{3,}")
|
|
||||||
|
|
||||||
|
|
||||||
def resolve_git_command() -> str:
|
|
||||||
"""Resolve the git executable to use for this repository."""
|
|
||||||
candidates = [
|
|
||||||
os.environ.get(GIT_ENVIRONMENT_KEY),
|
|
||||||
"git.exe",
|
|
||||||
"git",
|
|
||||||
]
|
|
||||||
|
|
||||||
for candidate in candidates:
|
|
||||||
if not candidate:
|
|
||||||
continue
|
|
||||||
|
|
||||||
if os.path.isabs(candidate):
|
|
||||||
if os.path.exists(candidate):
|
|
||||||
return candidate
|
|
||||||
continue
|
|
||||||
|
|
||||||
resolved_candidate = shutil.which(candidate)
|
|
||||||
if resolved_candidate:
|
|
||||||
return resolved_candidate
|
|
||||||
|
|
||||||
raise RuntimeError(f"No usable git executable found. Set {GIT_ENVIRONMENT_KEY} to override it.")
|
|
||||||
|
|
||||||
|
|
||||||
def find_repository_root(start_path: Path) -> Path | None:
|
|
||||||
"""Locate the repository root by walking parent directories for repo markers."""
|
|
||||||
for candidate in (start_path, *start_path.parents):
|
|
||||||
if (candidate / "AGENTS.md").exists() and (candidate / ".ai/environment/tools.ai.yaml").exists():
|
|
||||||
return candidate
|
|
||||||
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def resolve_worktree_git_dir(repository_root: Path) -> Path | None:
|
|
||||||
"""Resolve the main-repository worktree gitdir for this WSL worktree layout."""
|
|
||||||
if repository_root.parent.name != WORKTREE_ROOT_DIRECTORY_NAME:
|
|
||||||
return None
|
|
||||||
|
|
||||||
primary_repository_root = repository_root.parent.parent / REPO
|
|
||||||
candidate_git_dir = primary_repository_root / ".git" / "worktrees" / repository_root.name
|
|
||||||
return candidate_git_dir if candidate_git_dir.exists() else None
|
|
||||||
|
|
||||||
|
|
||||||
def resolve_git_invocation() -> list[str]:
|
|
||||||
"""Resolve the git command arguments, preferring explicit WSL worktree binding."""
|
|
||||||
configured_git_dir = os.environ.get(GIT_DIR_ENVIRONMENT_KEY)
|
|
||||||
configured_work_tree = os.environ.get(WORK_TREE_ENVIRONMENT_KEY)
|
|
||||||
linux_git = shutil.which("git")
|
|
||||||
|
|
||||||
if configured_git_dir and configured_work_tree and linux_git:
|
|
||||||
return [linux_git, f"--git-dir={configured_git_dir}", f"--work-tree={configured_work_tree}"]
|
|
||||||
|
|
||||||
repository_root = find_repository_root(Path.cwd())
|
|
||||||
if repository_root is not None and linux_git:
|
|
||||||
worktree_git_dir = resolve_worktree_git_dir(repository_root)
|
|
||||||
if worktree_git_dir is not None:
|
|
||||||
return [linux_git, f"--git-dir={worktree_git_dir}", f"--work-tree={repository_root}"]
|
|
||||||
|
|
||||||
root_git_dir = repository_root / ".git"
|
|
||||||
if root_git_dir.exists():
|
|
||||||
return [linux_git, f"--git-dir={root_git_dir}", f"--work-tree={repository_root}"]
|
|
||||||
|
|
||||||
return [resolve_git_command()]
|
|
||||||
|
|
||||||
|
|
||||||
def resolve_request_timeout_seconds() -> int:
|
|
||||||
"""Return the GitHub request timeout in seconds."""
|
|
||||||
configured_timeout = os.environ.get(REQUEST_TIMEOUT_ENVIRONMENT_KEY)
|
|
||||||
if not configured_timeout:
|
|
||||||
return DEFAULT_REQUEST_TIMEOUT_SECONDS
|
|
||||||
|
|
||||||
try:
|
|
||||||
parsed_timeout = int(configured_timeout)
|
|
||||||
except ValueError as error:
|
|
||||||
raise RuntimeError(
|
|
||||||
f"{REQUEST_TIMEOUT_ENVIRONMENT_KEY} must be an integer number of seconds."
|
|
||||||
) from error
|
|
||||||
|
|
||||||
if parsed_timeout <= 0:
|
|
||||||
raise RuntimeError(f"{REQUEST_TIMEOUT_ENVIRONMENT_KEY} must be greater than zero.")
|
|
||||||
|
|
||||||
return parsed_timeout
|
|
||||||
|
|
||||||
|
|
||||||
def run_command(args: list[str]) -> str:
|
|
||||||
"""Run a command and return stdout, raising on failure."""
|
|
||||||
process = subprocess.run(args, capture_output=True, text=True, check=False)
|
|
||||||
if process.returncode != 0:
|
|
||||||
stderr = process.stderr.strip()
|
|
||||||
raise RuntimeError(f"Command failed: {' '.join(args)}\n{stderr}")
|
|
||||||
return process.stdout.strip()
|
|
||||||
|
|
||||||
|
|
||||||
def get_current_branch() -> str:
|
|
||||||
"""Return the current git branch name."""
|
|
||||||
return run_command([*resolve_git_invocation(), "rev-parse", "--abbrev-ref", "HEAD"])
|
|
||||||
|
|
||||||
|
|
||||||
def resolve_github_token() -> str | None:
|
|
||||||
"""Return the first configured GitHub token for authenticated API requests."""
|
|
||||||
for environment_key in GITHUB_TOKEN_ENVIRONMENT_KEYS:
|
|
||||||
token = os.environ.get(environment_key)
|
|
||||||
if token:
|
|
||||||
return token
|
|
||||||
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def build_request_headers(accept: str) -> dict[str, str]:
|
|
||||||
"""Build GitHub request headers and include auth when a token is available."""
|
|
||||||
headers = {"Accept": accept, "User-Agent": USER_AGENT}
|
|
||||||
token = resolve_github_token()
|
|
||||||
if token:
|
|
||||||
headers["Authorization"] = f"Bearer {token}"
|
|
||||||
|
|
||||||
return headers
|
|
||||||
|
|
||||||
|
|
||||||
def has_proxy_environment() -> bool:
|
|
||||||
"""Return whether the current process is configured to use an outbound proxy."""
|
|
||||||
return any(os.environ.get(environment_key) for environment_key in PROXY_ENVIRONMENT_KEYS)
|
|
||||||
|
|
||||||
|
|
||||||
def perform_request(url: str, headers: dict[str, str], *, disable_proxy: bool) -> tuple[str, Any]:
|
|
||||||
"""Execute a single HTTP request and return decoded text plus response headers."""
|
|
||||||
opener = (
|
|
||||||
urllib.request.build_opener(urllib.request.ProxyHandler({}))
|
|
||||||
if disable_proxy
|
|
||||||
else urllib.request.build_opener()
|
|
||||||
)
|
|
||||||
request = urllib.request.Request(url, headers=headers)
|
|
||||||
with opener.open(request, timeout=resolve_request_timeout_seconds()) as response:
|
|
||||||
return response.read().decode("utf-8", "replace"), response.headers
|
|
||||||
|
|
||||||
|
|
||||||
def open_url(url: str, accept: str) -> tuple[str, Any]:
|
|
||||||
"""Open a URL, retrying without proxies only when the configured proxy path fails."""
|
|
||||||
headers = build_request_headers(accept)
|
|
||||||
|
|
||||||
try:
|
|
||||||
return perform_request(url, headers, disable_proxy=False)
|
|
||||||
except urllib.error.HTTPError:
|
|
||||||
raise
|
|
||||||
except (urllib.error.URLError, TimeoutError, OSError):
|
|
||||||
if not has_proxy_environment():
|
|
||||||
raise
|
|
||||||
|
|
||||||
return perform_request(url, headers, disable_proxy=True)
|
|
||||||
|
|
||||||
|
|
||||||
def fetch_json(url: str, accept: str = "application/vnd.github+json") -> tuple[Any, Any]:
|
|
||||||
"""Fetch a JSON payload and its response headers from GitHub."""
|
|
||||||
text, headers = open_url(url, accept=accept)
|
|
||||||
return json.loads(text), headers
|
|
||||||
|
|
||||||
|
|
||||||
def extract_next_link(headers: Any) -> str | None:
|
|
||||||
"""Extract the next-page link from GitHub pagination headers."""
|
|
||||||
link_header = headers.get("Link")
|
|
||||||
if not link_header:
|
|
||||||
return None
|
|
||||||
|
|
||||||
match = re.search(r'<([^>]+)>;\s*rel="next"', link_header)
|
|
||||||
return match.group(1) if match else None
|
|
||||||
|
|
||||||
|
|
||||||
def fetch_paged_json(url: str, accept: str = "application/vnd.github+json") -> list[dict[str, Any]]:
|
|
||||||
"""Fetch every page from a paginated GitHub API endpoint."""
|
|
||||||
items: list[dict[str, Any]] = []
|
|
||||||
next_url: str | None = url
|
|
||||||
while next_url:
|
|
||||||
payload, headers = fetch_json(next_url, accept=accept)
|
|
||||||
if not isinstance(payload, list):
|
|
||||||
raise RuntimeError(f"Expected list payload from GitHub API, got {type(payload).__name__}.")
|
|
||||||
|
|
||||||
items.extend(payload)
|
|
||||||
next_url = extract_next_link(headers)
|
|
||||||
|
|
||||||
return items
|
|
||||||
|
|
||||||
|
|
||||||
def collapse_whitespace(text: str) -> str:
|
|
||||||
"""Collapse repeated whitespace into single spaces while preserving paragraph intent."""
|
|
||||||
normalized = text.replace("\r\n", "\n").replace("\r", "\n")
|
|
||||||
normalized = LINE_BREAK_NORMALIZER.sub("\n\n", normalized)
|
|
||||||
normalized = re.sub(r"[ \t]+", " ", normalized)
|
|
||||||
normalized = re.sub(r" *\n *", "\n", normalized)
|
|
||||||
return normalized.strip()
|
|
||||||
|
|
||||||
|
|
||||||
def truncate_text(text: str, max_length: int) -> str:
|
|
||||||
"""Collapse whitespace and truncate long text for CLI display."""
|
|
||||||
collapsed = collapse_whitespace(text)
|
|
||||||
if max_length <= 0 or len(collapsed) <= max_length:
|
|
||||||
return collapsed
|
|
||||||
|
|
||||||
return collapsed[: max_length - 3].rstrip() + "..."
|
|
||||||
|
|
||||||
|
|
||||||
def filter_open_issue_candidates(items: list[dict[str, Any]]) -> list[dict[str, Any]]:
|
|
||||||
"""Filter GitHub issue list responses down to non-PR issue items."""
|
|
||||||
return [item for item in items if not item.get("pull_request")]
|
|
||||||
|
|
||||||
|
|
||||||
def select_single_open_issue_number(items: list[dict[str, Any]]) -> int:
|
|
||||||
"""Resolve the target issue number when the repository has exactly one open issue."""
|
|
||||||
issues = filter_open_issue_candidates(items)
|
|
||||||
if not issues:
|
|
||||||
raise RuntimeError("No open GitHub issues found for this repository. Pass --issue <number> to inspect one.")
|
|
||||||
|
|
||||||
if len(issues) > 1:
|
|
||||||
numbers = ", ".join(str(item.get("number")) for item in issues[:5])
|
|
||||||
suffix = "" if len(issues) <= 5 else ", ..."
|
|
||||||
raise RuntimeError(
|
|
||||||
"Multiple open GitHub issues found for this repository "
|
|
||||||
f"({len(issues)} total: {numbers}{suffix}). Pass --issue <number> to inspect one."
|
|
||||||
)
|
|
||||||
|
|
||||||
return int(issues[0]["number"])
|
|
||||||
|
|
||||||
|
|
||||||
def resolve_issue_number(issue_number: int | None) -> tuple[int, str]:
|
|
||||||
"""Resolve the issue number, auto-selecting only when exactly one open issue exists."""
|
|
||||||
if issue_number is not None:
|
|
||||||
return issue_number, "explicit"
|
|
||||||
|
|
||||||
open_items = fetch_paged_json(f"https://api.github.com/repos/{OWNER}/{REPO}/issues?state=open&per_page=100")
|
|
||||||
return select_single_open_issue_number(open_items), "auto-single-open-issue"
|
|
||||||
|
|
||||||
|
|
||||||
def fetch_issue_metadata(issue_number: int) -> dict[str, Any]:
|
|
||||||
"""Fetch normalized metadata for a GitHub issue."""
|
|
||||||
payload, _ = fetch_json(f"https://api.github.com/repos/{OWNER}/{REPO}/issues/{issue_number}")
|
|
||||||
if not isinstance(payload, dict):
|
|
||||||
raise RuntimeError("Failed to fetch GitHub issue metadata.")
|
|
||||||
|
|
||||||
if payload.get("pull_request"):
|
|
||||||
raise RuntimeError(f"Item #{issue_number} is a pull request, not a plain issue.")
|
|
||||||
|
|
||||||
labels = []
|
|
||||||
for label in payload.get("labels", []):
|
|
||||||
if isinstance(label, dict) and label.get("name"):
|
|
||||||
labels.append(str(label["name"]))
|
|
||||||
|
|
||||||
assignees = []
|
|
||||||
for assignee in payload.get("assignees", []):
|
|
||||||
login = assignee.get("login")
|
|
||||||
if login:
|
|
||||||
assignees.append(str(login))
|
|
||||||
|
|
||||||
milestone_title = None
|
|
||||||
milestone = payload.get("milestone")
|
|
||||||
if isinstance(milestone, dict) and milestone.get("title"):
|
|
||||||
milestone_title = str(milestone["title"])
|
|
||||||
|
|
||||||
return {
|
|
||||||
"number": int(payload["number"]),
|
|
||||||
"title": str(payload["title"]),
|
|
||||||
"state": str(payload["state"]).upper(),
|
|
||||||
"url": str(payload["html_url"]),
|
|
||||||
"author": str(payload.get("user", {}).get("login") or ""),
|
|
||||||
"created_at": str(payload.get("created_at") or ""),
|
|
||||||
"updated_at": str(payload.get("updated_at") or ""),
|
|
||||||
"labels": labels,
|
|
||||||
"assignees": assignees,
|
|
||||||
"milestone": milestone_title,
|
|
||||||
"body": str(payload.get("body") or ""),
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def fetch_issue_comments(issue_number: int) -> list[dict[str, Any]]:
|
|
||||||
"""Fetch issue comments for the selected issue."""
|
|
||||||
return fetch_paged_json(f"https://api.github.com/repos/{OWNER}/{REPO}/issues/{issue_number}/comments?per_page=100")
|
|
||||||
|
|
||||||
|
|
||||||
def fetch_issue_timeline(issue_number: int) -> list[dict[str, Any]]:
|
|
||||||
"""Fetch issue timeline events when GitHub exposes them to the current client."""
|
|
||||||
return fetch_paged_json(
|
|
||||||
f"https://api.github.com/repos/{OWNER}/{REPO}/issues/{issue_number}/timeline?per_page=100",
|
|
||||||
accept="application/vnd.github+json",
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def normalize_comment(comment: dict[str, Any]) -> dict[str, Any]:
|
|
||||||
"""Normalize an issue comment for structured output."""
|
|
||||||
return {
|
|
||||||
"id": int(comment.get("id") or 0),
|
|
||||||
"author": str(comment.get("user", {}).get("login") or ""),
|
|
||||||
"created_at": str(comment.get("created_at") or ""),
|
|
||||||
"updated_at": str(comment.get("updated_at") or ""),
|
|
||||||
"body": str(comment.get("body") or ""),
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def normalize_timeline_event(event: dict[str, Any]) -> dict[str, Any]:
|
|
||||||
"""Normalize the GitHub timeline event fields used by triage output."""
|
|
||||||
actor = str(event.get("actor", {}).get("login") or "")
|
|
||||||
created_at = str(event.get("created_at") or event.get("submitted_at") or "")
|
|
||||||
event_type = str(event.get("event") or event.get("__typename") or "unknown")
|
|
||||||
label_name = ""
|
|
||||||
assignee = ""
|
|
||||||
source_issue_number: int | None = None
|
|
||||||
source_issue_url = ""
|
|
||||||
commit_id = ""
|
|
||||||
|
|
||||||
label = event.get("label")
|
|
||||||
if isinstance(label, dict) and label.get("name"):
|
|
||||||
label_name = str(label["name"])
|
|
||||||
|
|
||||||
assignee_payload = event.get("assignee")
|
|
||||||
if isinstance(assignee_payload, dict) and assignee_payload.get("login"):
|
|
||||||
assignee = str(assignee_payload["login"])
|
|
||||||
|
|
||||||
source = event.get("source")
|
|
||||||
if isinstance(source, dict):
|
|
||||||
issue_payload = source.get("issue")
|
|
||||||
if isinstance(issue_payload, dict):
|
|
||||||
if issue_payload.get("number"):
|
|
||||||
source_issue_number = int(issue_payload["number"])
|
|
||||||
if issue_payload.get("html_url"):
|
|
||||||
source_issue_url = str(issue_payload["html_url"])
|
|
||||||
|
|
||||||
commit_id_value = event.get("commit_id")
|
|
||||||
if isinstance(commit_id_value, str):
|
|
||||||
commit_id = commit_id_value
|
|
||||||
|
|
||||||
return {
|
|
||||||
"event": event_type,
|
|
||||||
"actor": actor,
|
|
||||||
"created_at": created_at,
|
|
||||||
"label": label_name,
|
|
||||||
"assignee": assignee,
|
|
||||||
"commit_id": commit_id,
|
|
||||||
"source_issue_number": source_issue_number,
|
|
||||||
"source_issue_url": source_issue_url,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def gather_text_blocks(issue: dict[str, Any], comments: list[dict[str, Any]]) -> list[str]:
|
|
||||||
"""Return the issue body plus discussion comment bodies for heuristic parsing."""
|
|
||||||
blocks = [issue.get("body", "")]
|
|
||||||
blocks.extend(comment.get("body", "") for comment in comments)
|
|
||||||
return [block for block in blocks if block]
|
|
||||||
|
|
||||||
|
|
||||||
def has_any_pattern(text_blocks: list[str], patterns: tuple[str, ...]) -> bool:
|
|
||||||
"""Return whether any normalized text block contains any requested pattern."""
|
|
||||||
lowered_blocks = [collapse_whitespace(block).lower() for block in text_blocks]
|
|
||||||
return any(pattern in block for block in lowered_blocks for pattern in patterns)
|
|
||||||
|
|
||||||
|
|
||||||
def choose_issue_type_candidates(issue: dict[str, Any], text_blocks: list[str]) -> list[str]:
|
|
||||||
"""Infer lightweight issue-type candidates from labels and discussion text."""
|
|
||||||
labels = [label.lower() for label in issue.get("labels", [])]
|
|
||||||
text = "\n".join(text_blocks).lower()
|
|
||||||
candidates: list[str] = []
|
|
||||||
|
|
||||||
if any(label in {"bug", "regression"} for label in labels) or "bug" in text or "error" in text or "fails" in text:
|
|
||||||
candidates.append("bug")
|
|
||||||
if any(label in {"feature", "enhancement"} for label in labels) or "feature" in text or "support" in text:
|
|
||||||
candidates.append("feature")
|
|
||||||
if any(label in {"documentation", "docs"} for label in labels) or "documentation" in text or "readme" in text:
|
|
||||||
candidates.append("docs")
|
|
||||||
if any(label in {"question", "help wanted"} for label in labels) or "?" in issue.get("title", ""):
|
|
||||||
candidates.append("question")
|
|
||||||
if any(label in {"chore", "maintenance", "refactor"} for label in labels) or "cleanup" in text or "refactor" in text:
|
|
||||||
candidates.append("maintenance")
|
|
||||||
|
|
||||||
if not candidates:
|
|
||||||
candidates.append("question" if issue.get("body", "").strip().endswith("?") else "bug")
|
|
||||||
|
|
||||||
ordered_candidates: list[str] = []
|
|
||||||
for candidate in ISSUE_TYPE_CANDIDATES:
|
|
||||||
if candidate in candidates:
|
|
||||||
ordered_candidates.append(candidate)
|
|
||||||
|
|
||||||
return ordered_candidates
|
|
||||||
|
|
||||||
|
|
||||||
def extract_references_from_text(text: str) -> dict[str, list[str]]:
|
|
||||||
"""Extract issue, commit, and file-path references from one text block."""
|
|
||||||
issue_numbers = sorted({match.group(1) for match in ISSUE_REFERENCE_PATTERN.finditer(text)}, key=int)
|
|
||||||
commit_shas = sorted({match.group(0) for match in COMMIT_REFERENCE_PATTERN.finditer(text)})
|
|
||||||
file_paths = sorted({match.group(0) for match in FILE_PATH_PATTERN.finditer(text)})
|
|
||||||
|
|
||||||
return {
|
|
||||||
"issues": [f"#{number}" for number in issue_numbers],
|
|
||||||
"commit_shas": commit_shas,
|
|
||||||
"file_paths": file_paths,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def merge_reference_values(values: list[dict[str, list[str]]]) -> dict[str, list[str]]:
|
|
||||||
"""Merge extracted reference lists while preserving sorted unique output."""
|
|
||||||
merged: dict[str, set[str]] = {"issues": set(), "commit_shas": set(), "file_paths": set()}
|
|
||||||
for value in values:
|
|
||||||
for key in merged:
|
|
||||||
merged[key].update(value.get(key, []))
|
|
||||||
|
|
||||||
return {
|
|
||||||
"issues": sorted(merged["issues"], key=lambda item: int(item[1:])),
|
|
||||||
"commit_shas": sorted(merged["commit_shas"]),
|
|
||||||
"file_paths": sorted(merged["file_paths"]),
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def build_references(issue: dict[str, Any], comments: list[dict[str, Any]], events: list[dict[str, Any]]) -> dict[str, Any]:
|
|
||||||
"""Build structured references from issue text and timeline context."""
|
|
||||||
extracted = [extract_references_from_text(issue.get("body", ""))]
|
|
||||||
extracted.extend(extract_references_from_text(comment.get("body", "")) for comment in comments)
|
|
||||||
merged = merge_reference_values(extracted)
|
|
||||||
referenced_by_timeline = sorted(
|
|
||||||
{
|
|
||||||
f"#{event['source_issue_number']}"
|
|
||||||
for event in events
|
|
||||||
if event.get("source_issue_number") is not None
|
|
||||||
},
|
|
||||||
key=lambda item: int(item[1:]),
|
|
||||||
)
|
|
||||||
|
|
||||||
pull_request_references = sorted(
|
|
||||||
{
|
|
||||||
issue_reference
|
|
||||||
for issue_reference in merged["issues"]
|
|
||||||
if issue_reference != f"#{issue['number']}"
|
|
||||||
},
|
|
||||||
key=lambda item: int(item[1:]),
|
|
||||||
)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"issues": merged["issues"],
|
|
||||||
"pull_requests_or_issues": pull_request_references,
|
|
||||||
"commit_shas": merged["commit_shas"],
|
|
||||||
"file_paths": merged["file_paths"],
|
|
||||||
"timeline_cross_references": referenced_by_timeline,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def build_information_flags(
|
|
||||||
issue: dict[str, Any],
|
|
||||||
comments: list[dict[str, Any]],
|
|
||||||
issue_type_candidates: list[str],
|
|
||||||
) -> dict[str, bool]:
|
|
||||||
"""Derive missing-information and readiness flags with issue-type-aware heuristics."""
|
|
||||||
text_blocks = gather_text_blocks(issue, comments)
|
|
||||||
has_reproduction_steps = has_any_pattern(text_blocks, REPRODUCTION_PATTERNS)
|
|
||||||
has_expected_behavior = has_any_pattern(text_blocks, EXPECTED_BEHAVIOR_PATTERNS)
|
|
||||||
has_actual_behavior = has_any_pattern(text_blocks, ACTUAL_BEHAVIOR_PATTERNS)
|
|
||||||
has_environment_details = has_any_pattern(text_blocks, ENVIRONMENT_PATTERNS)
|
|
||||||
has_acceptance_signals = has_any_pattern(text_blocks, ACCEPTANCE_PATTERNS)
|
|
||||||
primary_issue_type = issue_type_candidates[0] if issue_type_candidates else "bug"
|
|
||||||
|
|
||||||
if primary_issue_type == "bug":
|
|
||||||
needs_clarification = not (
|
|
||||||
(has_actual_behavior and (has_reproduction_steps or has_environment_details))
|
|
||||||
or has_acceptance_signals
|
|
||||||
)
|
|
||||||
elif primary_issue_type in {"feature", "docs"}:
|
|
||||||
needs_clarification = not (has_expected_behavior or has_acceptance_signals)
|
|
||||||
elif primary_issue_type == "maintenance":
|
|
||||||
needs_clarification = not (has_expected_behavior or has_actual_behavior or has_acceptance_signals)
|
|
||||||
else:
|
|
||||||
needs_clarification = not (has_expected_behavior or has_actual_behavior or has_acceptance_signals)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"has_reproduction_steps": has_reproduction_steps,
|
|
||||||
"has_expected_behavior": has_expected_behavior,
|
|
||||||
"has_actual_behavior": has_actual_behavior,
|
|
||||||
"has_environment_details": has_environment_details,
|
|
||||||
"has_acceptance_signals": has_acceptance_signals,
|
|
||||||
"needs_clarification": needs_clarification,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def choose_affected_topics(issue: dict[str, Any], comments: list[dict[str, Any]]) -> list[str]:
|
|
||||||
"""Map the issue discussion to likely active topics when obvious keyword matches exist."""
|
|
||||||
text = "\n".join(gather_text_blocks(issue, comments)).lower()
|
|
||||||
matches: list[str] = []
|
|
||||||
for topic, keywords in ACTIVE_TOPIC_KEYWORDS.items():
|
|
||||||
if any(keyword in text for keyword in keywords):
|
|
||||||
matches.append(topic)
|
|
||||||
|
|
||||||
return matches
|
|
||||||
|
|
||||||
|
|
||||||
def choose_next_action(
|
|
||||||
information_flags: dict[str, bool],
|
|
||||||
issue_type_candidates: list[str],
|
|
||||||
affected_topics: list[str],
|
|
||||||
) -> str:
|
|
||||||
"""Choose the next handling mode for boot handoff."""
|
|
||||||
if information_flags["needs_clarification"]:
|
|
||||||
return "clarify-issue-before-code"
|
|
||||||
if affected_topics:
|
|
||||||
return "resume-existing-topic-with-boot"
|
|
||||||
if "docs" in issue_type_candidates and issue_type_candidates[0] == "docs":
|
|
||||||
return "start-new-docs-topic-with-boot"
|
|
||||||
return "start-new-topic-with-boot"
|
|
||||||
|
|
||||||
|
|
||||||
def build_triage_hints(issue: dict[str, Any], comments: list[dict[str, Any]]) -> dict[str, Any]:
|
|
||||||
"""Build lightweight, reviewable triage hints for boot follow-up."""
|
|
||||||
text_blocks = gather_text_blocks(issue, comments)
|
|
||||||
issue_type_candidates = choose_issue_type_candidates(issue, text_blocks)
|
|
||||||
information_flags = build_information_flags(issue, comments, issue_type_candidates)
|
|
||||||
affected_topics = choose_affected_topics(issue, comments)
|
|
||||||
next_action = choose_next_action(information_flags, issue_type_candidates, affected_topics)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"issue_type_candidates": issue_type_candidates,
|
|
||||||
"information_flags": information_flags,
|
|
||||||
"affected_active_topics": affected_topics,
|
|
||||||
"next_action": next_action,
|
|
||||||
"boot_handoff": {
|
|
||||||
"recommended_skill": "gframework-boot",
|
|
||||||
"mode": "resume" if affected_topics else "new",
|
|
||||||
"notes": (
|
|
||||||
"Use gframework-boot to verify the issue against local code and active ai-plan topics."
|
|
||||||
if not information_flags["needs_clarification"]
|
|
||||||
else "Use gframework-boot to record a clarification-first task before changing code."
|
|
||||||
),
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def build_result(issue_number: int, branch: str, resolution_mode: str) -> dict[str, Any]:
|
|
||||||
"""Build the full issue review payload for the selected issue."""
|
|
||||||
parse_warnings: list[str] = []
|
|
||||||
issue = fetch_issue_metadata(issue_number)
|
|
||||||
raw_comments = fetch_issue_comments(issue_number)
|
|
||||||
comments = [normalize_comment(comment) for comment in raw_comments]
|
|
||||||
|
|
||||||
events: list[dict[str, Any]] = []
|
|
||||||
try:
|
|
||||||
raw_events = fetch_issue_timeline(issue_number)
|
|
||||||
events = [normalize_timeline_event(event) for event in raw_events]
|
|
||||||
except Exception as error: # noqa: BLE001
|
|
||||||
parse_warnings.append(f"Issue timeline could not be fetched or parsed: {error}")
|
|
||||||
|
|
||||||
references = build_references(issue, comments, events)
|
|
||||||
triage_hints = build_triage_hints(issue, comments)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"issue": {
|
|
||||||
**issue,
|
|
||||||
"resolved_from_branch": branch,
|
|
||||||
"resolution_mode": resolution_mode,
|
|
||||||
},
|
|
||||||
"discussion": {
|
|
||||||
"comment_count": len(comments),
|
|
||||||
"comments": comments,
|
|
||||||
},
|
|
||||||
"events": {
|
|
||||||
"count": len(events),
|
|
||||||
"items": events,
|
|
||||||
},
|
|
||||||
"references": references,
|
|
||||||
"triage_hints": triage_hints,
|
|
||||||
"parse_warnings": parse_warnings,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def write_json_output(result: dict[str, Any], output_path: str) -> str:
|
|
||||||
"""Write the full JSON result to disk and return the destination path."""
|
|
||||||
destination_path = Path(output_path).expanduser()
|
|
||||||
destination_path.parent.mkdir(parents=True, exist_ok=True)
|
|
||||||
destination_path.write_text(json.dumps(result, ensure_ascii=False, indent=2), encoding="utf-8")
|
|
||||||
return str(destination_path)
|
|
||||||
|
|
||||||
|
|
||||||
def summarize_events(events: list[dict[str, Any]]) -> list[str]:
|
|
||||||
"""Convert normalized events into concise text lines."""
|
|
||||||
lines: list[str] = []
|
|
||||||
for event in events:
|
|
||||||
summary = f"- {event['event']}"
|
|
||||||
details: list[str] = []
|
|
||||||
if event.get("actor"):
|
|
||||||
details.append(f"actor={event['actor']}")
|
|
||||||
if event.get("label"):
|
|
||||||
details.append(f"label={event['label']}")
|
|
||||||
if event.get("assignee"):
|
|
||||||
details.append(f"assignee={event['assignee']}")
|
|
||||||
if event.get("source_issue_number") is not None:
|
|
||||||
details.append(f"source_issue=#{event['source_issue_number']}")
|
|
||||||
if event.get("commit_id"):
|
|
||||||
details.append(f"commit={event['commit_id'][:12]}")
|
|
||||||
if event.get("created_at"):
|
|
||||||
details.append(f"at={event['created_at']}")
|
|
||||||
if details:
|
|
||||||
summary += " (" + ", ".join(details) + ")"
|
|
||||||
lines.append(summary)
|
|
||||||
return lines
|
|
||||||
|
|
||||||
|
|
||||||
def format_text(
|
|
||||||
result: dict[str, Any],
|
|
||||||
*,
|
|
||||||
sections: list[str] | None = None,
|
|
||||||
max_description_length: int = 400,
|
|
||||||
json_output_path: str | None = None,
|
|
||||||
) -> str:
|
|
||||||
"""Format the result payload into concise text output."""
|
|
||||||
lines: list[str] = []
|
|
||||||
selected_sections = set(sections or DISPLAY_SECTION_CHOICES)
|
|
||||||
issue = result["issue"]
|
|
||||||
triage_hints = result["triage_hints"]
|
|
||||||
discussion = result["discussion"]
|
|
||||||
events = result["events"]
|
|
||||||
references = result["references"]
|
|
||||||
|
|
||||||
if "issue" in selected_sections:
|
|
||||||
lines.append(f"Issue #{issue['number']}: {issue['title']}")
|
|
||||||
lines.append(f"State: {issue['state']}")
|
|
||||||
lines.append(f"Author: {issue['author']}")
|
|
||||||
lines.append(f"Labels: {', '.join(issue['labels']) if issue['labels'] else '(none)'}")
|
|
||||||
lines.append(f"Assignees: {', '.join(issue['assignees']) if issue['assignees'] else '(none)'}")
|
|
||||||
lines.append(f"Milestone: {issue['milestone'] or '(none)'}")
|
|
||||||
lines.append(f"Created: {issue['created_at']}")
|
|
||||||
lines.append(f"Updated: {issue['updated_at']}")
|
|
||||||
lines.append(f"Resolved from branch: {issue['resolved_from_branch'] or '(not branch-based)'}")
|
|
||||||
lines.append(f"Resolution mode: {issue['resolution_mode']}")
|
|
||||||
lines.append(f"URL: {issue['url']}")
|
|
||||||
if issue["body"]:
|
|
||||||
lines.append("Body:")
|
|
||||||
lines.append(truncate_text(issue["body"], max_description_length))
|
|
||||||
|
|
||||||
if "summary" in selected_sections:
|
|
||||||
lines.append("")
|
|
||||||
lines.append("Triage summary:")
|
|
||||||
lines.append("- Issue type candidates: " + ", ".join(triage_hints["issue_type_candidates"]))
|
|
||||||
information_flags = triage_hints["information_flags"]
|
|
||||||
lines.append(
|
|
||||||
"- Information flags: "
|
|
||||||
+ ", ".join(
|
|
||||||
[
|
|
||||||
f"repro={'yes' if information_flags['has_reproduction_steps'] else 'no'}",
|
|
||||||
f"expected={'yes' if information_flags['has_expected_behavior'] else 'no'}",
|
|
||||||
f"actual={'yes' if information_flags['has_actual_behavior'] else 'no'}",
|
|
||||||
f"environment={'yes' if information_flags['has_environment_details'] else 'no'}",
|
|
||||||
f"acceptance={'yes' if information_flags['has_acceptance_signals'] else 'no'}",
|
|
||||||
f"needs_clarification={'yes' if information_flags['needs_clarification'] else 'no'}",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
)
|
|
||||||
lines.append(
|
|
||||||
"- Affected active topics: "
|
|
||||||
+ (", ".join(triage_hints["affected_active_topics"]) if triage_hints["affected_active_topics"] else "(none)")
|
|
||||||
)
|
|
||||||
lines.append(f"- Next action: {triage_hints['next_action']}")
|
|
||||||
lines.append(f"- Boot handoff: {triage_hints['boot_handoff']['notes']}")
|
|
||||||
|
|
||||||
if "comments" in selected_sections:
|
|
||||||
lines.append("")
|
|
||||||
lines.append(f"Discussion comments: {discussion['comment_count']}")
|
|
||||||
for comment in discussion["comments"]:
|
|
||||||
lines.append(f"- {comment['author']} at {comment['created_at']}")
|
|
||||||
lines.append(f" {truncate_text(comment['body'], max_description_length)}")
|
|
||||||
|
|
||||||
if "events" in selected_sections:
|
|
||||||
lines.append("")
|
|
||||||
lines.append(f"Timeline events: {events['count']}")
|
|
||||||
lines.extend(summarize_events(events["items"]))
|
|
||||||
|
|
||||||
if "references" in selected_sections:
|
|
||||||
lines.append("")
|
|
||||||
lines.append("References:")
|
|
||||||
lines.append("- Mentioned issues: " + (", ".join(references["issues"]) if references["issues"] else "(none)"))
|
|
||||||
lines.append(
|
|
||||||
"- Cross references: "
|
|
||||||
+ (
|
|
||||||
", ".join(references["timeline_cross_references"])
|
|
||||||
if references["timeline_cross_references"]
|
|
||||||
else "(none)"
|
|
||||||
)
|
|
||||||
)
|
|
||||||
lines.append(
|
|
||||||
"- Related issue/PR mentions: "
|
|
||||||
+ (
|
|
||||||
", ".join(references["pull_requests_or_issues"])
|
|
||||||
if references["pull_requests_or_issues"]
|
|
||||||
else "(none)"
|
|
||||||
)
|
|
||||||
)
|
|
||||||
lines.append("- Commit SHAs: " + (", ".join(references["commit_shas"]) if references["commit_shas"] else "(none)"))
|
|
||||||
lines.append("- File paths: " + (", ".join(references["file_paths"]) if references["file_paths"] else "(none)"))
|
|
||||||
|
|
||||||
if result["parse_warnings"] and "warnings" in selected_sections:
|
|
||||||
lines.append("")
|
|
||||||
lines.append("Warnings:")
|
|
||||||
for warning in result["parse_warnings"]:
|
|
||||||
lines.append(f"- {truncate_text(warning, max_description_length)}")
|
|
||||||
|
|
||||||
if json_output_path:
|
|
||||||
lines.append("")
|
|
||||||
lines.append(f"Full JSON written to: {json_output_path}")
|
|
||||||
|
|
||||||
return "\n".join(lines)
|
|
||||||
|
|
||||||
|
|
||||||
def parse_args() -> argparse.Namespace:
|
|
||||||
"""Parse CLI arguments."""
|
|
||||||
parser = argparse.ArgumentParser()
|
|
||||||
parser.add_argument("--branch", help="Override the current branch name.")
|
|
||||||
parser.add_argument("--issue", type=int, help="Fetch a specific issue number instead of auto-selecting one.")
|
|
||||||
parser.add_argument("--format", choices=("text", "json"), default="text")
|
|
||||||
parser.add_argument(
|
|
||||||
"--json-output",
|
|
||||||
help="Write the full JSON result to a file. When used with --format text, stdout stays concise and points to the file.",
|
|
||||||
)
|
|
||||||
parser.add_argument(
|
|
||||||
"--section",
|
|
||||||
action="append",
|
|
||||||
choices=DISPLAY_SECTION_CHOICES,
|
|
||||||
help="Limit text output to specific sections. Can be passed multiple times.",
|
|
||||||
)
|
|
||||||
parser.add_argument(
|
|
||||||
"--max-description-length",
|
|
||||||
type=int,
|
|
||||||
default=400,
|
|
||||||
help="Truncate long text bodies in text output to this many characters.",
|
|
||||||
)
|
|
||||||
return parser.parse_args()
|
|
||||||
|
|
||||||
|
|
||||||
def main() -> None:
|
|
||||||
"""Run the CLI entry point."""
|
|
||||||
args = parse_args()
|
|
||||||
branch = args.branch or get_current_branch()
|
|
||||||
issue_number, resolution_mode = resolve_issue_number(args.issue)
|
|
||||||
result = build_result(issue_number, branch, resolution_mode)
|
|
||||||
|
|
||||||
json_output_path: str | None = None
|
|
||||||
if args.json_output:
|
|
||||||
json_output_path = write_json_output(result, args.json_output)
|
|
||||||
|
|
||||||
if args.format == "json":
|
|
||||||
print(json.dumps(result, ensure_ascii=False, indent=2))
|
|
||||||
return
|
|
||||||
|
|
||||||
print(
|
|
||||||
format_text(
|
|
||||||
result,
|
|
||||||
sections=args.section,
|
|
||||||
max_description_length=args.max_description_length,
|
|
||||||
json_output_path=json_output_path,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
try:
|
|
||||||
main()
|
|
||||||
except Exception as error: # noqa: BLE001
|
|
||||||
print(str(error), file=sys.stderr)
|
|
||||||
sys.exit(1)
|
|
||||||
@ -1,94 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
"""Regression tests for the GFramework issue review fetch helper."""
|
|
||||||
|
|
||||||
from __future__ import annotations
|
|
||||||
|
|
||||||
import importlib.util
|
|
||||||
from pathlib import Path
|
|
||||||
import unittest
|
|
||||||
|
|
||||||
|
|
||||||
SCRIPT_PATH = Path(__file__).with_name("fetch_current_issue_review.py")
|
|
||||||
MODULE_SPEC = importlib.util.spec_from_file_location("fetch_current_issue_review", SCRIPT_PATH)
|
|
||||||
if MODULE_SPEC is None or MODULE_SPEC.loader is None:
|
|
||||||
raise RuntimeError(f"Unable to load module from {SCRIPT_PATH}.")
|
|
||||||
|
|
||||||
MODULE = importlib.util.module_from_spec(MODULE_SPEC)
|
|
||||||
MODULE_SPEC.loader.exec_module(MODULE)
|
|
||||||
|
|
||||||
|
|
||||||
class SelectSingleOpenIssueNumberTests(unittest.TestCase):
|
|
||||||
"""Cover auto-resolution rules for open GitHub issues."""
|
|
||||||
|
|
||||||
def test_select_single_open_issue_number_filters_pull_requests(self) -> None:
|
|
||||||
"""Pull requests in the issues API must not block the single-open-issue path."""
|
|
||||||
selected = MODULE.select_single_open_issue_number(
|
|
||||||
[
|
|
||||||
{"number": 10, "pull_request": {"url": "https://example.test/pr/10"}},
|
|
||||||
{"number": 11},
|
|
||||||
]
|
|
||||||
)
|
|
||||||
|
|
||||||
self.assertEqual(selected, 11)
|
|
||||||
|
|
||||||
def test_select_single_open_issue_number_rejects_multiple_plain_issues(self) -> None:
|
|
||||||
"""Auto-resolution must stop when more than one plain issue is open."""
|
|
||||||
with self.assertRaisesRegex(RuntimeError, "Multiple open GitHub issues found"):
|
|
||||||
MODULE.select_single_open_issue_number([{"number": 11}, {"number": 12}])
|
|
||||||
|
|
||||||
|
|
||||||
class ExtractReferencesFromTextTests(unittest.TestCase):
|
|
||||||
"""Cover lightweight reference extraction used by the text and JSON output."""
|
|
||||||
|
|
||||||
def test_extract_references_from_text_finds_issue_commit_and_path_mentions(self) -> None:
|
|
||||||
"""The helper should retain the high-signal references needed for follow-up triage."""
|
|
||||||
references = MODULE.extract_references_from_text(
|
|
||||||
"See #123, commit abcdef1234567890, and GFramework.Core/Systems/Runner.cs for the failing path."
|
|
||||||
)
|
|
||||||
|
|
||||||
self.assertEqual(references["issues"], ["#123"])
|
|
||||||
self.assertEqual(references["commit_shas"], ["abcdef1234567890"])
|
|
||||||
self.assertEqual(references["file_paths"], ["GFramework.Core/Systems/Runner.cs"])
|
|
||||||
|
|
||||||
|
|
||||||
class BuildTriageHintsTests(unittest.TestCase):
|
|
||||||
"""Cover next-action classification for non-bug issue flows."""
|
|
||||||
|
|
||||||
def test_build_triage_hints_routes_docs_issue_to_docs_topic_without_bug_style_clarification(self) -> None:
|
|
||||||
"""Docs issues with a clear requested change should not be forced through bug-style clarification."""
|
|
||||||
triage_hints = MODULE.build_triage_hints(
|
|
||||||
{
|
|
||||||
"title": "Update documentation landing page",
|
|
||||||
"labels": ["docs"],
|
|
||||||
"body": "The guide should explain the landing-page layout for new contributors.",
|
|
||||||
},
|
|
||||||
[],
|
|
||||||
)
|
|
||||||
|
|
||||||
self.assertEqual(triage_hints["issue_type_candidates"][0], "docs")
|
|
||||||
self.assertEqual(triage_hints["affected_active_topics"], [])
|
|
||||||
self.assertFalse(triage_hints["information_flags"]["needs_clarification"])
|
|
||||||
self.assertEqual(triage_hints["next_action"], "start-new-docs-topic-with-boot")
|
|
||||||
|
|
||||||
def test_build_triage_hints_routes_feature_issue_to_new_topic_when_request_is_clear(self) -> None:
|
|
||||||
"""Feature requests with explicit desired behavior should stay actionable without fake bug repro gates."""
|
|
||||||
triage_hints = MODULE.build_triage_hints(
|
|
||||||
{
|
|
||||||
"title": "Support release note previews",
|
|
||||||
"labels": ["enhancement"],
|
|
||||||
"body": "The workflow should support previewing generated notes before completion.",
|
|
||||||
},
|
|
||||||
[],
|
|
||||||
)
|
|
||||||
|
|
||||||
self.assertEqual(triage_hints["issue_type_candidates"][0], "feature")
|
|
||||||
self.assertEqual(triage_hints["affected_active_topics"], [])
|
|
||||||
self.assertFalse(triage_hints["information_flags"]["needs_clarification"])
|
|
||||||
self.assertEqual(triage_hints["next_action"], "start-new-topic-with-boot")
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
unittest.main()
|
|
||||||
@ -1,114 +0,0 @@
|
|||||||
---
|
|
||||||
name: gframework-multi-agent-batch
|
|
||||||
description: Repository-specific multi-agent orchestration workflow for the GFramework repo. Use when the main agent should keep coordinating multiple parallel subagents, maintain ai-plan recovery artifacts, review subagent results, and continue bounded multi-agent waves until reviewability, context budget, or branch-diff limits say to stop.
|
|
||||||
---
|
|
||||||
|
|
||||||
# GFramework Multi-Agent Batch
|
|
||||||
|
|
||||||
## Overview
|
|
||||||
|
|
||||||
Use this skill when `gframework-boot` has already established repository context, and the task now benefits from the
|
|
||||||
main agent acting as the persistent coordinator for multiple parallel subagents.
|
|
||||||
|
|
||||||
Treat `AGENTS.md` as the source of truth. This skill expands the repository's multi-agent coordination rules; it does
|
|
||||||
not replace them.
|
|
||||||
|
|
||||||
This skill is for orchestration-heavy work, not for every task that merely happens to use one subagent. Prefer it when
|
|
||||||
the main agent must keep splitting bounded write slices, monitoring progress, updating `ai-plan`, validating accepted
|
|
||||||
results, and deciding whether another delegation wave is still safe.
|
|
||||||
|
|
||||||
## Use When
|
|
||||||
|
|
||||||
Adopt this workflow only when all of the following are true:
|
|
||||||
|
|
||||||
1. The task is complex enough that multiple parallel slices materially shorten the work.
|
|
||||||
2. The candidate write sets can be kept disjoint.
|
|
||||||
3. The main agent still needs to own review, validation, integration, and `ai-plan` updates.
|
|
||||||
4. Another wave is still likely to fit the branch-diff, context-budget, and reviewability budget.
|
|
||||||
|
|
||||||
Prefer `gframework-batch-boot` instead when the task is mainly repetitive bulk progress with a single obvious slice
|
|
||||||
pattern and little need for continuous multi-worker orchestration.
|
|
||||||
|
|
||||||
## Startup Workflow
|
|
||||||
|
|
||||||
1. Execute the normal `gframework-boot` startup sequence first:
|
|
||||||
- read `AGENTS.md`
|
|
||||||
- read `.ai/environment/tools.ai.yaml`
|
|
||||||
- read `ai-plan/public/README.md`
|
|
||||||
- read the mapped active topic `todos/` and `traces/`
|
|
||||||
2. Confirm that the active topic and current branch still match the work you are about to delegate.
|
|
||||||
3. Define the current wave in one sentence:
|
|
||||||
- benchmark-host alignment
|
|
||||||
- runtime hotspot reduction
|
|
||||||
- documentation synchronization
|
|
||||||
- other bounded multi-slice work
|
|
||||||
4. Identify the critical path and keep it local.
|
|
||||||
5. Split only the non-blocking work into disjoint ownership slices.
|
|
||||||
6. Estimate whether one more delegation wave is still safe:
|
|
||||||
- include current branch diff vs baseline
|
|
||||||
- loaded `ai-plan` context
|
|
||||||
- expected validation output
|
|
||||||
- expected integration overhead
|
|
||||||
|
|
||||||
## Worker Design Rules
|
|
||||||
|
|
||||||
For each `worker` subagent, specify:
|
|
||||||
|
|
||||||
- the concrete objective
|
|
||||||
- the exact owned files or subsystem
|
|
||||||
- files or areas the worker must not touch
|
|
||||||
- required validation commands
|
|
||||||
- expected output format
|
|
||||||
- a reminder that other agents may be editing the repo
|
|
||||||
|
|
||||||
Prefer `explorer` subagents when the result is read-only ranking, tracing, or candidate discovery.
|
|
||||||
|
|
||||||
Do not launch two workers whose write sets overlap unless the overlap is trivial and the main agent has already decided
|
|
||||||
how to serialize or reconcile that overlap.
|
|
||||||
|
|
||||||
## Main-Agent Loop
|
|
||||||
|
|
||||||
While workers run, the main agent should only do non-overlapping work:
|
|
||||||
|
|
||||||
- inspect the next candidate slices
|
|
||||||
- recompute branch-diff and context-budget posture
|
|
||||||
- review finished worker output
|
|
||||||
- queue follow-up validation
|
|
||||||
- keep `ai-plan/public/**` current when accepted scope or next steps change
|
|
||||||
|
|
||||||
After each completed worker task:
|
|
||||||
|
|
||||||
1. Review the reported ownership, validation, and changed files.
|
|
||||||
2. Confirm the worker stayed inside its boundary.
|
|
||||||
3. Run or rerun the required validation locally if the slice is accepted.
|
|
||||||
4. Record accepted delegated scope, validation milestones, and the next recovery point in the active `ai-plan` files.
|
|
||||||
5. Reassess whether another wave is still reviewable and safe.
|
|
||||||
|
|
||||||
## Stop Conditions
|
|
||||||
|
|
||||||
Stop the current multi-agent wave when any of the following becomes true:
|
|
||||||
|
|
||||||
- the next wave would likely push the main agent near or beyond a safe context budget
|
|
||||||
- the remaining work no longer splits into clean disjoint ownership slices
|
|
||||||
- branch diff vs baseline is approaching the current reviewability budget
|
|
||||||
- integrating another worker would degrade clarity more than it would save time
|
|
||||||
- validation failures show that the next step belongs on the critical path and should stay local
|
|
||||||
|
|
||||||
If a branch-size threshold is also in play, treat it as a coarse repository-scope signal, not the sole decision rule.
|
|
||||||
|
|
||||||
## Task Tracking
|
|
||||||
|
|
||||||
When this workflow is active, the main agent must keep the active `ai-plan` topic current with:
|
|
||||||
|
|
||||||
- delegated scope that has been accepted
|
|
||||||
- validation results
|
|
||||||
- current branch-diff posture if it affects stop decisions
|
|
||||||
- the next recommended resume step
|
|
||||||
|
|
||||||
The main agent should keep active entries concise enough that `boot` can still recover the current wave quickly.
|
|
||||||
|
|
||||||
## Example Triggers
|
|
||||||
|
|
||||||
- `Use $gframework-multi-agent-batch to coordinate non-conflicting subagents for this complex CQRS task.`
|
|
||||||
- `Keep delegating bounded parallel slices, update ai-plan, and verify each worker result before continuing.`
|
|
||||||
- `Run a multi-agent wave where the main agent owns review, validation, and integration.`
|
|
||||||
@ -1,4 +0,0 @@
|
|||||||
interface:
|
|
||||||
display_name: "GFramework Multi-Agent Batch"
|
|
||||||
short_description: "Coordinate bounded parallel subagents with ai-plan tracking"
|
|
||||||
default_prompt: "Use $gframework-multi-agent-batch to coordinate multiple bounded parallel subagents in this GFramework repository while the main agent owns ai-plan updates, validation, review, and integration."
|
|
||||||
@ -1,92 +0,0 @@
|
|||||||
---
|
|
||||||
name: gframework-pr-review
|
|
||||||
description: Repository-specific GitHub PR review workflow for the GFramework repo. Use when Codex needs to inspect the GitHub pull request for the current branch, extract AI review findings from CodeRabbit, greptile-apps, or gemini-code-assist, read failed checks, MegaLinter warnings, or failed test signals from the PR page, and then verify which findings should be fixed in the local codebase. Trigger explicitly with $gframework-pr-review or with prompts such as "look at the current PR", "extract CodeRabbit comments", "extract Greptile comments", "extract Gemini comments", or "check Failed Tests on the PR".
|
|
||||||
---
|
|
||||||
|
|
||||||
# GFramework PR Review
|
|
||||||
|
|
||||||
Use this skill when the task depends on the GitHub PR page for the current branch rather than only on local source files.
|
|
||||||
|
|
||||||
Shortcut: `$gframework-pr-review`
|
|
||||||
|
|
||||||
## Workflow
|
|
||||||
|
|
||||||
1. Read `AGENTS.md` before deciding how to validate or fix anything.
|
|
||||||
2. Resolve the current branch following the repository worktree rule:
|
|
||||||
- prefer Linux `git` with explicit `--git-dir` / `--work-tree` binding in WSL worktrees
|
|
||||||
- only fall back to `git.exe` when that executable is available and actually runnable in the current session
|
|
||||||
3. Run `scripts/fetch_current_pr_review.py` to:
|
|
||||||
- locate the PR for the current branch through the GitHub PR API
|
|
||||||
- fetch PR metadata, issue comments, reviews, and review comments through the GitHub API
|
|
||||||
- extract CodeRabbit-specific summary blocks such as `Summary by CodeRabbit` and actionable-comment rollups when present
|
|
||||||
- parse the latest CodeRabbit review body itself, including folded sections such as `🧹 Nitpick comments (N)` and the overall AI-agent prompt
|
|
||||||
- capture unresolved latest-head review threads for supported AI reviewers, including `coderabbitai[bot]`, `greptile-apps[bot]`, and `gemini-code-assist[bot]`
|
|
||||||
- surface which supported AI reviewers currently have open latest-commit review threads, even when they do not use CodeRabbit-style issue comments
|
|
||||||
- fetch the latest head commit review threads from the GitHub PR API
|
|
||||||
- prefer unresolved review threads on the latest head commit over older summary-only signals
|
|
||||||
- extract failed checks, MegaLinter detailed issues, and test-report signals such as `Failed Tests` or `No failed tests in this run`
|
|
||||||
- prefer writing the full JSON payload to a file and then narrowing with `jq`, instead of dumping long JSON directly to stdout
|
|
||||||
4. Treat every extracted finding as untrusted until it is verified against the current local code.
|
|
||||||
5. Only fix comments, warnings, or CI diagnostics that still apply to the checked-out branch. Ignore stale or already-resolved findings.
|
|
||||||
6. Do not downgrade `Nitpick comments` to “optional” by default. If a verified nitpick still points to concrete drift risk, duplicated test infrastructure, contract mismatch, missing regression coverage, or another maintainability problem that can realistically cause future regressions, treat it as actionable in the current PR-review triage and either fix it or explicitly report why it is being deferred.
|
|
||||||
7. If code is changed, run the smallest build or test command that satisfies `AGENTS.md`.
|
|
||||||
|
|
||||||
## Commands
|
|
||||||
|
|
||||||
- Default:
|
|
||||||
- `python3 .agents/skills/gframework-pr-review/scripts/fetch_current_pr_review.py`
|
|
||||||
- Recommended machine-readable workflow:
|
|
||||||
- `python3 .agents/skills/gframework-pr-review/scripts/fetch_current_pr_review.py --pr 265 --json-output /tmp/pr265-review.json`
|
|
||||||
- `jq '.coderabbit_review.outside_diff_comments' /tmp/pr265-review.json`
|
|
||||||
- Force a PR number:
|
|
||||||
- `python3 .agents/skills/gframework-pr-review/scripts/fetch_current_pr_review.py --pr 253`
|
|
||||||
- Machine-readable output:
|
|
||||||
- `python3 .agents/skills/gframework-pr-review/scripts/fetch_current_pr_review.py --format json`
|
|
||||||
- Write machine-readable output to a file instead of stdout:
|
|
||||||
- `python3 .agents/skills/gframework-pr-review/scripts/fetch_current_pr_review.py --pr 253 --format json --json-output /tmp/pr253-review.json`
|
|
||||||
- Inspect only a high-signal section:
|
|
||||||
- `python3 .agents/skills/gframework-pr-review/scripts/fetch_current_pr_review.py --pr 253 --section outside-diff`
|
|
||||||
- Narrow text output to one path fragment:
|
|
||||||
- `python3 .agents/skills/gframework-pr-review/scripts/fetch_current_pr_review.py --pr 253 --section outside-diff --path GFramework.Core/Events/Event.cs`
|
|
||||||
|
|
||||||
## Output Expectations
|
|
||||||
|
|
||||||
The script should produce:
|
|
||||||
|
|
||||||
- PR metadata: number, title, state, branch, URL
|
|
||||||
- Supported AI reviewer summary, including latest reviews and open-thread counts for `coderabbitai[bot]`, `greptile-apps[bot]`, and `gemini-code-assist[bot]`
|
|
||||||
- CodeRabbit summary block from issue comments when available
|
|
||||||
- Folded latest-review sections such as `Nitpick comments (N)` when CodeRabbit puts them in the review body instead of issue comments
|
|
||||||
- Parsed latest head-review threads, with unresolved threads clearly separated
|
|
||||||
- Latest head commit review metadata and review threads
|
|
||||||
- Unresolved latest-commit review threads after reply-thread folding
|
|
||||||
- Pre-merge failed checks, if present
|
|
||||||
- Latest MegaLinter status and any detailed issues posted by `github-actions[bot]`
|
|
||||||
- Test summary, including failed-test signals when present
|
|
||||||
- Detailed failed-test rows from GitHub Test Reporter / CTRF comments when the PR comment includes `Name` / `Failure Message` content
|
|
||||||
- CLI support for writing full JSON to a file and printing only narrowed text sections to stdout
|
|
||||||
- Parse warnings only when both the primary API source and the intended fallback signal are unavailable
|
|
||||||
|
|
||||||
## Recovery Rules
|
|
||||||
|
|
||||||
- If the current branch has no matching public PR, report that clearly instead of guessing.
|
|
||||||
- If GitHub access fails because of proxy configuration, rerun the fetch with proxy variables removed.
|
|
||||||
- If the current WSL session resolves `git.exe` but cannot execute it cleanly, keep using the explicit Linux worktree binding instead of retrying Windows Git.
|
|
||||||
- Prefer GitHub API results over PR HTML. The PR HTML page is now a fallback/debugging source, not the primary source of truth.
|
|
||||||
- If the summary block and the latest head review threads disagree, trust the latest unresolved head-review threads and treat older summary findings as stale until re-verified locally.
|
|
||||||
- Do not assume every AI reviewer behaves like CodeRabbit. `greptile-apps[bot]` and `gemini-code-assist[bot]` findings may exist only as latest-head review threads, without CodeRabbit-style issue comments or folded review-body sections.
|
|
||||||
- Treat GitHub Actions comments with `Success with warnings` as actionable review input when they include concrete linter diagnostics such as `MegaLinter` detailed issues; do not skip them just because the parent check is green.
|
|
||||||
- Do not assume all CodeRabbit findings live in issue comments. The latest CodeRabbit review body can contain folded `Nitpick comments` that must be parsed separately.
|
|
||||||
- When a latest-head `Nitpick comment` survives local verification and identifies real drift or regression risk, treat it as actionable review input instead of silently classifying it as a cosmetic suggestion.
|
|
||||||
- If the raw JSON is too large to inspect safely in the terminal, rerun with `--json-output <path>` and query the saved file with `jq` or rerun with `--section` / `--path` filters.
|
|
||||||
|
|
||||||
## Example Triggers
|
|
||||||
|
|
||||||
- 'fix pr review'
|
|
||||||
- 'Use FPR'
|
|
||||||
- `Use $gframework-pr-review on the current branch`
|
|
||||||
- `Check the current PR and extract CodeRabbit suggestions`
|
|
||||||
- `Check the current PR and extract Greptile suggestions`
|
|
||||||
- `Check the current PR and extract Gemini Code Assist suggestions`
|
|
||||||
- `Look for Failed Tests on the PR page`
|
|
||||||
- `先用 $gframework-pr-review 看当前分支 PR`
|
|
||||||
@ -1,4 +0,0 @@
|
|||||||
interface:
|
|
||||||
display_name: "GFramework PR Review"
|
|
||||||
short_description: "Inspect the current PR and AI review findings"
|
|
||||||
default_prompt: "Use $gframework-pr-review to inspect the current branch PR through the GitHub API, prioritize unresolved review threads on the latest head commit from supported AI reviewers such as CodeRabbit and greptile-apps, and summarize failed checks or failed tests."
|
|
||||||
File diff suppressed because it is too large
Load Diff
@ -1,53 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
"""Regression tests for the GFramework PR review fetch helper."""
|
|
||||||
|
|
||||||
from __future__ import annotations
|
|
||||||
|
|
||||||
import importlib.util
|
|
||||||
from pathlib import Path
|
|
||||||
import unittest
|
|
||||||
|
|
||||||
|
|
||||||
SCRIPT_PATH = Path(__file__).with_name("fetch_current_pr_review.py")
|
|
||||||
MODULE_SPEC = importlib.util.spec_from_file_location("fetch_current_pr_review", SCRIPT_PATH)
|
|
||||||
if MODULE_SPEC is None or MODULE_SPEC.loader is None:
|
|
||||||
raise RuntimeError(f"Unable to load module from {SCRIPT_PATH}.")
|
|
||||||
|
|
||||||
MODULE = importlib.util.module_from_spec(MODULE_SPEC)
|
|
||||||
MODULE_SPEC.loader.exec_module(MODULE)
|
|
||||||
|
|
||||||
|
|
||||||
class ParseFailedTestDetailsTests(unittest.TestCase):
|
|
||||||
"""Cover failed-test table parsing edge cases for CTRF comments."""
|
|
||||||
|
|
||||||
def test_parse_failed_test_details_ignores_trailing_columns(self) -> None:
|
|
||||||
"""Extra columns should not prevent extracting the name and failure message."""
|
|
||||||
block = """
|
|
||||||
### ❌ **Some tests failed!**
|
|
||||||
<table>
|
|
||||||
<tbody>
|
|
||||||
<tr>
|
|
||||||
<td>❌ RegisterMigration_During_Cache_Rebuild_Should_Not_Leave_Stale_Type_Cache</td>
|
|
||||||
<td><pre>Expected: False\nBut was: True</pre></td>
|
|
||||||
<td>failed</td>
|
|
||||||
<td>35.3s</td>
|
|
||||||
</tr>
|
|
||||||
</tbody>
|
|
||||||
</table>
|
|
||||||
"""
|
|
||||||
|
|
||||||
details = MODULE.parse_failed_test_details(block)
|
|
||||||
|
|
||||||
self.assertEqual(
|
|
||||||
details,
|
|
||||||
[
|
|
||||||
{
|
|
||||||
"name": "RegisterMigration_During_Cache_Rebuild_Should_Not_Leave_Stale_Type_Cache",
|
|
||||||
"failure_message": "Expected: False\nBut was: True",
|
|
||||||
}
|
|
||||||
],
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
unittest.main()
|
|
||||||
@ -1,62 +0,0 @@
|
|||||||
schema_version: 1
|
|
||||||
generated_at_utc: "2026-03-21T04:47:58Z"
|
|
||||||
generated_from: ".ai/environment/tools.raw.yaml"
|
|
||||||
generator: "scripts/generate-ai-environment.py"
|
|
||||||
platform:
|
|
||||||
family: "wsl-linux"
|
|
||||||
os: "Linux"
|
|
||||||
distro: "Ubuntu 24.04.4 LTS"
|
|
||||||
shell: "bash"
|
|
||||||
capabilities:
|
|
||||||
dotnet: true
|
|
||||||
python: true
|
|
||||||
node: true
|
|
||||||
bun: true
|
|
||||||
docker: true
|
|
||||||
fast_search: true
|
|
||||||
json_cli: true
|
|
||||||
tool_selection:
|
|
||||||
search:
|
|
||||||
preferred: "rg"
|
|
||||||
fallback: "grep"
|
|
||||||
use_for: "Repository text search."
|
|
||||||
json:
|
|
||||||
preferred: "jq"
|
|
||||||
fallback: "python3"
|
|
||||||
use_for: "Inspecting or transforming JSON command output."
|
|
||||||
shell:
|
|
||||||
preferred: "bash"
|
|
||||||
fallback: "sh"
|
|
||||||
use_for: "Repository shell scripts and command execution."
|
|
||||||
scripting:
|
|
||||||
preferred: "python3"
|
|
||||||
fallback: "bash"
|
|
||||||
use_for: "Non-trivial local automation and helper scripts."
|
|
||||||
docs_package_manager:
|
|
||||||
preferred: "bun"
|
|
||||||
fallback: "npm"
|
|
||||||
use_for: "Installing and previewing the docs site."
|
|
||||||
build_and_test:
|
|
||||||
preferred: "dotnet"
|
|
||||||
fallback: "unavailable"
|
|
||||||
use_for: "Build, test, restore, and solution validation."
|
|
||||||
python:
|
|
||||||
available: true
|
|
||||||
helper_packages:
|
|
||||||
requests: true
|
|
||||||
rich: true
|
|
||||||
openai: false
|
|
||||||
tiktoken: false
|
|
||||||
pydantic: false
|
|
||||||
pytest: false
|
|
||||||
preferences:
|
|
||||||
prefer_project_listed_tools: true
|
|
||||||
prefer_python_for_non_trivial_automation: true
|
|
||||||
avoid_unlisted_system_tools: true
|
|
||||||
rules:
|
|
||||||
- "Use rg instead of grep for repository search when rg is available."
|
|
||||||
- "Use jq for JSON inspection; fall back to python3 if jq is unavailable."
|
|
||||||
- "Prefer python3 over complex bash for non-trivial scripting when python3 is available."
|
|
||||||
- "Use bun for docs preview workflows when bun is available; otherwise fall back to npm."
|
|
||||||
- "Use dotnet for repository build and test workflows."
|
|
||||||
- "Do not assume unrelated system tools are part of the supported project environment."
|
|
||||||
@ -1,89 +0,0 @@
|
|||||||
schema_version: 1
|
|
||||||
generated_at_utc: "2026-03-21T04:47:28Z"
|
|
||||||
generator: "scripts/collect-dev-environment.sh"
|
|
||||||
|
|
||||||
platform:
|
|
||||||
os: "Linux"
|
|
||||||
distro: "Ubuntu 24.04.4 LTS"
|
|
||||||
version: "24.04"
|
|
||||||
kernel: "5.15.167.4-microsoft-standard-WSL2"
|
|
||||||
wsl: true
|
|
||||||
wsl_version: "2.4.13"
|
|
||||||
shell: "bash"
|
|
||||||
|
|
||||||
required_runtimes:
|
|
||||||
dotnet:
|
|
||||||
installed: true
|
|
||||||
version: "10.0.104"
|
|
||||||
path: "/usr/bin/dotnet"
|
|
||||||
purpose: "Builds and tests the GFramework solution."
|
|
||||||
python3:
|
|
||||||
installed: true
|
|
||||||
version: "Python 3.12.3"
|
|
||||||
path: "/usr/bin/python3"
|
|
||||||
purpose: "Runs local automation and environment collection scripts."
|
|
||||||
node:
|
|
||||||
installed: true
|
|
||||||
version: "v20.20.1"
|
|
||||||
path: "/usr/bin/node"
|
|
||||||
purpose: "Provides the JavaScript runtime used by docs tooling."
|
|
||||||
bun:
|
|
||||||
installed: true
|
|
||||||
version: "1.3.10"
|
|
||||||
path: "/root/.bun/bin/bun"
|
|
||||||
purpose: "Installs and previews the VitePress documentation site."
|
|
||||||
|
|
||||||
required_tools:
|
|
||||||
git:
|
|
||||||
installed: true
|
|
||||||
version: "git version 2.43.0"
|
|
||||||
path: "/usr/bin/git"
|
|
||||||
purpose: "Source control and patch review."
|
|
||||||
bash:
|
|
||||||
installed: true
|
|
||||||
version: "GNU bash, version 5.2.21(1)-release (x86_64-pc-linux-gnu)"
|
|
||||||
path: "/usr/bin/bash"
|
|
||||||
purpose: "Executes repository scripts and shell automation."
|
|
||||||
rg:
|
|
||||||
installed: true
|
|
||||||
version: "ripgrep 15.1.0 (rev af60c2de9d)"
|
|
||||||
path: "/root/.bun/install/global/node_modules/@openai/codex-linux-x64/vendor/x86_64-unknown-linux-musl/path/rg"
|
|
||||||
purpose: "Fast text search across the repository."
|
|
||||||
jq:
|
|
||||||
installed: true
|
|
||||||
version: "jq-1.7"
|
|
||||||
path: "/usr/bin/jq"
|
|
||||||
purpose: "Inspecting and transforming JSON outputs."
|
|
||||||
|
|
||||||
project_tools:
|
|
||||||
docker:
|
|
||||||
installed: true
|
|
||||||
version: "Docker version 29.2.1, build a5c7197"
|
|
||||||
path: "/usr/bin/docker"
|
|
||||||
purpose: "Runs MegaLinter and other containerized validation tools."
|
|
||||||
|
|
||||||
python_packages:
|
|
||||||
requests:
|
|
||||||
installed: true
|
|
||||||
version: "2.31.0"
|
|
||||||
purpose: "Simple HTTP calls in local helper scripts."
|
|
||||||
rich:
|
|
||||||
installed: true
|
|
||||||
version: "13.7.1"
|
|
||||||
purpose: "Readable CLI output for local Python helpers."
|
|
||||||
openai:
|
|
||||||
installed: false
|
|
||||||
version: "not-installed"
|
|
||||||
purpose: "Optional scripted access to OpenAI APIs."
|
|
||||||
tiktoken:
|
|
||||||
installed: false
|
|
||||||
version: "not-installed"
|
|
||||||
purpose: "Optional token counting for prompt and context inspection."
|
|
||||||
pydantic:
|
|
||||||
installed: false
|
|
||||||
version: "not-installed"
|
|
||||||
purpose: "Optional typed config and schema validation for helper scripts."
|
|
||||||
pytest:
|
|
||||||
installed: false
|
|
||||||
version: "not-installed"
|
|
||||||
purpose: "Optional lightweight testing for Python helper scripts."
|
|
||||||
@ -1,26 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
# yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json
|
|
||||||
language: "zh-CN"
|
|
||||||
early_access: false
|
|
||||||
|
|
||||||
reviews:
|
|
||||||
profile: "chill"
|
|
||||||
request_changes_workflow: true # 有问题时可以直接 request changes
|
|
||||||
high_level_summary: true # PR 总体总结
|
|
||||||
review_status: true # review 结果状态
|
|
||||||
review_details: true # 展示具体问题
|
|
||||||
poem: false # 关闭诗歌(基本没人用)
|
|
||||||
tools:
|
|
||||||
github-checks:
|
|
||||||
enabled: true
|
|
||||||
timeout_ms: 900000
|
|
||||||
auto_review:
|
|
||||||
enabled: true
|
|
||||||
drafts: false # draft PR 不 review
|
|
||||||
base_branches:
|
|
||||||
- refactor/cqrs-architecture-decoupling
|
|
||||||
|
|
||||||
chat:
|
|
||||||
auto_reply: true
|
|
||||||
@ -1,13 +0,0 @@
|
|||||||
{
|
|
||||||
"version": 1,
|
|
||||||
"isRoot": true,
|
|
||||||
"tools": {
|
|
||||||
"dotnetctrfjsonreporter": {
|
|
||||||
"version": "0.0.7",
|
|
||||||
"commands": [
|
|
||||||
"DotnetCtrfJsonReporter"
|
|
||||||
],
|
|
||||||
"rollForward": false
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@ -1,18 +0,0 @@
|
|||||||
root = true
|
|
||||||
|
|
||||||
[*]
|
|
||||||
charset = utf-8
|
|
||||||
end_of_line = lf
|
|
||||||
insert_final_newline = true
|
|
||||||
|
|
||||||
[*.sln]
|
|
||||||
end_of_line = crlf
|
|
||||||
|
|
||||||
[*.bat]
|
|
||||||
end_of_line = crlf
|
|
||||||
|
|
||||||
[*.cmd]
|
|
||||||
end_of_line = crlf
|
|
||||||
|
|
||||||
[*.ps1]
|
|
||||||
end_of_line = crlf
|
|
||||||
14
.feluda.yaml
14
.feluda.yaml
@ -1,14 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
license_overrides:
|
|
||||||
NETStandard.Library: MIT
|
|
||||||
Microsoft.NETCore.Platforms: MIT
|
|
||||||
System.Buffers: MIT
|
|
||||||
System.Memory: MIT
|
|
||||||
System.Numerics.Vectors: MIT
|
|
||||||
System.Threading.Tasks.Extensions: MIT
|
|
||||||
System.ComponentModel.Composition: MIT
|
|
||||||
System.Security.Cryptography.ProtectedData: MIT
|
|
||||||
System.Security.Permissions: MIT
|
|
||||||
Microsoft.VisualStudio.Validation: MIT
|
|
||||||
35
.gitattributes
vendored
35
.gitattributes
vendored
@ -1,35 +0,0 @@
|
|||||||
# Keep repository text normalized to LF unless a file format is known to require CRLF.
|
|
||||||
* text=auto eol=lf
|
|
||||||
|
|
||||||
# Solution and Windows-native scripts are more interoperable when they keep CRLF in the working tree.
|
|
||||||
*.sln text eol=crlf
|
|
||||||
*.bat text eol=crlf
|
|
||||||
*.cmd text eol=crlf
|
|
||||||
*.ps1 text eol=crlf
|
|
||||||
|
|
||||||
# Source, config, scripts, and documentation stay LF across WSL and Windows editors.
|
|
||||||
*.sh text eol=lf
|
|
||||||
*.cs text eol=lf
|
|
||||||
*.csproj text eol=lf
|
|
||||||
*.props text eol=lf
|
|
||||||
*.targets text eol=lf
|
|
||||||
*.json text eol=lf
|
|
||||||
*.yml text eol=lf
|
|
||||||
*.yaml text eol=lf
|
|
||||||
*.md text eol=lf
|
|
||||||
*.ts text eol=lf
|
|
||||||
*.js text eol=lf
|
|
||||||
*.mts text eol=lf
|
|
||||||
*.vue text eol=lf
|
|
||||||
*.css text eol=lf
|
|
||||||
|
|
||||||
# Common binary assets should never be line-normalized.
|
|
||||||
*.png binary
|
|
||||||
*.jpg binary
|
|
||||||
*.jpeg binary
|
|
||||||
*.gif binary
|
|
||||||
*.ico binary
|
|
||||||
*.zip binary
|
|
||||||
*.dll binary
|
|
||||||
*.so binary
|
|
||||||
*.pdb binary
|
|
||||||
125
.github/ISSUE_TEMPLATE/01-bug-report.yml
vendored
125
.github/ISSUE_TEMPLATE/01-bug-report.yml
vendored
@ -1,125 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
name: "Bug Report / 缺陷报告"
|
|
||||||
description: "Report a reproducible defect in GFramework. / 报告可稳定复现的 GFramework 缺陷。"
|
|
||||||
title: "[Bug]: "
|
|
||||||
body:
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
Thanks for taking the time to report a bug.
|
|
||||||
|
|
||||||
感谢你提交缺陷报告。提交前请先搜索已有 Issue,并尽量提供最小复现信息。
|
|
||||||
- type: checkboxes
|
|
||||||
id: checks
|
|
||||||
attributes:
|
|
||||||
label: "Pre-Submission Checks / 提交前检查"
|
|
||||||
description: "Please confirm the following items before submitting. / 提交前请确认以下事项。"
|
|
||||||
options:
|
|
||||||
- label: "I searched existing issues and did not find a duplicate. / 我已搜索现有 Issue,未发现重复问题。"
|
|
||||||
required: true
|
|
||||||
- label: "I checked the relevant README or docs pages first. / 我已先阅读相关 README 或文档。"
|
|
||||||
required: true
|
|
||||||
- label: "I can describe a reproducible scenario or provide a minimal repro. / 我可以描述稳定复现场景或提供最小复现。"
|
|
||||||
required: true
|
|
||||||
- type: dropdown
|
|
||||||
id: module
|
|
||||||
attributes:
|
|
||||||
label: "Affected Module / 影响模块"
|
|
||||||
description: "Choose the module that best matches the problem. / 请选择最符合问题范围的模块。"
|
|
||||||
options:
|
|
||||||
- "GFramework.Core"
|
|
||||||
- "GFramework.Core.Abstractions"
|
|
||||||
- "GFramework.Game"
|
|
||||||
- "GFramework.Game.Abstractions"
|
|
||||||
- "GFramework.Godot"
|
|
||||||
- "GFramework.SourceGenerators"
|
|
||||||
- "GFramework.Godot.SourceGenerators"
|
|
||||||
- "Docs / 文档"
|
|
||||||
- "Build / CI / Packaging"
|
|
||||||
- "Unknown / Not sure / 不确定"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: input
|
|
||||||
id: version
|
|
||||||
attributes:
|
|
||||||
label: "Package or Commit Version / 包版本或提交版本"
|
|
||||||
description: "Example: NuGet version, commit SHA, or branch. / 例如 NuGet 版本、提交 SHA 或分支。"
|
|
||||||
placeholder: "e.g. GeWuYou.GFramework.Core 1.2.3 / main@abc1234"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: summary
|
|
||||||
attributes:
|
|
||||||
label: "Bug Summary / 问题概述"
|
|
||||||
description: "Describe the defect in one or two paragraphs. / 用 1-2 段简要描述问题。"
|
|
||||||
placeholder: "What is broken, and when does it happen? / 具体哪里出错,什么情况下出现?"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: steps
|
|
||||||
attributes:
|
|
||||||
label: "Steps To Reproduce / 复现步骤"
|
|
||||||
description: "Provide a deterministic repro whenever possible. / 尽量提供可稳定复现的步骤。"
|
|
||||||
placeholder: |
|
|
||||||
1. ...
|
|
||||||
2. ...
|
|
||||||
3. ...
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: expected
|
|
||||||
attributes:
|
|
||||||
label: "Expected Behavior / 预期行为"
|
|
||||||
description: "What should happen instead? / 正常情况下应该发生什么?"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: actual
|
|
||||||
attributes:
|
|
||||||
label: "Actual Behavior / 实际行为"
|
|
||||||
description: "What actually happens? Include exception text if available. / 实际发生了什么?如有异常请附上。"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: repro
|
|
||||||
attributes:
|
|
||||||
label: "Minimal Repro / 最小复现"
|
|
||||||
description: "Share a repository, gist, code snippet, or explain why a minimal repro is not yet available. / 提供仓库、gist、代码片段,或说明暂时无法提供的原因。"
|
|
||||||
placeholder: |
|
|
||||||
Please provide one of the following:
|
|
||||||
- A GitHub repository or sample project
|
|
||||||
- A gist or focused code snippet
|
|
||||||
- Or explain why a minimal repro is not yet available
|
|
||||||
|
|
||||||
请提供以下任一内容:
|
|
||||||
- GitHub 仓库或示例项目
|
|
||||||
- Gist 或聚焦代码片段
|
|
||||||
- 或说明暂时无法提供最小复现的原因
|
|
||||||
render: shell
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: logs
|
|
||||||
attributes:
|
|
||||||
label: "Logs and Screenshots / 日志与截图"
|
|
||||||
description: "Paste relevant logs, stack traces, or attach screenshots. / 粘贴相关日志、堆栈,或补充截图。"
|
|
||||||
render: shell
|
|
||||||
- type: textarea
|
|
||||||
id: environment
|
|
||||||
attributes:
|
|
||||||
label: "Environment / 环境信息"
|
|
||||||
description: "List the environment details that matter for reproduction. / 请列出与复现相关的环境信息。"
|
|
||||||
placeholder: |
|
|
||||||
- OS:
|
|
||||||
- .NET SDK / Runtime:
|
|
||||||
- Godot version (if applicable):
|
|
||||||
- IDE / Build tool:
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: impact
|
|
||||||
attributes:
|
|
||||||
label: "Impact and Scope / 影响范围"
|
|
||||||
description: "Explain whether this blocks adoption, breaks compatibility, or affects only a narrow scenario. / 说明该问题是否阻塞使用、破坏兼容性,还是仅影响较窄场景。"
|
|
||||||
83
.github/ISSUE_TEMPLATE/02-feature-request.yml
vendored
83
.github/ISSUE_TEMPLATE/02-feature-request.yml
vendored
@ -1,83 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
name: "Feature Request / 功能建议"
|
|
||||||
description: "Suggest a new capability or an API improvement. / 提出新能力或 API 改进建议。"
|
|
||||||
title: "[Feature]: "
|
|
||||||
body:
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
Use this form for feature proposals, API improvements, and workflow enhancements.
|
|
||||||
|
|
||||||
该模板适用于新功能、API 改进和工作流优化建议。请优先描述问题和动机,而不只是直接给出实现方案。
|
|
||||||
- type: checkboxes
|
|
||||||
id: checks
|
|
||||||
attributes:
|
|
||||||
label: "Pre-Submission Checks / 提交前检查"
|
|
||||||
description: "Please confirm the following items before submitting. / 提交前请确认以下事项。"
|
|
||||||
options:
|
|
||||||
- label: "I searched existing issues and did not find the same request. / 我已搜索现有 Issue,未发现相同建议。"
|
|
||||||
required: true
|
|
||||||
- label: "I checked the relevant docs, examples, or current APIs first. / 我已先检查相关文档、示例或现有 API。"
|
|
||||||
required: true
|
|
||||||
- label: "I can explain the user problem or workflow gap this request solves. / 我可以说明该建议要解决的用户问题或工作流缺口。"
|
|
||||||
required: true
|
|
||||||
- type: dropdown
|
|
||||||
id: module
|
|
||||||
attributes:
|
|
||||||
label: "Target Module / 目标模块"
|
|
||||||
description: "Choose the module that should own this capability. / 请选择最适合承载该能力的模块。"
|
|
||||||
options:
|
|
||||||
- "GFramework.Core"
|
|
||||||
- "GFramework.Core.Abstractions"
|
|
||||||
- "GFramework.Game"
|
|
||||||
- "GFramework.Game.Abstractions"
|
|
||||||
- "GFramework.Godot"
|
|
||||||
- "GFramework.SourceGenerators"
|
|
||||||
- "GFramework.Godot.SourceGenerators"
|
|
||||||
- "Docs / 文档"
|
|
||||||
- "Build / CI / Packaging"
|
|
||||||
- "Cross-cutting / 跨模块"
|
|
||||||
- "Unknown / Not sure / 不确定"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: problem
|
|
||||||
attributes:
|
|
||||||
label: "Problem Statement / 问题背景"
|
|
||||||
description: "What problem are you facing today? / 你当前遇到的核心问题是什么?"
|
|
||||||
placeholder: "Describe the workflow pain, limitation, or missing capability. / 描述当前流程痛点、限制或缺失能力。"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: proposal
|
|
||||||
attributes:
|
|
||||||
label: "Proposed Solution / 建议方案"
|
|
||||||
description: "Describe the behavior, API shape, or user experience you want. / 描述你期望的行为、API 形态或使用体验。"
|
|
||||||
placeholder: "What should GFramework provide? / 希望 GFramework 提供什么?"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: use-cases
|
|
||||||
attributes:
|
|
||||||
label: "Use Cases / 使用场景"
|
|
||||||
description: "Show the practical scenarios this would unlock or simplify. / 说明该能力能解决或简化哪些实际场景。"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: api-sketch
|
|
||||||
attributes:
|
|
||||||
label: "API or Design Sketch / API 或设计草图"
|
|
||||||
description: "Optional but helpful: provide pseudocode, API examples, or a rough design. / 可选但强烈建议:补充伪代码、API 示例或设计草图。"
|
|
||||||
render: csharp
|
|
||||||
- type: textarea
|
|
||||||
id: alternatives
|
|
||||||
attributes:
|
|
||||||
label: "Alternatives Considered / 已考虑的替代方案"
|
|
||||||
description: "Describe current workarounds or alternatives and why they are insufficient. / 描述现有替代方案或绕过方式,以及为什么不足。"
|
|
||||||
- type: textarea
|
|
||||||
id: compatibility
|
|
||||||
attributes:
|
|
||||||
label: "Compatibility and Migration Impact / 兼容性与迁移影响"
|
|
||||||
description: "State whether this needs breaking changes, opt-in behavior, or migration notes. / 说明该建议是否涉及破坏性变更、显式开关或迁移说明。"
|
|
||||||
64
.github/ISSUE_TEMPLATE/03-documentation.yml
vendored
64
.github/ISSUE_TEMPLATE/03-documentation.yml
vendored
@ -1,64 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
name: "Documentation / 文档改进"
|
|
||||||
description: "Report missing, outdated, or unclear documentation. / 报告缺失、过期或不清晰的文档。"
|
|
||||||
title: "[Docs]: "
|
|
||||||
body:
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
Documentation issues are product issues in this repository.
|
|
||||||
|
|
||||||
文档问题同样是产品问题。请尽量指出具体页面、段落和建议修正方向,方便快速处理。
|
|
||||||
- type: checkboxes
|
|
||||||
id: checks
|
|
||||||
attributes:
|
|
||||||
label: "Pre-Submission Checks / 提交前检查"
|
|
||||||
description: "Please confirm the following items before submitting. / 提交前请确认以下事项。"
|
|
||||||
options:
|
|
||||||
- label: "I searched existing issues and did not find the same documentation problem. / 我已搜索现有 Issue,未发现相同文档问题。"
|
|
||||||
required: true
|
|
||||||
- label: "I checked the latest docs site or repository docs pages first. / 我已先检查最新文档站点或仓库文档页面。"
|
|
||||||
required: true
|
|
||||||
- type: input
|
|
||||||
id: page
|
|
||||||
attributes:
|
|
||||||
label: "Document Path or URL / 文档路径或链接"
|
|
||||||
description: "Provide the file path or docs URL if you know it. / 如果知道,请提供文档文件路径或页面链接。"
|
|
||||||
placeholder: "e.g. docs/zh-CN/core/architecture.md"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: dropdown
|
|
||||||
id: doc-issue-type
|
|
||||||
attributes:
|
|
||||||
label: "Issue Type / 问题类型"
|
|
||||||
description: "Choose the primary documentation problem. / 请选择主要问题类型。"
|
|
||||||
options:
|
|
||||||
- "Missing content / 缺少内容"
|
|
||||||
- "Outdated content / 内容过期"
|
|
||||||
- "Incorrect content / 内容错误"
|
|
||||||
- "Unclear explanation / 说明不清晰"
|
|
||||||
- "Missing example / 缺少示例"
|
|
||||||
- "Translation issue / 翻译问题"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: current-problem
|
|
||||||
attributes:
|
|
||||||
label: "Current Problem / 当前问题"
|
|
||||||
description: "Describe what is confusing, wrong, or missing. / 说明当前哪里令人困惑、错误或缺失。"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: expected-docs
|
|
||||||
attributes:
|
|
||||||
label: "Expected Improvement / 期望改进"
|
|
||||||
description: "Describe the improvement you expect. / 说明你期望如何改进。"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: references
|
|
||||||
attributes:
|
|
||||||
label: "Related Code or References / 相关代码或参考资料"
|
|
||||||
description: "Link related source files, PRs, issues, or external references if helpful. / 如有帮助,请附上相关源码、PR、Issue 或外部参考资料。"
|
|
||||||
69
.github/ISSUE_TEMPLATE/04-question.yml
vendored
69
.github/ISSUE_TEMPLATE/04-question.yml
vendored
@ -1,69 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
name: "Question / 使用咨询"
|
|
||||||
description: "Ask for guidance about usage, behavior, or adoption. / 询问用法、行为或接入方式。"
|
|
||||||
title: "[Question]: "
|
|
||||||
body:
|
|
||||||
- type: markdown
|
|
||||||
attributes:
|
|
||||||
value: |
|
|
||||||
Use this form when your question is specific to GFramework behavior, APIs, or adoption guidance.
|
|
||||||
|
|
||||||
如果你的问题与 GFramework 的行为、API 或接入方式直接相关,请使用此模板。一般咨询请先查看 README、贡献指南与 docs。
|
|
||||||
- type: checkboxes
|
|
||||||
id: checks
|
|
||||||
attributes:
|
|
||||||
label: "Pre-Submission Checks / 提交前检查"
|
|
||||||
description: "Please confirm the following items before submitting. / 提交前请确认以下事项。"
|
|
||||||
options:
|
|
||||||
- label: "I searched existing issues and read the relevant docs first. / 我已先搜索现有 Issue 并阅读相关文档。"
|
|
||||||
required: true
|
|
||||||
- label: "This is not a private support request or unrelated general programming question. / 这不是私有支持请求,也不是与本项目无关的泛编程问题。"
|
|
||||||
required: true
|
|
||||||
- type: dropdown
|
|
||||||
id: topic
|
|
||||||
attributes:
|
|
||||||
label: "Topic Area / 主题领域"
|
|
||||||
description: "Choose the area closest to your question. / 请选择最接近问题的主题。"
|
|
||||||
options:
|
|
||||||
- "Architecture / 架构"
|
|
||||||
- "Core APIs / Core API"
|
|
||||||
- "Game Module / Game 模块"
|
|
||||||
- "Godot Integration / Godot 集成"
|
|
||||||
- "Source Generators / 源生成器"
|
|
||||||
- "Build / Packaging / 构建与打包"
|
|
||||||
- "Docs / 文档"
|
|
||||||
- "Other / 其他"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: goal
|
|
||||||
attributes:
|
|
||||||
label: "What Are You Trying To Do? / 你想实现什么?"
|
|
||||||
description: "Explain your goal before describing the problem. / 请先说明你的目标,再描述遇到的问题。"
|
|
||||||
placeholder: "I want to... / 我想要……"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: current-attempt
|
|
||||||
attributes:
|
|
||||||
label: "Current Attempt / 当前尝试"
|
|
||||||
description: "Show what you already tried, including code, docs, or configuration. / 说明你已经尝试过什么,包括代码、文档或配置。"
|
|
||||||
render: csharp
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: question
|
|
||||||
attributes:
|
|
||||||
label: "Specific Question / 具体问题"
|
|
||||||
description: "Ask the narrowest question that would unblock you. / 提出能真正帮你解阻的最小问题。"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
- type: textarea
|
|
||||||
id: environment
|
|
||||||
attributes:
|
|
||||||
label: "Relevant Environment / 相关环境"
|
|
||||||
description: "Include the framework version, runtime, engine version, or project context. If not applicable, write N/A. / 请补充框架版本、运行时、引擎版本或项目上下文;如不适用请填写 N/A。"
|
|
||||||
validations:
|
|
||||||
required: true
|
|
||||||
14
.github/ISSUE_TEMPLATE/config.yml
vendored
14
.github/ISSUE_TEMPLATE/config.yml
vendored
@ -1,14 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
blank_issues_enabled: false
|
|
||||||
contact_links:
|
|
||||||
- name: "Search Existing Issues / 搜索现有 Issues"
|
|
||||||
url: "https://github.com/GeWuYou/GFramework/issues?q=is%3Aissue"
|
|
||||||
about: "Check whether your topic has already been reported or discussed. / 先确认是否已有相同问题或讨论。"
|
|
||||||
- name: "Read Contribution Guide / 阅读贡献指南"
|
|
||||||
url: "https://github.com/GeWuYou/GFramework/blob/main/docs/zh-CN/contributing.md"
|
|
||||||
about: "Review issue and pull request expectations before submitting. / 提交前先阅读 Issue 与 PR 的协作约定。"
|
|
||||||
- name: "Browse Documentation / 查看文档"
|
|
||||||
url: "https://github.com/GeWuYou/GFramework/tree/main/docs/zh-CN"
|
|
||||||
about: "Read docs, tutorials, and troubleshooting pages first. / 先查看文档、教程与排障页面。"
|
|
||||||
69
.github/actions/validate-pat/action.yml
vendored
69
.github/actions/validate-pat/action.yml
vendored
@ -1,69 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
name: Validate PAT
|
|
||||||
description: Validate that the release PAT can access the repository and push tags.
|
|
||||||
|
|
||||||
inputs:
|
|
||||||
pat-token:
|
|
||||||
description: Personal access token used by semantic-release.
|
|
||||||
required: true
|
|
||||||
repo-api-url:
|
|
||||||
description: GitHub repository API URL, for example https://api.github.com/repos/owner/repo.
|
|
||||||
required: true
|
|
||||||
repository:
|
|
||||||
description: Repository slug used in error messages.
|
|
||||||
required: true
|
|
||||||
missing-token-message:
|
|
||||||
description: Error message emitted when the PAT is absent.
|
|
||||||
required: true
|
|
||||||
|
|
||||||
runs:
|
|
||||||
using: composite
|
|
||||||
steps:
|
|
||||||
- name: Validate PAT can push
|
|
||||||
shell: bash
|
|
||||||
env:
|
|
||||||
PAT_TOKEN: ${{ inputs.pat-token }}
|
|
||||||
REPO_API_URL: ${{ inputs.repo-api-url }}
|
|
||||||
REPOSITORY: ${{ inputs.repository }}
|
|
||||||
MISSING_TOKEN_MESSAGE: ${{ inputs.missing-token-message }}
|
|
||||||
run: |
|
|
||||||
if [ -z "${PAT_TOKEN}" ]; then
|
|
||||||
echo "::error::${MISSING_TOKEN_MESSAGE}"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
response_file="$(mktemp)"
|
|
||||||
trap 'rm -f "${response_file}"' EXIT
|
|
||||||
|
|
||||||
status_code="$(
|
|
||||||
curl -sS -o "${response_file}" -w "%{http_code}" \
|
|
||||||
-H "Authorization: Bearer ${PAT_TOKEN}" \
|
|
||||||
-H "Accept: application/vnd.github+json" \
|
|
||||||
-H "X-GitHub-Api-Version: 2022-11-28" \
|
|
||||||
"${REPO_API_URL}"
|
|
||||||
)"
|
|
||||||
|
|
||||||
case "${status_code}" in
|
|
||||||
200)
|
|
||||||
# The repository endpoint returns 200 for read-only tokens as well.
|
|
||||||
# semantic-release still performs a remote push probe, so require push permission here.
|
|
||||||
push_ok="$(jq -r '.permissions.push // false' "${response_file}")"
|
|
||||||
if [ "${push_ok}" != "true" ]; then
|
|
||||||
echo "::error::PAT_TOKEN can read ${REPOSITORY} but lacks push permission. semantic-release requires contents:write."
|
|
||||||
cat "${response_file}"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
;;
|
|
||||||
401|403)
|
|
||||||
echo "::error::PAT_TOKEN is invalid or lacks access to ${REPOSITORY} (HTTP ${status_code})."
|
|
||||||
cat "${response_file}"
|
|
||||||
exit 1
|
|
||||||
;;
|
|
||||||
*)
|
|
||||||
echo "::error::Failed to validate PAT_TOKEN against ${REPO_API_URL} (HTTP ${status_code})."
|
|
||||||
cat "${response_file}"
|
|
||||||
exit 1
|
|
||||||
;;
|
|
||||||
esac
|
|
||||||
99
.github/cliff.toml
vendored
99
.github/cliff.toml
vendored
@ -1,99 +0,0 @@
|
|||||||
[remote.github]
|
|
||||||
owner = "GeWuYou"
|
|
||||||
repo = "GFramework"
|
|
||||||
|
|
||||||
[changelog]
|
|
||||||
header = ""
|
|
||||||
|
|
||||||
body = """
|
|
||||||
{%- macro remote_url() -%}
|
|
||||||
https://github.com/{{ remote.github.owner }}/{{ remote.github.repo }}
|
|
||||||
{%- endmacro -%}
|
|
||||||
|
|
||||||
{% macro has_release_highlight(commit) -%}
|
|
||||||
{%- set highlighted = false -%}
|
|
||||||
{%- if commit.remote and commit.remote.pr_labels -%}
|
|
||||||
{%- for label in commit.remote.pr_labels -%}
|
|
||||||
{%- if label == "release-highlight" or label == "highlight" -%}
|
|
||||||
{%- set highlighted = true -%}
|
|
||||||
{%- endif -%}
|
|
||||||
{%- endfor -%}
|
|
||||||
{%- endif -%}
|
|
||||||
{%- if not highlighted and commit.footers -%}
|
|
||||||
{%- for footer in commit.footers -%}
|
|
||||||
{%- if footer.token == "Release-Highlight" and footer.value | trim == "true" -%}
|
|
||||||
{%- set highlighted = true -%}
|
|
||||||
{%- endif -%}
|
|
||||||
{%- endfor -%}
|
|
||||||
{%- endif -%}
|
|
||||||
{{ highlighted }}
|
|
||||||
{%- endmacro %}
|
|
||||||
|
|
||||||
{% macro print_commit(commit) -%}
|
|
||||||
- {{ commit.message | split(pat="\n") | first | trim | upper_first }}{% if commit.remote and commit.remote.username %} by @{{ commit.remote.username }}{% elif commit.author.name %} by {{ commit.author.name }}{% endif %}{% if commit.remote and commit.remote.pr_number %} in [#{{ commit.remote.pr_number }}]({{ self::remote_url() }}/pull/{{ commit.remote.pr_number }}){% endif %}
|
|
||||||
{%- endmacro %}
|
|
||||||
|
|
||||||
{% if version -%}
|
|
||||||
## {{ version }} ({{ timestamp | date(format="%Y-%m-%d") }})
|
|
||||||
{% else -%}
|
|
||||||
## 未发布
|
|
||||||
{% endif %}
|
|
||||||
|
|
||||||
{% set highlights = commits | filter(attribute="breaking", value=true) %}
|
|
||||||
{% for commit in commits -%}
|
|
||||||
{% if self::has_release_highlight(commit=commit) == "true" -%}
|
|
||||||
{% set_global highlights = highlights | concat(with=commit) -%}
|
|
||||||
{% endif -%}
|
|
||||||
{% endfor -%}
|
|
||||||
|
|
||||||
{% if highlights | length > 0 -%}
|
|
||||||
## 重点条目
|
|
||||||
{% for commit in highlights -%}
|
|
||||||
{{ self::print_commit(commit=commit) }}
|
|
||||||
{% endfor %}
|
|
||||||
|
|
||||||
{% endif -%}
|
|
||||||
|
|
||||||
{% if commits | length > 0 -%}
|
|
||||||
## What's Changed
|
|
||||||
|
|
||||||
{% for group, commits in commits | group_by(attribute="group") -%}
|
|
||||||
### {{ group | striptags | trim }}
|
|
||||||
{% for commit in commits -%}
|
|
||||||
{{ self::print_commit(commit=commit) }}
|
|
||||||
{% endfor %}
|
|
||||||
|
|
||||||
{% endfor -%}
|
|
||||||
{% endif -%}
|
|
||||||
|
|
||||||
{% if previous and previous.version and version -%}
|
|
||||||
Full Changelog: [{{ previous.version }}...{{ version }}]({{ self::remote_url() }}/compare/{{ previous.version }}...{{ version }})
|
|
||||||
{% endif -%}
|
|
||||||
"""
|
|
||||||
|
|
||||||
footer = ""
|
|
||||||
|
|
||||||
[git]
|
|
||||||
conventional_commits = true
|
|
||||||
filter_unconventional = true
|
|
||||||
split_commits = false
|
|
||||||
protect_breaking_commits = false
|
|
||||||
sort_commits = "oldest"
|
|
||||||
|
|
||||||
commit_parsers = [
|
|
||||||
{ message = ".*\\[skip changelog\\].*", skip = true },
|
|
||||||
{ body = ".*\\[skip changelog\\].*", skip = true },
|
|
||||||
{ message = "^feat", group = "<!-- 0 -->✨ 新功能" },
|
|
||||||
{ message = "^fix", group = "<!-- 1 -->🐛 Bug 修复" },
|
|
||||||
{ message = "^perf", group = "<!-- 2 -->⚡ 优化" },
|
|
||||||
{ message = "^refactor", group = "<!-- 2 -->⚡ 优化" },
|
|
||||||
{ message = "^docs", group = "<!-- 3 -->📝 文档/其他" },
|
|
||||||
{ message = "^test", group = "<!-- 3 -->📝 文档/其他" },
|
|
||||||
{ message = "^chore", group = "<!-- 3 -->📝 文档/其他" },
|
|
||||||
{ message = "^build", group = "<!-- 3 -->📝 文档/其他" },
|
|
||||||
{ message = "^ci", group = "<!-- 3 -->📝 文档/其他" },
|
|
||||||
{ message = "^style", group = "<!-- 3 -->📝 文档/其他" }
|
|
||||||
]
|
|
||||||
|
|
||||||
[git.github]
|
|
||||||
commits = true
|
|
||||||
22
.github/dependabot.yml
vendored
22
.github/dependabot.yml
vendored
@ -1,22 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
version: 2
|
|
||||||
updates:
|
|
||||||
# ===== NuGet 依赖(所有项目)=====
|
|
||||||
- package-ecosystem: "nuget"
|
|
||||||
directory: "/"
|
|
||||||
schedule:
|
|
||||||
interval: "weekly"
|
|
||||||
open-pull-requests-limit: 5
|
|
||||||
ignore:
|
|
||||||
# 忽略所有依赖的大版本升级(框架项目非常建议)
|
|
||||||
- dependency-name: "*"
|
|
||||||
update-types:
|
|
||||||
- "version-update:semver-major"
|
|
||||||
|
|
||||||
# ===== GitHub Actions =====
|
|
||||||
- package-ecosystem: "github-actions"
|
|
||||||
directory: "/"
|
|
||||||
schedule:
|
|
||||||
interval: "weekly"
|
|
||||||
271
.github/workflows/auto-tag.yml
vendored
271
.github/workflows/auto-tag.yml
vendored
@ -1,187 +1,112 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
name: Auto Increment Version and Tag
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
name: Semantic Release Version and Tag
|
|
||||||
|
|
||||||
|
# 工作流触发条件配置
|
||||||
|
# 当向 main 或 master 分支推送代码时触发
|
||||||
|
# 或者当针对 main 或 master 的 PR 被合并关闭时触发
|
||||||
on:
|
on:
|
||||||
workflow_dispatch:
|
push:
|
||||||
|
branches: [ main, master ]
|
||||||
concurrency:
|
pull_request:
|
||||||
group: semantic-release-main
|
branches: [ main, master ]
|
||||||
cancel-in-progress: false
|
types: [ closed ]
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
preview:
|
auto-tag:
|
||||||
if: >
|
name: Auto Increment Version and Create Tag
|
||||||
github.ref == 'refs/heads/main'
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
pull-requests: read
|
|
||||||
outputs:
|
|
||||||
published: ${{ steps.semantic_release.outputs.new_release_published }}
|
|
||||||
last_tag: ${{ steps.semantic_release.outputs.last_release_git_tag }}
|
|
||||||
next_version: ${{ steps.semantic_release.outputs.new_release_version }}
|
|
||||||
next_tag: ${{ steps.semantic_release.outputs.new_release_git_tag }}
|
|
||||||
steps:
|
|
||||||
- name: Checkout code
|
|
||||||
uses: actions/checkout@v6
|
|
||||||
with:
|
|
||||||
fetch-depth: 0
|
|
||||||
persist-credentials: false
|
|
||||||
ref: ${{ github.sha }}
|
|
||||||
|
|
||||||
# semantic-release 在 dry-run 中仍会执行一次 git push --dry-run 权限探测。
|
|
||||||
# 这里提前要求与正式 release 相同的 PAT,避免 github-actions[bot] 因只读上下文触发误导性的 403。
|
|
||||||
- name: Validate PAT token
|
|
||||||
uses: ./.github/actions/validate-pat
|
|
||||||
with:
|
|
||||||
pat-token: ${{ secrets.PAT_TOKEN }}
|
|
||||||
repo-api-url: ${{ github.api_url }}/repos/${{ github.repository }}
|
|
||||||
repository: ${{ github.repository }}
|
|
||||||
missing-token-message: PAT_TOKEN is required because semantic-release preview performs a git push --dry-run permission check.
|
|
||||||
|
|
||||||
# preview 始终先运行,用于给当前 SHA 生成待发布版本预览。
|
|
||||||
- name: Semantic release preview
|
|
||||||
id: semantic_release
|
|
||||||
uses: cycjimmy/semantic-release-action@v6
|
|
||||||
with:
|
|
||||||
dry_run: true
|
|
||||||
ci: false
|
|
||||||
extra_plugins: |
|
|
||||||
conventional-changelog-conventionalcommits@9.1.0
|
|
||||||
env:
|
|
||||||
GITHUB_TOKEN: ${{ secrets.PAT_TOKEN }}
|
|
||||||
|
|
||||||
- name: Show preview result
|
|
||||||
run: |
|
|
||||||
echo "published=${{ steps.semantic_release.outputs.new_release_published }}"
|
|
||||||
echo "last_tag=${{ steps.semantic_release.outputs.last_release_git_tag }}"
|
|
||||||
echo "next_version=${{ steps.semantic_release.outputs.new_release_version }}"
|
|
||||||
echo "next_tag=${{ steps.semantic_release.outputs.new_release_git_tag }}"
|
|
||||||
|
|
||||||
- name: Generate preview release notes
|
|
||||||
if: ${{ steps.semantic_release.outputs.new_release_published == 'true' }}
|
|
||||||
id: cliff_preview
|
|
||||||
uses: orhun/git-cliff-action@v4
|
|
||||||
with:
|
|
||||||
config: .github/cliff.toml
|
|
||||||
args: >-
|
|
||||||
-vv --unreleased --strip header
|
|
||||||
--tag "${{ steps.semantic_release.outputs.new_release_git_tag }}"
|
|
||||||
env:
|
|
||||||
OUTPUT: PREVIEW_RELEASE_NOTES.md
|
|
||||||
GITHUB_REPO: ${{ github.repository }}
|
|
||||||
GITHUB_TOKEN: ${{ github.token }}
|
|
||||||
|
|
||||||
- name: Write preview summary
|
|
||||||
env:
|
|
||||||
RELEASE_PUBLISHED: ${{ steps.semantic_release.outputs.new_release_published }}
|
|
||||||
CLIFF_RELEASE_NOTES: ${{ steps.cliff_preview.outputs.content }}
|
|
||||||
run: |
|
|
||||||
{
|
|
||||||
echo "## Release Preview"
|
|
||||||
echo
|
|
||||||
echo "- Commit: \`${{ github.sha }}\`"
|
|
||||||
echo "- Release needed: \`${{ steps.semantic_release.outputs.new_release_published }}\`"
|
|
||||||
echo "- Last tag: \`${{ steps.semantic_release.outputs.last_release_git_tag }}\`"
|
|
||||||
echo "- Next version: \`${{ steps.semantic_release.outputs.new_release_version }}\`"
|
|
||||||
echo "- Next tag: \`${{ steps.semantic_release.outputs.new_release_git_tag }}\`"
|
|
||||||
echo "- Preview auth: uses \`PAT_TOKEN\` because semantic-release dry-run still performs a remote push permission probe."
|
|
||||||
echo "- Snapshot semantics: this preview is pinned to dispatch SHA \`${{ github.sha }}\`; commits added to \`main\` after the run starts are not included."
|
|
||||||
if [ "${RELEASE_PUBLISHED}" = "true" ] && [ -n "${CLIFF_RELEASE_NOTES}" ]; then
|
|
||||||
echo
|
|
||||||
echo "### 候选发布说明"
|
|
||||||
echo
|
|
||||||
printf '%s\n' "${CLIFF_RELEASE_NOTES}"
|
|
||||||
fi
|
|
||||||
echo
|
|
||||||
echo "If the version looks correct, approve the \`release-approval\` environment to continue."
|
|
||||||
} >> "${GITHUB_STEP_SUMMARY}"
|
|
||||||
|
|
||||||
release:
|
|
||||||
if: >
|
|
||||||
github.ref == 'refs/heads/main' &&
|
|
||||||
needs.preview.result == 'success' &&
|
|
||||||
needs.preview.outputs.published == 'true'
|
|
||||||
needs:
|
|
||||||
- preview
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
# 条件判断:仅在推送事件或已合并的 PR 关闭事件中执行
|
||||||
|
if: github.event_name == 'push' || (github.event_name == 'pull_request' && github.event.action == 'closed' && github.event.pull_request.merged == true)
|
||||||
permissions:
|
permissions:
|
||||||
contents: write
|
contents: write
|
||||||
pull-requests: read
|
|
||||||
environment:
|
|
||||||
name: release-approval
|
|
||||||
steps:
|
steps:
|
||||||
|
# 步骤一:检出仓库代码
|
||||||
|
# 使用 actions/checkout 动作获取完整的 Git 历史用于查找已有标签
|
||||||
- name: Checkout code
|
- name: Checkout code
|
||||||
uses: actions/checkout@v6
|
uses: actions/checkout@v4
|
||||||
with:
|
with:
|
||||||
fetch-depth: 0
|
lfs: true
|
||||||
|
fetch-depth: 0 # 获取全部历史记录以查找现有标签
|
||||||
persist-credentials: false
|
persist-credentials: false
|
||||||
ref: ${{ github.sha }}
|
|
||||||
|
|
||||||
- name: Validate PAT token
|
# 步骤二:检查是否需要跳过打标签操作
|
||||||
uses: ./.github/actions/validate-pat
|
# 根据最新提交信息决定是否继续后续流程
|
||||||
with:
|
- name: Check for skip keyword
|
||||||
pat-token: ${{ secrets.PAT_TOKEN }}
|
id: check_skip
|
||||||
repo-api-url: ${{ github.api_url }}/repos/${{ github.repository }}
|
|
||||||
repository: ${{ github.repository }}
|
|
||||||
missing-token-message: PAT_TOKEN is required because a tag created with GITHUB_TOKEN will not trigger publish.yml.
|
|
||||||
|
|
||||||
- name: Semantic release
|
|
||||||
id: semantic_release
|
|
||||||
uses: cycjimmy/semantic-release-action@v6
|
|
||||||
with:
|
|
||||||
dry_run: false
|
|
||||||
extra_plugins: |
|
|
||||||
conventional-changelog-conventionalcommits@9.1.0
|
|
||||||
env:
|
|
||||||
GITHUB_TOKEN: ${{ secrets.PAT_TOKEN }}
|
|
||||||
|
|
||||||
- name: Show release result
|
|
||||||
run: |
|
run: |
|
||||||
echo "published=${{ steps.semantic_release.outputs.new_release_published }}"
|
# 检查最近一次提交信息是否包含跳过关键词
|
||||||
echo "preview_last_tag=${{ needs.preview.outputs.last_tag }}"
|
LAST_COMMIT_MSG=$(git log -1 --pretty=format:"%B")
|
||||||
echo "preview_next_version=${{ needs.preview.outputs.next_version }}"
|
if [[ "$LAST_COMMIT_MSG" == *"[skip release]"* ]] || [[ "$LAST_COMMIT_MSG" == *"[no tag]"* ]]; then
|
||||||
echo "preview_next_tag=${{ needs.preview.outputs.next_tag }}"
|
echo "skip_tag=true" >> $GITHUB_OUTPUT
|
||||||
echo "last_tag=${{ steps.semantic_release.outputs.last_release_git_tag }}"
|
echo "Skipping tag creation due to skip keyword in commit message"
|
||||||
echo "next_version=${{ steps.semantic_release.outputs.new_release_version }}"
|
else
|
||||||
echo "next_tag=${{ steps.semantic_release.outputs.new_release_git_tag }}"
|
echo "skip_tag=false" >> $GITHUB_OUTPUT
|
||||||
|
echo "No skip keyword found, proceeding with tag creation"
|
||||||
- name: Generate published release notes
|
|
||||||
if: ${{ steps.semantic_release.outputs.new_release_published == 'true' }}
|
|
||||||
id: cliff_release
|
|
||||||
uses: orhun/git-cliff-action@v4
|
|
||||||
with:
|
|
||||||
config: .github/cliff.toml
|
|
||||||
args: >-
|
|
||||||
-vv --latest --strip header
|
|
||||||
env:
|
|
||||||
OUTPUT: PUBLISHED_RELEASE_NOTES.md
|
|
||||||
GITHUB_REPO: ${{ github.repository }}
|
|
||||||
GITHUB_TOKEN: ${{ github.token }}
|
|
||||||
|
|
||||||
- name: Write release summary
|
|
||||||
env:
|
|
||||||
RELEASE_PUBLISHED: ${{ steps.semantic_release.outputs.new_release_published }}
|
|
||||||
CLIFF_RELEASE_NOTES: ${{ steps.cliff_release.outputs.content }}
|
|
||||||
run: |
|
|
||||||
{
|
|
||||||
echo "## Release Publish"
|
|
||||||
echo
|
|
||||||
echo "- Commit: \`${{ github.sha }}\`"
|
|
||||||
echo "- Preview last tag: \`${{ needs.preview.outputs.last_tag }}\`"
|
|
||||||
echo "- Preview next version: \`${{ needs.preview.outputs.next_version }}\`"
|
|
||||||
echo "- Preview next tag: \`${{ needs.preview.outputs.next_tag }}\`"
|
|
||||||
echo "- Published: \`${{ steps.semantic_release.outputs.new_release_published }}\`"
|
|
||||||
echo "- Last tag: \`${{ steps.semantic_release.outputs.last_release_git_tag }}\`"
|
|
||||||
echo "- Next version: \`${{ steps.semantic_release.outputs.new_release_version }}\`"
|
|
||||||
echo "- Next tag: \`${{ steps.semantic_release.outputs.new_release_git_tag }}\`"
|
|
||||||
echo "- Snapshot semantics: this publish run still uses dispatch SHA \`${{ github.sha }}\`; commits added to \`main\` after the preview started are excluded."
|
|
||||||
if [ "${RELEASE_PUBLISHED}" = "true" ] && [ -n "${CLIFF_RELEASE_NOTES}" ]; then
|
|
||||||
echo
|
|
||||||
echo "### 已发布说明"
|
|
||||||
echo
|
|
||||||
printf '%s\n' "${CLIFF_RELEASE_NOTES}"
|
|
||||||
fi
|
fi
|
||||||
} >> "${GITHUB_STEP_SUMMARY}"
|
echo "Last commit message: $LAST_COMMIT_MSG"
|
||||||
|
|
||||||
|
# 步骤三:计算下一个版本号(若未被跳过)
|
||||||
|
# 自动解析当前最新标签并递增修订号生成新的语义化版本号
|
||||||
|
- name: Get next version
|
||||||
|
id: get_next_version
|
||||||
|
if: steps.check_skip.outputs.skip_tag == 'false'
|
||||||
|
run: |
|
||||||
|
# 获取最新的标签版本号,如果没有标签则默认为 0.0.0
|
||||||
|
LATEST_TAG=$(git describe --tags --abbrev=0 2>/dev/null || echo "v0.0.0")
|
||||||
|
|
||||||
|
# 移除可能存在的 v 前缀
|
||||||
|
VERSION_NUM=${LATEST_TAG#v}
|
||||||
|
|
||||||
|
# 解析主版本号、次版本号和修订号
|
||||||
|
MAJOR=$(echo $VERSION_NUM | cut -d. -f1)
|
||||||
|
MINOR=$(echo $VERSION_NUM | cut -d. -f2)
|
||||||
|
PATCH=$(echo $VERSION_NUM | cut -d. -f3)
|
||||||
|
|
||||||
|
# 递增修订号
|
||||||
|
PATCH=$((PATCH + 1))
|
||||||
|
|
||||||
|
# 构造新版本号
|
||||||
|
NEW_VERSION="$MAJOR.$MINOR.$PATCH"
|
||||||
|
NEW_TAG="v$NEW_VERSION"
|
||||||
|
|
||||||
|
echo "latest_tag=$LATEST_TAG"
|
||||||
|
echo "new_version=$NEW_VERSION"
|
||||||
|
echo "new_tag=$NEW_TAG"
|
||||||
|
|
||||||
|
echo "latest_tag=$LATEST_TAG" >> $GITHUB_OUTPUT
|
||||||
|
echo "new_version=$NEW_VERSION" >> $GITHUB_OUTPUT
|
||||||
|
echo "new_tag=$NEW_TAG" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
# 步骤四:创建并推送新标签到远程仓库(若未被跳过)
|
||||||
|
# 使用个人访问令牌(PAT)进行身份验证完成推送操作
|
||||||
|
- name: Create tag and push (using PAT)
|
||||||
|
if: steps.check_skip.outputs.skip_tag == 'false'
|
||||||
|
env:
|
||||||
|
PAT: ${{ secrets.PAT_TOKEN }}
|
||||||
|
REPO: ${{ github.repository }}
|
||||||
|
NEW_TAG: ${{ steps.get_next_version.outputs.new_tag }}
|
||||||
|
run: |
|
||||||
|
set -e
|
||||||
|
git config --local user.email "action@github.com"
|
||||||
|
git config --local user.name "GitHub Action"
|
||||||
|
|
||||||
|
echo "Creating annotated tag $NEW_TAG"
|
||||||
|
git tag -a "$NEW_TAG" -m "Auto-generated tag: $NEW_TAG"
|
||||||
|
|
||||||
|
# 推送单个 tag,使用 PAT 作为 HTTPS token
|
||||||
|
echo "Pushing tag $NEW_TAG to origin using PAT"
|
||||||
|
git push "https://x-access-token:${PAT}@github.com/${REPO}.git" "refs/tags/${NEW_TAG}"
|
||||||
|
|
||||||
|
# 步骤五:输出本次成功创建的新版本相关信息(若未被跳过)
|
||||||
|
- name: Print version info
|
||||||
|
if: steps.check_skip.outputs.skip_tag == 'false'
|
||||||
|
run: |
|
||||||
|
echo "Previous tag was: ${{ steps.get_next_version.outputs.latest_tag }}"
|
||||||
|
echo "New tag created: ${{ steps.get_next_version.outputs.new_tag }}"
|
||||||
|
echo "Version number: ${{ steps.get_next_version.outputs.new_version }}"
|
||||||
|
|
||||||
|
# 步骤六:输出跳过原因信息(如果检测到了跳过关键字)
|
||||||
|
- name: Print skip info
|
||||||
|
if: steps.check_skip.outputs.skip_tag == 'true'
|
||||||
|
run: |
|
||||||
|
echo "Tag creation skipped due to commit message containing skip keyword"
|
||||||
|
|||||||
71
.github/workflows/benchmark.yml
vendored
71
.github/workflows/benchmark.yml
vendored
@ -1,71 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
name: Benchmark
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_dispatch:
|
|
||||||
inputs:
|
|
||||||
benchmark_filter:
|
|
||||||
description: '可选的 BenchmarkDotNet 过滤器;留空时仅执行 benchmark 项目 Release build'
|
|
||||||
required: false
|
|
||||||
default: ''
|
|
||||||
type: string
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
benchmark:
|
|
||||||
name: Benchmark Build Or Run
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout code
|
|
||||||
uses: actions/checkout@v6
|
|
||||||
with:
|
|
||||||
fetch-depth: 0
|
|
||||||
|
|
||||||
- name: Setup .NET 10
|
|
||||||
uses: actions/setup-dotnet@v5
|
|
||||||
with:
|
|
||||||
dotnet-version: 10.0.x
|
|
||||||
|
|
||||||
- name: Cache NuGet packages
|
|
||||||
uses: actions/cache@v5
|
|
||||||
with:
|
|
||||||
path: |
|
|
||||||
~/.nuget/packages
|
|
||||||
~/.local/share/NuGet
|
|
||||||
key: ${{ runner.os }}-nuget-benchmarks-${{ hashFiles('GFramework.Cqrs.Benchmarks/*.csproj', 'GFramework.Cqrs/*.csproj', 'GFramework.Cqrs.Abstractions/*.csproj', 'GFramework.Core/*.csproj', 'GFramework.Core.Abstractions/*.csproj', '**/nuget.config') }}
|
|
||||||
|
|
||||||
- name: Restore benchmark project
|
|
||||||
run: dotnet restore GFramework.Cqrs.Benchmarks/GFramework.Cqrs.Benchmarks.csproj
|
|
||||||
|
|
||||||
- name: Build benchmark project
|
|
||||||
run: dotnet build GFramework.Cqrs.Benchmarks/GFramework.Cqrs.Benchmarks.csproj -c Release --no-restore
|
|
||||||
|
|
||||||
- name: Report build-only mode
|
|
||||||
if: ${{ inputs.benchmark_filter == '' }}
|
|
||||||
run: |
|
|
||||||
echo "No benchmark filter provided."
|
|
||||||
echo "Workflow completed after validating the benchmark project build."
|
|
||||||
|
|
||||||
- name: Run filtered benchmarks
|
|
||||||
if: ${{ inputs.benchmark_filter != '' }}
|
|
||||||
env:
|
|
||||||
BENCHMARK_FILTER: ${{ inputs.benchmark_filter }}
|
|
||||||
run: |
|
|
||||||
set -euo pipefail
|
|
||||||
dotnet run --project GFramework.Cqrs.Benchmarks/GFramework.Cqrs.Benchmarks.csproj -c Release --no-build -- \
|
|
||||||
--filter "$BENCHMARK_FILTER"
|
|
||||||
|
|
||||||
- name: Upload BenchmarkDotNet artifacts
|
|
||||||
if: ${{ always() && inputs.benchmark_filter != '' }}
|
|
||||||
uses: actions/upload-artifact@v7
|
|
||||||
with:
|
|
||||||
name: benchmark-artifacts
|
|
||||||
path: |
|
|
||||||
BenchmarkDotNet.Artifacts/**
|
|
||||||
GFramework.Cqrs.Benchmarks/bin/Release/net10.0/BenchmarkDotNet.Artifacts/**
|
|
||||||
if-no-files-found: ignore
|
|
||||||
304
.github/workflows/ci.yml
vendored
304
.github/workflows/ci.yml
vendored
@ -1,304 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
# CI/CD工作流配置:构建和测试.NET项目
|
|
||||||
# 该工作流仅在创建或更新面向任意分支的 pull request 时触发
|
|
||||||
name: CI - Build & Test
|
|
||||||
|
|
||||||
on:
|
|
||||||
pull_request:
|
|
||||||
branches: [ '**' ]
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
pull-requests: write
|
|
||||||
security-events: write
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
# 代码质量检查 job(并行执行,不阻塞构建)
|
|
||||||
code-quality:
|
|
||||||
name: Code Quality & Security
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
|
|
||||||
steps:
|
|
||||||
# 检出源代码,设置fetch-depth为0以获取完整的git历史
|
|
||||||
- name: Checkout code
|
|
||||||
uses: actions/checkout@v6
|
|
||||||
with:
|
|
||||||
fetch-depth: 0
|
|
||||||
|
|
||||||
# 校验C#命名空间与源码目录是否符合命名规范
|
|
||||||
- name: Validate C# naming
|
|
||||||
run: bash scripts/validate-csharp-naming.sh
|
|
||||||
|
|
||||||
# 校验仓库维护源码是否包含 Apache-2.0 文件头声明
|
|
||||||
- name: Validate license headers
|
|
||||||
run: python3 scripts/license-header.py --check
|
|
||||||
|
|
||||||
- name: Validate runtime-generator boundaries
|
|
||||||
run: python3 scripts/validate-runtime-generator-boundaries.py
|
|
||||||
|
|
||||||
# 缓存MegaLinter
|
|
||||||
- name: Cache MegaLinter
|
|
||||||
uses: actions/cache@v5
|
|
||||||
with:
|
|
||||||
path: ~/.cache/megalinter
|
|
||||||
key: ${{ runner.os }}-megalinter-v9
|
|
||||||
restore-keys: |
|
|
||||||
${{ runner.os }}-megalinter-
|
|
||||||
|
|
||||||
# MegaLinter扫描步骤
|
|
||||||
# 执行代码质量检查和安全扫描,生成SARIF格式报告
|
|
||||||
- name: MegaLinter
|
|
||||||
uses: oxsecurity/megalinter@v9.4.0
|
|
||||||
continue-on-error: true
|
|
||||||
env:
|
|
||||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
FAIL_ON_ERROR: ${{ github.ref == 'refs/heads/main' }}
|
|
||||||
|
|
||||||
# 上传SARIF格式的安全和代码质量问题报告到GitHub安全中心
|
|
||||||
- name: Upload SARIF
|
|
||||||
uses: github/codeql-action/upload-sarif@v4
|
|
||||||
with:
|
|
||||||
sarif_file: megalinter-reports/sarif
|
|
||||||
|
|
||||||
# 缓存TruffleHog
|
|
||||||
- name: Cache TruffleHog
|
|
||||||
uses: actions/cache@v5
|
|
||||||
with:
|
|
||||||
path: ~/.cache/trufflehog
|
|
||||||
key: ${{ runner.os }}-trufflehog
|
|
||||||
|
|
||||||
# TruffleHog OSS 扫描步骤
|
|
||||||
# 使用 TruffleHog 工具扫描代码库中的敏感信息泄露,如API密钥、密码等
|
|
||||||
# 该步骤会比较基础分支和当前提交之间的差异,检测新增内容中是否包含敏感数据
|
|
||||||
- name: TruffleHog OSS
|
|
||||||
uses: trufflesecurity/trufflehog@v3.95.2
|
|
||||||
with:
|
|
||||||
# 扫描路径,. 表示扫描整个仓库
|
|
||||||
path: .
|
|
||||||
# 基础提交哈希,用于与当前提交进行比较
|
|
||||||
base: ${{ github.event.pull_request.base.sha }}
|
|
||||||
# 当前提交哈希,作为扫描的目标版本
|
|
||||||
head: ${{ github.event.pull_request.head.sha }}
|
|
||||||
|
|
||||||
# 构建和测试 job(并行执行)
|
|
||||||
build-and-test:
|
|
||||||
name: Build and Test
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
|
|
||||||
steps:
|
|
||||||
# 检出源代码
|
|
||||||
- name: Checkout code
|
|
||||||
uses: actions/checkout@v6
|
|
||||||
with:
|
|
||||||
fetch-depth: 0
|
|
||||||
|
|
||||||
# 安装和配置.NET SDK版本
|
|
||||||
- name: Setup .NET 8
|
|
||||||
uses: actions/setup-dotnet@v5
|
|
||||||
with:
|
|
||||||
dotnet-version: 8.0.x
|
|
||||||
|
|
||||||
- name: Setup .NET 9
|
|
||||||
uses: actions/setup-dotnet@v5
|
|
||||||
with:
|
|
||||||
dotnet-version: 9.0.x
|
|
||||||
|
|
||||||
- name: Setup .NET 10
|
|
||||||
uses: actions/setup-dotnet@v5
|
|
||||||
with:
|
|
||||||
dotnet-version: 10.0.x
|
|
||||||
|
|
||||||
# 配置NuGet包缓存以加速后续构建
|
|
||||||
- name: Cache NuGet packages
|
|
||||||
uses: actions/cache@v5
|
|
||||||
with:
|
|
||||||
path: |
|
|
||||||
~/.nuget/packages
|
|
||||||
~/.local/share/NuGet
|
|
||||||
key: ${{ runner.os }}-nuget-${{ hashFiles('**/*.csproj', '**/nuget.config') }}
|
|
||||||
|
|
||||||
# 配置.NET本地工具缓存以加速后续构建
|
|
||||||
- name: Cache dotnet tools
|
|
||||||
uses: actions/cache@v5
|
|
||||||
with:
|
|
||||||
path: ~/.dotnet/tools
|
|
||||||
key: ${{ runner.os }}-dotnet-tools-${{ hashFiles('.config/dotnet-tools.json') }}
|
|
||||||
|
|
||||||
# 执行NuGet包恢复操作
|
|
||||||
- name: Restore
|
|
||||||
run: dotnet restore GFramework.sln
|
|
||||||
# 恢复.NET本地工具
|
|
||||||
- name: Restore .NET tools
|
|
||||||
run: dotnet tool restore
|
|
||||||
|
|
||||||
- name: Setup Node.js 20
|
|
||||||
uses: actions/setup-node@v6
|
|
||||||
with:
|
|
||||||
node-version: 20
|
|
||||||
|
|
||||||
- name: Setup Bun
|
|
||||||
uses: oven-sh/setup-bun@v2
|
|
||||||
with:
|
|
||||||
bun-version: 1.2.15
|
|
||||||
|
|
||||||
- name: Install config tool dependencies
|
|
||||||
working-directory: tools/gframework-config-tool
|
|
||||||
run: bun install
|
|
||||||
|
|
||||||
- name: Run config tool tests
|
|
||||||
working-directory: tools/gframework-config-tool
|
|
||||||
run: bun run test
|
|
||||||
|
|
||||||
# 构建项目,使用Release配置且跳过恢复步骤
|
|
||||||
- name: Build
|
|
||||||
run: dotnet build GFramework.sln -c Release --no-restore
|
|
||||||
|
|
||||||
- name: Pack published modules
|
|
||||||
run: |
|
|
||||||
rm -rf ./packages
|
|
||||||
dotnet pack GFramework.sln \
|
|
||||||
-c Release \
|
|
||||||
--no-build \
|
|
||||||
--no-restore \
|
|
||||||
-o ./packages \
|
|
||||||
-p:IncludeSymbols=false
|
|
||||||
|
|
||||||
- name: Validate packed modules
|
|
||||||
run: bash scripts/validate-packed-modules.sh ./packages
|
|
||||||
|
|
||||||
# 运行单元测试,输出TRX格式结果到TestResults目录
|
|
||||||
# 顺序执行各测试项目,避免并发 dotnet test 进程导致“TRX 全绿但 step 仍返回失败”的假红状态
|
|
||||||
- name: Test All Projects
|
|
||||||
id: test_all_projects
|
|
||||||
run: |
|
|
||||||
set -euo pipefail
|
|
||||||
mkdir -p TestResults
|
|
||||||
|
|
||||||
test_projects=(
|
|
||||||
"GFramework.Core.Tests/GFramework.Core.Tests.csproj:core"
|
|
||||||
"GFramework.Game.Tests/GFramework.Game.Tests.csproj:game"
|
|
||||||
"GFramework.SourceGenerators.Tests/GFramework.SourceGenerators.Tests.csproj:sg"
|
|
||||||
"GFramework.Cqrs.Tests/GFramework.Cqrs.Tests.csproj:cqrs"
|
|
||||||
"GFramework.Ecs.Arch.Tests/GFramework.Ecs.Arch.Tests.csproj:ecs-arch"
|
|
||||||
"GFramework.Godot.Tests/GFramework.Godot.Tests.csproj:godot"
|
|
||||||
"GFramework.Godot.SourceGenerators.Tests/GFramework.Godot.SourceGenerators.Tests.csproj:godot-sg"
|
|
||||||
)
|
|
||||||
|
|
||||||
failed=0
|
|
||||||
failed_projects=()
|
|
||||||
failed_log_paths=()
|
|
||||||
|
|
||||||
for entry in "${test_projects[@]}"; do
|
|
||||||
project="${entry%%:*}"
|
|
||||||
name="${entry##*:}"
|
|
||||||
log_path="TestResults/${name}.console.log"
|
|
||||||
|
|
||||||
echo "::group::dotnet test $project"
|
|
||||||
if ! dotnet test "$project" \
|
|
||||||
-c Release \
|
|
||||||
--no-build \
|
|
||||||
--logger "trx;LogFileName=${name}.trx" \
|
|
||||||
--results-directory TestResults \
|
|
||||||
2>&1 | tee "$log_path"; then
|
|
||||||
failed=1
|
|
||||||
failed_projects+=("$project")
|
|
||||||
failed_log_paths+=("$log_path")
|
|
||||||
echo "::error title=Test project failed::$project returned a non-zero exit code."
|
|
||||||
fi
|
|
||||||
echo "::endgroup::"
|
|
||||||
done
|
|
||||||
|
|
||||||
if [ "$failed" -eq 1 ]; then
|
|
||||||
printf 'Failed test projects:\n'
|
|
||||||
printf ' %s\n' "${failed_projects[@]}"
|
|
||||||
fi
|
|
||||||
|
|
||||||
{
|
|
||||||
echo "failed=$failed"
|
|
||||||
echo "failed_projects<<EOF"
|
|
||||||
if [ "${#failed_projects[@]}" -gt 0 ]; then
|
|
||||||
printf '%s\n' "${failed_projects[@]}"
|
|
||||||
fi
|
|
||||||
echo "EOF"
|
|
||||||
echo "failed_log_paths<<EOF"
|
|
||||||
if [ "${#failed_log_paths[@]}" -gt 0 ]; then
|
|
||||||
printf '%s\n' "${failed_log_paths[@]}"
|
|
||||||
fi
|
|
||||||
echo "EOF"
|
|
||||||
} >> "$GITHUB_OUTPUT"
|
|
||||||
|
|
||||||
- name: Generate CTRF report
|
|
||||||
run: |
|
|
||||||
mkdir -p ctrf
|
|
||||||
|
|
||||||
for trx in TestResults/*.trx; do
|
|
||||||
name=$(basename "$trx" .trx)
|
|
||||||
echo "Processing $trx -> ctrf/$name.json"
|
|
||||||
|
|
||||||
dotnet tool run DotnetCtrfJsonReporter \
|
|
||||||
-p "$trx" \
|
|
||||||
-t nunit \
|
|
||||||
-d ctrf \
|
|
||||||
-f "$name.json"
|
|
||||||
done
|
|
||||||
|
|
||||||
- name: Run GFramework.Godot.Tests Diagnostics
|
|
||||||
if: always() && contains(steps.test_all_projects.outputs.failed_projects, 'GFramework.Godot.Tests/GFramework.Godot.Tests.csproj')
|
|
||||||
continue-on-error: true
|
|
||||||
run: |
|
|
||||||
mkdir -p TestResults
|
|
||||||
dotnet test GFramework.Godot.Tests/GFramework.Godot.Tests.csproj \
|
|
||||||
-c Release \
|
|
||||||
--no-build \
|
|
||||||
--blame-crash \
|
|
||||||
--diag TestResults/godot-testhost-diag.log \
|
|
||||||
--logger "trx;LogFileName=godot-diagnostic.trx" \
|
|
||||||
--results-directory TestResults \
|
|
||||||
2>&1 | tee TestResults/godot-diagnostic.console.log
|
|
||||||
|
|
||||||
|
|
||||||
# 生成并发布测试报告,无论测试成功或失败都会执行
|
|
||||||
- name: Test Report
|
|
||||||
uses: dorny/test-reporter@v3
|
|
||||||
if: always()
|
|
||||||
with:
|
|
||||||
name: .NET Test Results
|
|
||||||
path: TestResults/*.trx
|
|
||||||
reporter: dotnet-trx
|
|
||||||
- name: Publish Test Report
|
|
||||||
uses: ctrf-io/github-test-reporter@v1
|
|
||||||
with:
|
|
||||||
report-path: './ctrf/*.json'
|
|
||||||
github-report: true
|
|
||||||
pull-request-report: ${{ github.event.pull_request.head.repo.full_name == github.repository }}
|
|
||||||
summary-delta-report: true
|
|
||||||
insights-report: true
|
|
||||||
flaky-rate-report: true
|
|
||||||
fail-rate-report: true
|
|
||||||
slowest-report: true
|
|
||||||
upload-artifact: true
|
|
||||||
fetch-previous-results: true
|
|
||||||
env:
|
|
||||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
if: always()
|
|
||||||
|
|
||||||
- name: Fail if any test project failed
|
|
||||||
if: always() && steps.test_all_projects.outputs.failed == '1'
|
|
||||||
env:
|
|
||||||
FAILED_PROJECTS: ${{ steps.test_all_projects.outputs.failed_projects }}
|
|
||||||
FAILED_LOG_PATHS: ${{ steps.test_all_projects.outputs.failed_log_paths }}
|
|
||||||
run: |
|
|
||||||
echo "The following test projects returned non-zero exit codes:"
|
|
||||||
printf '%s\n' "$FAILED_PROJECTS"
|
|
||||||
echo
|
|
||||||
echo "Captured dotnet test output:"
|
|
||||||
while IFS= read -r log_path; do
|
|
||||||
if [ -n "$log_path" ] && [ -f "$log_path" ]; then
|
|
||||||
echo "--- BEGIN $log_path ---"
|
|
||||||
cat "$log_path"
|
|
||||||
echo "--- END $log_path ---"
|
|
||||||
fi
|
|
||||||
done <<< "$FAILED_LOG_PATHS"
|
|
||||||
exit 1
|
|
||||||
53
.github/workflows/codeql.yml
vendored
53
.github/workflows/codeql.yml
vendored
@ -1,53 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
# GitHub Actions工作流配置:CodeQL静态代码分析
|
|
||||||
# 该工作流用于对C#项目进行安全漏洞和代码质量分析
|
|
||||||
name: "CodeQL"
|
|
||||||
|
|
||||||
# 触发事件配置
|
|
||||||
# 在以下情况下触发工作流:
|
|
||||||
# 1. 针对任意分支的拉取请求时
|
|
||||||
# 2. 每天凌晨2点执行一次
|
|
||||||
on:
|
|
||||||
pull_request:
|
|
||||||
branches: [ '**' ]
|
|
||||||
schedule:
|
|
||||||
- cron: '0 2 * * *'
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
# 分析任务配置
|
|
||||||
# 对C#代码进行静态分析扫描
|
|
||||||
analyze:
|
|
||||||
name: Analyze (C#)
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
permissions:
|
|
||||||
security-events: write
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout repository
|
|
||||||
uses: actions/checkout@v6
|
|
||||||
with:
|
|
||||||
fetch-depth: 0
|
|
||||||
|
|
||||||
# 设置.NET运行时环境
|
|
||||||
# 配置.NET 8.0.x版本支持
|
|
||||||
- name: Setup .NET
|
|
||||||
uses: actions/setup-dotnet@v5
|
|
||||||
with:
|
|
||||||
dotnet-version: |
|
|
||||||
8.0.x
|
|
||||||
|
|
||||||
# 初始化CodeQL分析环境
|
|
||||||
# 配置C#语言支持并启用自动构建模式
|
|
||||||
- name: Initialize CodeQL
|
|
||||||
uses: github/codeql-action/init@v4
|
|
||||||
with:
|
|
||||||
languages: csharp
|
|
||||||
build-mode: autobuild
|
|
||||||
|
|
||||||
# 执行CodeQL代码分析
|
|
||||||
# 运行静态分析并生成结果报告
|
|
||||||
- name: Perform CodeQL Analysis
|
|
||||||
uses: github/codeql-action/analyze@v4
|
|
||||||
128
.github/workflows/license-compliance.yml
vendored
128
.github/workflows/license-compliance.yml
vendored
@ -1,128 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
name: License Compliance (Feluda)
|
|
||||||
|
|
||||||
on:
|
|
||||||
push:
|
|
||||||
tags:
|
|
||||||
- '*'
|
|
||||||
|
|
||||||
concurrency:
|
|
||||||
group: ${{ github.workflow }}-${{ github.ref }}
|
|
||||||
cancel-in-progress: false
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: write
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
compliance:
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
steps:
|
|
||||||
- name: Checkout repository
|
|
||||||
uses: actions/checkout@v6
|
|
||||||
|
|
||||||
# 使用Feluda许可证扫描器检查项目依赖的许可证合规性
|
|
||||||
# 配置参数:
|
|
||||||
# - project-license: 设置项目许可证为Apache-2.0
|
|
||||||
# - fail-on-restrictive: 发现限制性许可证时失败
|
|
||||||
# - fail-on-incompatible: 发现不兼容许可证时失败
|
|
||||||
# - update-badge: 自动更新许可证徽章
|
|
||||||
- name: Feluda License Scanner
|
|
||||||
uses: anistark/feluda@v1.12.0
|
|
||||||
with:
|
|
||||||
project-license: 'Apache-2.0'
|
|
||||||
fail-on-restrictive: false
|
|
||||||
fail-on-incompatible: false
|
|
||||||
update-badge: startsWith(github.ref, 'refs/tags/v')
|
|
||||||
- name: Feluda License Scanner Incompatible Licenses
|
|
||||||
run: |
|
|
||||||
feluda --incompatible
|
|
||||||
|
|
||||||
# 生成合规性文件(NOTICE / THIRD_PARTY_LICENSES)
|
|
||||||
- name: Generate compliance files
|
|
||||||
run: |
|
|
||||||
echo "1" | feluda generate
|
|
||||||
echo "2" | feluda generate
|
|
||||||
|
|
||||||
# 生成 SBOM(SPDX + CycloneDX)
|
|
||||||
- name: Generate SBOM
|
|
||||||
run: |
|
|
||||||
feluda sbom spdx --output sbom.spdx.json
|
|
||||||
feluda sbom cyclonedx --output sbom.cyclonedx.json
|
|
||||||
|
|
||||||
# 校验 SBOM
|
|
||||||
- name: Validate SBOM files
|
|
||||||
run: |
|
|
||||||
feluda sbom validate sbom.spdx.json --output sbom-spdx-validation.txt
|
|
||||||
feluda sbom validate sbom.cyclonedx.json --output sbom-cyclonedx-validation.txt
|
|
||||||
|
|
||||||
# 上传合规产物到 GitHub Actions 工件存储
|
|
||||||
# 此步骤将指定的合规文件打包并上传为工件,供后续流程使用
|
|
||||||
# 参数说明:
|
|
||||||
# name: 步骤名称,用于标识该操作
|
|
||||||
# uses: 指定使用的 GitHub Action,此处为上传工件的官方动作
|
|
||||||
# with: 配置上传的具体内容
|
|
||||||
# name: 工件名称,用于标识上传的文件集合
|
|
||||||
# path: 指定需要上传的文件路径列表(支持多行格式)
|
|
||||||
# third-party-licenses/**: 手工维护的参考源码许可证原文
|
|
||||||
- name: Upload compliance artifacts
|
|
||||||
uses: actions/upload-artifact@v7
|
|
||||||
with:
|
|
||||||
name: license-compliance
|
|
||||||
path: |
|
|
||||||
NOTICE
|
|
||||||
THIRD_PARTY_LICENSES.md
|
|
||||||
third-party-licenses/**
|
|
||||||
sbom.spdx.json
|
|
||||||
sbom.cyclonedx.json
|
|
||||||
sbom-spdx-validation.txt
|
|
||||||
sbom-cyclonedx-validation.txt
|
|
||||||
|
|
||||||
# 将合规文件打包为 ZIP 压缩包
|
|
||||||
# 此步骤通过 zip 命令将多个合规文件压缩为一个 ZIP 文件,便于分发或存档
|
|
||||||
# 压缩包中包含以下文件:
|
|
||||||
# - NOTICE: 项目声明文件
|
|
||||||
# - THIRD_PARTY_LICENSES.md: 第三方许可证列表
|
|
||||||
# - third-party-licenses/: 手工维护的参考源码许可证原文
|
|
||||||
# - sbom.spdx.json: SPDX 格式的软件物料清单
|
|
||||||
# - sbom.cyclonedx.json: CycloneDX 格式的软件物料清单
|
|
||||||
# - sbom-spdx-validation.txt: SPDX 格式验证结果
|
|
||||||
# - sbom-cyclonedx-validation.txt: CycloneDX 格式验证结果
|
|
||||||
- name: Package compliance bundle
|
|
||||||
run: |
|
|
||||||
zip -r license-compliance.zip \
|
|
||||||
NOTICE \
|
|
||||||
THIRD_PARTY_LICENSES.md \
|
|
||||||
third-party-licenses \
|
|
||||||
sbom.spdx.json \
|
|
||||||
sbom.cyclonedx.json \
|
|
||||||
sbom-spdx-validation.txt \
|
|
||||||
sbom-cyclonedx-validation.txt
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# 将合规产物上传至 GitHub Release
|
|
||||||
# 此步骤将指定的合规文件附加到当前标签对应的 GitHub Release 中
|
|
||||||
# 参数说明:
|
|
||||||
# name: 步骤名称,用于标识该操作
|
|
||||||
# uses: 指定使用的 GitHub Action,此处为发布 Release 的第三方动作
|
|
||||||
# with: 配置发布的具体信息
|
|
||||||
# tag_name: 指定 Release 对应的 Git 标签名
|
|
||||||
# files: 指定需要附加到 Release 的文件路径列表(支持多行格式)
|
|
||||||
# env: 设置环境变量
|
|
||||||
# GITHUB_TOKEN: GitHub 访问令牌,用于授权发布操作
|
|
||||||
- name: Upload compliance assets to GitHub Release
|
|
||||||
uses: softprops/action-gh-release@v3
|
|
||||||
with:
|
|
||||||
tag_name: ${{ github.ref_name }}
|
|
||||||
files: |
|
|
||||||
NOTICE
|
|
||||||
THIRD_PARTY_LICENSES.md
|
|
||||||
sbom.spdx.json
|
|
||||||
sbom.cyclonedx.json
|
|
||||||
sbom-spdx-validation.txt
|
|
||||||
sbom-cyclonedx-validation.txt
|
|
||||||
license-compliance.zip
|
|
||||||
env:
|
|
||||||
GITHUB_TOKEN: ${{ github.token }}
|
|
||||||
54
.github/workflows/license-header-fix.yml
vendored
54
.github/workflows/license-header-fix.yml
vendored
@ -1,54 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
# 维护者手动触发的 Apache-2.0 文件头修复流程。
|
|
||||||
name: License Header Fix
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_dispatch:
|
|
||||||
inputs:
|
|
||||||
base_branch:
|
|
||||||
description: Branch to fix and target with the generated pull request.
|
|
||||||
required: true
|
|
||||||
default: main
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: write
|
|
||||||
pull-requests: write
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
fix-license-headers:
|
|
||||||
name: Create license header fix PR
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout target branch
|
|
||||||
uses: actions/checkout@v6
|
|
||||||
with:
|
|
||||||
ref: ${{ inputs.base_branch }}
|
|
||||||
|
|
||||||
- name: Add missing license headers
|
|
||||||
run: python3 scripts/license-header.py --fix
|
|
||||||
|
|
||||||
- name: Create pull request
|
|
||||||
uses: peter-evans/create-pull-request@v8
|
|
||||||
with:
|
|
||||||
token: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
base: ${{ inputs.base_branch }}
|
|
||||||
branch: chore/license-headers-${{ github.run_id }}
|
|
||||||
delete-branch: true
|
|
||||||
commit-message: |
|
|
||||||
chore(license): 补齐 Apache-2.0 文件头
|
|
||||||
|
|
||||||
- 补充缺失源文件许可证声明
|
|
||||||
- 更新文件头治理校验结果
|
|
||||||
title: "chore(license): 补齐 Apache-2.0 文件头"
|
|
||||||
body: |
|
|
||||||
## Summary
|
|
||||||
|
|
||||||
- 补齐仓库维护源码和配置文件缺失的 Apache-2.0 文件头
|
|
||||||
- 使用 `scripts/license-header.py --fix` 生成本次修复
|
|
||||||
|
|
||||||
## Validation
|
|
||||||
|
|
||||||
- `python3 scripts/license-header.py --check`
|
|
||||||
110
.github/workflows/publish-docs.yml
vendored
110
.github/workflows/publish-docs.yml
vendored
@ -1,110 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
# 工作流名称:Publish Docs
|
|
||||||
# 该工作流用于在推送标签或手动触发时构建并部署文档到 GitHub Pages
|
|
||||||
|
|
||||||
name: Publish Docs
|
|
||||||
|
|
||||||
# 触发条件配置
|
|
||||||
# 当推送以 'v' 开头的标签时触发,或者通过 GitHub UI 手动触发
|
|
||||||
on:
|
|
||||||
push:
|
|
||||||
tags:
|
|
||||||
- 'v*'
|
|
||||||
workflow_dispatch:
|
|
||||||
|
|
||||||
# 权限配置
|
|
||||||
# 配置工作流所需的权限:
|
|
||||||
# - contents: read(读取仓库内容)
|
|
||||||
# - pages: write(写入 GitHub Pages)
|
|
||||||
# - id-token: write(写入身份令牌)
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
pages: write
|
|
||||||
id-token: write
|
|
||||||
|
|
||||||
# 并发控制配置
|
|
||||||
# 设置并发组为 "pages",并且不允许取消正在进行的任务
|
|
||||||
concurrency:
|
|
||||||
group: pages
|
|
||||||
cancel-in-progress: false
|
|
||||||
|
|
||||||
# 定义工作流中的任务
|
|
||||||
jobs:
|
|
||||||
# 构建和部署任务
|
|
||||||
build-and-deploy:
|
|
||||||
# 条件判断:仅当推送的是正式版本标签(不包含预发布标识)或手动触发时执行
|
|
||||||
if: |
|
|
||||||
(startsWith(github.ref, 'refs/tags/v') && !contains(github.ref, '-'))
|
|
||||||
|| github.event_name == 'workflow_dispatch'
|
|
||||||
|
|
||||||
# 指定运行环境为最新版 Ubuntu
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
|
|
||||||
# 环境配置
|
|
||||||
environment:
|
|
||||||
name: github-pages
|
|
||||||
url: ${{ steps.deployment.outputs.page_url }}
|
|
||||||
|
|
||||||
# 步骤定义
|
|
||||||
steps:
|
|
||||||
# 检出代码
|
|
||||||
- name: Checkout
|
|
||||||
uses: actions/checkout@v6
|
|
||||||
|
|
||||||
# 按 GitHub Pages 官方流程初始化部署元数据。
|
|
||||||
- name: Configure GitHub Pages
|
|
||||||
uses: actions/configure-pages@v6
|
|
||||||
|
|
||||||
# 安装 Bun 运行时
|
|
||||||
- name: Setup Bun
|
|
||||||
uses: oven-sh/setup-bun@v2
|
|
||||||
with:
|
|
||||||
bun-version: latest
|
|
||||||
|
|
||||||
# 安装项目依赖
|
|
||||||
- name: Install Dependencies
|
|
||||||
working-directory: docs
|
|
||||||
run: bun install
|
|
||||||
|
|
||||||
# 构建 VitePress 文档
|
|
||||||
- name: Build VitePress
|
|
||||||
working-directory: docs
|
|
||||||
run: bun run build
|
|
||||||
|
|
||||||
# 生成 LLM 索引文件
|
|
||||||
- name: Make docs LLM ready
|
|
||||||
uses: demodrive-ai/llms-txt-action@v1
|
|
||||||
with:
|
|
||||||
docs_dir: docs/.vitepress/dist
|
|
||||||
sitemap_path: sitemap.xml
|
|
||||||
skip_llms_txt: 'false'
|
|
||||||
skip_llms_full_txt: 'false'
|
|
||||||
skip_md_files: 'false'
|
|
||||||
|
|
||||||
# 在上传前校验 LLM 索引产物,避免部署出“步骤成功但文件缺失”的 Pages 站点。
|
|
||||||
- name: Verify LLM artifacts
|
|
||||||
run: |
|
|
||||||
test -f docs/.vitepress/dist/sitemap.xml
|
|
||||||
test -f docs/.vitepress/dist/llms.txt
|
|
||||||
test -f docs/.vitepress/dist/llms-full.txt
|
|
||||||
|
|
||||||
md_count="$(find docs/.vitepress/dist -type f -name '*.md' | wc -l)"
|
|
||||||
if [ "$md_count" -eq 0 ]; then
|
|
||||||
echo "Expected llms-txt-action to generate page-level markdown files, but none were found."
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
echo "Generated $md_count markdown files for LLM ingestion."
|
|
||||||
|
|
||||||
# 上传构建产物作为 Pages 部署工件
|
|
||||||
- name: Upload Pages Artifact
|
|
||||||
uses: actions/upload-pages-artifact@v5
|
|
||||||
with:
|
|
||||||
path: docs/.vitepress/dist
|
|
||||||
|
|
||||||
# 将文档部署到 GitHub Pages
|
|
||||||
- name: Deploy to GitHub Pages
|
|
||||||
id: deployment
|
|
||||||
uses: actions/deploy-pages@v5
|
|
||||||
97
.github/workflows/publish-vscode-extension.yml
vendored
97
.github/workflows/publish-vscode-extension.yml
vendored
@ -1,97 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
name: Publish VS Code Extension
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_dispatch:
|
|
||||||
inputs:
|
|
||||||
version:
|
|
||||||
description: Extension version to publish, for example 0.1.0. Leave empty to use package.json or the pushed tag.
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
publish_to_marketplace:
|
|
||||||
description: Publish to the Visual Studio Marketplace after packaging.
|
|
||||||
required: true
|
|
||||||
type: boolean
|
|
||||||
default: true
|
|
||||||
push:
|
|
||||||
tags:
|
|
||||||
- 'gframework-config-tool-v*'
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
publish:
|
|
||||||
name: Package And Publish Marketplace Extension
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
|
|
||||||
defaults:
|
|
||||||
run:
|
|
||||||
working-directory: tools/gframework-config-tool
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout repository
|
|
||||||
uses: actions/checkout@v6
|
|
||||||
with:
|
|
||||||
fetch-depth: 0
|
|
||||||
|
|
||||||
- name: Setup Node.js 20
|
|
||||||
uses: actions/setup-node@v6
|
|
||||||
with:
|
|
||||||
node-version: 20
|
|
||||||
|
|
||||||
- name: Setup Bun
|
|
||||||
uses: oven-sh/setup-bun@v2
|
|
||||||
with:
|
|
||||||
bun-version: 1.2.15
|
|
||||||
|
|
||||||
- name: Determine extension version
|
|
||||||
id: version
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
set -euo pipefail
|
|
||||||
|
|
||||||
PACKAGE_VERSION=$(node -p "require('./package.json').version")
|
|
||||||
VERSION="${PACKAGE_VERSION}"
|
|
||||||
|
|
||||||
if [[ "${GITHUB_REF:-}" == refs/tags/gframework-config-tool-v* ]]; then
|
|
||||||
VERSION="${GITHUB_REF#refs/tags/gframework-config-tool-v}"
|
|
||||||
elif [[ -n "${{ inputs.version || '' }}" ]]; then
|
|
||||||
VERSION="${{ inputs.version }}"
|
|
||||||
fi
|
|
||||||
|
|
||||||
echo "Resolved extension version: ${VERSION}"
|
|
||||||
echo "version=${VERSION}" >> "${GITHUB_OUTPUT}"
|
|
||||||
|
|
||||||
- name: Install extension dependencies
|
|
||||||
run: bun install
|
|
||||||
|
|
||||||
- name: Synchronize package.json version
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
set -euo pipefail
|
|
||||||
node -e "const fs=require('fs'); const path='package.json'; const data=JSON.parse(fs.readFileSync(path,'utf8')); data.version='${{ steps.version.outputs.version }}'; fs.writeFileSync(path, JSON.stringify(data, null, 2) + '\n');"
|
|
||||||
|
|
||||||
- name: Run extension tests
|
|
||||||
run: bun run test
|
|
||||||
|
|
||||||
- name: Package VSIX
|
|
||||||
run: |
|
|
||||||
set -euo pipefail
|
|
||||||
mkdir -p ../../artifacts
|
|
||||||
bun run package:vsix -- --out "../../artifacts/gframework-config-tool-${{ steps.version.outputs.version }}.vsix"
|
|
||||||
|
|
||||||
- name: Upload VSIX artifact
|
|
||||||
uses: actions/upload-artifact@v7
|
|
||||||
with:
|
|
||||||
name: gframework-config-tool-vsix
|
|
||||||
path: artifacts/gframework-config-tool-${{ steps.version.outputs.version }}.vsix
|
|
||||||
if-no-files-found: error
|
|
||||||
|
|
||||||
- name: Publish to Visual Studio Marketplace
|
|
||||||
if: github.event_name == 'push' || inputs.publish_to_marketplace
|
|
||||||
env:
|
|
||||||
VSCE_PAT: ${{ secrets.VSCE_PAT }}
|
|
||||||
run: bun run publish:marketplace
|
|
||||||
249
.github/workflows/publish.yml
vendored
249
.github/workflows/publish.yml
vendored
@ -1,64 +1,50 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
name: Publish (NuGet + GitHub Release)
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
# 发布工作流(NuGet + GitHub Packages + GitHub Release)
|
|
||||||
#
|
|
||||||
# 功能:当推送标签时自动构建、打包,并将相同产物并发发布到 NuGet.org 与 GitHub Packages,
|
|
||||||
# 最后创建 GitHub Release。
|
|
||||||
# 触发条件:推送任何标签(如 v1.0.0 或 1.0.0)
|
|
||||||
# 权限:允许写入内容、包和使用 OIDC 身份验证
|
|
||||||
name: Publish (NuGet + GitHub Packages + GitHub Release)
|
|
||||||
|
|
||||||
|
# 触发:推送 tag 时触发(例如 v1.0.0 或 1.0.0)
|
||||||
on:
|
on:
|
||||||
push:
|
push:
|
||||||
tags:
|
tags:
|
||||||
- '*'
|
- '*'
|
||||||
|
|
||||||
concurrency:
|
# 顶级权限:允许创建 release、写 packages,并允许 id-token(OIDC)
|
||||||
group: ${{ github.workflow }}-${{ github.ref }}
|
|
||||||
cancel-in-progress: false
|
|
||||||
|
|
||||||
permissions:
|
permissions:
|
||||||
contents: write
|
contents: write
|
||||||
packages: write
|
packages: write
|
||||||
id-token: write
|
id-token: write
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
build-pack:
|
build-and-publish:
|
||||||
name: Build And Pack
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
permissions:
|
permissions:
|
||||||
contents: read
|
|
||||||
packages: read
|
|
||||||
id-token: write
|
id-token: write
|
||||||
|
contents: write
|
||||||
outputs:
|
packages: write
|
||||||
package_version: ${{ steps.tag_version.outputs.version }}
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout repository (at tag)
|
- name: Checkout repository (at tag)
|
||||||
uses: actions/checkout@v6
|
uses: actions/checkout@v4
|
||||||
with:
|
with:
|
||||||
fetch-depth: 0
|
fetch-depth: 0
|
||||||
persist-credentials: true
|
persist-credentials: true
|
||||||
|
|
||||||
- name: Setup .NET
|
- name: Setup .NET
|
||||||
uses: actions/setup-dotnet@v5
|
uses: actions/setup-dotnet@v4
|
||||||
with:
|
with:
|
||||||
dotnet-version: 10.0.x
|
dotnet-version: 9.0.x
|
||||||
|
|
||||||
- name: Cache NuGet packages
|
- name: Install unzip (for reading .nuspec from .nupkg)
|
||||||
uses: actions/cache@v5
|
run: sudo apt-get update && sudo apt-get install -y unzip
|
||||||
with:
|
|
||||||
path: ~/.nuget/packages
|
|
||||||
key: ${{ runner.os }}-nuget-${{ hashFiles('**/*.csproj') }}
|
|
||||||
|
|
||||||
- name: Restore dependencies
|
- name: Restore dependencies
|
||||||
run: dotnet restore GFramework.sln
|
run: dotnet restore
|
||||||
|
|
||||||
|
- name: Build
|
||||||
|
run: dotnet build -c Release --no-restore -p:DebugType=portable
|
||||||
|
|
||||||
|
- name: Test
|
||||||
|
run: dotnet test --no-build -c Release --verbosity normal
|
||||||
|
|
||||||
# 从 GitHub 引用中提取标签版本。
|
|
||||||
# 提取逻辑:去除 refs/tags/ 前缀,然后去除 v/V 前缀。
|
|
||||||
- name: Determine tag version
|
- name: Determine tag version
|
||||||
id: tag_version
|
id: tag_version
|
||||||
run: |
|
run: |
|
||||||
@ -68,68 +54,23 @@ jobs:
|
|||||||
VERSION=${TAG#v}
|
VERSION=${TAG#v}
|
||||||
VERSION=${VERSION#V}
|
VERSION=${VERSION#V}
|
||||||
echo "tag='$TAG' -> version='$VERSION'"
|
echo "tag='$TAG' -> version='$VERSION'"
|
||||||
echo "version=$VERSION" >> "$GITHUB_OUTPUT"
|
echo "version=$VERSION" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
- name: Pack (use tag version)
|
- name: Pack (use tag version)
|
||||||
run: |
|
run: |
|
||||||
set -e
|
set -e
|
||||||
echo "Packing with version=${{ steps.tag_version.outputs.version }}"
|
echo "Packing with version=${{ steps.tag_version.outputs.version }}"
|
||||||
dotnet pack GFramework.sln \
|
dotnet pack -c Release -o ./packages -p:PackageVersion=${{ steps.tag_version.outputs.version }} -p:IncludeSymbols=false
|
||||||
-c Release \
|
|
||||||
--no-restore \
|
|
||||||
-o ./packages \
|
|
||||||
-p:PackageVersion=${{ steps.tag_version.outputs.version }} \
|
|
||||||
-p:IncludeSymbols=false
|
|
||||||
|
|
||||||
- name: Validate packed modules
|
|
||||||
run: bash scripts/validate-packed-modules.sh ./packages
|
|
||||||
|
|
||||||
- name: Validate runtime-generator package boundaries
|
|
||||||
run: python3 scripts/validate-runtime-generator-boundaries.py --package-dir ./packages
|
|
||||||
|
|
||||||
- name: Show packages
|
- name: Show packages
|
||||||
run: ls -la ./packages || true
|
run: ls -la ./packages || true
|
||||||
|
|
||||||
# 上传 nupkg 工件,供多个发布 job 复用,避免重复打包。
|
|
||||||
- name: Upload package artifacts
|
|
||||||
uses: actions/upload-artifact@v7
|
|
||||||
with:
|
|
||||||
name: packages
|
|
||||||
path: ./packages/*.nupkg
|
|
||||||
|
|
||||||
publish-nuget:
|
|
||||||
name: Publish To NuGet.org
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs: build-pack
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
packages: read
|
|
||||||
id-token: write
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Setup .NET
|
|
||||||
uses: actions/setup-dotnet@v5
|
|
||||||
with:
|
|
||||||
dotnet-version: 10.0.x
|
|
||||||
|
|
||||||
- name: Download package artifacts
|
|
||||||
uses: actions/download-artifact@v8
|
|
||||||
with:
|
|
||||||
name: packages
|
|
||||||
path: ./packages
|
|
||||||
|
|
||||||
- name: Show downloaded packages
|
|
||||||
run: ls -la ./packages || true
|
|
||||||
|
|
||||||
- name: NuGet login (OIDC → temporary API key)
|
- name: NuGet login (OIDC → temporary API key)
|
||||||
id: nuget_login
|
id: nuget_login
|
||||||
uses: NuGet/login@v1
|
uses: NuGet/login@v1
|
||||||
with:
|
with:
|
||||||
user: ${{ secrets.NUGET_USER }}
|
user: ${{ secrets.NUGET_USER }} # 推荐将用户名放 secrets
|
||||||
|
|
||||||
# 将所有生成的包推送到 nuget.org。
|
|
||||||
# 使用临时 API 密钥进行身份验证,并跳过重复包上传。
|
|
||||||
- name: Push all packages to nuget.org
|
- name: Push all packages to nuget.org
|
||||||
env:
|
env:
|
||||||
NUGET_API_KEY: ${{ steps.nuget_login.outputs.NUGET_API_KEY }}
|
NUGET_API_KEY: ${{ steps.nuget_login.outputs.NUGET_API_KEY }}
|
||||||
@ -150,118 +91,58 @@ jobs:
|
|||||||
echo "No packages found to push."
|
echo "No packages found to push."
|
||||||
fi
|
fi
|
||||||
|
|
||||||
publish-github-packages:
|
- name: Get Version and First Package Path
|
||||||
name: Publish To GitHub Packages
|
id: get_version
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs: build-pack
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: read
|
|
||||||
packages: write
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Setup .NET
|
|
||||||
uses: actions/setup-dotnet@v5
|
|
||||||
with:
|
|
||||||
dotnet-version: 10.0.x
|
|
||||||
|
|
||||||
- name: Download package artifacts
|
|
||||||
uses: actions/download-artifact@v8
|
|
||||||
with:
|
|
||||||
name: packages
|
|
||||||
path: ./packages
|
|
||||||
|
|
||||||
- name: Show downloaded packages
|
|
||||||
run: ls -la ./packages || true
|
|
||||||
|
|
||||||
# 使用仓库内建的 GITHUB_TOKEN 配置 GitHub Packages NuGet 源。
|
|
||||||
- name: Configure GitHub Packages source
|
|
||||||
run: |
|
run: |
|
||||||
set -e
|
set -e
|
||||||
dotnet nuget add source "https://nuget.pkg.github.com/${{ github.repository_owner }}/index.json" \
|
PACKAGE_FILE=$(find ./packages -name "*.nupkg" | head -n 1 || true)
|
||||||
--name github \
|
if [ -z "$PACKAGE_FILE" ]; then
|
||||||
--username "${{ github.repository_owner }}" \
|
echo "No .nupkg file found in ./packages"
|
||||||
--password "${{ github.token }}" \
|
exit 1
|
||||||
--store-password-in-clear-text
|
|
||||||
|
|
||||||
- name: Push all packages to GitHub Packages
|
|
||||||
run: |
|
|
||||||
set -e
|
|
||||||
pushed_any=false
|
|
||||||
for PKG in ./packages/*.nupkg; do
|
|
||||||
[ -f "$PKG" ] || continue
|
|
||||||
pushed_any=true
|
|
||||||
echo "Pushing $PKG to GitHub Packages..."
|
|
||||||
dotnet nuget push "$PKG" \
|
|
||||||
--source github \
|
|
||||||
--skip-duplicate
|
|
||||||
done
|
|
||||||
if [ "$pushed_any" = false ]; then
|
|
||||||
echo "No packages found to push."
|
|
||||||
fi
|
fi
|
||||||
|
# 从 .nupkg(zip)里读取 .nuspec 并提取 <version>
|
||||||
|
VERSION=$(unzip -p "$PACKAGE_FILE" '*.nuspec' 2>/dev/null | sed -n 's:.*<version>\(.*\)</version>.*:\1:p' | head -n1)
|
||||||
|
if [ -z "$VERSION" ]; then
|
||||||
|
echo "Failed to parse version from $PACKAGE_FILE"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
BASENAME=$(basename "$PACKAGE_FILE")
|
||||||
|
echo "package_file=$PACKAGE_FILE" >> $GITHUB_OUTPUT
|
||||||
|
echo "package_basename=$BASENAME" >> $GITHUB_OUTPUT
|
||||||
|
echo "version=$VERSION" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
create-release:
|
- name: Create GitHub Release
|
||||||
name: Create GitHub Release
|
id: create_release
|
||||||
runs-on: ubuntu-latest
|
uses: actions/create-release@v1
|
||||||
needs:
|
|
||||||
- build-pack
|
|
||||||
- publish-nuget
|
|
||||||
- publish-github-packages
|
|
||||||
if: ${{ always() && needs.build-pack.result == 'success' }}
|
|
||||||
|
|
||||||
permissions:
|
|
||||||
contents: write
|
|
||||||
packages: read
|
|
||||||
pull-requests: read
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout repository (at tag)
|
|
||||||
uses: actions/checkout@v6
|
|
||||||
with:
|
|
||||||
fetch-depth: 0
|
|
||||||
persist-credentials: true
|
|
||||||
|
|
||||||
- name: Download package artifacts
|
|
||||||
uses: actions/download-artifact@v8
|
|
||||||
with:
|
|
||||||
name: packages
|
|
||||||
path: ./packages
|
|
||||||
|
|
||||||
- name: Generate release notes
|
|
||||||
id: cliff_release
|
|
||||||
uses: orhun/git-cliff-action@v4
|
|
||||||
with:
|
|
||||||
config: .github/cliff.toml
|
|
||||||
args: >-
|
|
||||||
-vv --latest --strip header
|
|
||||||
env:
|
env:
|
||||||
OUTPUT: RELEASE_NOTES.md
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
GITHUB_REPO: ${{ github.repository }}
|
|
||||||
GITHUB_TOKEN: ${{ github.token }}
|
|
||||||
|
|
||||||
# 无论某一侧包源发布是否失败,都继续创建 Release。
|
|
||||||
# 合规工件由独立 workflow 生成,当前发布流不再假设这些文件在同一次运行中可用。
|
|
||||||
- name: Create GitHub Release and Upload Assets
|
|
||||||
uses: softprops/action-gh-release@v3
|
|
||||||
with:
|
with:
|
||||||
name: "Release ${{ github.ref_name }}"
|
tag_name: ${{ github.ref_name }}
|
||||||
body_path: RELEASE_NOTES.md
|
release_name: "Release ${{ github.ref_name }}"
|
||||||
|
body: "Release created by CI for tag ${{ github.ref_name }} (package version ${{ steps.get_version.outputs.version }})"
|
||||||
draft: false
|
draft: false
|
||||||
prerelease: false
|
prerelease: false
|
||||||
files: |
|
|
||||||
./packages/*.nupkg
|
|
||||||
env:
|
|
||||||
GITHUB_TOKEN: ${{ github.token }}
|
|
||||||
|
|
||||||
- name: Write publish summary
|
- name: Upload all .nupkg to Release (curl)
|
||||||
env:
|
env:
|
||||||
CLIFF_RELEASE_NOTES: ${{ steps.cliff_release.outputs.content }}
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
UPLOAD_URL_TEMPLATE: ${{ steps.create_release.outputs.upload_url }}
|
||||||
run: |
|
run: |
|
||||||
{
|
set -e
|
||||||
echo "## GitHub Release"
|
# upload_url from create-release is like: https://uploads.github.com/repos/OWNER/REPO/releases/ID/assets{?name,label}
|
||||||
echo
|
# strip template part "{?name,label}"
|
||||||
echo "- Tag: \`${{ github.ref_name }}\`"
|
UPLOAD_URL="${UPLOAD_URL_TEMPLATE%\{*}"
|
||||||
echo "- Package version: \`${{ needs.build-pack.outputs.package_version }}\`"
|
echo "Upload base URL: $UPLOAD_URL"
|
||||||
echo
|
|
||||||
printf '%s\n' "${CLIFF_RELEASE_NOTES}"
|
for package_file in ./packages/*.nupkg; do
|
||||||
} >> "${GITHUB_STEP_SUMMARY}"
|
if [ -f "$package_file" ]; then
|
||||||
|
basename=$(basename "$package_file")
|
||||||
|
echo "Uploading $basename to release..."
|
||||||
|
curl --fail -sS -X POST \
|
||||||
|
-H "Authorization: Bearer $GITHUB_TOKEN" \
|
||||||
|
-H "Content-Type: application/octet-stream" \
|
||||||
|
--data-binary @"$package_file" \
|
||||||
|
"$UPLOAD_URL?name=$basename"
|
||||||
|
echo "Uploaded $basename"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|||||||
24
.gitignore
vendored
24
.gitignore
vendored
@ -3,27 +3,3 @@ obj/
|
|||||||
/packages/
|
/packages/
|
||||||
riderModule.iml
|
riderModule.iml
|
||||||
/_ReSharper.Caches/
|
/_ReSharper.Caches/
|
||||||
GFramework.sln.DotSettings.user
|
|
||||||
.idea/
|
|
||||||
dotnet-home/
|
|
||||||
scripts/__pycache__/
|
|
||||||
# ai
|
|
||||||
opencode.json
|
|
||||||
.claude/settings.local.json
|
|
||||||
.claude/settings.json
|
|
||||||
.omc/
|
|
||||||
docs/.omc/
|
|
||||||
docs/.vitepress/cache/
|
|
||||||
ai-plan/*
|
|
||||||
!ai-plan/README.md
|
|
||||||
!ai-plan/public/
|
|
||||||
ai-plan/public/*
|
|
||||||
!ai-plan/public/README.md
|
|
||||||
!ai-plan/public/**/
|
|
||||||
!ai-plan/public/**/*.md
|
|
||||||
ai-plan/private/
|
|
||||||
ai-libs/
|
|
||||||
.codex
|
|
||||||
# tool
|
|
||||||
.venv/
|
|
||||||
BenchmarkDotNet.Artifacts/
|
|
||||||
|
|||||||
@ -1,15 +0,0 @@
|
|||||||
# Allowlist for fake/test/demo secrets only
|
|
||||||
# DO NOT add real credentials here
|
|
||||||
[allowlist]
|
|
||||||
description = "Ignore test/demo secrets"
|
|
||||||
|
|
||||||
paths = [
|
|
||||||
"docs/.*",
|
|
||||||
".*Test.*\\.json",
|
|
||||||
".*Development.*"
|
|
||||||
]
|
|
||||||
|
|
||||||
regexes = [
|
|
||||||
"FAKE_.*_KEY",
|
|
||||||
"TEST_.*_TOKEN"
|
|
||||||
]
|
|
||||||
BIN
.idea/.idea.GFramework/.idea/icon.png
generated
BIN
.idea/.idea.GFramework/.idea/icon.png
generated
Binary file not shown.
|
Before Width: | Height: | Size: 280 KiB |
@ -1,91 +0,0 @@
|
|||||||
# Copyright (c) 2025-2026 GeWuYou
|
|
||||||
# SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
# 配置文件用于设置代码质量检查工具的各项参数和规则
|
|
||||||
# 包含全局排除目录、启用/禁用的检查器、特定语言配置等设置
|
|
||||||
|
|
||||||
APPLY_FIXES: none
|
|
||||||
FAIL_ON_ERROR: false
|
|
||||||
|
|
||||||
# ========================
|
|
||||||
# 全局排除目录配置
|
|
||||||
# 定义不需要进行代码检查的目录列表
|
|
||||||
# ========================
|
|
||||||
EXCLUDED_DIRECTORIES:
|
|
||||||
- bin
|
|
||||||
- obj
|
|
||||||
- packages
|
|
||||||
- node_modules
|
|
||||||
- TestResults
|
|
||||||
- .git
|
|
||||||
- .vs
|
|
||||||
- .vscode
|
|
||||||
|
|
||||||
# ========================
|
|
||||||
# 禁用噪音较大的检查器
|
|
||||||
# 避免在检查过程中产生过多无关警告信息
|
|
||||||
# ========================
|
|
||||||
DISABLE:
|
|
||||||
- COPYPASTE
|
|
||||||
- SPELL
|
|
||||||
- MARKDOWN
|
|
||||||
|
|
||||||
# ========================
|
|
||||||
# 启用核心检查器列表
|
|
||||||
# 定义需要运行的主要代码质量检查工具
|
|
||||||
# ========================
|
|
||||||
ENABLE_LINTERS:
|
|
||||||
- CSHARP_DOTNET_FORMAT
|
|
||||||
- CSHARP_ROSLYN_ANALYZERS
|
|
||||||
- YAML
|
|
||||||
- JSON
|
|
||||||
- GITHUB_ACTIONS
|
|
||||||
- REPOSITORY_GITLEAKS
|
|
||||||
- REPOSITORY_TRUFFLEHOG
|
|
||||||
|
|
||||||
# ========================
|
|
||||||
# C# 代码格式化检查配置
|
|
||||||
# 设置 C# 代码风格检查的参数和验证级别
|
|
||||||
# ========================
|
|
||||||
CSHARP_DOTNET_FORMAT_ARGUMENTS:
|
|
||||||
# 仓库根目录同时存在 GFramework.sln 与 GFramework.csproj;
|
|
||||||
# 显式指定 workspace,避免 dotnet format 在 CI 中因自动探测歧义直接异常退出。
|
|
||||||
- "GFramework.sln"
|
|
||||||
- "--severity"
|
|
||||||
- "info"
|
|
||||||
- "--verify-no-changes"
|
|
||||||
|
|
||||||
# ========================
|
|
||||||
# YAML 文件检查过滤配置
|
|
||||||
# 定义 YAML 文件的包含和排除正则表达式模式
|
|
||||||
# ========================
|
|
||||||
YAML_YAMLLINT_FILTER_REGEX_INCLUDE: '.*\.(ya?ml)$'
|
|
||||||
YAML_YAMLLINT_FILTER_REGEX_EXCLUDE: '.*/.github/.*'
|
|
||||||
|
|
||||||
# ========================
|
|
||||||
# JSON 文件检查过滤配置
|
|
||||||
# 定义 JSON 文件的包含正则表达式模式
|
|
||||||
# ========================
|
|
||||||
JSON_JSONLINT_FILTER_REGEX_INCLUDE: '.*\.json$'
|
|
||||||
|
|
||||||
# ========================
|
|
||||||
# GitHub Actions 工作流检查配置
|
|
||||||
# 控制 GitHub Actions 文件检查的错误报告行为
|
|
||||||
# ========================
|
|
||||||
ACTION_ACTIONLINT_DISABLE_ERRORS: true
|
|
||||||
|
|
||||||
# ========================
|
|
||||||
# 报告输出配置
|
|
||||||
# 设置检查结果的多种报告输出格式
|
|
||||||
# ========================
|
|
||||||
CONSOLE_REPORTER: true
|
|
||||||
SARIF_REPORTER: true
|
|
||||||
GITHUB_COMMENT_REPORTER: true
|
|
||||||
|
|
||||||
# ========================
|
|
||||||
# 性能优化配置
|
|
||||||
# 控制检查过程的并行执行和时间统计选项
|
|
||||||
# ========================
|
|
||||||
PARALLEL: true
|
|
||||||
SHOW_ELAPSED_TIME: true
|
|
||||||
VALIDATE_ALL_CODEBASE: false
|
|
||||||
129
.releaserc.json
129
.releaserc.json
@ -1,129 +0,0 @@
|
|||||||
{
|
|
||||||
"branches": [
|
|
||||||
"main"
|
|
||||||
],
|
|
||||||
"tagFormat": "v${version}",
|
|
||||||
"plugins": [
|
|
||||||
[
|
|
||||||
"@semantic-release/commit-analyzer",
|
|
||||||
{
|
|
||||||
"preset": "conventionalcommits",
|
|
||||||
"releaseRules": [
|
|
||||||
{
|
|
||||||
"breaking": true,
|
|
||||||
"release": "major"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"revert": true,
|
|
||||||
"release": "patch"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "feat",
|
|
||||||
"release": "minor"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "fix",
|
|
||||||
"release": "patch"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "perf",
|
|
||||||
"release": "patch"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "refactor",
|
|
||||||
"release": "patch"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "deps",
|
|
||||||
"release": "patch"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "security",
|
|
||||||
"release": "patch"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "docs",
|
|
||||||
"release": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "test",
|
|
||||||
"release": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "chore",
|
|
||||||
"release": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "build",
|
|
||||||
"release": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "ci",
|
|
||||||
"release": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "style",
|
|
||||||
"release": false
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"parserOpts": {
|
|
||||||
"noteKeywords": [
|
|
||||||
"BREAKING CHANGE",
|
|
||||||
"BREAKING CHANGES"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
],
|
|
||||||
[
|
|
||||||
"@semantic-release/release-notes-generator",
|
|
||||||
{
|
|
||||||
"preset": "conventionalcommits",
|
|
||||||
"presetConfig": {
|
|
||||||
"types": [
|
|
||||||
{
|
|
||||||
"type": "feat",
|
|
||||||
"section": "Features",
|
|
||||||
"hidden": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "fix",
|
|
||||||
"section": "Bug Fixes",
|
|
||||||
"hidden": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "perf",
|
|
||||||
"section": "Performance Improvements",
|
|
||||||
"hidden": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "refactor",
|
|
||||||
"section": "Refactoring",
|
|
||||||
"hidden": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "deps",
|
|
||||||
"section": "Dependency Updates",
|
|
||||||
"hidden": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "security",
|
|
||||||
"section": "Security Fixes",
|
|
||||||
"hidden": false
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"type": "revert",
|
|
||||||
"section": "Reverts",
|
|
||||||
"hidden": false
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"parserOpts": {
|
|
||||||
"noteKeywords": [
|
|
||||||
"BREAKING CHANGE",
|
|
||||||
"BREAKING CHANGES"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
]
|
|
||||||
]
|
|
||||||
}
|
|
||||||
539
AGENTS.md
539
AGENTS.md
@ -1,539 +0,0 @@
|
|||||||
# AGENTS.md
|
|
||||||
|
|
||||||
This document is the single source of truth for coding behavior in this repository.
|
|
||||||
|
|
||||||
All AI agents and contributors must follow these rules when writing, reviewing, or modifying code in `GFramework`.
|
|
||||||
|
|
||||||
## Environment Capability Inventory
|
|
||||||
|
|
||||||
- Before choosing runtimes or CLI tools, read `@.ai/environment/tools.ai.yaml`.
|
|
||||||
- Use `@.ai/environment/tools.raw.yaml` only when you need the full collected facts behind the AI-facing hints.
|
|
||||||
- Prefer the project-relevant tools listed there instead of assuming every installed system tool is fair game.
|
|
||||||
- If the real environment differs from the inventory, use the project-relevant installed tool and report the mismatch.
|
|
||||||
- When working in WSL against this repository's Windows-backed worktree, first prefer Linux `git` with an explicit
|
|
||||||
`--git-dir=<repo>/.git/worktrees/<worktree-name>` and `--work-tree=<worktree-root>` binding for every repository
|
|
||||||
command. Treat that explicit binding as higher priority than `git.exe`, because it avoids WSL worktree path
|
|
||||||
translation mistakes and still works in sessions where Windows `.exe` execution is unavailable.
|
|
||||||
- If a plain Linux `git` command in WSL fails with a worktree-style “not a git repository” path translation error,
|
|
||||||
rerun it with the explicit `--git-dir` / `--work-tree` binding before trying `git.exe`.
|
|
||||||
- Only prefer Windows Git from WSL (for example `git.exe`) when that executable is both resolvable and executable in the
|
|
||||||
current session, and when the explicit Linux `git` binding is unavailable or has already failed.
|
|
||||||
- If the shell resolves `git.exe` but the current WSL session cannot execute it cleanly (for example `Exec format
|
|
||||||
error`), keep using the explicit Linux `git` binding for the rest of the task instead of retrying Windows Git.
|
|
||||||
- If the shell does not currently resolve `git.exe` to the host Windows Git installation and you still need Windows Git
|
|
||||||
as a fallback, prepend that installation's command directory to `PATH` and reset shell command hashing for the
|
|
||||||
current session before continuing.
|
|
||||||
- After resolving either strategy, prefer a session-local binding or command wrapper for subsequent Git commands so the
|
|
||||||
shell does not silently fall back to the wrong repository context later in the same WSL session.
|
|
||||||
|
|
||||||
## Git Workflow Rules
|
|
||||||
|
|
||||||
- Every completed task MUST pass at least one build validation before it is considered done.
|
|
||||||
- When the goal is to inspect or reduce warnings printed during project build, contributors MUST establish the warning
|
|
||||||
baseline from a non-incremental repository-root build by running `dotnet clean` and then `dotnet build`.
|
|
||||||
- Contributors MUST NOT treat a repeated incremental `dotnet build` result as authoritative for warning inspection when
|
|
||||||
a clean baseline has not been captured in the same round.
|
|
||||||
- If a direct `dotnet clean`, `dotnet build`, or `dotnet test` command fails inside the agent sandbox with missing
|
|
||||||
diagnostics, `Permission denied`, MSBuild pipe/socket errors, or other environment-only noise that does not match a
|
|
||||||
normal shell invocation, contributors MUST request permission and rerun the same direct command outside the sandbox
|
|
||||||
before concluding that the repository or toolchain is broken.
|
|
||||||
- For repository truth, contributors MUST prefer the result of the original direct command executed outside the sandbox
|
|
||||||
over sandbox-only failures, workaround-heavy variants, or speculative environment flags unless the user explicitly
|
|
||||||
asks for a non-default command shape.
|
|
||||||
- If the task changes multiple projects or shared abstractions, prefer a solution-level or affected-project
|
|
||||||
`dotnet build ... -c Release`; otherwise use the smallest build command that still proves the result compiles.
|
|
||||||
- When a task adds a feature or modifies code, contributors MUST run a Release build for every directly affected
|
|
||||||
module/project instead of relying on an unrelated project or solution slice that does not actually compile the touched
|
|
||||||
code.
|
|
||||||
- Warnings reported by those affected-module builds are part of the task scope. Contributors MUST resolve the touched
|
|
||||||
module's build warnings in the same change, or stop and explicitly report the exact warning IDs and blocker instead of
|
|
||||||
deferring them to a separate long-lived cleanup branch by default.
|
|
||||||
- If the required build passes and there are task-related staged or unstaged changes, contributors MUST create a Git
|
|
||||||
commit automatically instead of leaving the task uncommitted, unless the user explicitly says not to commit.
|
|
||||||
- Commit messages MUST use Conventional Commits format: `<type>(<scope>): <summary>`.
|
|
||||||
- The commit `summary` MUST use simplified Chinese and briefly describe the main change.
|
|
||||||
- The commit `body` MUST use unordered list items, and each item MUST start with a verb such as `新增`、`修复`、`优化`、
|
|
||||||
`更新`、`补充`、`重构`.
|
|
||||||
- Each commit body bullet MUST describe one independent change point; avoid repeated or redundant descriptions.
|
|
||||||
- Commit `type` MUST reflect release semantics instead of author intent:
|
|
||||||
- Use `feat` only for user-facing or consumer-facing capability additions that should raise the next released version's
|
|
||||||
`minor` segment.
|
|
||||||
- Use `fix` for behavior corrections, `perf` for observable performance improvements, and `refactor` only for
|
|
||||||
non-feature code restructuring; these should raise the next released version's `patch` segment.
|
|
||||||
- Use `deps` for dependency version updates, dependency lockfile refreshes, and package maintenance that should raise
|
|
||||||
the next released version's `patch` segment.
|
|
||||||
- Use `security` for vulnerability fixes, dependency security mitigations, and security configuration corrections
|
|
||||||
that should raise the next released version's `patch` segment.
|
|
||||||
- Use `docs`、`test`、`chore`、`build`、`ci`、`style` for their literal categories; do not encode these changes as
|
|
||||||
`feat` just because they feel important. These categories MUST NOT trigger a release.
|
|
||||||
- Use `BREAKING CHANGE` in the commit footer or `!` after the type / scope header (for example `feat!:` or
|
|
||||||
`feat(core)!:`) when the change should raise the next released version's `major` segment.
|
|
||||||
- Documentation-only changes MUST NOT use `feat`, including new guides, refreshed examples, navigation updates, and
|
|
||||||
adoption notes for existing capabilities. If a commit changes both product behavior and related docs, either split the
|
|
||||||
commit or use `feat` only when the code/package behavior is the primary released change.
|
|
||||||
- Contributors MUST avoid ambiguous scopes such as `feat(docs)` for documentation work. If the change only affects docs,
|
|
||||||
prefer `docs(<module-or-area>)`; if it adds a real capability in a docs-related toolchain, use the scope of that
|
|
||||||
actual subsystem instead of `docs`.
|
|
||||||
- Keep technical terms in English when they are established project terms, such as `API`、`Model`、`System`.
|
|
||||||
- When composing a multi-line commit body from shell commands, contributors MUST NOT rely on Bash `$"..."` quoting for
|
|
||||||
newline escapes, because it passes literal `\n` sequences to Git. Use multiple `-m` flags or ANSI-C `$'...'`
|
|
||||||
quoting so the commit body contains real line breaks.
|
|
||||||
- If a new task starts while the current branch is `main`, contributors MUST first try to update local `main` from the
|
|
||||||
remote, then create and switch to a dedicated branch before making substantive changes.
|
|
||||||
- The branch naming rule for a new task branch is `<type>/<topic-or-scope>`, where `<type>` should match the intended
|
|
||||||
Conventional Commit category as closely as practical.
|
|
||||||
|
|
||||||
## License Header Rules
|
|
||||||
|
|
||||||
- Repository-maintained source and configuration files that are supported by `scripts/license-header.py` MUST include an
|
|
||||||
Apache-2.0 file header before the task is considered complete.
|
|
||||||
- When creating or modifying supported files, contributors MUST preserve an existing compliant header or add the SPDX
|
|
||||||
header generated by `python3 scripts/license-header.py --fix`.
|
|
||||||
- Before committing changes that add or modify supported source/configuration files, contributors MUST run
|
|
||||||
`python3 scripts/license-header.py --check` and resolve any missing or misplaced headers.
|
|
||||||
- For files with shebang lines, keep the shebang as the first line and place the license header immediately after it.
|
|
||||||
- For XML/MSBuild files with an XML declaration, keep the XML declaration as the first node and place the license header
|
|
||||||
immediately after it.
|
|
||||||
- Do not add project license headers to excluded or third-party areas such as `.agents/**`, `ai-libs/**`,
|
|
||||||
`third-party-licenses/**`, generated snapshots, binary assets, lock files, and generated build output. Treat
|
|
||||||
`scripts/license-header.py` as the authoritative include/exclude policy for this check.
|
|
||||||
- If CI reports a license-header failure, either fix it locally with `python3 scripts/license-header.py --fix` or, for
|
|
||||||
maintainer-owned cleanup, use the manual `License Header Fix` GitHub Actions workflow to create a reviewed repair PR.
|
|
||||||
|
|
||||||
## Repository Boot Skill
|
|
||||||
|
|
||||||
- The repository-maintained Codex boot skill lives at `.agents/skills/gframework-boot/`.
|
|
||||||
- The repository-maintained multi-agent coordination skill lives at `.agents/skills/gframework-multi-agent-batch/`.
|
|
||||||
- Prefer invoking `$gframework-boot` when the user uses short startup prompts such as `boot`、`continue`、`next step`、
|
|
||||||
`按 boot 开始`、`先看 AGENTS`、`继续当前任务`.
|
|
||||||
- Prefer invoking `$gframework-multi-agent-batch` when the user explicitly wants the main agent to delegate bounded
|
|
||||||
parallel work, track subagent progress, maintain `ai-plan`, verify subagent output, and keep coordinating until the
|
|
||||||
current multi-agent batch reaches a natural stop boundary.
|
|
||||||
- The boot skill is a startup convenience layer, not a replacement for this document. If the skill and `AGENTS.md`
|
|
||||||
diverge, follow `AGENTS.md` first and update the skill in the same change.
|
|
||||||
- The boot skill MUST read `AGENTS.md`、`.ai/environment/tools.ai.yaml`、`ai-plan/public/README.md` and the relevant
|
|
||||||
active-topic `ai-plan/` artifacts before substantive execution.
|
|
||||||
|
|
||||||
## Subagent Usage Rules
|
|
||||||
|
|
||||||
- Use subagents only when the task is complex, the context is likely to grow too large, or the work can be split into
|
|
||||||
independent parallel subtasks.
|
|
||||||
- The main agent MUST identify the critical path first. Do not delegate the immediate blocking task if the next local
|
|
||||||
step depends on that result.
|
|
||||||
- Use `explorer` subagents for read-only discovery, comparison, tracing, and narrow codebase questions.
|
|
||||||
- Use `worker` subagents only for bounded implementation tasks with an explicit file or module ownership boundary.
|
|
||||||
- Every delegation MUST specify:
|
|
||||||
- the concrete objective
|
|
||||||
- the expected output format
|
|
||||||
- the files or subsystem the subagent owns
|
|
||||||
- any constraints about tests, diagnostics, or compatibility
|
|
||||||
- Subagents are not allowed to revert or overwrite unrelated changes from the user or other agents. They must adapt to
|
|
||||||
concurrent work instead of assuming exclusive ownership of the repository.
|
|
||||||
- Prefer lightweight models such as `gpt-5.1-codex-mini` for narrow exploration, indexing, and comparison tasks.
|
|
||||||
- Prefer stronger models such as `gpt-5.4` for cross-module design work, non-trivial refactors, and tasks that require
|
|
||||||
higher confidence reasoning.
|
|
||||||
- The main agent remains responsible for reviewing and integrating subagent output. Unreviewed subagent conclusions do
|
|
||||||
not count as final results.
|
|
||||||
|
|
||||||
### Multi-Agent Coordination Rules
|
|
||||||
|
|
||||||
The terms below describe the default guardrails for multi-agent batches and how they affect worker-launch decisions.
|
|
||||||
|
|
||||||
- `branch-diff budget`: the maximum acceptable branch diff size in files or lines before another worker wave becomes
|
|
||||||
harder to review as a single PR.
|
|
||||||
- `reviewability budget`: the cumulative complexity limit beyond which accepting more parallel slices would materially
|
|
||||||
reduce review quality, even if the raw file count still looks acceptable.
|
|
||||||
- `context-budget`: the main agent's remaining capacity to track active workers, validation, and integration state
|
|
||||||
without losing critical execution context.
|
|
||||||
- When any of these budgets approaches its safe limit, the main agent SHOULD stop launching more workers and close the
|
|
||||||
current wave first.
|
|
||||||
- `$gframework-multi-agent-batch` contains the fuller workflow and stop-condition guidance for applying these budgets in
|
|
||||||
practice.
|
|
||||||
|
|
||||||
- Prefer the repository's multi-agent coordination mode when the user explicitly wants the main agent to keep
|
|
||||||
orchestrating parallel subagents, or when the work naturally splits into `2+` disjoint write slices that can proceed
|
|
||||||
in parallel without blocking the next local step.
|
|
||||||
- In that mode, the main agent MUST keep ownership of:
|
|
||||||
- critical-path selection
|
|
||||||
- baseline and stop-condition tracking
|
|
||||||
- `ai-plan` updates
|
|
||||||
- validation planning and final validation
|
|
||||||
- review and acceptance of every subagent result
|
|
||||||
- the final integration and completion decision
|
|
||||||
- Before spawning any `worker` subagent, the main agent MUST:
|
|
||||||
- identify the immediate blocking step and keep it local
|
|
||||||
- define disjoint file or subsystem ownership for each worker
|
|
||||||
- state the required validation commands and expected output format
|
|
||||||
- check that the expected write set still fits the current branch-diff and reviewability budget
|
|
||||||
- While workers run, the main agent MUST avoid overlapping edits and focus on non-conflicting work such as:
|
|
||||||
- ranking the next candidate slices
|
|
||||||
- reviewing completed worker output
|
|
||||||
- recomputing branch-diff and context-budget posture
|
|
||||||
- keeping `ai-plan/public/**` recovery artifacts current
|
|
||||||
- Before accepting a worker result, the main agent MUST confirm:
|
|
||||||
- the worker stayed within its owned files or subsystem
|
|
||||||
- the reported validation is sufficient for that slice
|
|
||||||
- any accepted findings or follow-up scope are recorded in the active `ai-plan` todo or trace when the task is
|
|
||||||
complex or multi-step
|
|
||||||
- Do not continue launching workers merely because a file-count threshold still has room. Stop the current wave when
|
|
||||||
ownership boundaries start to overlap, reviewability materially degrades, or the context-budget signal says the main
|
|
||||||
agent should close the batch.
|
|
||||||
- When a complex task uses multiple workers, the main agent SHOULD prefer the public workflow documented by
|
|
||||||
`$gframework-multi-agent-batch` unless a more task-specific skill already provides stricter rules.
|
|
||||||
|
|
||||||
## Commenting Rules (MUST)
|
|
||||||
|
|
||||||
All generated or modified code MUST include clear and meaningful comments where required by the rules below.
|
|
||||||
|
|
||||||
### XML Documentation (Required)
|
|
||||||
|
|
||||||
- All public, protected, and internal types and members MUST include XML documentation comments (`///`).
|
|
||||||
- Use `<summary>`, `<param>`, `<returns>`, `<exception>`, and `<remarks>` where applicable.
|
|
||||||
- Comments must explain intent, contract, and usage constraints instead of restating syntax.
|
|
||||||
- If a member participates in lifecycle, threading, registration, or disposal behavior, document that behavior
|
|
||||||
explicitly.
|
|
||||||
|
|
||||||
### Inline Comments
|
|
||||||
|
|
||||||
- Add inline comments for:
|
|
||||||
- Non-trivial logic
|
|
||||||
- Concurrency or threading behavior
|
|
||||||
- Performance-sensitive paths
|
|
||||||
- Workarounds, compatibility constraints, or edge cases
|
|
||||||
- Registration order, lifecycle sequencing, or generated code assumptions
|
|
||||||
- Avoid obvious comments such as `// increment i`.
|
|
||||||
|
|
||||||
### Architecture-Level Comments
|
|
||||||
|
|
||||||
- Core framework components such as Architecture, Module, System, Context, Registry, Service Module, and Lifecycle types
|
|
||||||
MUST include high-level explanations of:
|
|
||||||
- Responsibilities
|
|
||||||
- Lifecycle
|
|
||||||
- Interaction with other components
|
|
||||||
- Why the abstraction exists
|
|
||||||
- When to use it instead of alternatives
|
|
||||||
|
|
||||||
### Source Generator Comments
|
|
||||||
|
|
||||||
- Generated logic and generator pipelines MUST explain:
|
|
||||||
- What is generated
|
|
||||||
- Why it is generated
|
|
||||||
- The semantic assumptions the generator relies on
|
|
||||||
- Any diagnostics or fallback behavior
|
|
||||||
|
|
||||||
### Complex Logic Requirement
|
|
||||||
|
|
||||||
- Methods with non-trivial logic MUST document:
|
|
||||||
- The core idea
|
|
||||||
- Key decisions
|
|
||||||
- Edge case handling, if any
|
|
||||||
|
|
||||||
### Quality Rules
|
|
||||||
|
|
||||||
- Comments MUST NOT be trivial, redundant, or misleading.
|
|
||||||
- Prefer explaining `why` and `when`, not just `what`.
|
|
||||||
- Code should remain understandable without requiring external context.
|
|
||||||
- Prefer slightly more explanation over too little for framework code.
|
|
||||||
|
|
||||||
### Enforcement
|
|
||||||
|
|
||||||
- Missing required documentation is a coding standards violation.
|
|
||||||
- Code that does not meet the documentation rules is considered incomplete.
|
|
||||||
|
|
||||||
## Code Style
|
|
||||||
|
|
||||||
### Language and Project Settings
|
|
||||||
|
|
||||||
- Follow the repository defaults:
|
|
||||||
- `ImplicitUsings` disabled
|
|
||||||
- `Nullable` enabled
|
|
||||||
- `GenerateDocumentationFile` enabled for shipped libraries
|
|
||||||
- `LangVersion` is generally `preview` in the main libraries and abstractions
|
|
||||||
- Do not rely on implicit imports. Declare every required `using` explicitly.
|
|
||||||
- Write null-safe code that respects nullable annotations instead of suppressing warnings by default.
|
|
||||||
|
|
||||||
### Naming and Structure
|
|
||||||
|
|
||||||
- Use the namespace pattern `GFramework.{Module}.{Feature}` with PascalCase segments.
|
|
||||||
- Follow standard C# naming:
|
|
||||||
- Types, methods, properties, events, and constants: PascalCase
|
|
||||||
- Interfaces: `I` prefix
|
|
||||||
- Parameters and locals: camelCase
|
|
||||||
- Private fields: `_camelCase`
|
|
||||||
- Keep abstractions projects free of implementation details and engine-specific dependencies.
|
|
||||||
- Preserve existing module boundaries. Do not introduce new cross-module dependencies without clear architectural need.
|
|
||||||
- Framework runtime, abstractions, and meta-package projects MUST NOT reference `*.SourceGenerators*` projects or packages,
|
|
||||||
and MUST NOT use source-generator attributes such as `GenerateEnumExtensions` or `ContextAware`. Those capabilities are
|
|
||||||
reserved for consumer projects, generator projects, examples explicitly meant to demonstrate generator usage, and related tests.
|
|
||||||
|
|
||||||
### Formatting
|
|
||||||
|
|
||||||
- Use 4 spaces for indentation. Do not use tabs.
|
|
||||||
- Use Allman braces.
|
|
||||||
- Keep `using` directives at the top of the file and sort them consistently.
|
|
||||||
- Separate logical blocks with blank lines when it improves readability.
|
|
||||||
- Prefer one primary type per file unless the surrounding project already uses a different local pattern.
|
|
||||||
- Unless there is a clear and documented reason to keep a file large, keep a single source file under roughly 800-1000
|
|
||||||
lines.
|
|
||||||
- If a file grows beyond that range, contributors MUST stop and check whether responsibilities should be split before
|
|
||||||
continuing; treating oversized files as the default is considered a design smell.
|
|
||||||
- Keep line length readable. Around 120 characters is the preferred upper bound.
|
|
||||||
|
|
||||||
### C# Conventions
|
|
||||||
|
|
||||||
- Prefer explicit, readable code over clever shorthand in framework internals.
|
|
||||||
- Match existing async patterns and naming conventions (`Async` suffix for asynchronous methods).
|
|
||||||
- Avoid hidden side effects in property getters, constructors, and registration helpers.
|
|
||||||
- Preserve deterministic behavior in registries, lifecycle orchestration, and generated outputs.
|
|
||||||
- When adding analyzers or suppressions, keep them minimal and justify them in code comments if the reason is not
|
|
||||||
obvious.
|
|
||||||
|
|
||||||
### Analyzer and Validation Expectations
|
|
||||||
|
|
||||||
- The repository uses `Meziantou.Analyzer`; treat analyzer feedback as part of the coding standard.
|
|
||||||
- Treat SonarQube maintainability rules as part of the coding standard as well, especially cognitive complexity and
|
|
||||||
oversized parameter list findings.
|
|
||||||
- When a method approaches analyzer complexity limits, prefer extracting named helper methods by semantic phase
|
|
||||||
(parsing, normalization, validation, diagnostics) instead of silencing the warning or doing cosmetic reshuffles.
|
|
||||||
- When a constructor or method exceeds parameter count limits, choose the refactor that matches the shape of the API:
|
|
||||||
use domain-specific value objects or parameter objects for naturally grouped data, and prefer named factory methods
|
|
||||||
when the call site is really selecting between different creation modes.
|
|
||||||
- Do not add suppressions for complexity or parameter-count findings unless the constraint is externally imposed and the
|
|
||||||
reason is documented in code comments.
|
|
||||||
- Naming must remain compatible with `scripts/validate-csharp-naming.sh`.
|
|
||||||
|
|
||||||
## Testing Requirements
|
|
||||||
|
|
||||||
### Required Coverage
|
|
||||||
|
|
||||||
- Every non-trivial feature, bug fix, or behavior change MUST include tests or an explicit justification for why a test
|
|
||||||
is not practical.
|
|
||||||
- Public API changes must be covered by unit or integration tests.
|
|
||||||
- When a public API defines multiple contract branches, tests MUST cover the meaningful variants, including null,
|
|
||||||
empty, default, and filtered inputs when those branches change behavior.
|
|
||||||
- Regression fixes should include a test that fails before the fix and passes after it.
|
|
||||||
|
|
||||||
### Test Organization
|
|
||||||
|
|
||||||
- Mirror the source structure in test projects whenever practical.
|
|
||||||
- Reuse existing architecture test infrastructure when relevant:
|
|
||||||
- `ArchitectureTestsBase<T>`
|
|
||||||
- `SyncTestArchitecture`
|
|
||||||
- `AsyncTestArchitecture`
|
|
||||||
- Keep tests focused on observable behavior, not implementation trivia.
|
|
||||||
|
|
||||||
### Source Generator Tests
|
|
||||||
|
|
||||||
- Source generator changes MUST be covered by generator tests.
|
|
||||||
- Preserve snapshot-based verification patterns already used in the repository.
|
|
||||||
- When generator behavior changes intentionally, update snapshots together with the implementation.
|
|
||||||
|
|
||||||
### Validation Commands
|
|
||||||
|
|
||||||
Use the smallest command set that proves the change, then expand if the change is cross-cutting.
|
|
||||||
If a sandboxed agent run reports environment-specific .NET failures, rerun the same direct command outside the sandbox
|
|
||||||
and treat that unsandboxed result as authoritative for validation and warning baselines.
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Check warnings from the default repository build entrypoint
|
|
||||||
dotnet clean
|
|
||||||
dotnet build
|
|
||||||
|
|
||||||
# Build the full solution
|
|
||||||
dotnet build GFramework.sln -c Release
|
|
||||||
|
|
||||||
# Run all tests
|
|
||||||
dotnet test GFramework.sln -c Release
|
|
||||||
|
|
||||||
# Run a single test project
|
|
||||||
dotnet test GFramework.Core.Tests -c Release
|
|
||||||
dotnet test GFramework.Game.Tests -c Release
|
|
||||||
dotnet test GFramework.SourceGenerators.Tests -c Release
|
|
||||||
dotnet test GFramework.Ecs.Arch.Tests -c Release
|
|
||||||
|
|
||||||
# Run a single NUnit test or test group
|
|
||||||
dotnet test GFramework.Core.Tests -c Release --filter "FullyQualifiedName~CommandExecutorTests.Execute"
|
|
||||||
|
|
||||||
# Validate naming rules used by CI
|
|
||||||
bash scripts/validate-csharp-naming.sh
|
|
||||||
```
|
|
||||||
|
|
||||||
### Test Execution Expectations
|
|
||||||
|
|
||||||
- Run targeted tests for the code you changed whenever possible.
|
|
||||||
- Run broader solution-level validation for changes that touch shared abstractions, lifecycle behavior, source
|
|
||||||
generators, or dependency wiring.
|
|
||||||
- Do not claim completion if required tests were skipped; state what was not run and why.
|
|
||||||
|
|
||||||
## Security Rules
|
|
||||||
|
|
||||||
- Validate external or user-controlled input before it reaches file system, serialization, reflection, code generation,
|
|
||||||
or process boundaries.
|
|
||||||
- Do not build command strings, file paths, type names, or generated code from untrusted input without strict validation
|
|
||||||
or allow-listing.
|
|
||||||
- Avoid logging secrets, tokens, credentials, or machine-specific sensitive data.
|
|
||||||
- Keep source generators deterministic and free of hidden environment or network dependencies.
|
|
||||||
- Prefer least-privilege behavior for file, process, and environment access.
|
|
||||||
- Do not introduce unsafe deserialization, broad reflection-based activation, or dynamic code execution unless it is
|
|
||||||
explicitly required and tightly constrained.
|
|
||||||
- When adding caching, pooling, or shared mutable state, document thread-safety assumptions and failure modes.
|
|
||||||
- Minimize new package dependencies. Add them only when necessary and keep scope narrow.
|
|
||||||
|
|
||||||
## Documentation Rules
|
|
||||||
|
|
||||||
### Code Documentation
|
|
||||||
|
|
||||||
- Any change to public API, lifecycle semantics, module behavior, or extension points MUST update the related XML docs.
|
|
||||||
- If a framework abstraction changes meaning or intended usage, update the explanatory comments in code as part of the
|
|
||||||
same change.
|
|
||||||
|
|
||||||
### Documentation Source Of Truth
|
|
||||||
|
|
||||||
- Treat source code, `*.csproj`, tests, generated snapshots, and packaging metadata as the primary evidence for
|
|
||||||
documentation updates.
|
|
||||||
- Treat verified reference implementations under `ai-libs/` as a secondary evidence source for real project adoption
|
|
||||||
patterns, directory layouts, and end-to-end usage examples.
|
|
||||||
- Treat existing `README.md` files and `docs/zh-CN/` pages as editable outputs, not authoritative truth.
|
|
||||||
- If existing documentation conflicts with code or tests, update the documentation to match the implementation instead
|
|
||||||
of preserving outdated wording.
|
|
||||||
- Do not publish example code, setup steps, or package guidance that cannot be traced back to code, tests, or a
|
|
||||||
verified consumer project.
|
|
||||||
|
|
||||||
### Module README Requirements
|
|
||||||
|
|
||||||
- Every user-facing package or module directory that contains a `*.csproj` intended for direct consumption MUST have a
|
|
||||||
sibling `README.md`.
|
|
||||||
- Use the canonical filename `README.md`. Do not introduce new `ReadMe.md` or other filename variants.
|
|
||||||
- A module README MUST describe:
|
|
||||||
- the module's purpose
|
|
||||||
- the relationship to adjacent runtime, abstractions, or generator packages
|
|
||||||
- the major subdirectories or subsystems the reader is expected to use
|
|
||||||
- the minimum adoption path
|
|
||||||
- the corresponding `docs/zh-CN/` entry points
|
|
||||||
- Adding a new top-level module directory without a `README.md` is considered incomplete work.
|
|
||||||
- If a module's responsibilities, setup, public API surface, generator inputs, or adoption path change, update that
|
|
||||||
module's `README.md` in the same change.
|
|
||||||
|
|
||||||
### Repository Documentation
|
|
||||||
|
|
||||||
- Update the relevant `README.md` or `docs/` page when behavior, setup steps, architecture guidance, or user-facing
|
|
||||||
examples change.
|
|
||||||
- Public documentation under `README.md` and `docs/**` MUST stay reader-facing. Do not publish governance-only content
|
|
||||||
such as inventory tables, coverage baselines, review queues, batch metrics, recovery points, trace summaries, or
|
|
||||||
“this still needs a later audit wave” notes in those user-facing pages.
|
|
||||||
- Public documentation MUST use semantic section titles and link labels. Do not surface raw filenames or paths such as
|
|
||||||
`README.md`、`game/index.md`、`../core/cqrs.md` as reader-facing navigation text when a meaningful destination label is
|
|
||||||
available.
|
|
||||||
- Public documentation MUST avoid rhetorical, self-referential, or AI-sounding headings and prompts such as
|
|
||||||
“你真正会用到的公开入口”、
|
|
||||||
“先理解包关系”、
|
|
||||||
“这个栏目应该回答什么” or “想看……转到……”. Prefer neutral labels such as
|
|
||||||
“公开入口”、
|
|
||||||
“模块与包关系”、
|
|
||||||
“栏目覆盖范围” and “相关主题”.
|
|
||||||
- Public documentation MUST present limitations, suitability, and migration boundaries as adoption guidance for readers.
|
|
||||||
Do not publish internal-governance or product-roadmap wording such as “当前阶段的结论”、
|
|
||||||
“不建议立即启动”、
|
|
||||||
“仓库当前的主要使用者” or similar maintainer-facing decision records in `README.md` or `docs/**`; that material
|
|
||||||
belongs in `ai-plan/**` if it must be tracked.
|
|
||||||
- Governance-only material such as XML audit snapshots, documentation remediation baselines, backlog status, and
|
|
||||||
recovery metadata belongs in `ai-plan/**` or other contributor-only artifacts, not in public docs.
|
|
||||||
- Treat `ai-libs/` as a read-only third-party source reference area.
|
|
||||||
- Code under `ai-libs/**` exists for comparison, tracing, design study, and behavior verification; do not modify it
|
|
||||||
unless the user explicitly asks to sync or update that third-party snapshot.
|
|
||||||
- When implementation plans, traces, reviews, or design notes say “reference a third-party project”, prefer the
|
|
||||||
repository-local path under `ai-libs/` instead of an unspecified upstream repository.
|
|
||||||
- If a task depends on observations from `ai-libs/**`, record the referenced path and conclusion in the active plan or
|
|
||||||
trace when the work is multi-step or complex, or when an active tracking document already exists, rather than editing
|
|
||||||
the third-party reference copy.
|
|
||||||
- The main documentation site lives under `docs/`, with Chinese content under `docs/zh-CN/`.
|
|
||||||
- Keep code samples, package names, and command examples aligned with the current repository state.
|
|
||||||
- Prefer documenting behavior and design intent, not only API surface.
|
|
||||||
- When a public page references XML docs or API coverage, convert that evidence into reader-facing guidance: explain
|
|
||||||
which types, namespaces, or entry points readers should inspect and why, instead of exposing audit counts or
|
|
||||||
governance terminology.
|
|
||||||
- When a feature is added, removed, renamed, or substantially refactored, contributors MUST update or create the
|
|
||||||
corresponding user-facing integration documentation in `docs/zh-CN/` in the same change.
|
|
||||||
- For integration-oriented features such as the AI-First config system, documentation MUST cover:
|
|
||||||
- project directory layout and file conventions
|
|
||||||
- required project or package wiring
|
|
||||||
- minimal working usage example
|
|
||||||
- migration or compatibility notes when behavior changes
|
|
||||||
- If an existing documentation page no longer reflects the current implementation, fixing the code without fixing the
|
|
||||||
documentation is considered incomplete work.
|
|
||||||
- Do not rely on “the code is self-explanatory” for framework features that consumers need to adopt; write the
|
|
||||||
adoption path down so future users do not need to rediscover it from source.
|
|
||||||
- The repository root `README.md` MUST mirror the current top-level documentation taxonomy used by the docs site.
|
|
||||||
Do not maintain a second, differently named navigation system in the root README.
|
|
||||||
- Prefer linking the root `README.md` to section landing pages such as `index.md` instead of deep-linking to a single
|
|
||||||
article when the target is intended to be a documentation category.
|
|
||||||
- If a docs category appears in VitePress navigation or sidebar, it MUST have a real landing page or be removed from
|
|
||||||
navigation in the same change.
|
|
||||||
- When examples are rewritten, preserve only the parts that remain true. Delete or replace speculative examples instead
|
|
||||||
of lightly editing them into another inaccurate form.
|
|
||||||
|
|
||||||
### Task Tracking
|
|
||||||
|
|
||||||
- `ai-plan/` is split by intent:
|
|
||||||
- `ai-plan/public/README.md`: the shared startup index that binds worktrees or branches to active topics and resume
|
|
||||||
entry points
|
|
||||||
- `ai-plan/public/<topic>/todos/`: repository-safe recovery documents for an active topic
|
|
||||||
- `ai-plan/public/<topic>/traces/`: repository-safe execution traces for an active topic
|
|
||||||
- `ai-plan/public/<topic>/archive/`: archived stage-level artifacts that still belong to an active topic; prefer
|
|
||||||
`archive/todos/` and `archive/traces/` when archiving content cut out of the active entry files
|
|
||||||
- `ai-plan/public/archive/<topic>/`: completed-topic archives that should not be treated as default boot context
|
|
||||||
- `ai-plan/private/`: worktree-private recovery artifacts; keep these untracked and scoped to the current worktree
|
|
||||||
- Contributors MUST keep committed `ai-plan/public/**` content safe to publish in Git history.
|
|
||||||
- Never write secrets, tokens, credentials, private keys, machine usernames, home-directory paths, hostnames, IP
|
|
||||||
addresses, proprietary URLs, or other sensitive environment details into any `ai-plan/**` file.
|
|
||||||
- Never record absolute file-system paths in `ai-plan/**`; use repository-relative paths, branch names, PR numbers, or
|
|
||||||
stable document identifiers instead.
|
|
||||||
- Use `ai-plan/public/**` only for durable, handoff-safe task state. Put temporary notes, local experiments, or
|
|
||||||
worktree-specific scratch recovery data under `ai-plan/private/`.
|
|
||||||
- `ai-plan/public/README.md` MUST list only active topics. Do not add `ai-plan/public/archive/**` content to the
|
|
||||||
default boot index.
|
|
||||||
- When a worktree-to-topic mapping changes, or when a topic becomes active/inactive, contributors MUST update
|
|
||||||
`ai-plan/public/README.md` in the same change.
|
|
||||||
- When working from a tracked implementation plan, contributors MUST update the corresponding tracking document under
|
|
||||||
`ai-plan/public/<topic>/todos/` in the same change.
|
|
||||||
- Tracking updates MUST reflect completed work, newly discovered issues, validation results, and the next recommended
|
|
||||||
recovery point.
|
|
||||||
- Active tracking and trace files are recovery entrypoints, not append-only changelogs. They MUST stay concise enough
|
|
||||||
for `boot` to locate the current recovery point quickly.
|
|
||||||
- Completing code changes without updating the active tracking document is considered incomplete work.
|
|
||||||
- For any multi-step refactor, migration, or cross-module task, contributors MUST create or adopt a dedicated recovery
|
|
||||||
document under `ai-plan/public/<topic>/todos/` before making substantive code changes.
|
|
||||||
- Recovery documents MUST record the current phase, the active recovery point identifier, known risks, and the next
|
|
||||||
recommended resume step so another contributor or subagent can continue the work safely.
|
|
||||||
- Contributors MUST maintain a matching execution trace under `ai-plan/public/<topic>/traces/` for complex work. The
|
|
||||||
trace should record the current date, key decisions, validation milestones, and the immediate next step.
|
|
||||||
- When a stage inside an active topic is fully complete, move the finished artifacts into that topic's `archive/`
|
|
||||||
directory instead of leaving every completed step in the default boot path.
|
|
||||||
- When completed and validated stages begin to accumulate, contributors MUST archive their detailed history out of the
|
|
||||||
active `todos/` and `traces/` entry files in the same change. Keep only the current recovery point, active facts,
|
|
||||||
active risks, immediate next step, and pointers to the relevant archive files in the default boot path.
|
|
||||||
- When a topic is fully complete, move the entire topic directory under `ai-plan/public/archive/<topic>/` and remove it
|
|
||||||
from `ai-plan/public/README.md` in the same change.
|
|
||||||
- When a task spans multiple commits or is likely to exceed a single agent context window, update both the recovery
|
|
||||||
document and the trace at each meaningful milestone before pausing or handing work off.
|
|
||||||
- If subagents are used on a complex task, the main agent MUST capture the delegated scope and any accepted findings in
|
|
||||||
the active recovery document or trace before continuing implementation.
|
|
||||||
|
|
||||||
### Documentation Preview
|
|
||||||
|
|
||||||
When documentation changes need local preview, use:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd docs && bun install && bun run dev
|
|
||||||
```
|
|
||||||
|
|
||||||
## Review Standard
|
|
||||||
|
|
||||||
Before considering work complete, confirm:
|
|
||||||
|
|
||||||
- Required comments and XML docs are present
|
|
||||||
- Code follows repository style and naming rules
|
|
||||||
- Relevant tests were added or updated
|
|
||||||
- Sensitive or unsafe behavior was not introduced
|
|
||||||
- User-facing documentation is updated when needed
|
|
||||||
- Feature adoption docs under `docs/zh-CN/` were added or updated when functionality was added, removed, or refactored
|
|
||||||
144
CLAUDE.md
144
CLAUDE.md
@ -1,144 +0,0 @@
|
|||||||
# CLAUDE.md
|
|
||||||
|
|
||||||
This file provides project understanding for AI agents working in this repository.
|
|
||||||
|
|
||||||
## Project Overview
|
|
||||||
|
|
||||||
GFramework 是面向游戏开发的模块化 C# 框架,核心能力与引擎解耦。项目灵感参考 QFramework,并在模块边界、工程组织和可扩展性方面持续重构。
|
|
||||||
|
|
||||||
## AI Agent Instructions
|
|
||||||
|
|
||||||
All coding rules are defined in:
|
|
||||||
|
|
||||||
@AGENTS.md
|
|
||||||
|
|
||||||
Follow them strictly.
|
|
||||||
|
|
||||||
## Module Dependency Graph
|
|
||||||
|
|
||||||
```text
|
|
||||||
GFramework (meta package) ─→ Core + Game
|
|
||||||
GFramework.Cqrs ─→ Cqrs.Abstractions, Core.Abstractions
|
|
||||||
GFramework.Core ─→ Core.Abstractions
|
|
||||||
GFramework.Game ─→ Game.Abstractions, Core, Core.Abstractions
|
|
||||||
GFramework.Godot ─→ Core, Game, Core.Abstractions, Game.Abstractions
|
|
||||||
GFramework.Ecs.Arch ─→ Ecs.Arch.Abstractions, Core, Core.Abstractions
|
|
||||||
GFramework.Core.SourceGenerators ─→ Core.SourceGenerators.Abstractions, SourceGenerators.Common
|
|
||||||
GFramework.Game.SourceGenerators ─→ SourceGenerators.Common
|
|
||||||
GFramework.Godot.SourceGenerators ─→ Godot.SourceGenerators.Abstractions, SourceGenerators.Common
|
|
||||||
GFramework.Cqrs.SourceGenerators ─→ SourceGenerators.Common
|
|
||||||
```
|
|
||||||
|
|
||||||
- **Abstractions projects** (`netstandard2.1`): 只包含接口和契约定义,不承载运行时实现逻辑。
|
|
||||||
- **Core / Game / Ecs.Arch** (`net8.0;net9.0;net10.0`): 平台无关的核心实现层。
|
|
||||||
- **Godot**: Godot 引擎集成层,负责与节点、场景和引擎生命周期对接。
|
|
||||||
- **SourceGenerators family** (`netstandard2.0`/`netstandard2.1`): 按 Core / Game / Godot / Cqrs 拆分的 Roslyn
|
|
||||||
增量源码生成器,以及共享的 abstractions/common 基础设施。
|
|
||||||
|
|
||||||
## Architecture Pattern
|
|
||||||
|
|
||||||
框架核心采用 `Architecture / Model / System / Utility` 四层结构:
|
|
||||||
|
|
||||||
- **IArchitecture**: 顶层容器,负责生命周期管理、组件注册、模块安装和统一服务访问。
|
|
||||||
- **IContextAware**: 统一上下文访问接口,组件通过 `SetContext(IArchitectureContext)` 获取架构上下文。
|
|
||||||
- **IModel**: 数据与状态层,负责长期状态和业务数据建模。
|
|
||||||
- **ISystem**: 业务逻辑层,负责命令执行、流程编排和规则落地。
|
|
||||||
- **IUtility**: 通用无状态工具层,供其他层复用。
|
|
||||||
|
|
||||||
关键实现位于 `GFramework.Core/Architectures/Architecture.cs`,其职责是作为总协调器串联生命周期、组件注册和模块系统。
|
|
||||||
|
|
||||||
## Architecture Details
|
|
||||||
|
|
||||||
### Lifecycle
|
|
||||||
|
|
||||||
Architecture 负责统一生命周期编排,核心阶段包括:
|
|
||||||
|
|
||||||
- `Init`
|
|
||||||
- `Ready`
|
|
||||||
- `Destroy`
|
|
||||||
|
|
||||||
在实现层中,生命周期被拆分为更细粒度的初始化与销毁阶段,用于保证 Utility、Model、System、服务模块和钩子的顺序一致性。
|
|
||||||
|
|
||||||
### Component Coordination
|
|
||||||
|
|
||||||
框架通过独立组件协作完成架构编排:
|
|
||||||
|
|
||||||
- `ArchitectureLifecycle`: 管理生命周期阶段、阶段转换和生命周期钩子。
|
|
||||||
- `ArchitectureComponentRegistry`: 管理 Model、System、Utility 的注册与解析。
|
|
||||||
- `ArchitectureModules`: 管理模块安装、服务模块接入和扩展点注册。
|
|
||||||
|
|
||||||
这组拆分的目标是降低单个核心类的职责密度,同时保持对外 API 稳定。
|
|
||||||
|
|
||||||
### Context Propagation
|
|
||||||
|
|
||||||
`IArchitectureContext` 和相关 Provider 类型负责在组件之间传播上下文能力,使 Model、System
|
|
||||||
和外部扩展都能通过统一入口访问架构服务,而不直接耦合具体实现细节。
|
|
||||||
|
|
||||||
## Key Patterns
|
|
||||||
|
|
||||||
### CQRS
|
|
||||||
|
|
||||||
命令与查询分离,支持同步与异步执行。当前版本内建自有 CQRS runtime、行为管道和 handler 自动注册;历史 `Mediator`
|
|
||||||
兼容别名已从公开 API 移除,统一使用 `Cqrs` 命名入口。
|
|
||||||
|
|
||||||
### EventBus
|
|
||||||
|
|
||||||
类型安全事件总线支持事件发布、订阅、优先级、过滤器和弱引用订阅。它是模块之间松耦合通信的核心基础设施之一。
|
|
||||||
|
|
||||||
### BindableProperty
|
|
||||||
|
|
||||||
响应式属性模型通过值变化通知驱动界面或业务层更新,适合表达轻量级状态同步。
|
|
||||||
|
|
||||||
### Coroutine
|
|
||||||
|
|
||||||
帧驱动协程系统基于 `IYieldInstruction` 和调度器抽象,支持等待时间、事件和任务完成等常见模式。
|
|
||||||
|
|
||||||
### IoC
|
|
||||||
|
|
||||||
依赖注入通过 `MicrosoftDiContainer` 对 `Microsoft.Extensions.DependencyInjection` 进行封装,用于统一组件注册和服务解析体验。
|
|
||||||
|
|
||||||
### Service Modules
|
|
||||||
|
|
||||||
`IServiceModule` 模式用于向 Architecture 注册内置服务,例如 EventBus、CommandExecutor、QueryExecutor 等。这一模式承担“基础设施能力装配”的职责。
|
|
||||||
|
|
||||||
## Source Generators
|
|
||||||
|
|
||||||
当前仓库包含多类 Roslyn 增量源码生成器:
|
|
||||||
|
|
||||||
- `LoggerGenerator` (`[Log]`): 自动生成日志字段和日志辅助方法。
|
|
||||||
- `PriorityGenerator` (`[Priority]`): 生成优先级比较相关实现。
|
|
||||||
- `EnumExtensionsGenerator` (`[GenerateEnumExtensions]`): 生成枚举扩展能力。
|
|
||||||
- `ContextAwareGenerator` (`[ContextAware]`): 自动实现 `IContextAware` 相关样板逻辑。
|
|
||||||
- `CqrsHandlerRegistryGenerator`: 为消费端程序集生成 CQRS handler 注册器,运行时优先使用生成产物,无法覆盖时回退到反射扫描;非默认程序集可通过
|
|
||||||
`RegisterCqrsHandlersFromAssembly(...)` / `RegisterCqrsHandlersFromAssemblies(...)` 显式接入同一路径。
|
|
||||||
|
|
||||||
这些生成器的目标是减少重复代码,同时保持框架层 API 的一致性与可维护性。
|
|
||||||
|
|
||||||
## Module Structure
|
|
||||||
|
|
||||||
仓库以“抽象层 + 实现层 + 集成层 + 生成器层”的方式组织:
|
|
||||||
|
|
||||||
- `GFramework.Core.Abstractions` / `GFramework.Game.Abstractions`: 约束接口和公共契约。
|
|
||||||
- `GFramework.Cqrs.Abstractions` / `GFramework.Cqrs`: 提供 CQRS 契约、runtime 与 handler 注册基础设施。
|
|
||||||
- `GFramework.Core` / `GFramework.Game`: 提供平台无关实现。
|
|
||||||
- `GFramework.Godot`: 提供与 Godot 运行时集成的适配实现。
|
|
||||||
- `GFramework.Ecs.Arch`: 提供 ECS Architecture 相关扩展。
|
|
||||||
- `GFramework.Core.SourceGenerators` / `GFramework.Game.SourceGenerators` / `GFramework.Godot.SourceGenerators` /
|
|
||||||
`GFramework.Cqrs.SourceGenerators` 与相关 Abstractions/Common: 提供代码生成能力。
|
|
||||||
|
|
||||||
这种结构的核心设计目标是让抽象稳定、实现可替换、引擎集成隔离、生成器能力可独立演进。
|
|
||||||
|
|
||||||
## Documentation Structure
|
|
||||||
|
|
||||||
项目文档位于 `docs/`,中文内容位于 `docs/zh-CN/`。文档内容覆盖:
|
|
||||||
|
|
||||||
- 入门与安装
|
|
||||||
- Core / Game / Godot / ECS 各模块能力
|
|
||||||
- Source Generator 使用说明
|
|
||||||
- 教程、最佳实践与故障排查
|
|
||||||
|
|
||||||
阅读顺序通常建议先看根目录 `README.md` 和各子模块 `README.md`,再进入 `docs/` 查阅专题说明。
|
|
||||||
|
|
||||||
## Design Intent
|
|
||||||
|
|
||||||
GFramework 的设计重点不是把所有能力堆进单一核心类,而是通过清晰的模块边界、可组合的服务注册方式、稳定的抽象契约以及适度自动化的源码生成,构建一个适合长期演进的游戏开发基础框架。
|
|
||||||
@ -1,18 +0,0 @@
|
|||||||
<!--
|
|
||||||
Copyright (c) 2025-2026 GeWuYou
|
|
||||||
SPDX-License-Identifier: Apache-2.0
|
|
||||||
-->
|
|
||||||
|
|
||||||
<Project>
|
|
||||||
<!-- Keep repository-wide analyzer behavior consistent while allowing only selected projects to opt into polyfills. -->
|
|
||||||
<ItemGroup>
|
|
||||||
<PackageReference Include="Meziantou.Analyzer" Version="3.0.72">
|
|
||||||
<PrivateAssets>all</PrivateAssets>
|
|
||||||
<IncludeAssets>runtime; build; native; contentfiles; analyzers</IncludeAssets>
|
|
||||||
</PackageReference>
|
|
||||||
<PackageReference Update="Meziantou.Polyfill" Version="1.0.123">
|
|
||||||
<PrivateAssets>all</PrivateAssets>
|
|
||||||
<IncludeAssets>runtime; build; native; contentfiles; analyzers</IncludeAssets>
|
|
||||||
</PackageReference>
|
|
||||||
</ItemGroup>
|
|
||||||
</Project>
|
|
||||||
@ -1,176 +0,0 @@
|
|||||||
<!--
|
|
||||||
Copyright (c) 2025-2026 GeWuYou
|
|
||||||
SPDX-License-Identifier: Apache-2.0
|
|
||||||
-->
|
|
||||||
|
|
||||||
<Project>
|
|
||||||
|
|
||||||
<!--
|
|
||||||
为 GFramework 运行时包生成可选的模块级 transitive global usings。
|
|
||||||
该逻辑只在明确启用的可打包项目中生效,并在构建/打包期间自动扫描源码命名空间。
|
|
||||||
-->
|
|
||||||
<UsingTask TaskName="GenerateGFrameworkTransitiveGlobalUsingsProps"
|
|
||||||
TaskFactory="RoslynCodeTaskFactory"
|
|
||||||
AssemblyFile="$(MSBuildToolsPath)/Microsoft.Build.Tasks.Core.dll">
|
|
||||||
<ParameterGroup>
|
|
||||||
<SourceFiles ParameterType="Microsoft.Build.Framework.ITaskItem[]" Required="true"/>
|
|
||||||
<ExcludedNamespaces ParameterType="Microsoft.Build.Framework.ITaskItem[]"/>
|
|
||||||
<ExcludedNamespacePrefixes ParameterType="Microsoft.Build.Framework.ITaskItem[]"/>
|
|
||||||
<OutputFile ParameterType="System.String" Required="true"/>
|
|
||||||
<NamespaceItemName ParameterType="System.String" Required="true"/>
|
|
||||||
</ParameterGroup>
|
|
||||||
<Task>
|
|
||||||
<Code Type="Fragment"
|
|
||||||
Language="cs"><![CDATA[
|
|
||||||
var discoveredNamespaces = new global::System.Collections.Generic.SortedSet<string>(global::System.StringComparer.Ordinal);
|
|
||||||
var exactExclusions = new global::System.Collections.Generic.HashSet<string>(global::System.StringComparer.Ordinal);
|
|
||||||
var prefixExclusions = new global::System.Collections.Generic.List<string>();
|
|
||||||
var namespacePattern = new global::System.Text.RegularExpressions.Regex(
|
|
||||||
@"^\s*namespace\s+([A-Za-z_][A-Za-z0-9_.]*)\s*(?:;|\{)",
|
|
||||||
global::System.Text.RegularExpressions.RegexOptions.Compiled);
|
|
||||||
|
|
||||||
if (ExcludedNamespaces != null)
|
|
||||||
{
|
|
||||||
foreach (var excludedNamespace in ExcludedNamespaces)
|
|
||||||
{
|
|
||||||
if (!string.IsNullOrWhiteSpace(excludedNamespace.ItemSpec))
|
|
||||||
{
|
|
||||||
exactExclusions.Add(excludedNamespace.ItemSpec.Trim());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (ExcludedNamespacePrefixes != null)
|
|
||||||
{
|
|
||||||
foreach (var excludedPrefix in ExcludedNamespacePrefixes)
|
|
||||||
{
|
|
||||||
if (!string.IsNullOrWhiteSpace(excludedPrefix.ItemSpec))
|
|
||||||
{
|
|
||||||
prefixExclusions.Add(excludedPrefix.ItemSpec.Trim());
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
foreach (var sourceFile in SourceFiles)
|
|
||||||
{
|
|
||||||
var path = sourceFile.ItemSpec;
|
|
||||||
if (!global::System.IO.File.Exists(path))
|
|
||||||
{
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
foreach (var line in global::System.IO.File.ReadLines(path))
|
|
||||||
{
|
|
||||||
var match = namespacePattern.Match(line);
|
|
||||||
if (!match.Success)
|
|
||||||
{
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
var namespaceName = match.Groups[1].Value;
|
|
||||||
if (!namespaceName.StartsWith("GFramework.", global::System.StringComparison.Ordinal))
|
|
||||||
{
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (exactExclusions.Contains(namespaceName))
|
|
||||||
{
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
var excludedByPrefix = false;
|
|
||||||
foreach (var prefix in prefixExclusions)
|
|
||||||
{
|
|
||||||
if (namespaceName.StartsWith(prefix, global::System.StringComparison.Ordinal))
|
|
||||||
{
|
|
||||||
excludedByPrefix = true;
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!excludedByPrefix)
|
|
||||||
{
|
|
||||||
discoveredNamespaces.Add(namespaceName);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
static string Escape(string value)
|
|
||||||
{
|
|
||||||
return global::System.Security.SecurityElement.Escape(value) ?? value;
|
|
||||||
}
|
|
||||||
|
|
||||||
var directory = global::System.IO.Path.GetDirectoryName(OutputFile);
|
|
||||||
if (!string.IsNullOrEmpty(directory))
|
|
||||||
{
|
|
||||||
global::System.IO.Directory.CreateDirectory(directory);
|
|
||||||
}
|
|
||||||
|
|
||||||
var builder = new global::System.Text.StringBuilder();
|
|
||||||
var msbuildPropertyOpen = new string(new[] { '$', '(' });
|
|
||||||
var msbuildItemOpen = new string(new[] { '@', '(' });
|
|
||||||
builder.AppendLine("<Project>");
|
|
||||||
builder.AppendLine(" <!-- This file is generated by GFramework's MSBuild transitive global usings pipeline. -->");
|
|
||||||
builder.AppendLine(" <!-- EnableGFrameworkGlobalUsings=true enables the transitive global usings from this package. -->");
|
|
||||||
builder.AppendLine(" <!-- Add <GFrameworkExcludedUsing Include=\"Namespace\" /> to opt out of specific namespaces. -->");
|
|
||||||
builder.Append(" <ItemGroup Condition=\"'");
|
|
||||||
builder.Append(msbuildPropertyOpen);
|
|
||||||
builder.AppendLine("EnableGFrameworkGlobalUsings)' == 'true'\">");
|
|
||||||
|
|
||||||
foreach (var namespaceName in discoveredNamespaces)
|
|
||||||
{
|
|
||||||
builder.Append(" <");
|
|
||||||
builder.Append(NamespaceItemName);
|
|
||||||
builder.Append(" Include=\"");
|
|
||||||
builder.Append(Escape(namespaceName));
|
|
||||||
builder.AppendLine("\" />");
|
|
||||||
}
|
|
||||||
|
|
||||||
builder.Append(" <");
|
|
||||||
builder.Append(NamespaceItemName);
|
|
||||||
builder.Append(" Remove=\"");
|
|
||||||
builder.Append(msbuildItemOpen);
|
|
||||||
builder.AppendLine("GFrameworkExcludedUsing)\" />");
|
|
||||||
builder.Append(" <Using Include=\"");
|
|
||||||
builder.Append(msbuildItemOpen);
|
|
||||||
builder.Append(NamespaceItemName);
|
|
||||||
builder.AppendLine(")\" />");
|
|
||||||
builder.AppendLine(" </ItemGroup>");
|
|
||||||
builder.AppendLine("</Project>");
|
|
||||||
|
|
||||||
global::System.IO.File.WriteAllText(OutputFile, builder.ToString(), new global::System.Text.UTF8Encoding(false));
|
|
||||||
Log.LogMessage(global::Microsoft.Build.Framework.MessageImportance.Low,
|
|
||||||
$"Generated {discoveredNamespaces.Count} transitive global usings for {OutputFile}.");
|
|
||||||
]]></Code>
|
|
||||||
</Task>
|
|
||||||
</UsingTask>
|
|
||||||
|
|
||||||
<PropertyGroup>
|
|
||||||
<_GFrameworkTransitiveGlobalUsingsEnabled Condition="'$(EnableGFrameworkPackageTransitiveGlobalUsings)' == 'true' and '$(IsPackable)' != 'false'">true</_GFrameworkTransitiveGlobalUsingsEnabled>
|
|
||||||
<_GFrameworkTransitiveGlobalUsingsPrimaryTargetFramework Condition="'$(_GFrameworkTransitiveGlobalUsingsEnabled)' == 'true' and '$(TargetFrameworks)' != ''">$([System.String]::Copy('$(TargetFrameworks)').Split(';')[0])</_GFrameworkTransitiveGlobalUsingsPrimaryTargetFramework>
|
|
||||||
<_GFrameworkTransitiveGlobalUsingsGenerationBuild Condition="'$(_GFrameworkTransitiveGlobalUsingsEnabled)' == 'true' and ('$(TargetFrameworks)' == '' or '$(TargetFramework)' == '$(_GFrameworkTransitiveGlobalUsingsPrimaryTargetFramework)')">true</_GFrameworkTransitiveGlobalUsingsGenerationBuild>
|
|
||||||
<_GFrameworkTransitiveGlobalUsingsPackageId Condition="'$(_GFrameworkTransitiveGlobalUsingsEnabled)' == 'true' and '$(PackageId)' != ''">$(PackageId)</_GFrameworkTransitiveGlobalUsingsPackageId>
|
|
||||||
<_GFrameworkTransitiveGlobalUsingsPackageId Condition="'$(_GFrameworkTransitiveGlobalUsingsEnabled)' == 'true' and '$(_GFrameworkTransitiveGlobalUsingsPackageId)' == ''">$(AssemblyName)</_GFrameworkTransitiveGlobalUsingsPackageId>
|
|
||||||
<_GFrameworkTransitiveGlobalUsingsOutputFile Condition="'$(_GFrameworkTransitiveGlobalUsingsEnabled)' == 'true'">$(BaseIntermediateOutputPath)gframework/$(_GFrameworkTransitiveGlobalUsingsPackageId).props</_GFrameworkTransitiveGlobalUsingsOutputFile>
|
|
||||||
<_GFrameworkTransitiveGlobalUsingsItemName Condition="'$(_GFrameworkTransitiveGlobalUsingsEnabled)' == 'true'">_$([System.Text.RegularExpressions.Regex]::Replace('$(MSBuildProjectName)', '[^A-Za-z0-9_]', '_'))_TransitiveUsing</_GFrameworkTransitiveGlobalUsingsItemName>
|
|
||||||
</PropertyGroup>
|
|
||||||
|
|
||||||
<ItemGroup Condition="'$(_GFrameworkTransitiveGlobalUsingsEnabled)' == 'true'">
|
|
||||||
<None Include="$(_GFrameworkTransitiveGlobalUsingsOutputFile)"
|
|
||||||
Pack="true"
|
|
||||||
PackagePath="buildTransitive"
|
|
||||||
Visible="false"/>
|
|
||||||
</ItemGroup>
|
|
||||||
|
|
||||||
<Target Name="GenerateGFrameworkModuleTransitiveGlobalUsings"
|
|
||||||
Condition="'$(_GFrameworkTransitiveGlobalUsingsGenerationBuild)' == 'true'"
|
|
||||||
BeforeTargets="CoreCompile;GenerateNuspec">
|
|
||||||
<GenerateGFrameworkTransitiveGlobalUsingsProps
|
|
||||||
SourceFiles="@(Compile->'%(FullPath)')"
|
|
||||||
ExcludedNamespaces="@(GFrameworkTransitiveUsingExclude)"
|
|
||||||
ExcludedNamespacePrefixes="@(GFrameworkTransitiveUsingExcludePrefix)"
|
|
||||||
OutputFile="$(_GFrameworkTransitiveGlobalUsingsOutputFile)"
|
|
||||||
NamespaceItemName="$(_GFrameworkTransitiveGlobalUsingsItemName)"/>
|
|
||||||
</Target>
|
|
||||||
|
|
||||||
</Project>
|
|
||||||
@ -1,45 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
using System.Collections.Concurrent;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Architectures;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 架构模块注册表 - 用于外部模块的自动注册
|
|
||||||
/// </summary>
|
|
||||||
public static class ArchitectureModuleRegistry
|
|
||||||
{
|
|
||||||
private static readonly ConcurrentDictionary<string, Func<IServiceModule>> Factories = new(StringComparer.Ordinal);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册模块工厂(幂等操作,相同模块名只会注册一次)
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="factory">模块工厂函数</param>
|
|
||||||
public static void Register(Func<IServiceModule> factory)
|
|
||||||
{
|
|
||||||
// 创建临时实例以获取模块名(用于幂等性检查)
|
|
||||||
var tempModule = factory();
|
|
||||||
var moduleName = tempModule.ModuleName;
|
|
||||||
|
|
||||||
// 幂等注册:相同模块名只注册一次
|
|
||||||
Factories.TryAdd(moduleName, factory);
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 创建所有已注册的模块实例
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>模块实例集合</returns>
|
|
||||||
public static IEnumerable<IServiceModule> CreateModules()
|
|
||||||
{
|
|
||||||
return Factories.Values.Select(f => f());
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 清空注册表(主要用于测试)
|
|
||||||
/// </summary>
|
|
||||||
public static void Clear()
|
|
||||||
{
|
|
||||||
Factories.Clear();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
@ -1,27 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
using GFramework.Core.Abstractions.Enums;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Architectures;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 表示架构阶段变化事件的数据。
|
|
||||||
/// 该类型用于向事件订阅者传递当前已进入的阶段值。
|
|
||||||
/// </summary>
|
|
||||||
public sealed class ArchitecturePhaseChangedEventArgs : EventArgs
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 初始化 <see cref="ArchitecturePhaseChangedEventArgs" /> 的新实例。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="phase">当前已进入的架构阶段。</param>
|
|
||||||
public ArchitecturePhaseChangedEventArgs(ArchitecturePhase phase)
|
|
||||||
{
|
|
||||||
Phase = phase;
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取当前已进入的架构阶段。
|
|
||||||
/// </summary>
|
|
||||||
public ArchitecturePhase Phase { get; }
|
|
||||||
}
|
|
||||||
@ -1,138 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
using System.Reflection;
|
|
||||||
using GFramework.Core.Abstractions.Lifecycle;
|
|
||||||
using GFramework.Core.Abstractions.Model;
|
|
||||||
using GFramework.Core.Abstractions.Systems;
|
|
||||||
using GFramework.Core.Abstractions.Utility;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Architectures;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 架构接口,专注于生命周期管理,包括系统、模型、工具的注册和获取
|
|
||||||
/// 业务操作通过 ArchitectureRuntime 提供
|
|
||||||
/// </summary>
|
|
||||||
public interface IArchitecture : IAsyncInitializable, IAsyncDestroyable, IInitializable, IDestroyable
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取架构上下文
|
|
||||||
/// </summary>
|
|
||||||
IArchitectureContext Context { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取或设置用于配置服务集合的委托
|
|
||||||
/// </summary>
|
|
||||||
/// <value>
|
|
||||||
/// 一个可为空的委托,用于配置IServiceCollection实例
|
|
||||||
/// </value>
|
|
||||||
Action<IServiceCollection>? Configurator { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册系统实例到架构中
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="T">系统类型,必须实现ISystem接口</typeparam>
|
|
||||||
/// <param name="system">要注册的系统实例</param>
|
|
||||||
/// <returns>注册的系统实例</returns>
|
|
||||||
T RegisterSystem<T>(T system) where T : ISystem;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册系统实例到架构中
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="T">系统类型,必须实现ISystem接口</typeparam>
|
|
||||||
/// <param name="onCreated">系统实例创建后的回调函数,可为null</param>
|
|
||||||
void RegisterSystem<T>(Action<T>? onCreated = null) where T : class, ISystem;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册模型实例到架构中
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="T">模型类型,必须实现IModel接口</typeparam>
|
|
||||||
/// <param name="model">要注册的模型实例</param>
|
|
||||||
/// <returns>注册的模型实例</returns>
|
|
||||||
T RegisterModel<T>(T model) where T : IModel;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册模型实例到架构中
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="T">模型类型,必须实现IModel接口</typeparam>
|
|
||||||
/// <param name="onCreated">模型实例创建后的回调函数,可为null</param>
|
|
||||||
void RegisterModel<T>(Action<T>? onCreated = null) where T : class, IModel;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册工具实例到架构中
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="T">工具类型,必须实现IUtility接口</typeparam>
|
|
||||||
/// <param name="utility">要注册的工具实例</param>
|
|
||||||
/// <returns>注册的工具实例</returns>
|
|
||||||
T RegisterUtility<T>(T utility) where T : IUtility;
|
|
||||||
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册工具类型并可选地指定创建回调
|
|
||||||
/// 当工具实例被创建时会调用指定的回调函数
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="T">工具类型,必须是引用类型且实现IUtility接口</typeparam>
|
|
||||||
/// <param name="onCreated">工具实例创建后的回调函数,可为null</param>
|
|
||||||
void RegisterUtility<T>(Action<T>? onCreated = null) where T : class, IUtility;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册 CQRS 请求管道行为。
|
|
||||||
/// 既支持实现 <c>IPipelineBehavior<,></c> 的开放泛型行为类型,
|
|
||||||
/// 也支持绑定到单一请求/响应对的封闭行为类型。
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TBehavior">行为类型,必须是引用类型</typeparam>
|
|
||||||
void RegisterCqrsPipelineBehavior<TBehavior>()
|
|
||||||
where TBehavior : class;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册 CQRS 流式请求管道行为。
|
|
||||||
/// 既支持实现 <c>IStreamPipelineBehavior<,></c> 的开放泛型行为类型,
|
|
||||||
/// 也支持绑定到单一流式请求/响应对的封闭行为类型。
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TBehavior">行为类型,必须是引用类型</typeparam>
|
|
||||||
/// <exception cref="InvalidOperationException">当前架构的底层容器已冻结,无法继续注册流式管道行为。</exception>
|
|
||||||
/// <exception cref="ObjectDisposedException">当前架构的底层容器已释放,无法继续注册流式管道行为。</exception>
|
|
||||||
/// <remarks>
|
|
||||||
/// 该入口应在架构初始化冻结容器之前调用;具体开放泛型或封闭行为类型的校验逻辑由底层容器负责。
|
|
||||||
/// </remarks>
|
|
||||||
void RegisterCqrsStreamPipelineBehavior<TBehavior>()
|
|
||||||
where TBehavior : class;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 从指定程序集显式注册 CQRS 处理器。
|
|
||||||
/// 当处理器位于默认架构程序集之外的模块或扩展程序集中时,可在初始化阶段调用该入口接入对应程序集。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="assembly">包含 CQRS 处理器或生成注册器的程序集。</param>
|
|
||||||
/// <exception cref="ArgumentNullException"><paramref name="assembly" /> 为 <see langword="null" />。</exception>
|
|
||||||
/// <exception cref="InvalidOperationException">当前架构的底层容器已冻结,无法继续注册处理器。</exception>
|
|
||||||
void RegisterCqrsHandlersFromAssembly(Assembly assembly);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 从多个程序集显式注册 CQRS 处理器。
|
|
||||||
/// 该入口会对程序集集合去重,适用于统一接入多个扩展包或模块程序集。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="assemblies">要接入的程序集集合。</param>
|
|
||||||
/// <exception cref="ArgumentNullException"><paramref name="assemblies" /> 为 <see langword="null" />。</exception>
|
|
||||||
/// <exception cref="InvalidOperationException">当前架构的底层容器已冻结,无法继续注册处理器。</exception>
|
|
||||||
void RegisterCqrsHandlersFromAssemblies(IEnumerable<Assembly> assemblies);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 安装架构模块
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="module">要安装的模块</param>
|
|
||||||
/// <returns>安装的模块实例</returns>
|
|
||||||
IArchitectureModule InstallModule(IArchitectureModule module);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册生命周期钩子
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="hook">生命周期钩子实例</param>
|
|
||||||
/// <returns>注册的钩子实例</returns>
|
|
||||||
IArchitectureLifecycleHook RegisterLifecycleHook(IArchitectureLifecycleHook hook);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 等待直到架构准备就绪的异步操作
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>表示异步等待操作的任务</returns>
|
|
||||||
Task WaitUntilReadyAsync();
|
|
||||||
}
|
|
||||||
@ -1,22 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
using GFramework.Core.Abstractions.Properties;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Architectures;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 定义架构配置的接口,提供日志工厂、日志级别和架构选项的配置功能
|
|
||||||
/// </summary>
|
|
||||||
public interface IArchitectureConfiguration
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取或设置日志选项,包含日志相关的配置参数
|
|
||||||
/// </summary>
|
|
||||||
LoggerProperties LoggerProperties { get; set; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取或设置架构选项,包含架构相关的配置参数
|
|
||||||
/// </summary>
|
|
||||||
ArchitectureProperties ArchitectureProperties { get; set; }
|
|
||||||
}
|
|
||||||
@ -1,293 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
using GFramework.Core.Abstractions.Command;
|
|
||||||
using GFramework.Core.Abstractions.Environment;
|
|
||||||
using GFramework.Core.Abstractions.Events;
|
|
||||||
using GFramework.Core.Abstractions.Model;
|
|
||||||
using GFramework.Core.Abstractions.Query;
|
|
||||||
using GFramework.Core.Abstractions.Systems;
|
|
||||||
using GFramework.Core.Abstractions.Utility;
|
|
||||||
using GFramework.Cqrs.Abstractions.Cqrs;
|
|
||||||
using ICommand = GFramework.Core.Abstractions.Command.ICommand;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Architectures;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 架构上下文接口,统一暴露框架组件访问、兼容旧命令/查询总线,以及当前推荐的 CQRS 运行时入口。
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// <para>旧的 <c>GFramework.Core.Abstractions.Command</c> 与 <c>GFramework.Core.Abstractions.Query</c> 契约会继续通过原有 Command/Query Executor 路径执行,以保证存量代码兼容。</para>
|
|
||||||
/// <para>新的 <c>GFramework.Cqrs.Abstractions.Cqrs</c> 契约由内置 CQRS dispatcher 统一处理,支持 request pipeline、notification publish 与 stream request。</para>
|
|
||||||
/// <para>新功能优先使用 <see cref="SendRequestAsync{TResponse}(IRequest{TResponse},CancellationToken)" />、<see cref="SendAsync{TCommand}(TCommand,CancellationToken)" /> 与对应的 CQRS Command/Query 重载;迁移旧代码时可先保留旧入口,再逐步替换为 CQRS 请求模型。</para>
|
|
||||||
/// </remarks>
|
|
||||||
public interface IArchitectureContext : ICqrsContext
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定类型的服务实例
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TService">服务类型</typeparam>
|
|
||||||
/// <returns>服务实例,如果不存在则抛出异常</returns>
|
|
||||||
TService GetService<TService>() where TService : class;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定类型的所有服务实例
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TService">服务类型</typeparam>
|
|
||||||
/// <returns>所有符合条件的服务实例列表</returns>
|
|
||||||
IReadOnlyList<TService> GetServices<TService>() where TService : class;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定类型的系统实例
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TSystem">系统类型,必须继承自ISystem接口</typeparam>
|
|
||||||
/// <returns>系统实例,如果不存在则抛出异常</returns>
|
|
||||||
TSystem GetSystem<TSystem>() where TSystem : class, ISystem;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定类型的所有系统实例
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TSystem">系统类型,必须继承自ISystem接口</typeparam>
|
|
||||||
/// <returns>所有符合条件的系统实例列表</returns>
|
|
||||||
IReadOnlyList<TSystem> GetSystems<TSystem>() where TSystem : class, ISystem;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定类型的模型实例
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TModel">模型类型,必须继承自IModel接口</typeparam>
|
|
||||||
/// <returns>模型实例,如果不存在则抛出异常</returns>
|
|
||||||
TModel GetModel<TModel>() where TModel : class, IModel;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定类型的所有模型实例
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TModel">模型类型,必须继承自IModel接口</typeparam>
|
|
||||||
/// <returns>所有符合条件的模型实例列表</returns>
|
|
||||||
IReadOnlyList<TModel> GetModels<TModel>() where TModel : class, IModel;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定类型的工具类实例
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TUtility">工具类类型,必须继承自IUtility接口</typeparam>
|
|
||||||
/// <returns>工具类实例,如果不存在则抛出异常</returns>
|
|
||||||
TUtility GetUtility<TUtility>() where TUtility : class, IUtility;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定类型的所有工具类实例
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TUtility">工具类类型,必须继承自IUtility接口</typeparam>
|
|
||||||
/// <returns>所有符合条件的工具类实例列表</returns>
|
|
||||||
IReadOnlyList<TUtility> GetUtilities<TUtility>() where TUtility : class, IUtility;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定类型的所有服务实例,并按优先级排序
|
|
||||||
/// 实现 IPrioritized 接口的服务将按优先级排序(数值越小优先级越高)
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TService">服务类型</typeparam>
|
|
||||||
/// <returns>按优先级排序后的服务实例列表</returns>
|
|
||||||
IReadOnlyList<TService> GetServicesByPriority<TService>() where TService : class;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定类型的所有系统实例,并按优先级排序
|
|
||||||
/// 实现 IPrioritized 接口的系统将按优先级排序(数值越小优先级越高)
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TSystem">系统类型,必须继承自ISystem接口</typeparam>
|
|
||||||
/// <returns>按优先级排序后的系统实例列表</returns>
|
|
||||||
IReadOnlyList<TSystem> GetSystemsByPriority<TSystem>() where TSystem : class, ISystem;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定类型的所有模型实例,并按优先级排序
|
|
||||||
/// 实现 IPrioritized 接口的模型将按优先级排序(数值越小优先级越高)
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TModel">模型类型,必须继承自IModel接口</typeparam>
|
|
||||||
/// <returns>按优先级排序后的模型实例列表</returns>
|
|
||||||
IReadOnlyList<TModel> GetModelsByPriority<TModel>() where TModel : class, IModel;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定类型的所有工具类实例,并按优先级排序
|
|
||||||
/// 实现 IPrioritized 接口的工具将按优先级排序(数值越小优先级越高)
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TUtility">工具类类型,必须继承自IUtility接口</typeparam>
|
|
||||||
/// <returns>按优先级排序后的工具类实例列表</returns>
|
|
||||||
IReadOnlyList<TUtility> GetUtilitiesByPriority<TUtility>() where TUtility : class, IUtility;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送一个旧版命令。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="command">要发送的旧版命令。</param>
|
|
||||||
void SendCommand(ICommand command);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送一个旧版带返回值命令。
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TResult">命令执行结果类型。</typeparam>
|
|
||||||
/// <param name="command">要发送的旧版命令。</param>
|
|
||||||
/// <returns>命令执行结果。</returns>
|
|
||||||
TResult SendCommand<TResult>(ICommand<TResult> command);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送一个新版 CQRS 命令并返回结果。
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TResponse">命令响应类型。</typeparam>
|
|
||||||
/// <param name="command">要发送的 CQRS 命令。</param>
|
|
||||||
/// <returns>命令执行结果。</returns>
|
|
||||||
/// <remarks>
|
|
||||||
/// 这是迁移后的推荐命令入口。无返回值命令应实现 <c>IRequest<Unit></c>,并优先通过 <see cref="SendAsync{TCommand}(TCommand,CancellationToken)" /> 调用。
|
|
||||||
/// </remarks>
|
|
||||||
TResponse SendCommand<TResponse>(GFramework.Cqrs.Abstractions.Cqrs.Command.ICommand<TResponse> command);
|
|
||||||
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 异步发送一个旧版命令。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="command">要发送的旧版命令。</param>
|
|
||||||
Task SendCommandAsync(IAsyncCommand command);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 异步发送一个新版 CQRS 命令并返回结果。
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TResponse">命令响应类型。</typeparam>
|
|
||||||
/// <param name="command">要发送的 CQRS 命令。</param>
|
|
||||||
/// <param name="cancellationToken">取消令牌。</param>
|
|
||||||
/// <returns>包含命令执行结果的值任务。</returns>
|
|
||||||
ValueTask<TResponse> SendCommandAsync<TResponse>(
|
|
||||||
GFramework.Cqrs.Abstractions.Cqrs.Command.ICommand<TResponse> command,
|
|
||||||
CancellationToken cancellationToken = default);
|
|
||||||
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 异步发送一个旧版带返回值命令。
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TResult">命令执行结果类型。</typeparam>
|
|
||||||
/// <param name="command">要发送的旧版命令。</param>
|
|
||||||
/// <returns>命令执行结果。</returns>
|
|
||||||
Task<TResult> SendCommandAsync<TResult>(IAsyncCommand<TResult> command);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送一个旧版查询请求。
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TResult">查询结果类型。</typeparam>
|
|
||||||
/// <param name="query">要发送的旧版查询。</param>
|
|
||||||
/// <returns>查询结果。</returns>
|
|
||||||
TResult SendQuery<TResult>(IQuery<TResult> query);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送一个新版 CQRS 查询并返回结果。
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TResponse">查询响应类型。</typeparam>
|
|
||||||
/// <param name="query">要发送的 CQRS 查询。</param>
|
|
||||||
/// <returns>查询结果。</returns>
|
|
||||||
/// <remarks>
|
|
||||||
/// 这是迁移后的推荐查询入口。新查询应优先实现 <c>GFramework.Cqrs.Abstractions.Cqrs.Query.IQuery<TResponse></c>。
|
|
||||||
/// </remarks>
|
|
||||||
TResponse SendQuery<TResponse>(GFramework.Cqrs.Abstractions.Cqrs.Query.IQuery<TResponse> query);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 异步发送一个旧版查询请求。
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TResult">查询结果类型。</typeparam>
|
|
||||||
/// <param name="query">要发送的旧版异步查询。</param>
|
|
||||||
/// <returns>查询结果。</returns>
|
|
||||||
Task<TResult> SendQueryAsync<TResult>(IAsyncQuery<TResult> query);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 异步发送一个新版 CQRS 查询并返回结果。
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TResponse">查询响应类型。</typeparam>
|
|
||||||
/// <param name="query">要发送的 CQRS 查询。</param>
|
|
||||||
/// <param name="cancellationToken">取消令牌。</param>
|
|
||||||
/// <returns>包含查询结果的值任务。</returns>
|
|
||||||
ValueTask<TResponse> SendQueryAsync<TResponse>(GFramework.Cqrs.Abstractions.Cqrs.Query.IQuery<TResponse> query,
|
|
||||||
CancellationToken cancellationToken = default);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送一个事件
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TEvent">事件类型,必须具有无参构造函数</typeparam>
|
|
||||||
void SendEvent<TEvent>() where TEvent : new();
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送一个带参数的事件
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TEvent">事件类型</typeparam>
|
|
||||||
/// <param name="e">事件参数</param>
|
|
||||||
void SendEvent<TEvent>(TEvent e) where TEvent : class;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册事件处理器
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TEvent">事件类型</typeparam>
|
|
||||||
/// <param name="handler">事件处理委托</param>
|
|
||||||
/// <returns>事件注销接口</returns>
|
|
||||||
IUnRegister RegisterEvent<TEvent>(Action<TEvent> handler);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 取消注册事件监听器
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TEvent">事件类型</typeparam>
|
|
||||||
/// <param name="onEvent">要取消注册的事件回调方法</param>
|
|
||||||
void UnRegisterEvent<TEvent>(Action<TEvent> onEvent);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送新版 CQRS 请求,并统一处理命令与查询。
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// 这是自有 CQRS 运行时的主入口。新代码应优先通过该方法或 <see cref="SendAsync{TCommand}(TCommand,CancellationToken)" /> 进入 dispatcher。
|
|
||||||
/// </remarks>
|
|
||||||
ValueTask<TResponse> SendRequestAsync<TResponse>(
|
|
||||||
IRequest<TResponse> request,
|
|
||||||
CancellationToken cancellationToken = default);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送新版 CQRS 请求的同步包装版本。
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// 仅为兼容同步调用链保留;新代码应优先使用异步入口,避免阻塞当前线程。
|
|
||||||
/// </remarks>
|
|
||||||
TResponse SendRequest<TResponse>(IRequest<TResponse> request);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发布新版 CQRS 通知。
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// 该入口用于一对多通知分发,与框架级 <c>EventBus</c> 事件系统并存,适合围绕请求处理过程传播领域通知。
|
|
||||||
/// </remarks>
|
|
||||||
ValueTask PublishAsync<TNotification>(
|
|
||||||
TNotification notification,
|
|
||||||
CancellationToken cancellationToken = default)
|
|
||||||
where TNotification : INotification;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 创建新版 CQRS 流式请求。
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// 适用于需要按序惰性产出大量结果的场景。调用方应消费返回的异步序列,而不是回退到旧版查询总线。
|
|
||||||
/// </remarks>
|
|
||||||
IAsyncEnumerable<TResponse> CreateStream<TResponse>(
|
|
||||||
IStreamRequest<TResponse> request,
|
|
||||||
CancellationToken cancellationToken = default);
|
|
||||||
|
|
||||||
// === 便捷扩展方法 ===
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送一个无返回值的新版 CQRS 命令。
|
|
||||||
/// </summary>
|
|
||||||
ValueTask SendAsync<TCommand>(
|
|
||||||
TCommand command,
|
|
||||||
CancellationToken cancellationToken = default)
|
|
||||||
where TCommand : IRequest<Unit>;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送一个有返回值的新版 CQRS 请求。
|
|
||||||
/// </summary>
|
|
||||||
ValueTask<TResponse> SendAsync<TResponse>(
|
|
||||||
IRequest<TResponse> command,
|
|
||||||
CancellationToken cancellationToken = default);
|
|
||||||
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取环境对象
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>环境对象实例</returns>
|
|
||||||
IEnvironment GetEnvironment();
|
|
||||||
}
|
|
||||||
@ -1,24 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Architectures;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 架构上下文提供者接口,用于解耦上下文获取逻辑
|
|
||||||
/// </summary>
|
|
||||||
public interface IArchitectureContextProvider
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取当前的架构上下文
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>架构上下文实例</returns>
|
|
||||||
IArchitectureContext GetContext();
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 尝试获取指定类型的架构上下文
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="T">架构上下文类型</typeparam>
|
|
||||||
/// <param name="context">输出的上下文实例</param>
|
|
||||||
/// <returns>如果成功获取则返回true,否则返回false</returns>
|
|
||||||
bool TryGetContext<T>(out T? context) where T : class, IArchitectureContext;
|
|
||||||
}
|
|
||||||
@ -1,20 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
using GFramework.Core.Abstractions.Enums;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Architectures;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 架构生命周期钩子接口,用于在架构的不同生命周期阶段执行自定义逻辑。
|
|
||||||
/// 实现此接口的类可以监听架构阶段变化并访问相关的架构实例。
|
|
||||||
/// </summary>
|
|
||||||
public interface IArchitectureLifecycleHook
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 当架构进入指定阶段时触发的回调方法。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="phase">当前的架构阶段</param>
|
|
||||||
/// <param name="architecture">相关的架构实例</param>
|
|
||||||
void OnPhase(ArchitecturePhase phase, IArchitecture architecture);
|
|
||||||
}
|
|
||||||
@ -1,17 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Architectures;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 架构模块接口,继承自架构生命周期接口。
|
|
||||||
/// 定义了模块安装到架构中的标准方法。
|
|
||||||
/// </summary>
|
|
||||||
public interface IArchitectureModule
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 将当前模块安装到指定的架构中。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="architecture">要安装模块的目标架构实例。</param>
|
|
||||||
void Install(IArchitecture architecture);
|
|
||||||
}
|
|
||||||
@ -1,19 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
using GFramework.Core.Abstractions.Enums;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Architectures;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 架构阶段监听器接口,用于监听和响应架构生命周期中的不同阶段变化。
|
|
||||||
/// 实现此接口的类可以在架构进入特定阶段时执行相应的逻辑处理。
|
|
||||||
/// </summary>
|
|
||||||
public interface IArchitecturePhaseListener
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 当架构进入指定阶段时触发的回调方法。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="phase">架构阶段枚举值,表示当前所处的架构阶段</param>
|
|
||||||
void OnArchitecturePhase(ArchitecturePhase phase);
|
|
||||||
}
|
|
||||||
@ -1,52 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
using GFramework.Core.Abstractions.Command;
|
|
||||||
using GFramework.Core.Abstractions.Events;
|
|
||||||
using GFramework.Core.Abstractions.Ioc;
|
|
||||||
using GFramework.Core.Abstractions.Query;
|
|
||||||
using GFramework.Core.Abstractions.Rule;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Architectures;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 架构服务接口,定义了框架核心架构所需的服务组件
|
|
||||||
/// </summary>
|
|
||||||
public interface IArchitectureServices : IContextAware
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取依赖注入容器
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>IIocContainer类型的依赖注入容器实例</returns>
|
|
||||||
IIocContainer Container { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取类型事件系统
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>ITypeEventSystem类型的事件系统实例</returns>
|
|
||||||
IEventBus EventBus { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取命令执行器
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>ICommandExecutor类型的命令执行器实例</returns>
|
|
||||||
ICommandExecutor CommandExecutor { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取查询执行器
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>IQueryExecutor类型的查询执行器实例</returns>
|
|
||||||
IQueryExecutor QueryExecutor { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取异步查询执行器
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>IAsyncQueryExecutor类型的异步查询执行器实例</returns>
|
|
||||||
IAsyncQueryExecutor AsyncQueryExecutor { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取服务模块管理器
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>IServiceModuleManager类型的服务模块管理器实例</returns>
|
|
||||||
IServiceModuleManager ModuleManager { get; }
|
|
||||||
}
|
|
||||||
@ -1,37 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
using GFramework.Core.Abstractions.Ioc;
|
|
||||||
using GFramework.Core.Abstractions.Lifecycle;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Architectures;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 服务模块接口,定义了服务模块的基本契约。
|
|
||||||
/// 所有服务模块必须实现此接口,以支持注册、初始化和异步销毁功能。
|
|
||||||
/// </summary>
|
|
||||||
public interface IServiceModule : IInitializable, IAsyncDestroyable
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取模块的唯一名称。
|
|
||||||
/// </summary>
|
|
||||||
string ModuleName { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取模块的优先级,数值越小优先级越高。
|
|
||||||
/// 用于控制模块的注册和初始化顺序。
|
|
||||||
/// </summary>
|
|
||||||
int Priority { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取模块的启用状态。
|
|
||||||
/// 返回 true 表示模块已启用,false 表示模块被禁用。
|
|
||||||
/// </summary>
|
|
||||||
bool IsEnabled { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册模块提供的服务到依赖注入容器中。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="container">依赖注入容器实例,用于注册服务。</param>
|
|
||||||
void Register(IIocContainer container);
|
|
||||||
}
|
|
||||||
@ -1,43 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
using GFramework.Core.Abstractions.Ioc;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Architectures;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 服务模块管理器接口,用于管理架构中的服务模块。
|
|
||||||
/// </summary>
|
|
||||||
public interface IServiceModuleManager
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 注册一个服务模块。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="module">要注册的服务模块实例。</param>
|
|
||||||
void RegisterModule(IServiceModule module);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 注册内置的服务模块。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="container">IoC容器实例,用于解析依赖。</param>
|
|
||||||
void RegisterBuiltInModules(IIocContainer container);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取所有已注册的服务模块。
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>只读的服务模块列表。</returns>
|
|
||||||
IReadOnlyList<IServiceModule> GetModules();
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 异步初始化所有已注册的服务模块。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="asyncMode">是否以异步模式初始化模块。</param>
|
|
||||||
/// <returns>表示异步操作的任务。</returns>
|
|
||||||
Task InitializeAllAsync(bool asyncMode);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 异步销毁所有已注册的服务模块。
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>表示异步操作的值任务。</returns>
|
|
||||||
ValueTask DestroyAllAsync();
|
|
||||||
}
|
|
||||||
@ -1,26 +0,0 @@
|
|||||||
// Copyright (c) 2026 GeWuYou
|
|
||||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
// you may not use this file except in compliance with the License.
|
|
||||||
// You may obtain a copy of the License at
|
|
||||||
//
|
|
||||||
// http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
//
|
|
||||||
// Unless required by applicable law or agreed to in writing, software
|
|
||||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
// See the License for the specific language governing permissions and
|
|
||||||
// limitations under the License.
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Bases;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 定义具有键值访问能力的接口契约
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TKey">键的类型</typeparam>
|
|
||||||
public interface IHasKey<out TKey>
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取对象的键值
|
|
||||||
/// </summary>
|
|
||||||
TKey Key { get; }
|
|
||||||
}
|
|
||||||
@ -1,22 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Bases;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 表示键值对的接口,定义了通用的键值对数据结构契约
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TKey">键的类型</typeparam>
|
|
||||||
/// <typeparam name="TValue">值的类型</typeparam>
|
|
||||||
public interface IKeyValue<out TKey, out TValue>
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取键值对中的键
|
|
||||||
/// </summary>
|
|
||||||
TKey Key { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取键值对中的值
|
|
||||||
/// </summary>
|
|
||||||
TValue Value { get; }
|
|
||||||
}
|
|
||||||
@ -1,20 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Bases;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 定义具有优先级的对象接口。
|
|
||||||
/// 数值越小优先级越高,越先执行。
|
|
||||||
/// 用于控制服务、系统等组件的执行顺序。
|
|
||||||
/// </summary>
|
|
||||||
public interface IPrioritized
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取优先级值。
|
|
||||||
/// 数值越小优先级越高。
|
|
||||||
/// 默认优先级为 0。
|
|
||||||
/// 建议范围:-1000 到 1000。
|
|
||||||
/// </summary>
|
|
||||||
int Priority { get; }
|
|
||||||
}
|
|
||||||
@ -1,74 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Bases;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 预定义的优先级分组常量
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// 提供标准化的优先级值,用于统一管理系统、服务等组件的执行顺序。
|
|
||||||
/// 优先级值越小,优先级越高(负数表示高优先级)。
|
|
||||||
/// </remarks>
|
|
||||||
public static class PriorityGroup
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 关键优先级 - 最高优先级,用于核心系统和基础设施
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// 适用场景:
|
|
||||||
/// - 日志系统
|
|
||||||
/// - 配置管理
|
|
||||||
/// - IoC 容器初始化
|
|
||||||
/// - 架构核心组件
|
|
||||||
/// </remarks>
|
|
||||||
public const int Critical = -100;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 高优先级 - 用于重要但非核心的系统
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// 适用场景:
|
|
||||||
/// - 事件总线
|
|
||||||
/// - 资源管理器
|
|
||||||
/// - 输入系统
|
|
||||||
/// - 网络管理器
|
|
||||||
/// </remarks>
|
|
||||||
public const int High = -50;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 普通优先级 - 默认优先级
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// 适用场景:
|
|
||||||
/// - 游戏逻辑系统
|
|
||||||
/// - UI 系统
|
|
||||||
/// - 音频系统
|
|
||||||
/// - 大部分业务逻辑
|
|
||||||
/// </remarks>
|
|
||||||
public const int Normal = 0;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 低优先级 - 用于非关键系统
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// 适用场景:
|
|
||||||
/// - 统计系统
|
|
||||||
/// - 调试工具
|
|
||||||
/// - 性能监控
|
|
||||||
/// - 辅助功能
|
|
||||||
/// </remarks>
|
|
||||||
public const int Low = 50;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 延迟优先级 - 最低优先级,用于可延迟执行的系统
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// 适用场景:
|
|
||||||
/// - 分析和遥测
|
|
||||||
/// - 后台数据同步
|
|
||||||
/// - 缓存清理
|
|
||||||
/// - 非紧急任务
|
|
||||||
/// </remarks>
|
|
||||||
public const int Deferred = 100;
|
|
||||||
}
|
|
||||||
@ -1,31 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
using GFramework.Core.Abstractions.Rule;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Command;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 表示一个异步命令接口,该命令不返回结果
|
|
||||||
/// </summary>
|
|
||||||
public interface IAsyncCommand : IContextAware
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 异步执行命令
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>表示异步操作的任务</returns>
|
|
||||||
Task ExecuteAsync();
|
|
||||||
}
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 表示一个异步命令接口,该命令返回指定类型的结果
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TResult">命令执行结果的类型</typeparam>
|
|
||||||
public interface IAsyncCommand<TResult> : IContextAware
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 异步执行命令并返回结果
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>表示异步操作的任务,任务结果为命令执行的返回值</returns>
|
|
||||||
Task<TResult> ExecuteAsync();
|
|
||||||
}
|
|
||||||
@ -1,39 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Command;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 定义命令执行器接口,提供同步和异步方式发送并执行命令的方法。
|
|
||||||
/// </summary>
|
|
||||||
public interface ICommandExecutor
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 发送并执行一个命令。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="command">要执行的命令对象,实现 ICommand 接口。</param>
|
|
||||||
public void Send(ICommand command);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送并执行一个带返回值的命令。
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TResult">命令执行结果的类型。</typeparam>
|
|
||||||
/// <param name="command">要执行的带返回值的命令对象,实现 ICommand<TResult> 接口。</param>
|
|
||||||
/// <returns>命令执行的结果,类型为 TResult。</returns>
|
|
||||||
public TResult Send<TResult>(ICommand<TResult> command);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送并异步执行一个命令。
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="command">要执行的命令对象,实现 IAsyncCommand 接口。</param>
|
|
||||||
/// <returns>表示异步操作的任务。</returns>
|
|
||||||
Task SendAsync(IAsyncCommand command);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 发送并异步执行一个带返回值的命令。
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="TResult">命令执行结果的类型。</typeparam>
|
|
||||||
/// <param name="command">要执行的带返回值的命令对象,实现 IAsyncCommand<TResult> 接口。</param>
|
|
||||||
/// <returns>表示异步操作的任务,其结果为命令执行的结果,类型为 TResult。</returns>
|
|
||||||
Task<TResult> SendAsync<TResult>(IAsyncCommand<TResult> command);
|
|
||||||
}
|
|
||||||
@ -1,49 +0,0 @@
|
|||||||
// Copyright (c) 2025 GeWuYou
|
|
||||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
// you may not use this file except in compliance with the License.
|
|
||||||
// You may obtain a copy of the License at
|
|
||||||
//
|
|
||||||
// http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
//
|
|
||||||
// Unless required by applicable law or agreed to in writing, software
|
|
||||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
// See the License for the specific language governing permissions and
|
|
||||||
// limitations under the License.
|
|
||||||
|
|
||||||
using GFramework.Core.Abstractions.Utility;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Concurrency;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 异步键锁管理器接口,提供基于键的细粒度锁机制
|
|
||||||
/// </summary>
|
|
||||||
public interface IAsyncKeyLockManager : IUtility, IDisposable
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 异步获取指定键的锁(推荐使用)
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="key">锁键</param>
|
|
||||||
/// <param name="cancellationToken">取消令牌</param>
|
|
||||||
/// <returns>锁句柄,使用 await using 自动释放</returns>
|
|
||||||
ValueTask<IAsyncLockHandle> AcquireLockAsync(string key, CancellationToken cancellationToken = default);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 同步获取指定键的锁(兼容性方法)
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="key">锁键</param>
|
|
||||||
/// <returns>锁句柄,使用 using 自动释放</returns>
|
|
||||||
IAsyncLockHandle AcquireLock(string key);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取锁管理器的统计信息
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>统计信息快照</returns>
|
|
||||||
LockStatistics GetStatistics();
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取当前活跃的锁信息(用于调试)
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>键到锁信息的只读字典</returns>
|
|
||||||
IReadOnlyDictionary<string, LockInfo> GetActiveLocks();
|
|
||||||
}
|
|
||||||
@ -1,30 +0,0 @@
|
|||||||
// Copyright (c) 2025 GeWuYou
|
|
||||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
// you may not use this file except in compliance with the License.
|
|
||||||
// You may obtain a copy of the License at
|
|
||||||
//
|
|
||||||
// http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
//
|
|
||||||
// Unless required by applicable law or agreed to in writing, software
|
|
||||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
// See the License for the specific language governing permissions and
|
|
||||||
// limitations under the License.
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Concurrency;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 异步锁句柄接口,支持 await using 语法
|
|
||||||
/// </summary>
|
|
||||||
public interface IAsyncLockHandle : IAsyncDisposable, IDisposable
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 锁的键
|
|
||||||
/// </summary>
|
|
||||||
string Key { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 锁获取时的时间戳(Environment.TickCount64)
|
|
||||||
/// </summary>
|
|
||||||
long AcquiredTicks { get; }
|
|
||||||
}
|
|
||||||
@ -1,43 +0,0 @@
|
|||||||
// Copyright (c) 2025 GeWuYou
|
|
||||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
// you may not use this file except in compliance with the License.
|
|
||||||
// You may obtain a copy of the License at
|
|
||||||
//
|
|
||||||
// http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
//
|
|
||||||
// Unless required by applicable law or agreed to in writing, software
|
|
||||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
// See the License for the specific language governing permissions and
|
|
||||||
// limitations under the License.
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Concurrency;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 锁信息(用于调试)
|
|
||||||
/// </summary>
|
|
||||||
public readonly struct LockInfo
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 锁的键。
|
|
||||||
/// </summary>
|
|
||||||
public string Key { get; init; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 当前引用计数。
|
|
||||||
/// </summary>
|
|
||||||
public int ReferenceCount { get; init; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 最后访问时间戳(Environment.TickCount64)。
|
|
||||||
/// </summary>
|
|
||||||
public long LastAccessTicks { get; init; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 等待队列长度(近似值)。
|
|
||||||
/// 注意:这是一个基于 SemaphoreSlim.CurrentCount 的近似指示器,
|
|
||||||
/// 当 CurrentCount == 0 时表示锁被持有且可能有等待者,返回 1;
|
|
||||||
/// 否则返回 0。这不是精确的等待者数量,仅用于调试参考。
|
|
||||||
/// </summary>
|
|
||||||
public int WaitingCount { get; init; }
|
|
||||||
}
|
|
||||||
@ -1,43 +0,0 @@
|
|||||||
// Copyright (c) 2025 GeWuYou
|
|
||||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
// you may not use this file except in compliance with the License.
|
|
||||||
// You may obtain a copy of the License at
|
|
||||||
//
|
|
||||||
// http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
//
|
|
||||||
// Unless required by applicable law or agreed to in writing, software
|
|
||||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
// See the License for the specific language governing permissions and
|
|
||||||
// limitations under the License.
|
|
||||||
|
|
||||||
using System.Runtime.InteropServices;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Concurrency;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 锁统计信息
|
|
||||||
/// </summary>
|
|
||||||
[StructLayout(LayoutKind.Auto)]
|
|
||||||
public readonly struct LockStatistics
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 当前活跃的锁数量
|
|
||||||
/// </summary>
|
|
||||||
public int ActiveLockCount { get; init; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 累计获取锁的次数
|
|
||||||
/// </summary>
|
|
||||||
public int TotalAcquired { get; init; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 累计释放锁的次数
|
|
||||||
/// </summary>
|
|
||||||
public int TotalReleased { get; init; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 累计清理的锁数量
|
|
||||||
/// </summary>
|
|
||||||
public int TotalCleaned { get; init; }
|
|
||||||
}
|
|
||||||
@ -1,102 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
using GFramework.Core.Abstractions.Events;
|
|
||||||
using GFramework.Core.Abstractions.Utility;
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Configuration;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 配置管理器接口,提供类型安全的配置存储和访问
|
|
||||||
/// 线程安全:所有方法都是线程安全的
|
|
||||||
/// </summary>
|
|
||||||
public interface IConfigurationManager : IUtility
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取配置数量
|
|
||||||
/// </summary>
|
|
||||||
int Count { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定键的配置值
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="T">配置值类型</typeparam>
|
|
||||||
/// <param name="key">配置键</param>
|
|
||||||
/// <returns>配置值,如果不存在则返回类型默认值</returns>
|
|
||||||
T? GetConfig<T>(string key);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取指定键的配置值,如果不存在则返回默认值
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="T">配置值类型</typeparam>
|
|
||||||
/// <param name="key">配置键</param>
|
|
||||||
/// <param name="defaultValue">默认值</param>
|
|
||||||
/// <returns>配置值或默认值</returns>
|
|
||||||
T GetConfig<T>(string key, T defaultValue);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 设置指定键的配置值
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="T">配置值类型</typeparam>
|
|
||||||
/// <param name="key">配置键</param>
|
|
||||||
/// <param name="value">配置值</param>
|
|
||||||
void SetConfig<T>(string key, T value);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 检查指定键的配置是否存在
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="key">配置键</param>
|
|
||||||
/// <returns>如果存在返回 true,否则返回 false</returns>
|
|
||||||
bool HasConfig(string key);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 移除指定键的配置
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="key">配置键</param>
|
|
||||||
/// <returns>如果成功移除返回 true,否则返回 false</returns>
|
|
||||||
bool RemoveConfig(string key);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 清空所有配置
|
|
||||||
/// </summary>
|
|
||||||
void Clear();
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 监听指定键的配置变化
|
|
||||||
/// </summary>
|
|
||||||
/// <typeparam name="T">配置值类型</typeparam>
|
|
||||||
/// <param name="key">配置键</param>
|
|
||||||
/// <param name="onChange">配置变化时的回调,参数为新值</param>
|
|
||||||
/// <returns>取消注册接口</returns>
|
|
||||||
IUnRegister WatchConfig<T>(string key, Action<T> onChange);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 从 JSON 字符串加载配置
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="json">JSON 字符串</param>
|
|
||||||
void LoadFromJson(string json);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 将配置保存为 JSON 字符串
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>JSON 字符串</returns>
|
|
||||||
string SaveToJson();
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 从文件加载配置
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="path">文件路径</param>
|
|
||||||
void LoadFromFile(string path);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 将配置保存到文件
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="path">文件路径</param>
|
|
||||||
void SaveToFile(string path);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取所有配置键
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>配置键集合</returns>
|
|
||||||
IEnumerable<string> GetAllKeys();
|
|
||||||
}
|
|
||||||
@ -1,41 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Controller;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 控制器标记接口,用于标识控制器组件
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// <para>
|
|
||||||
/// IController 是一个标记接口(Marker Interface),不包含任何方法或属性。
|
|
||||||
/// 它的作用是标识一个类是控制器,用于协调 Model、System 和 UI 之间的交互。
|
|
||||||
/// </para>
|
|
||||||
/// <para>
|
|
||||||
/// 架构访问 :控制器通常需要访问架构上下文。使用 [ContextAware] 特性
|
|
||||||
/// 自动生成上下文访问能力:
|
|
||||||
/// </para>
|
|
||||||
/// <code>
|
|
||||||
/// using GFramework.Core.SourceGenerators.Abstractions.Rule;
|
|
||||||
///
|
|
||||||
/// [ContextAware]
|
|
||||||
/// public partial class PlayerController : IController
|
|
||||||
/// {
|
|
||||||
/// public void Initialize()
|
|
||||||
/// {
|
|
||||||
/// // [ContextAware] 实现 IContextAware 接口,可使用扩展方法
|
|
||||||
/// var playerModel = this.GetModel<PlayerModel>();
|
|
||||||
/// var gameSystem = this.GetSystem<GameSystem>();
|
|
||||||
/// }
|
|
||||||
/// }
|
|
||||||
/// </code>
|
|
||||||
/// <para>
|
|
||||||
/// 注意:
|
|
||||||
/// </para>
|
|
||||||
/// <list type="bullet">
|
|
||||||
/// <item>必须添加 partial 关键字</item>
|
|
||||||
/// <item>[ContextAware] 特性会自动实现 IContextAware 接口</item>
|
|
||||||
/// <item>可使用 this.GetModel()、this.GetSystem() 等扩展方法访问架构</item>
|
|
||||||
/// </list>
|
|
||||||
/// </remarks>
|
|
||||||
public interface IController;
|
|
||||||
@ -1,31 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Coroutine;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 表示协程的最终完成结果。
|
|
||||||
/// </summary>
|
|
||||||
public enum CoroutineCompletionStatus
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 调度器无法确认该句柄的最终结果。
|
|
||||||
/// 这通常意味着句柄无效,或者句柄对应的历史结果已经不可用。
|
|
||||||
/// </summary>
|
|
||||||
Unknown,
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 协程自然执行结束。
|
|
||||||
/// </summary>
|
|
||||||
Completed,
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 协程被外部终止、清空或取消令牌中断。
|
|
||||||
/// </summary>
|
|
||||||
Cancelled,
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 协程在推进过程中抛出了异常。
|
|
||||||
/// </summary>
|
|
||||||
Faulted
|
|
||||||
}
|
|
||||||
@ -1,32 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Coroutine;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 表示协程调度器当前所处的执行阶段。
|
|
||||||
/// </summary>
|
|
||||||
/// <remarks>
|
|
||||||
/// 某些等待指令具有阶段语义,例如 <c>WaitForFixedUpdate</c> 和 <c>WaitForEndOfFrame</c>。
|
|
||||||
/// 宿主应为这些语义提供匹配的调度器阶段,否则这类等待不会自然完成。
|
|
||||||
/// </remarks>
|
|
||||||
public enum CoroutineExecutionStage
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 默认更新阶段。
|
|
||||||
/// 普通时间等待、下一帧等待以及大多数条件等待都会在该阶段推进。
|
|
||||||
/// </summary>
|
|
||||||
Update,
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 固定更新阶段。
|
|
||||||
/// 仅与固定步相关的等待指令会在该阶段完成。
|
|
||||||
/// </summary>
|
|
||||||
FixedUpdate,
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 帧结束阶段。
|
|
||||||
/// 仅与帧尾或延迟执行相关的等待指令会在该阶段完成。
|
|
||||||
/// </summary>
|
|
||||||
EndOfFrame
|
|
||||||
}
|
|
||||||
@ -1,36 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Coroutine;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 协程优先级枚举
|
|
||||||
/// 定义协程的执行优先级,高优先级的协程会优先执行
|
|
||||||
/// </summary>
|
|
||||||
public enum CoroutinePriority
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 最低优先级
|
|
||||||
/// </summary>
|
|
||||||
Lowest = 0,
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 低优先级
|
|
||||||
/// </summary>
|
|
||||||
Low = 1,
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 普通优先级(默认)
|
|
||||||
/// </summary>
|
|
||||||
Normal = 2,
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 高优先级
|
|
||||||
/// </summary>
|
|
||||||
High = 3,
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 最高优先级
|
|
||||||
/// </summary>
|
|
||||||
Highest = 4
|
|
||||||
}
|
|
||||||
@ -1,30 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Coroutine;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 表示协程的执行状态枚举
|
|
||||||
/// </summary>
|
|
||||||
public enum CoroutineState
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 协程正在运行中
|
|
||||||
/// </summary>
|
|
||||||
Running,
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 协程已暂停
|
|
||||||
/// </summary>
|
|
||||||
Paused,
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 协程已完成执行
|
|
||||||
/// </summary>
|
|
||||||
Completed,
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 协程已被取消
|
|
||||||
/// </summary>
|
|
||||||
Cancelled
|
|
||||||
}
|
|
||||||
@ -1,71 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Coroutine;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 协程统计信息接口
|
|
||||||
/// 提供协程执行的性能统计数据
|
|
||||||
/// </summary>
|
|
||||||
public interface ICoroutineStatistics
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取总协程启动数量
|
|
||||||
/// </summary>
|
|
||||||
long TotalStarted { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取总协程完成数量
|
|
||||||
/// </summary>
|
|
||||||
long TotalCompleted { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取总协程失败数量
|
|
||||||
/// </summary>
|
|
||||||
long TotalFailed { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取当前活跃协程数量
|
|
||||||
/// </summary>
|
|
||||||
int ActiveCount { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取当前暂停协程数量
|
|
||||||
/// </summary>
|
|
||||||
int PausedCount { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取协程平均执行时间(毫秒)
|
|
||||||
/// </summary>
|
|
||||||
double AverageExecutionTimeMs { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取协程最大执行时间(毫秒)
|
|
||||||
/// </summary>
|
|
||||||
double MaxExecutionTimeMs { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取按优先级分组的协程数量
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="priority">协程优先级</param>
|
|
||||||
/// <returns>指定优先级的协程数量</returns>
|
|
||||||
int GetCountByPriority(CoroutinePriority priority);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取按标签分组的协程数量
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="tag">协程标签</param>
|
|
||||||
/// <returns>指定标签的协程数量</returns>
|
|
||||||
int GetCountByTag(string tag);
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 重置统计数据
|
|
||||||
/// </summary>
|
|
||||||
void Reset();
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 生成统计报告
|
|
||||||
/// </summary>
|
|
||||||
/// <returns>格式化的统计报告字符串</returns>
|
|
||||||
string GenerateReport();
|
|
||||||
}
|
|
||||||
@ -1,25 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Coroutine;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 时间源接口,提供当前时间、时间增量以及更新功能
|
|
||||||
/// </summary>
|
|
||||||
public interface ITimeSource
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取当前时间
|
|
||||||
/// </summary>
|
|
||||||
double CurrentTime { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 获取时间增量(上一帧到当前帧的时间差)
|
|
||||||
/// </summary>
|
|
||||||
double DeltaTime { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 更新时间源的状态
|
|
||||||
/// </summary>
|
|
||||||
void Update();
|
|
||||||
}
|
|
||||||
@ -1,21 +0,0 @@
|
|||||||
// Copyright (c) 2025-2026 GeWuYou
|
|
||||||
// SPDX-License-Identifier: Apache-2.0
|
|
||||||
|
|
||||||
namespace GFramework.Core.Abstractions.Coroutine;
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 定义一个可等待指令的接口,用于协程系统中的异步操作控制
|
|
||||||
/// </summary>
|
|
||||||
public interface IYieldInstruction
|
|
||||||
{
|
|
||||||
/// <summary>
|
|
||||||
/// 获取当前等待指令是否已完成执行
|
|
||||||
/// </summary>
|
|
||||||
bool IsDone { get; }
|
|
||||||
|
|
||||||
/// <summary>
|
|
||||||
/// 每帧由调度器调用,用于更新当前等待指令的状态
|
|
||||||
/// </summary>
|
|
||||||
/// <param name="deltaTime">自上一帧以来的时间间隔(以秒为单位)</param>
|
|
||||||
void Update(double deltaTime);
|
|
||||||
}
|
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user