mirror of
https://github.com/GeWuYou/GFramework.git
synced 2026-05-07 08:44:29 +08:00
Compare commits
67 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c2d22285ed | ||
|
|
e3d6aa5111 | ||
|
|
30ddb841a9 | ||
|
|
c65c131d6a | ||
|
|
f0a2978882 | ||
|
|
3233151207 | ||
|
|
0ec8aa076b | ||
|
|
588800bb7b | ||
|
|
ee41206965 | ||
|
|
db89918333 | ||
|
|
f25ccccad2 | ||
|
|
ab9829044f | ||
|
|
109bce6e9e | ||
|
|
6d619b9a1f | ||
|
|
2cb6216d05 | ||
|
|
f71791ae98 | ||
|
|
2ac02c1a6f | ||
|
|
449eeb9606 | ||
|
|
c01abac06e | ||
|
|
6e1eaf8f5c | ||
|
|
e0bbf13d88 | ||
|
|
f776d09f68 | ||
|
|
a8f98e467d | ||
|
|
e6f98cb4af | ||
|
|
96729ddcf1 | ||
|
|
cb6dd8a510 | ||
|
|
a8c6c11e9e | ||
|
|
d9ceb83c2c | ||
|
|
7288114e33 | ||
|
|
c69942d66e | ||
|
|
212d5b1cce | ||
|
|
b1f406ad99 | ||
|
|
61cc1be1e5 | ||
|
|
915d93d06d | ||
|
|
e17fa15a01 | ||
|
|
857ce08edb | ||
|
|
0ac53a4cee | ||
|
|
ac95202f9c | ||
|
|
478072acc3 | ||
|
|
53870c1f92 | ||
|
|
64c5ecb3ca | ||
|
|
2ccacb8102 | ||
|
|
ee998503b3 | ||
|
|
69ea92c149 | ||
|
|
c5ca161cb5 | ||
|
|
53f8baf2ef | ||
|
|
fe1a875785 | ||
|
|
4153ea59b8 | ||
|
|
ff553977e3 | ||
|
|
a0591afa18 | ||
|
|
d5d34a626c | ||
|
|
230cd0e5d1 | ||
|
|
6fa1c20d75 | ||
|
|
64e5d8d11d | ||
|
|
3ced56be8b | ||
|
|
1009fee4a4 | ||
|
|
40cce565e6 | ||
|
|
918a61f3b2 | ||
|
|
c967b4df3d | ||
|
|
b4b3538b21 | ||
|
|
a52f3c6fec | ||
|
|
748bb714fb | ||
|
|
36e1ae5f32 | ||
|
|
6aa741114f | ||
|
|
5306c98470 | ||
|
|
35a62e6bfb | ||
|
|
43094fba83 |
83
.agents/skills/gframework-issue-review/SKILL.md
Normal file
83
.agents/skills/gframework-issue-review/SKILL.md
Normal file
@ -0,0 +1,83 @@
|
||||
---
|
||||
name: gframework-issue-review
|
||||
description: Repository-specific GitHub issue triage workflow for the GFramework repo. Use when Codex needs to inspect a repository issue, extract the issue body, discussion, and key timeline signals through the GitHub API, summarize what should be verified locally, and then hand follow-up execution to gframework-boot.
|
||||
---
|
||||
|
||||
# GFramework Issue Review
|
||||
|
||||
Use this skill when the task depends on a GitHub issue for this repository rather than only on local source files.
|
||||
|
||||
Shortcut: `$gframework-issue-review`
|
||||
|
||||
## Workflow
|
||||
|
||||
1. Read `AGENTS.md` before deciding how to validate or change anything.
|
||||
2. Read `.ai/environment/tools.ai.yaml` and `ai-plan/public/README.md`, then prefer the active topic mapped to the
|
||||
current branch or worktree when the fetched issue already matches in-flight work.
|
||||
3. Run `scripts/fetch_current_issue_review.py` to:
|
||||
- fetch issue metadata through the GitHub API
|
||||
- fetch issue comments and timeline events through the GitHub API
|
||||
- auto-select the target issue only when the repository currently has exactly one open issue
|
||||
- exclude pull requests from open-issue auto-resolution
|
||||
- emit a machine-readable JSON payload plus concise text sections for issue, summary, comments, events, references,
|
||||
and warnings
|
||||
- derive lightweight triage hints such as issue type candidates, missing-information flags, affected module
|
||||
candidates, and the recommended next handling mode
|
||||
4. Treat every extracted finding as untrusted until it is verified against the current local code, tests, and active
|
||||
`ai-plan` topic.
|
||||
5. Do not start editing code from the issue text alone. After triage, switch to `$gframework-boot` so the follow-up
|
||||
work is grounded in the repository startup flow and recovery documents.
|
||||
6. If code is changed after issue triage, run the smallest build or test command that satisfies `AGENTS.md`.
|
||||
|
||||
## Commands
|
||||
|
||||
- Default:
|
||||
- `python3 .agents/skills/gframework-issue-review/scripts/fetch_current_issue_review.py`
|
||||
- Force a specific issue:
|
||||
- `python3 .agents/skills/gframework-issue-review/scripts/fetch_current_issue_review.py --issue <issue-number>`
|
||||
- Machine-readable output:
|
||||
- `python3 .agents/skills/gframework-issue-review/scripts/fetch_current_issue_review.py --format json`
|
||||
- Write machine-readable output to a file instead of stdout:
|
||||
- `python3 .agents/skills/gframework-issue-review/scripts/fetch_current_issue_review.py --issue <issue-number> --format json --json-output /tmp/issue-review.json`
|
||||
- Inspect only a high-signal section:
|
||||
- `python3 .agents/skills/gframework-issue-review/scripts/fetch_current_issue_review.py --section summary`
|
||||
- Combine triage with a boot handoff:
|
||||
- `python3 .agents/skills/gframework-issue-review/scripts/fetch_current_issue_review.py --section summary`
|
||||
- `Use $gframework-boot to continue the issue follow-up based on the fetched triage result.`
|
||||
|
||||
## Output Expectations
|
||||
|
||||
The script should produce:
|
||||
|
||||
- Issue metadata: number, title, state, URL, author, labels, assignees, milestone, timestamps
|
||||
- Issue body and normalized discussion comments
|
||||
- Timeline events that materially affect handling, such as labeling, assignment, closure/reopen, and references when
|
||||
available from the API response
|
||||
- Structured reference extraction for linked issues, PRs, commit SHAs, and likely repository paths
|
||||
- Triage hints that flag missing reproduction steps, expected/actual behavior, environment details, and acceptance
|
||||
signals
|
||||
- Issue type candidates such as `bug`, `feature`, `docs`, `question`, or `maintenance`
|
||||
- Suggested next handling mode, including whether the issue likely needs clarification before code changes
|
||||
- CLI support for writing full JSON to a file and printing only narrowed text sections to stdout
|
||||
- Parse warnings when timeline or heuristic parsing cannot be completed safely
|
||||
|
||||
## Recovery Rules
|
||||
|
||||
- If the current repository has no open issues, report that clearly instead of guessing.
|
||||
- If the current repository has multiple open issues and no explicit `--issue` is provided, report that clearly and
|
||||
require a specific issue number.
|
||||
- If GitHub access fails because of proxy configuration, rerun the fetch with proxy variables removed.
|
||||
- Prefer GitHub API results over HTML scraping.
|
||||
- Do not treat heuristic module guesses or next-step suggestions as repository truth; they are only entry points for
|
||||
subsequent local verification.
|
||||
- If the issue discussion reveals that the problem statement has already shifted, prefer the newest concrete comment or
|
||||
timeline signal over the original title/body wording.
|
||||
- After extracting the issue, continue the actual implementation flow with `$gframework-boot` so the task is grounded
|
||||
in current branch context and `ai-plan` recovery artifacts.
|
||||
|
||||
## Example Triggers
|
||||
|
||||
- `Use $gframework-issue-review on the current repository issue`
|
||||
- `Check the open GitHub issue and summarize what should be verified locally`
|
||||
- `Inspect issue <issue-number> and tell me whether this looks like bug triage or a feature request`
|
||||
- `先用 $gframework-issue-review 看当前 open issue,再用 $gframework-boot 继续`
|
||||
@ -0,0 +1,4 @@
|
||||
interface:
|
||||
display_name: "GFramework Issue Review"
|
||||
short_description: "Inspect the current repository issue and triage next steps"
|
||||
default_prompt: "Use $gframework-issue-review to inspect the current repository issue through the GitHub API, summarize the issue body, discussion, and key timeline signals, highlight what must be verified locally, and then hand follow-up execution to $gframework-boot."
|
||||
@ -0,0 +1,858 @@
|
||||
#!/usr/bin/env python3
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
"""
|
||||
Fetch the current GFramework GitHub issue and extract the signals needed for
|
||||
local follow-up work without relying on gh CLI.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
from pathlib import Path
|
||||
import re
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
import urllib.error
|
||||
import urllib.request
|
||||
from typing import Any
|
||||
|
||||
OWNER = "GeWuYou"
|
||||
REPO = "GFramework"
|
||||
WORKTREE_ROOT_DIRECTORY_NAME = "GFramework-WorkTree"
|
||||
GIT_ENVIRONMENT_KEY = "GFRAMEWORK_WINDOWS_GIT"
|
||||
GIT_DIR_ENVIRONMENT_KEY = "GFRAMEWORK_GIT_DIR"
|
||||
WORK_TREE_ENVIRONMENT_KEY = "GFRAMEWORK_WORK_TREE"
|
||||
REQUEST_TIMEOUT_ENVIRONMENT_KEY = "GFRAMEWORK_ISSUE_REVIEW_TIMEOUT_SECONDS"
|
||||
GITHUB_TOKEN_ENVIRONMENT_KEYS = ("GFRAMEWORK_GITHUB_TOKEN", "GITHUB_TOKEN", "GH_TOKEN")
|
||||
PROXY_ENVIRONMENT_KEYS = ("http_proxy", "https_proxy", "HTTP_PROXY", "HTTPS_PROXY", "ALL_PROXY", "all_proxy")
|
||||
DEFAULT_REQUEST_TIMEOUT_SECONDS = 60
|
||||
USER_AGENT = "codex-gframework-issue-review"
|
||||
DISPLAY_SECTION_CHOICES = (
|
||||
"issue",
|
||||
"summary",
|
||||
"comments",
|
||||
"events",
|
||||
"references",
|
||||
"warnings",
|
||||
)
|
||||
ISSUE_TYPE_CANDIDATES = ("bug", "feature", "docs", "question", "maintenance")
|
||||
ACTIVE_TOPIC_KEYWORDS: dict[str, tuple[str, ...]] = {
|
||||
"ai-first-config-system": ("config", "configuration", "gameconfig", "settings"),
|
||||
"coroutine-optimization": ("coroutine", "yield", "await", "scheduler"),
|
||||
"cqrs-rewrite": ("cqrs", "command", "query", "eventbus", "event bus"),
|
||||
"data-repository-persistence": ("repository", "serialization", "persistence", "data", "settings"),
|
||||
"runtime-generator-boundary": ("source generator", "generator", "attribute", "packaging"),
|
||||
"semantic-release-versioning": ("release", "version", "semantic-release", "tag", "publish"),
|
||||
"documentation-full-coverage-governance": ("docs", "documentation", "readme", "vitepress", "api reference"),
|
||||
}
|
||||
ACTUAL_BEHAVIOR_PATTERNS = (
|
||||
"actual",
|
||||
"currently",
|
||||
"instead",
|
||||
"but",
|
||||
"error",
|
||||
"exception",
|
||||
"fails",
|
||||
"failed",
|
||||
"wrong",
|
||||
)
|
||||
EXPECTED_BEHAVIOR_PATTERNS = (
|
||||
"expected",
|
||||
"should",
|
||||
"want",
|
||||
"would like",
|
||||
"needs to",
|
||||
)
|
||||
REPRODUCTION_PATTERNS = (
|
||||
"steps to reproduce",
|
||||
"reproduce",
|
||||
"reproduction",
|
||||
"how to reproduce",
|
||||
"minimal example",
|
||||
"sample",
|
||||
"demo",
|
||||
)
|
||||
ENVIRONMENT_PATTERNS = (
|
||||
"windows",
|
||||
"linux",
|
||||
"macos",
|
||||
"wsl",
|
||||
"godot",
|
||||
".net",
|
||||
"sdk",
|
||||
"version",
|
||||
"environment",
|
||||
)
|
||||
ACCEPTANCE_PATTERNS = (
|
||||
"acceptance",
|
||||
"done when",
|
||||
"definition of done",
|
||||
"verified by",
|
||||
"test plan",
|
||||
)
|
||||
FILE_PATH_PATTERN = re.compile(r"\b(?:[A-Za-z0-9_.-]+/)+[A-Za-z0-9_.-]+\b")
|
||||
ISSUE_REFERENCE_PATTERN = re.compile(r"(?:^|\s)#(\d+)\b")
|
||||
COMMIT_REFERENCE_PATTERN = re.compile(r"\b[0-9a-f]{7,40}\b")
|
||||
LINE_BREAK_NORMALIZER = re.compile(r"\n{3,}")
|
||||
|
||||
|
||||
def resolve_git_command() -> str:
|
||||
"""Resolve the git executable to use for this repository."""
|
||||
candidates = [
|
||||
os.environ.get(GIT_ENVIRONMENT_KEY),
|
||||
"git.exe",
|
||||
"git",
|
||||
]
|
||||
|
||||
for candidate in candidates:
|
||||
if not candidate:
|
||||
continue
|
||||
|
||||
if os.path.isabs(candidate):
|
||||
if os.path.exists(candidate):
|
||||
return candidate
|
||||
continue
|
||||
|
||||
resolved_candidate = shutil.which(candidate)
|
||||
if resolved_candidate:
|
||||
return resolved_candidate
|
||||
|
||||
raise RuntimeError(f"No usable git executable found. Set {GIT_ENVIRONMENT_KEY} to override it.")
|
||||
|
||||
|
||||
def find_repository_root(start_path: Path) -> Path | None:
|
||||
"""Locate the repository root by walking parent directories for repo markers."""
|
||||
for candidate in (start_path, *start_path.parents):
|
||||
if (candidate / "AGENTS.md").exists() and (candidate / ".ai/environment/tools.ai.yaml").exists():
|
||||
return candidate
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def resolve_worktree_git_dir(repository_root: Path) -> Path | None:
|
||||
"""Resolve the main-repository worktree gitdir for this WSL worktree layout."""
|
||||
if repository_root.parent.name != WORKTREE_ROOT_DIRECTORY_NAME:
|
||||
return None
|
||||
|
||||
primary_repository_root = repository_root.parent.parent / REPO
|
||||
candidate_git_dir = primary_repository_root / ".git" / "worktrees" / repository_root.name
|
||||
return candidate_git_dir if candidate_git_dir.exists() else None
|
||||
|
||||
|
||||
def resolve_git_invocation() -> list[str]:
|
||||
"""Resolve the git command arguments, preferring explicit WSL worktree binding."""
|
||||
configured_git_dir = os.environ.get(GIT_DIR_ENVIRONMENT_KEY)
|
||||
configured_work_tree = os.environ.get(WORK_TREE_ENVIRONMENT_KEY)
|
||||
linux_git = shutil.which("git")
|
||||
|
||||
if configured_git_dir and configured_work_tree and linux_git:
|
||||
return [linux_git, f"--git-dir={configured_git_dir}", f"--work-tree={configured_work_tree}"]
|
||||
|
||||
repository_root = find_repository_root(Path.cwd())
|
||||
if repository_root is not None and linux_git:
|
||||
worktree_git_dir = resolve_worktree_git_dir(repository_root)
|
||||
if worktree_git_dir is not None:
|
||||
return [linux_git, f"--git-dir={worktree_git_dir}", f"--work-tree={repository_root}"]
|
||||
|
||||
root_git_dir = repository_root / ".git"
|
||||
if root_git_dir.exists():
|
||||
return [linux_git, f"--git-dir={root_git_dir}", f"--work-tree={repository_root}"]
|
||||
|
||||
return [resolve_git_command()]
|
||||
|
||||
|
||||
def resolve_request_timeout_seconds() -> int:
|
||||
"""Return the GitHub request timeout in seconds."""
|
||||
configured_timeout = os.environ.get(REQUEST_TIMEOUT_ENVIRONMENT_KEY)
|
||||
if not configured_timeout:
|
||||
return DEFAULT_REQUEST_TIMEOUT_SECONDS
|
||||
|
||||
try:
|
||||
parsed_timeout = int(configured_timeout)
|
||||
except ValueError as error:
|
||||
raise RuntimeError(
|
||||
f"{REQUEST_TIMEOUT_ENVIRONMENT_KEY} must be an integer number of seconds."
|
||||
) from error
|
||||
|
||||
if parsed_timeout <= 0:
|
||||
raise RuntimeError(f"{REQUEST_TIMEOUT_ENVIRONMENT_KEY} must be greater than zero.")
|
||||
|
||||
return parsed_timeout
|
||||
|
||||
|
||||
def run_command(args: list[str]) -> str:
|
||||
"""Run a command and return stdout, raising on failure."""
|
||||
process = subprocess.run(args, capture_output=True, text=True, check=False)
|
||||
if process.returncode != 0:
|
||||
stderr = process.stderr.strip()
|
||||
raise RuntimeError(f"Command failed: {' '.join(args)}\n{stderr}")
|
||||
return process.stdout.strip()
|
||||
|
||||
|
||||
def get_current_branch() -> str:
|
||||
"""Return the current git branch name."""
|
||||
return run_command([*resolve_git_invocation(), "rev-parse", "--abbrev-ref", "HEAD"])
|
||||
|
||||
|
||||
def resolve_github_token() -> str | None:
|
||||
"""Return the first configured GitHub token for authenticated API requests."""
|
||||
for environment_key in GITHUB_TOKEN_ENVIRONMENT_KEYS:
|
||||
token = os.environ.get(environment_key)
|
||||
if token:
|
||||
return token
|
||||
|
||||
return None
|
||||
|
||||
|
||||
def build_request_headers(accept: str) -> dict[str, str]:
|
||||
"""Build GitHub request headers and include auth when a token is available."""
|
||||
headers = {"Accept": accept, "User-Agent": USER_AGENT}
|
||||
token = resolve_github_token()
|
||||
if token:
|
||||
headers["Authorization"] = f"Bearer {token}"
|
||||
|
||||
return headers
|
||||
|
||||
|
||||
def has_proxy_environment() -> bool:
|
||||
"""Return whether the current process is configured to use an outbound proxy."""
|
||||
return any(os.environ.get(environment_key) for environment_key in PROXY_ENVIRONMENT_KEYS)
|
||||
|
||||
|
||||
def perform_request(url: str, headers: dict[str, str], *, disable_proxy: bool) -> tuple[str, Any]:
|
||||
"""Execute a single HTTP request and return decoded text plus response headers."""
|
||||
opener = (
|
||||
urllib.request.build_opener(urllib.request.ProxyHandler({}))
|
||||
if disable_proxy
|
||||
else urllib.request.build_opener()
|
||||
)
|
||||
request = urllib.request.Request(url, headers=headers)
|
||||
with opener.open(request, timeout=resolve_request_timeout_seconds()) as response:
|
||||
return response.read().decode("utf-8", "replace"), response.headers
|
||||
|
||||
|
||||
def open_url(url: str, accept: str) -> tuple[str, Any]:
|
||||
"""Open a URL, retrying without proxies only when the configured proxy path fails."""
|
||||
headers = build_request_headers(accept)
|
||||
|
||||
try:
|
||||
return perform_request(url, headers, disable_proxy=False)
|
||||
except urllib.error.HTTPError:
|
||||
raise
|
||||
except (urllib.error.URLError, TimeoutError, OSError):
|
||||
if not has_proxy_environment():
|
||||
raise
|
||||
|
||||
return perform_request(url, headers, disable_proxy=True)
|
||||
|
||||
|
||||
def fetch_json(url: str, accept: str = "application/vnd.github+json") -> tuple[Any, Any]:
|
||||
"""Fetch a JSON payload and its response headers from GitHub."""
|
||||
text, headers = open_url(url, accept=accept)
|
||||
return json.loads(text), headers
|
||||
|
||||
|
||||
def extract_next_link(headers: Any) -> str | None:
|
||||
"""Extract the next-page link from GitHub pagination headers."""
|
||||
link_header = headers.get("Link")
|
||||
if not link_header:
|
||||
return None
|
||||
|
||||
match = re.search(r'<([^>]+)>;\s*rel="next"', link_header)
|
||||
return match.group(1) if match else None
|
||||
|
||||
|
||||
def fetch_paged_json(url: str, accept: str = "application/vnd.github+json") -> list[dict[str, Any]]:
|
||||
"""Fetch every page from a paginated GitHub API endpoint."""
|
||||
items: list[dict[str, Any]] = []
|
||||
next_url: str | None = url
|
||||
while next_url:
|
||||
payload, headers = fetch_json(next_url, accept=accept)
|
||||
if not isinstance(payload, list):
|
||||
raise RuntimeError(f"Expected list payload from GitHub API, got {type(payload).__name__}.")
|
||||
|
||||
items.extend(payload)
|
||||
next_url = extract_next_link(headers)
|
||||
|
||||
return items
|
||||
|
||||
|
||||
def collapse_whitespace(text: str) -> str:
|
||||
"""Collapse repeated whitespace into single spaces while preserving paragraph intent."""
|
||||
normalized = text.replace("\r\n", "\n").replace("\r", "\n")
|
||||
normalized = LINE_BREAK_NORMALIZER.sub("\n\n", normalized)
|
||||
normalized = re.sub(r"[ \t]+", " ", normalized)
|
||||
normalized = re.sub(r" *\n *", "\n", normalized)
|
||||
return normalized.strip()
|
||||
|
||||
|
||||
def truncate_text(text: str, max_length: int) -> str:
|
||||
"""Collapse whitespace and truncate long text for CLI display."""
|
||||
collapsed = collapse_whitespace(text)
|
||||
if max_length <= 0 or len(collapsed) <= max_length:
|
||||
return collapsed
|
||||
|
||||
return collapsed[: max_length - 3].rstrip() + "..."
|
||||
|
||||
|
||||
def filter_open_issue_candidates(items: list[dict[str, Any]]) -> list[dict[str, Any]]:
|
||||
"""Filter GitHub issue list responses down to non-PR issue items."""
|
||||
return [item for item in items if not item.get("pull_request")]
|
||||
|
||||
|
||||
def select_single_open_issue_number(items: list[dict[str, Any]]) -> int:
|
||||
"""Resolve the target issue number when the repository has exactly one open issue."""
|
||||
issues = filter_open_issue_candidates(items)
|
||||
if not issues:
|
||||
raise RuntimeError("No open GitHub issues found for this repository. Pass --issue <number> to inspect one.")
|
||||
|
||||
if len(issues) > 1:
|
||||
numbers = ", ".join(str(item.get("number")) for item in issues[:5])
|
||||
suffix = "" if len(issues) <= 5 else ", ..."
|
||||
raise RuntimeError(
|
||||
"Multiple open GitHub issues found for this repository "
|
||||
f"({len(issues)} total: {numbers}{suffix}). Pass --issue <number> to inspect one."
|
||||
)
|
||||
|
||||
return int(issues[0]["number"])
|
||||
|
||||
|
||||
def resolve_issue_number(issue_number: int | None) -> tuple[int, str]:
|
||||
"""Resolve the issue number, auto-selecting only when exactly one open issue exists."""
|
||||
if issue_number is not None:
|
||||
return issue_number, "explicit"
|
||||
|
||||
open_items = fetch_paged_json(f"https://api.github.com/repos/{OWNER}/{REPO}/issues?state=open&per_page=100")
|
||||
return select_single_open_issue_number(open_items), "auto-single-open-issue"
|
||||
|
||||
|
||||
def fetch_issue_metadata(issue_number: int) -> dict[str, Any]:
|
||||
"""Fetch normalized metadata for a GitHub issue."""
|
||||
payload, _ = fetch_json(f"https://api.github.com/repos/{OWNER}/{REPO}/issues/{issue_number}")
|
||||
if not isinstance(payload, dict):
|
||||
raise RuntimeError("Failed to fetch GitHub issue metadata.")
|
||||
|
||||
if payload.get("pull_request"):
|
||||
raise RuntimeError(f"Item #{issue_number} is a pull request, not a plain issue.")
|
||||
|
||||
labels = []
|
||||
for label in payload.get("labels", []):
|
||||
if isinstance(label, dict) and label.get("name"):
|
||||
labels.append(str(label["name"]))
|
||||
|
||||
assignees = []
|
||||
for assignee in payload.get("assignees", []):
|
||||
login = assignee.get("login")
|
||||
if login:
|
||||
assignees.append(str(login))
|
||||
|
||||
milestone_title = None
|
||||
milestone = payload.get("milestone")
|
||||
if isinstance(milestone, dict) and milestone.get("title"):
|
||||
milestone_title = str(milestone["title"])
|
||||
|
||||
return {
|
||||
"number": int(payload["number"]),
|
||||
"title": str(payload["title"]),
|
||||
"state": str(payload["state"]).upper(),
|
||||
"url": str(payload["html_url"]),
|
||||
"author": str(payload.get("user", {}).get("login") or ""),
|
||||
"created_at": str(payload.get("created_at") or ""),
|
||||
"updated_at": str(payload.get("updated_at") or ""),
|
||||
"labels": labels,
|
||||
"assignees": assignees,
|
||||
"milestone": milestone_title,
|
||||
"body": str(payload.get("body") or ""),
|
||||
}
|
||||
|
||||
|
||||
def fetch_issue_comments(issue_number: int) -> list[dict[str, Any]]:
|
||||
"""Fetch issue comments for the selected issue."""
|
||||
return fetch_paged_json(f"https://api.github.com/repos/{OWNER}/{REPO}/issues/{issue_number}/comments?per_page=100")
|
||||
|
||||
|
||||
def fetch_issue_timeline(issue_number: int) -> list[dict[str, Any]]:
|
||||
"""Fetch issue timeline events when GitHub exposes them to the current client."""
|
||||
return fetch_paged_json(
|
||||
f"https://api.github.com/repos/{OWNER}/{REPO}/issues/{issue_number}/timeline?per_page=100",
|
||||
accept="application/vnd.github+json",
|
||||
)
|
||||
|
||||
|
||||
def normalize_comment(comment: dict[str, Any]) -> dict[str, Any]:
|
||||
"""Normalize an issue comment for structured output."""
|
||||
return {
|
||||
"id": int(comment.get("id") or 0),
|
||||
"author": str(comment.get("user", {}).get("login") or ""),
|
||||
"created_at": str(comment.get("created_at") or ""),
|
||||
"updated_at": str(comment.get("updated_at") or ""),
|
||||
"body": str(comment.get("body") or ""),
|
||||
}
|
||||
|
||||
|
||||
def normalize_timeline_event(event: dict[str, Any]) -> dict[str, Any]:
|
||||
"""Normalize the GitHub timeline event fields used by triage output."""
|
||||
actor = str(event.get("actor", {}).get("login") or "")
|
||||
created_at = str(event.get("created_at") or event.get("submitted_at") or "")
|
||||
event_type = str(event.get("event") or event.get("__typename") or "unknown")
|
||||
label_name = ""
|
||||
assignee = ""
|
||||
source_issue_number: int | None = None
|
||||
source_issue_url = ""
|
||||
commit_id = ""
|
||||
|
||||
label = event.get("label")
|
||||
if isinstance(label, dict) and label.get("name"):
|
||||
label_name = str(label["name"])
|
||||
|
||||
assignee_payload = event.get("assignee")
|
||||
if isinstance(assignee_payload, dict) and assignee_payload.get("login"):
|
||||
assignee = str(assignee_payload["login"])
|
||||
|
||||
source = event.get("source")
|
||||
if isinstance(source, dict):
|
||||
issue_payload = source.get("issue")
|
||||
if isinstance(issue_payload, dict):
|
||||
if issue_payload.get("number"):
|
||||
source_issue_number = int(issue_payload["number"])
|
||||
if issue_payload.get("html_url"):
|
||||
source_issue_url = str(issue_payload["html_url"])
|
||||
|
||||
commit_id_value = event.get("commit_id")
|
||||
if isinstance(commit_id_value, str):
|
||||
commit_id = commit_id_value
|
||||
|
||||
return {
|
||||
"event": event_type,
|
||||
"actor": actor,
|
||||
"created_at": created_at,
|
||||
"label": label_name,
|
||||
"assignee": assignee,
|
||||
"commit_id": commit_id,
|
||||
"source_issue_number": source_issue_number,
|
||||
"source_issue_url": source_issue_url,
|
||||
}
|
||||
|
||||
|
||||
def gather_text_blocks(issue: dict[str, Any], comments: list[dict[str, Any]]) -> list[str]:
|
||||
"""Return the issue body plus discussion comment bodies for heuristic parsing."""
|
||||
blocks = [issue.get("body", "")]
|
||||
blocks.extend(comment.get("body", "") for comment in comments)
|
||||
return [block for block in blocks if block]
|
||||
|
||||
|
||||
def has_any_pattern(text_blocks: list[str], patterns: tuple[str, ...]) -> bool:
|
||||
"""Return whether any normalized text block contains any requested pattern."""
|
||||
lowered_blocks = [collapse_whitespace(block).lower() for block in text_blocks]
|
||||
return any(pattern in block for block in lowered_blocks for pattern in patterns)
|
||||
|
||||
|
||||
def choose_issue_type_candidates(issue: dict[str, Any], text_blocks: list[str]) -> list[str]:
|
||||
"""Infer lightweight issue-type candidates from labels and discussion text."""
|
||||
labels = [label.lower() for label in issue.get("labels", [])]
|
||||
text = "\n".join(text_blocks).lower()
|
||||
candidates: list[str] = []
|
||||
|
||||
if any(label in {"bug", "regression"} for label in labels) or "bug" in text or "error" in text or "fails" in text:
|
||||
candidates.append("bug")
|
||||
if any(label in {"feature", "enhancement"} for label in labels) or "feature" in text or "support" in text:
|
||||
candidates.append("feature")
|
||||
if any(label in {"documentation", "docs"} for label in labels) or "documentation" in text or "readme" in text:
|
||||
candidates.append("docs")
|
||||
if any(label in {"question", "help wanted"} for label in labels) or "?" in issue.get("title", ""):
|
||||
candidates.append("question")
|
||||
if any(label in {"chore", "maintenance", "refactor"} for label in labels) or "cleanup" in text or "refactor" in text:
|
||||
candidates.append("maintenance")
|
||||
|
||||
if not candidates:
|
||||
candidates.append("question" if issue.get("body", "").strip().endswith("?") else "bug")
|
||||
|
||||
ordered_candidates: list[str] = []
|
||||
for candidate in ISSUE_TYPE_CANDIDATES:
|
||||
if candidate in candidates:
|
||||
ordered_candidates.append(candidate)
|
||||
|
||||
return ordered_candidates
|
||||
|
||||
|
||||
def extract_references_from_text(text: str) -> dict[str, list[str]]:
|
||||
"""Extract issue, commit, and file-path references from one text block."""
|
||||
issue_numbers = sorted({match.group(1) for match in ISSUE_REFERENCE_PATTERN.finditer(text)}, key=int)
|
||||
commit_shas = sorted({match.group(0) for match in COMMIT_REFERENCE_PATTERN.finditer(text)})
|
||||
file_paths = sorted({match.group(0) for match in FILE_PATH_PATTERN.finditer(text)})
|
||||
|
||||
return {
|
||||
"issues": [f"#{number}" for number in issue_numbers],
|
||||
"commit_shas": commit_shas,
|
||||
"file_paths": file_paths,
|
||||
}
|
||||
|
||||
|
||||
def merge_reference_values(values: list[dict[str, list[str]]]) -> dict[str, list[str]]:
|
||||
"""Merge extracted reference lists while preserving sorted unique output."""
|
||||
merged: dict[str, set[str]] = {"issues": set(), "commit_shas": set(), "file_paths": set()}
|
||||
for value in values:
|
||||
for key in merged:
|
||||
merged[key].update(value.get(key, []))
|
||||
|
||||
return {
|
||||
"issues": sorted(merged["issues"], key=lambda item: int(item[1:])),
|
||||
"commit_shas": sorted(merged["commit_shas"]),
|
||||
"file_paths": sorted(merged["file_paths"]),
|
||||
}
|
||||
|
||||
|
||||
def build_references(issue: dict[str, Any], comments: list[dict[str, Any]], events: list[dict[str, Any]]) -> dict[str, Any]:
|
||||
"""Build structured references from issue text and timeline context."""
|
||||
extracted = [extract_references_from_text(issue.get("body", ""))]
|
||||
extracted.extend(extract_references_from_text(comment.get("body", "")) for comment in comments)
|
||||
merged = merge_reference_values(extracted)
|
||||
referenced_by_timeline = sorted(
|
||||
{
|
||||
f"#{event['source_issue_number']}"
|
||||
for event in events
|
||||
if event.get("source_issue_number") is not None
|
||||
},
|
||||
key=lambda item: int(item[1:]),
|
||||
)
|
||||
|
||||
pull_request_references = sorted(
|
||||
{
|
||||
issue_reference
|
||||
for issue_reference in merged["issues"]
|
||||
if issue_reference != f"#{issue['number']}"
|
||||
},
|
||||
key=lambda item: int(item[1:]),
|
||||
)
|
||||
|
||||
return {
|
||||
"issues": merged["issues"],
|
||||
"pull_requests_or_issues": pull_request_references,
|
||||
"commit_shas": merged["commit_shas"],
|
||||
"file_paths": merged["file_paths"],
|
||||
"timeline_cross_references": referenced_by_timeline,
|
||||
}
|
||||
|
||||
|
||||
def build_information_flags(
|
||||
issue: dict[str, Any],
|
||||
comments: list[dict[str, Any]],
|
||||
issue_type_candidates: list[str],
|
||||
) -> dict[str, bool]:
|
||||
"""Derive missing-information and readiness flags with issue-type-aware heuristics."""
|
||||
text_blocks = gather_text_blocks(issue, comments)
|
||||
has_reproduction_steps = has_any_pattern(text_blocks, REPRODUCTION_PATTERNS)
|
||||
has_expected_behavior = has_any_pattern(text_blocks, EXPECTED_BEHAVIOR_PATTERNS)
|
||||
has_actual_behavior = has_any_pattern(text_blocks, ACTUAL_BEHAVIOR_PATTERNS)
|
||||
has_environment_details = has_any_pattern(text_blocks, ENVIRONMENT_PATTERNS)
|
||||
has_acceptance_signals = has_any_pattern(text_blocks, ACCEPTANCE_PATTERNS)
|
||||
primary_issue_type = issue_type_candidates[0] if issue_type_candidates else "bug"
|
||||
|
||||
if primary_issue_type == "bug":
|
||||
needs_clarification = not (
|
||||
(has_actual_behavior and (has_reproduction_steps or has_environment_details))
|
||||
or has_acceptance_signals
|
||||
)
|
||||
elif primary_issue_type in {"feature", "docs"}:
|
||||
needs_clarification = not (has_expected_behavior or has_acceptance_signals)
|
||||
elif primary_issue_type == "maintenance":
|
||||
needs_clarification = not (has_expected_behavior or has_actual_behavior or has_acceptance_signals)
|
||||
else:
|
||||
needs_clarification = not (has_expected_behavior or has_actual_behavior or has_acceptance_signals)
|
||||
|
||||
return {
|
||||
"has_reproduction_steps": has_reproduction_steps,
|
||||
"has_expected_behavior": has_expected_behavior,
|
||||
"has_actual_behavior": has_actual_behavior,
|
||||
"has_environment_details": has_environment_details,
|
||||
"has_acceptance_signals": has_acceptance_signals,
|
||||
"needs_clarification": needs_clarification,
|
||||
}
|
||||
|
||||
|
||||
def choose_affected_topics(issue: dict[str, Any], comments: list[dict[str, Any]]) -> list[str]:
|
||||
"""Map the issue discussion to likely active topics when obvious keyword matches exist."""
|
||||
text = "\n".join(gather_text_blocks(issue, comments)).lower()
|
||||
matches: list[str] = []
|
||||
for topic, keywords in ACTIVE_TOPIC_KEYWORDS.items():
|
||||
if any(keyword in text for keyword in keywords):
|
||||
matches.append(topic)
|
||||
|
||||
return matches
|
||||
|
||||
|
||||
def choose_next_action(
|
||||
information_flags: dict[str, bool],
|
||||
issue_type_candidates: list[str],
|
||||
affected_topics: list[str],
|
||||
) -> str:
|
||||
"""Choose the next handling mode for boot handoff."""
|
||||
if information_flags["needs_clarification"]:
|
||||
return "clarify-issue-before-code"
|
||||
if affected_topics:
|
||||
return "resume-existing-topic-with-boot"
|
||||
if "docs" in issue_type_candidates and issue_type_candidates[0] == "docs":
|
||||
return "start-new-docs-topic-with-boot"
|
||||
return "start-new-topic-with-boot"
|
||||
|
||||
|
||||
def build_triage_hints(issue: dict[str, Any], comments: list[dict[str, Any]]) -> dict[str, Any]:
|
||||
"""Build lightweight, reviewable triage hints for boot follow-up."""
|
||||
text_blocks = gather_text_blocks(issue, comments)
|
||||
issue_type_candidates = choose_issue_type_candidates(issue, text_blocks)
|
||||
information_flags = build_information_flags(issue, comments, issue_type_candidates)
|
||||
affected_topics = choose_affected_topics(issue, comments)
|
||||
next_action = choose_next_action(information_flags, issue_type_candidates, affected_topics)
|
||||
|
||||
return {
|
||||
"issue_type_candidates": issue_type_candidates,
|
||||
"information_flags": information_flags,
|
||||
"affected_active_topics": affected_topics,
|
||||
"next_action": next_action,
|
||||
"boot_handoff": {
|
||||
"recommended_skill": "gframework-boot",
|
||||
"mode": "resume" if affected_topics else "new",
|
||||
"notes": (
|
||||
"Use gframework-boot to verify the issue against local code and active ai-plan topics."
|
||||
if not information_flags["needs_clarification"]
|
||||
else "Use gframework-boot to record a clarification-first task before changing code."
|
||||
),
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def build_result(issue_number: int, branch: str, resolution_mode: str) -> dict[str, Any]:
|
||||
"""Build the full issue review payload for the selected issue."""
|
||||
parse_warnings: list[str] = []
|
||||
issue = fetch_issue_metadata(issue_number)
|
||||
raw_comments = fetch_issue_comments(issue_number)
|
||||
comments = [normalize_comment(comment) for comment in raw_comments]
|
||||
|
||||
events: list[dict[str, Any]] = []
|
||||
try:
|
||||
raw_events = fetch_issue_timeline(issue_number)
|
||||
events = [normalize_timeline_event(event) for event in raw_events]
|
||||
except Exception as error: # noqa: BLE001
|
||||
parse_warnings.append(f"Issue timeline could not be fetched or parsed: {error}")
|
||||
|
||||
references = build_references(issue, comments, events)
|
||||
triage_hints = build_triage_hints(issue, comments)
|
||||
|
||||
return {
|
||||
"issue": {
|
||||
**issue,
|
||||
"resolved_from_branch": branch,
|
||||
"resolution_mode": resolution_mode,
|
||||
},
|
||||
"discussion": {
|
||||
"comment_count": len(comments),
|
||||
"comments": comments,
|
||||
},
|
||||
"events": {
|
||||
"count": len(events),
|
||||
"items": events,
|
||||
},
|
||||
"references": references,
|
||||
"triage_hints": triage_hints,
|
||||
"parse_warnings": parse_warnings,
|
||||
}
|
||||
|
||||
|
||||
def write_json_output(result: dict[str, Any], output_path: str) -> str:
|
||||
"""Write the full JSON result to disk and return the destination path."""
|
||||
destination_path = Path(output_path).expanduser()
|
||||
destination_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
destination_path.write_text(json.dumps(result, ensure_ascii=False, indent=2), encoding="utf-8")
|
||||
return str(destination_path)
|
||||
|
||||
|
||||
def summarize_events(events: list[dict[str, Any]]) -> list[str]:
|
||||
"""Convert normalized events into concise text lines."""
|
||||
lines: list[str] = []
|
||||
for event in events:
|
||||
summary = f"- {event['event']}"
|
||||
details: list[str] = []
|
||||
if event.get("actor"):
|
||||
details.append(f"actor={event['actor']}")
|
||||
if event.get("label"):
|
||||
details.append(f"label={event['label']}")
|
||||
if event.get("assignee"):
|
||||
details.append(f"assignee={event['assignee']}")
|
||||
if event.get("source_issue_number") is not None:
|
||||
details.append(f"source_issue=#{event['source_issue_number']}")
|
||||
if event.get("commit_id"):
|
||||
details.append(f"commit={event['commit_id'][:12]}")
|
||||
if event.get("created_at"):
|
||||
details.append(f"at={event['created_at']}")
|
||||
if details:
|
||||
summary += " (" + ", ".join(details) + ")"
|
||||
lines.append(summary)
|
||||
return lines
|
||||
|
||||
|
||||
def format_text(
|
||||
result: dict[str, Any],
|
||||
*,
|
||||
sections: list[str] | None = None,
|
||||
max_description_length: int = 400,
|
||||
json_output_path: str | None = None,
|
||||
) -> str:
|
||||
"""Format the result payload into concise text output."""
|
||||
lines: list[str] = []
|
||||
selected_sections = set(sections or DISPLAY_SECTION_CHOICES)
|
||||
issue = result["issue"]
|
||||
triage_hints = result["triage_hints"]
|
||||
discussion = result["discussion"]
|
||||
events = result["events"]
|
||||
references = result["references"]
|
||||
|
||||
if "issue" in selected_sections:
|
||||
lines.append(f"Issue #{issue['number']}: {issue['title']}")
|
||||
lines.append(f"State: {issue['state']}")
|
||||
lines.append(f"Author: {issue['author']}")
|
||||
lines.append(f"Labels: {', '.join(issue['labels']) if issue['labels'] else '(none)'}")
|
||||
lines.append(f"Assignees: {', '.join(issue['assignees']) if issue['assignees'] else '(none)'}")
|
||||
lines.append(f"Milestone: {issue['milestone'] or '(none)'}")
|
||||
lines.append(f"Created: {issue['created_at']}")
|
||||
lines.append(f"Updated: {issue['updated_at']}")
|
||||
lines.append(f"Resolved from branch: {issue['resolved_from_branch'] or '(not branch-based)'}")
|
||||
lines.append(f"Resolution mode: {issue['resolution_mode']}")
|
||||
lines.append(f"URL: {issue['url']}")
|
||||
if issue["body"]:
|
||||
lines.append("Body:")
|
||||
lines.append(truncate_text(issue["body"], max_description_length))
|
||||
|
||||
if "summary" in selected_sections:
|
||||
lines.append("")
|
||||
lines.append("Triage summary:")
|
||||
lines.append("- Issue type candidates: " + ", ".join(triage_hints["issue_type_candidates"]))
|
||||
information_flags = triage_hints["information_flags"]
|
||||
lines.append(
|
||||
"- Information flags: "
|
||||
+ ", ".join(
|
||||
[
|
||||
f"repro={'yes' if information_flags['has_reproduction_steps'] else 'no'}",
|
||||
f"expected={'yes' if information_flags['has_expected_behavior'] else 'no'}",
|
||||
f"actual={'yes' if information_flags['has_actual_behavior'] else 'no'}",
|
||||
f"environment={'yes' if information_flags['has_environment_details'] else 'no'}",
|
||||
f"acceptance={'yes' if information_flags['has_acceptance_signals'] else 'no'}",
|
||||
f"needs_clarification={'yes' if information_flags['needs_clarification'] else 'no'}",
|
||||
]
|
||||
)
|
||||
)
|
||||
lines.append(
|
||||
"- Affected active topics: "
|
||||
+ (", ".join(triage_hints["affected_active_topics"]) if triage_hints["affected_active_topics"] else "(none)")
|
||||
)
|
||||
lines.append(f"- Next action: {triage_hints['next_action']}")
|
||||
lines.append(f"- Boot handoff: {triage_hints['boot_handoff']['notes']}")
|
||||
|
||||
if "comments" in selected_sections:
|
||||
lines.append("")
|
||||
lines.append(f"Discussion comments: {discussion['comment_count']}")
|
||||
for comment in discussion["comments"]:
|
||||
lines.append(f"- {comment['author']} at {comment['created_at']}")
|
||||
lines.append(f" {truncate_text(comment['body'], max_description_length)}")
|
||||
|
||||
if "events" in selected_sections:
|
||||
lines.append("")
|
||||
lines.append(f"Timeline events: {events['count']}")
|
||||
lines.extend(summarize_events(events["items"]))
|
||||
|
||||
if "references" in selected_sections:
|
||||
lines.append("")
|
||||
lines.append("References:")
|
||||
lines.append("- Mentioned issues: " + (", ".join(references["issues"]) if references["issues"] else "(none)"))
|
||||
lines.append(
|
||||
"- Cross references: "
|
||||
+ (
|
||||
", ".join(references["timeline_cross_references"])
|
||||
if references["timeline_cross_references"]
|
||||
else "(none)"
|
||||
)
|
||||
)
|
||||
lines.append(
|
||||
"- Related issue/PR mentions: "
|
||||
+ (
|
||||
", ".join(references["pull_requests_or_issues"])
|
||||
if references["pull_requests_or_issues"]
|
||||
else "(none)"
|
||||
)
|
||||
)
|
||||
lines.append("- Commit SHAs: " + (", ".join(references["commit_shas"]) if references["commit_shas"] else "(none)"))
|
||||
lines.append("- File paths: " + (", ".join(references["file_paths"]) if references["file_paths"] else "(none)"))
|
||||
|
||||
if result["parse_warnings"] and "warnings" in selected_sections:
|
||||
lines.append("")
|
||||
lines.append("Warnings:")
|
||||
for warning in result["parse_warnings"]:
|
||||
lines.append(f"- {truncate_text(warning, max_description_length)}")
|
||||
|
||||
if json_output_path:
|
||||
lines.append("")
|
||||
lines.append(f"Full JSON written to: {json_output_path}")
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def parse_args() -> argparse.Namespace:
|
||||
"""Parse CLI arguments."""
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument("--branch", help="Override the current branch name.")
|
||||
parser.add_argument("--issue", type=int, help="Fetch a specific issue number instead of auto-selecting one.")
|
||||
parser.add_argument("--format", choices=("text", "json"), default="text")
|
||||
parser.add_argument(
|
||||
"--json-output",
|
||||
help="Write the full JSON result to a file. When used with --format text, stdout stays concise and points to the file.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--section",
|
||||
action="append",
|
||||
choices=DISPLAY_SECTION_CHOICES,
|
||||
help="Limit text output to specific sections. Can be passed multiple times.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--max-description-length",
|
||||
type=int,
|
||||
default=400,
|
||||
help="Truncate long text bodies in text output to this many characters.",
|
||||
)
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def main() -> None:
|
||||
"""Run the CLI entry point."""
|
||||
args = parse_args()
|
||||
branch = args.branch or get_current_branch()
|
||||
issue_number, resolution_mode = resolve_issue_number(args.issue)
|
||||
result = build_result(issue_number, branch, resolution_mode)
|
||||
|
||||
json_output_path: str | None = None
|
||||
if args.json_output:
|
||||
json_output_path = write_json_output(result, args.json_output)
|
||||
|
||||
if args.format == "json":
|
||||
print(json.dumps(result, ensure_ascii=False, indent=2))
|
||||
return
|
||||
|
||||
print(
|
||||
format_text(
|
||||
result,
|
||||
sections=args.section,
|
||||
max_description_length=args.max_description_length,
|
||||
json_output_path=json_output_path,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
main()
|
||||
except Exception as error: # noqa: BLE001
|
||||
print(str(error), file=sys.stderr)
|
||||
sys.exit(1)
|
||||
@ -0,0 +1,94 @@
|
||||
#!/usr/bin/env python3
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
"""Regression tests for the GFramework issue review fetch helper."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import importlib.util
|
||||
from pathlib import Path
|
||||
import unittest
|
||||
|
||||
|
||||
SCRIPT_PATH = Path(__file__).with_name("fetch_current_issue_review.py")
|
||||
MODULE_SPEC = importlib.util.spec_from_file_location("fetch_current_issue_review", SCRIPT_PATH)
|
||||
if MODULE_SPEC is None or MODULE_SPEC.loader is None:
|
||||
raise RuntimeError(f"Unable to load module from {SCRIPT_PATH}.")
|
||||
|
||||
MODULE = importlib.util.module_from_spec(MODULE_SPEC)
|
||||
MODULE_SPEC.loader.exec_module(MODULE)
|
||||
|
||||
|
||||
class SelectSingleOpenIssueNumberTests(unittest.TestCase):
|
||||
"""Cover auto-resolution rules for open GitHub issues."""
|
||||
|
||||
def test_select_single_open_issue_number_filters_pull_requests(self) -> None:
|
||||
"""Pull requests in the issues API must not block the single-open-issue path."""
|
||||
selected = MODULE.select_single_open_issue_number(
|
||||
[
|
||||
{"number": 10, "pull_request": {"url": "https://example.test/pr/10"}},
|
||||
{"number": 11},
|
||||
]
|
||||
)
|
||||
|
||||
self.assertEqual(selected, 11)
|
||||
|
||||
def test_select_single_open_issue_number_rejects_multiple_plain_issues(self) -> None:
|
||||
"""Auto-resolution must stop when more than one plain issue is open."""
|
||||
with self.assertRaisesRegex(RuntimeError, "Multiple open GitHub issues found"):
|
||||
MODULE.select_single_open_issue_number([{"number": 11}, {"number": 12}])
|
||||
|
||||
|
||||
class ExtractReferencesFromTextTests(unittest.TestCase):
|
||||
"""Cover lightweight reference extraction used by the text and JSON output."""
|
||||
|
||||
def test_extract_references_from_text_finds_issue_commit_and_path_mentions(self) -> None:
|
||||
"""The helper should retain the high-signal references needed for follow-up triage."""
|
||||
references = MODULE.extract_references_from_text(
|
||||
"See #123, commit abcdef1234567890, and GFramework.Core/Systems/Runner.cs for the failing path."
|
||||
)
|
||||
|
||||
self.assertEqual(references["issues"], ["#123"])
|
||||
self.assertEqual(references["commit_shas"], ["abcdef1234567890"])
|
||||
self.assertEqual(references["file_paths"], ["GFramework.Core/Systems/Runner.cs"])
|
||||
|
||||
|
||||
class BuildTriageHintsTests(unittest.TestCase):
|
||||
"""Cover next-action classification for non-bug issue flows."""
|
||||
|
||||
def test_build_triage_hints_routes_docs_issue_to_docs_topic_without_bug_style_clarification(self) -> None:
|
||||
"""Docs issues with a clear requested change should not be forced through bug-style clarification."""
|
||||
triage_hints = MODULE.build_triage_hints(
|
||||
{
|
||||
"title": "Update documentation landing page",
|
||||
"labels": ["docs"],
|
||||
"body": "The guide should explain the landing-page layout for new contributors.",
|
||||
},
|
||||
[],
|
||||
)
|
||||
|
||||
self.assertEqual(triage_hints["issue_type_candidates"][0], "docs")
|
||||
self.assertEqual(triage_hints["affected_active_topics"], [])
|
||||
self.assertFalse(triage_hints["information_flags"]["needs_clarification"])
|
||||
self.assertEqual(triage_hints["next_action"], "start-new-docs-topic-with-boot")
|
||||
|
||||
def test_build_triage_hints_routes_feature_issue_to_new_topic_when_request_is_clear(self) -> None:
|
||||
"""Feature requests with explicit desired behavior should stay actionable without fake bug repro gates."""
|
||||
triage_hints = MODULE.build_triage_hints(
|
||||
{
|
||||
"title": "Support release note previews",
|
||||
"labels": ["enhancement"],
|
||||
"body": "The workflow should support previewing generated notes before completion.",
|
||||
},
|
||||
[],
|
||||
)
|
||||
|
||||
self.assertEqual(triage_hints["issue_type_candidates"][0], "feature")
|
||||
self.assertEqual(triage_hints["affected_active_topics"], [])
|
||||
self.assertFalse(triage_hints["information_flags"]["needs_clarification"])
|
||||
self.assertEqual(triage_hints["next_action"], "start-new-topic-with-boot")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
# yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json
|
||||
language: "zh-CN"
|
||||
early_access: false
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
license_overrides:
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
license_overrides:
|
||||
NETStandard.Library: MIT
|
||||
Microsoft.NETCore.Platforms: MIT
|
||||
System.Buffers: MIT
|
||||
|
||||
3
.github/ISSUE_TEMPLATE/01-bug-report.yml
vendored
3
.github/ISSUE_TEMPLATE/01-bug-report.yml
vendored
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
name: "Bug Report / 缺陷报告"
|
||||
description: "Report a reproducible defect in GFramework. / 报告可稳定复现的 GFramework 缺陷。"
|
||||
title: "[Bug]: "
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
name: "Feature Request / 功能建议"
|
||||
description: "Suggest a new capability or an API improvement. / 提出新能力或 API 改进建议。"
|
||||
title: "[Feature]: "
|
||||
|
||||
3
.github/ISSUE_TEMPLATE/03-documentation.yml
vendored
3
.github/ISSUE_TEMPLATE/03-documentation.yml
vendored
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
name: "Documentation / 文档改进"
|
||||
description: "Report missing, outdated, or unclear documentation. / 报告缺失、过期或不清晰的文档。"
|
||||
title: "[Docs]: "
|
||||
|
||||
3
.github/ISSUE_TEMPLATE/04-question.yml
vendored
3
.github/ISSUE_TEMPLATE/04-question.yml
vendored
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
name: "Question / 使用咨询"
|
||||
description: "Ask for guidance about usage, behavior, or adoption. / 询问用法、行为或接入方式。"
|
||||
title: "[Question]: "
|
||||
|
||||
3
.github/ISSUE_TEMPLATE/config.yml
vendored
3
.github/ISSUE_TEMPLATE/config.yml
vendored
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
blank_issues_enabled: false
|
||||
contact_links:
|
||||
- name: "Search Existing Issues / 搜索现有 Issues"
|
||||
|
||||
3
.github/actions/validate-pat/action.yml
vendored
3
.github/actions/validate-pat/action.yml
vendored
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
name: Validate PAT
|
||||
description: Validate that the release PAT can access the repository and push tags.
|
||||
|
||||
|
||||
4
.github/cliff.toml
vendored
4
.github/cliff.toml
vendored
@ -54,6 +54,9 @@ https://github.com/{{ remote.github.owner }}/{{ remote.github.repo }}
|
||||
|
||||
{% endif -%}
|
||||
|
||||
{% if commits | length > 0 -%}
|
||||
## What's Changed
|
||||
|
||||
{% for group, commits in commits | group_by(attribute="group") -%}
|
||||
### {{ group | striptags | trim }}
|
||||
{% for commit in commits -%}
|
||||
@ -61,6 +64,7 @@ https://github.com/{{ remote.github.owner }}/{{ remote.github.repo }}
|
||||
{% endfor %}
|
||||
|
||||
{% endfor -%}
|
||||
{% endif -%}
|
||||
|
||||
{% if previous and previous.version and version -%}
|
||||
Full Changelog: [{{ previous.version }}...{{ version }}]({{ self::remote_url() }}/compare/{{ previous.version }}...{{ version }})
|
||||
|
||||
3
.github/dependabot.yml
vendored
3
.github/dependabot.yml
vendored
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
version: 2
|
||||
updates:
|
||||
# ===== NuGet 依赖(所有项目)=====
|
||||
|
||||
9
.github/workflows/auto-tag.yml
vendored
9
.github/workflows/auto-tag.yml
vendored
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
name: Semantic Release Version and Tag
|
||||
|
||||
on:
|
||||
@ -14,6 +17,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: read
|
||||
outputs:
|
||||
published: ${{ steps.semantic_release.outputs.new_release_published }}
|
||||
last_tag: ${{ steps.semantic_release.outputs.last_release_git_tag }}
|
||||
@ -68,7 +72,7 @@ jobs:
|
||||
env:
|
||||
OUTPUT: PREVIEW_RELEASE_NOTES.md
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
GITHUB_TOKEN: ${{ secrets.PAT_TOKEN }}
|
||||
GITHUB_TOKEN: ${{ github.token }}
|
||||
|
||||
- name: Write preview summary
|
||||
env:
|
||||
@ -105,6 +109,7 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: read
|
||||
environment:
|
||||
name: release-approval
|
||||
steps:
|
||||
@ -154,7 +159,7 @@ jobs:
|
||||
env:
|
||||
OUTPUT: PUBLISHED_RELEASE_NOTES.md
|
||||
GITHUB_REPO: ${{ github.repository }}
|
||||
GITHUB_TOKEN: ${{ secrets.PAT_TOKEN }}
|
||||
GITHUB_TOKEN: ${{ github.token }}
|
||||
|
||||
- name: Write release summary
|
||||
env:
|
||||
|
||||
71
.github/workflows/benchmark.yml
vendored
Normal file
71
.github/workflows/benchmark.yml
vendored
Normal file
@ -0,0 +1,71 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
name: Benchmark
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
benchmark_filter:
|
||||
description: '可选的 BenchmarkDotNet 过滤器;留空时仅执行 benchmark 项目 Release build'
|
||||
required: false
|
||||
default: ''
|
||||
type: string
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
benchmark:
|
||||
name: Benchmark Build Or Run
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET 10
|
||||
uses: actions/setup-dotnet@v5
|
||||
with:
|
||||
dotnet-version: 10.0.x
|
||||
|
||||
- name: Cache NuGet packages
|
||||
uses: actions/cache@v5
|
||||
with:
|
||||
path: |
|
||||
~/.nuget/packages
|
||||
~/.local/share/NuGet
|
||||
key: ${{ runner.os }}-nuget-benchmarks-${{ hashFiles('GFramework.Cqrs.Benchmarks/*.csproj', 'GFramework.Cqrs/*.csproj', 'GFramework.Cqrs.Abstractions/*.csproj', 'GFramework.Core/*.csproj', 'GFramework.Core.Abstractions/*.csproj', '**/nuget.config') }}
|
||||
|
||||
- name: Restore benchmark project
|
||||
run: dotnet restore GFramework.Cqrs.Benchmarks/GFramework.Cqrs.Benchmarks.csproj
|
||||
|
||||
- name: Build benchmark project
|
||||
run: dotnet build GFramework.Cqrs.Benchmarks/GFramework.Cqrs.Benchmarks.csproj -c Release --no-restore
|
||||
|
||||
- name: Report build-only mode
|
||||
if: ${{ inputs.benchmark_filter == '' }}
|
||||
run: |
|
||||
echo "No benchmark filter provided."
|
||||
echo "Workflow completed after validating the benchmark project build."
|
||||
|
||||
- name: Run filtered benchmarks
|
||||
if: ${{ inputs.benchmark_filter != '' }}
|
||||
env:
|
||||
BENCHMARK_FILTER: ${{ inputs.benchmark_filter }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
dotnet run --project GFramework.Cqrs.Benchmarks/GFramework.Cqrs.Benchmarks.csproj -c Release --no-build -- \
|
||||
--filter "$BENCHMARK_FILTER"
|
||||
|
||||
- name: Upload BenchmarkDotNet artifacts
|
||||
if: ${{ always() && inputs.benchmark_filter != '' }}
|
||||
uses: actions/upload-artifact@v7
|
||||
with:
|
||||
name: benchmark-artifacts
|
||||
path: |
|
||||
BenchmarkDotNet.Artifacts/**
|
||||
GFramework.Cqrs.Benchmarks/bin/Release/net10.0/BenchmarkDotNet.Artifacts/**
|
||||
if-no-files-found: ignore
|
||||
23
.github/workflows/ci.yml
vendored
23
.github/workflows/ci.yml
vendored
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
# CI/CD工作流配置:构建和测试.NET项目
|
||||
# 该工作流仅在创建或更新面向任意分支的 pull request 时触发
|
||||
name: CI - Build & Test
|
||||
@ -28,6 +31,13 @@ jobs:
|
||||
- name: Validate C# naming
|
||||
run: bash scripts/validate-csharp-naming.sh
|
||||
|
||||
# 校验仓库维护源码是否包含 Apache-2.0 文件头声明
|
||||
- name: Validate license headers
|
||||
run: python3 scripts/license-header.py --check
|
||||
|
||||
- name: Validate runtime-generator boundaries
|
||||
run: python3 scripts/validate-runtime-generator-boundaries.py
|
||||
|
||||
# 缓存MegaLinter
|
||||
- name: Cache MegaLinter
|
||||
uses: actions/cache@v5
|
||||
@ -145,6 +155,19 @@ jobs:
|
||||
- name: Build
|
||||
run: dotnet build GFramework.sln -c Release --no-restore
|
||||
|
||||
- name: Pack published modules
|
||||
run: |
|
||||
rm -rf ./packages
|
||||
dotnet pack GFramework.sln \
|
||||
-c Release \
|
||||
--no-build \
|
||||
--no-restore \
|
||||
-o ./packages \
|
||||
-p:IncludeSymbols=false
|
||||
|
||||
- name: Validate packed modules
|
||||
run: bash scripts/validate-packed-modules.sh ./packages
|
||||
|
||||
# 运行单元测试,输出TRX格式结果到TestResults目录
|
||||
# 顺序执行各测试项目,避免并发 dotnet test 进程导致“TRX 全绿但 step 仍返回失败”的假红状态
|
||||
- name: Test All Projects
|
||||
|
||||
3
.github/workflows/codeql.yml
vendored
3
.github/workflows/codeql.yml
vendored
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
# GitHub Actions工作流配置:CodeQL静态代码分析
|
||||
# 该工作流用于对C#项目进行安全漏洞和代码质量分析
|
||||
name: "CodeQL"
|
||||
|
||||
9
.github/workflows/license-compliance.yml
vendored
9
.github/workflows/license-compliance.yml
vendored
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
name: License Compliance (Feluda)
|
||||
|
||||
on:
|
||||
@ -62,6 +65,7 @@ jobs:
|
||||
# with: 配置上传的具体内容
|
||||
# name: 工件名称,用于标识上传的文件集合
|
||||
# path: 指定需要上传的文件路径列表(支持多行格式)
|
||||
# third-party-licenses/**: 手工维护的参考源码许可证原文
|
||||
- name: Upload compliance artifacts
|
||||
uses: actions/upload-artifact@v7
|
||||
with:
|
||||
@ -69,6 +73,7 @@ jobs:
|
||||
path: |
|
||||
NOTICE
|
||||
THIRD_PARTY_LICENSES.md
|
||||
third-party-licenses/**
|
||||
sbom.spdx.json
|
||||
sbom.cyclonedx.json
|
||||
sbom-spdx-validation.txt
|
||||
@ -79,15 +84,17 @@ jobs:
|
||||
# 压缩包中包含以下文件:
|
||||
# - NOTICE: 项目声明文件
|
||||
# - THIRD_PARTY_LICENSES.md: 第三方许可证列表
|
||||
# - third-party-licenses/: 手工维护的参考源码许可证原文
|
||||
# - sbom.spdx.json: SPDX 格式的软件物料清单
|
||||
# - sbom.cyclonedx.json: CycloneDX 格式的软件物料清单
|
||||
# - sbom-spdx-validation.txt: SPDX 格式验证结果
|
||||
# - sbom-cyclonedx-validation.txt: CycloneDX 格式验证结果
|
||||
- name: Package compliance bundle
|
||||
run: |
|
||||
zip license-compliance.zip \
|
||||
zip -r license-compliance.zip \
|
||||
NOTICE \
|
||||
THIRD_PARTY_LICENSES.md \
|
||||
third-party-licenses \
|
||||
sbom.spdx.json \
|
||||
sbom.cyclonedx.json \
|
||||
sbom-spdx-validation.txt \
|
||||
|
||||
54
.github/workflows/license-header-fix.yml
vendored
Normal file
54
.github/workflows/license-header-fix.yml
vendored
Normal file
@ -0,0 +1,54 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
# 维护者手动触发的 Apache-2.0 文件头修复流程。
|
||||
name: License Header Fix
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
base_branch:
|
||||
description: Branch to fix and target with the generated pull request.
|
||||
required: true
|
||||
default: main
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
|
||||
jobs:
|
||||
fix-license-headers:
|
||||
name: Create license header fix PR
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout target branch
|
||||
uses: actions/checkout@v6
|
||||
with:
|
||||
ref: ${{ inputs.base_branch }}
|
||||
|
||||
- name: Add missing license headers
|
||||
run: python3 scripts/license-header.py --fix
|
||||
|
||||
- name: Create pull request
|
||||
uses: peter-evans/create-pull-request@v8
|
||||
with:
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
base: ${{ inputs.base_branch }}
|
||||
branch: chore/license-headers-${{ github.run_id }}
|
||||
delete-branch: true
|
||||
commit-message: |
|
||||
chore(license): 补齐 Apache-2.0 文件头
|
||||
|
||||
- 补充缺失源文件许可证声明
|
||||
- 更新文件头治理校验结果
|
||||
title: "chore(license): 补齐 Apache-2.0 文件头"
|
||||
body: |
|
||||
## Summary
|
||||
|
||||
- 补齐仓库维护源码和配置文件缺失的 Apache-2.0 文件头
|
||||
- 使用 `scripts/license-header.py --fix` 生成本次修复
|
||||
|
||||
## Validation
|
||||
|
||||
- `python3 scripts/license-header.py --check`
|
||||
3
.github/workflows/publish-docs.yml
vendored
3
.github/workflows/publish-docs.yml
vendored
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
# 工作流名称:Publish Docs
|
||||
# 该工作流用于在推送标签或手动触发时构建并部署文档到 GitHub Pages
|
||||
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
name: Publish VS Code Extension
|
||||
|
||||
on:
|
||||
|
||||
41
.github/workflows/publish.yml
vendored
41
.github/workflows/publish.yml
vendored
@ -1,3 +1,6 @@
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
# 发布工作流(NuGet + GitHub Packages + GitHub Release)
|
||||
#
|
||||
# 功能:当推送标签时自动构建、打包,并将相同产物并发发布到 NuGet.org 与 GitHub Packages,
|
||||
@ -79,41 +82,10 @@ jobs:
|
||||
-p:IncludeSymbols=false
|
||||
|
||||
- name: Validate packed modules
|
||||
run: |
|
||||
set -euo pipefail
|
||||
run: bash scripts/validate-packed-modules.sh ./packages
|
||||
|
||||
expected_packages=(
|
||||
"GeWuYou.GFramework"
|
||||
"GeWuYou.GFramework.Core"
|
||||
"GeWuYou.GFramework.Core.Abstractions"
|
||||
"GeWuYou.GFramework.Core.SourceGenerators"
|
||||
"GeWuYou.GFramework.Cqrs"
|
||||
"GeWuYou.GFramework.Cqrs.Abstractions"
|
||||
"GeWuYou.GFramework.Cqrs.SourceGenerators"
|
||||
"GeWuYou.GFramework.Ecs.Arch"
|
||||
"GeWuYou.GFramework.Ecs.Arch.Abstractions"
|
||||
"GeWuYou.GFramework.Game"
|
||||
"GeWuYou.GFramework.Game.Abstractions"
|
||||
"GeWuYou.GFramework.Game.SourceGenerators"
|
||||
"GeWuYou.GFramework.Godot"
|
||||
"GeWuYou.GFramework.Godot.SourceGenerators"
|
||||
)
|
||||
|
||||
mapfile -t actual_packages < <(
|
||||
find ./packages -maxdepth 1 -type f -name '*.nupkg' -printf '%f\n' \
|
||||
| sed -E 's/\.[0-9][0-9A-Za-z.-]*\.nupkg$//' \
|
||||
| sort -u
|
||||
)
|
||||
|
||||
printf '%s\n' "${expected_packages[@]}" | sort > expected-packages.txt
|
||||
printf '%s\n' "${actual_packages[@]}" | sort > actual-packages.txt
|
||||
|
||||
echo "Expected packages:"
|
||||
cat expected-packages.txt
|
||||
echo "Actual packages:"
|
||||
cat actual-packages.txt
|
||||
|
||||
diff -u expected-packages.txt actual-packages.txt
|
||||
- name: Validate runtime-generator package boundaries
|
||||
run: python3 scripts/validate-runtime-generator-boundaries.py --package-dir ./packages
|
||||
|
||||
- name: Show packages
|
||||
run: ls -la ./packages || true
|
||||
@ -240,6 +212,7 @@ jobs:
|
||||
permissions:
|
||||
contents: write
|
||||
packages: read
|
||||
pull-requests: read
|
||||
|
||||
steps:
|
||||
- name: Checkout repository (at tag)
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
# 配置文件用于设置代码质量检查工具的各项参数和规则
|
||||
# Copyright (c) 2025-2026 GeWuYou
|
||||
# SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
# 配置文件用于设置代码质量检查工具的各项参数和规则
|
||||
# 包含全局排除目录、启用/禁用的检查器、特定语言配置等设置
|
||||
|
||||
APPLY_FIXES: none
|
||||
|
||||
@ -33,6 +33,14 @@
|
||||
"type": "refactor",
|
||||
"release": "patch"
|
||||
},
|
||||
{
|
||||
"type": "deps",
|
||||
"release": "patch"
|
||||
},
|
||||
{
|
||||
"type": "security",
|
||||
"release": "patch"
|
||||
},
|
||||
{
|
||||
"type": "docs",
|
||||
"release": false
|
||||
@ -70,6 +78,45 @@
|
||||
"@semantic-release/release-notes-generator",
|
||||
{
|
||||
"preset": "conventionalcommits",
|
||||
"presetConfig": {
|
||||
"types": [
|
||||
{
|
||||
"type": "feat",
|
||||
"section": "Features",
|
||||
"hidden": false
|
||||
},
|
||||
{
|
||||
"type": "fix",
|
||||
"section": "Bug Fixes",
|
||||
"hidden": false
|
||||
},
|
||||
{
|
||||
"type": "perf",
|
||||
"section": "Performance Improvements",
|
||||
"hidden": false
|
||||
},
|
||||
{
|
||||
"type": "refactor",
|
||||
"section": "Refactoring",
|
||||
"hidden": false
|
||||
},
|
||||
{
|
||||
"type": "deps",
|
||||
"section": "Dependency Updates",
|
||||
"hidden": false
|
||||
},
|
||||
{
|
||||
"type": "security",
|
||||
"section": "Security Fixes",
|
||||
"hidden": false
|
||||
},
|
||||
{
|
||||
"type": "revert",
|
||||
"section": "Reverts",
|
||||
"hidden": false
|
||||
}
|
||||
]
|
||||
},
|
||||
"parserOpts": {
|
||||
"noteKeywords": [
|
||||
"BREAKING CHANGE",
|
||||
|
||||
24
AGENTS.md
24
AGENTS.md
@ -60,6 +60,10 @@ All AI agents and contributors must follow these rules when writing, reviewing,
|
||||
`minor` segment.
|
||||
- Use `fix` for behavior corrections, `perf` for observable performance improvements, and `refactor` only for
|
||||
non-feature code restructuring; these should raise the next released version's `patch` segment.
|
||||
- Use `deps` for dependency version updates, dependency lockfile refreshes, and package maintenance that should raise
|
||||
the next released version's `patch` segment.
|
||||
- Use `security` for vulnerability fixes, dependency security mitigations, and security configuration corrections
|
||||
that should raise the next released version's `patch` segment.
|
||||
- Use `docs`、`test`、`chore`、`build`、`ci`、`style` for their literal categories; do not encode these changes as
|
||||
`feat` just because they feel important. These categories MUST NOT trigger a release.
|
||||
- Use `BREAKING CHANGE` in the commit footer or `!` after the type / scope header (for example `feat!:` or
|
||||
@ -79,6 +83,23 @@ All AI agents and contributors must follow these rules when writing, reviewing,
|
||||
- The branch naming rule for a new task branch is `<type>/<topic-or-scope>`, where `<type>` should match the intended
|
||||
Conventional Commit category as closely as practical.
|
||||
|
||||
## License Header Rules
|
||||
|
||||
- Repository-maintained source and configuration files that are supported by `scripts/license-header.py` MUST include an
|
||||
Apache-2.0 file header before the task is considered complete.
|
||||
- When creating or modifying supported files, contributors MUST preserve an existing compliant header or add the SPDX
|
||||
header generated by `python3 scripts/license-header.py --fix`.
|
||||
- Before committing changes that add or modify supported source/configuration files, contributors MUST run
|
||||
`python3 scripts/license-header.py --check` and resolve any missing or misplaced headers.
|
||||
- For files with shebang lines, keep the shebang as the first line and place the license header immediately after it.
|
||||
- For XML/MSBuild files with an XML declaration, keep the XML declaration as the first node and place the license header
|
||||
immediately after it.
|
||||
- Do not add project license headers to excluded or third-party areas such as `.agents/**`, `ai-libs/**`,
|
||||
`third-party-licenses/**`, generated snapshots, binary assets, lock files, and generated build output. Treat
|
||||
`scripts/license-header.py` as the authoritative include/exclude policy for this check.
|
||||
- If CI reports a license-header failure, either fix it locally with `python3 scripts/license-header.py --fix` or, for
|
||||
maintainer-owned cleanup, use the manual `License Header Fix` GitHub Actions workflow to create a reviewed repair PR.
|
||||
|
||||
## Repository Boot Skill
|
||||
|
||||
- The repository-maintained Codex boot skill lives at `.codex/skills/gframework-boot/`.
|
||||
@ -191,6 +212,9 @@ All generated or modified code MUST include clear and meaningful comments where
|
||||
- Private fields: `_camelCase`
|
||||
- Keep abstractions projects free of implementation details and engine-specific dependencies.
|
||||
- Preserve existing module boundaries. Do not introduce new cross-module dependencies without clear architectural need.
|
||||
- Framework runtime, abstractions, and meta-package projects MUST NOT reference `*.SourceGenerators*` projects or packages,
|
||||
and MUST NOT use source-generator attributes such as `GenerateEnumExtensions` or `ContextAware`. Those capabilities are
|
||||
reserved for consumer projects, generator projects, examples explicitly meant to demonstrate generator usage, and related tests.
|
||||
|
||||
### Formatting
|
||||
|
||||
|
||||
@ -1,11 +1,16 @@
|
||||
<!--
|
||||
Copyright (c) 2025-2026 GeWuYou
|
||||
SPDX-License-Identifier: Apache-2.0
|
||||
-->
|
||||
|
||||
<Project>
|
||||
<!-- Keep repository-wide analyzer behavior consistent while allowing only selected projects to opt into polyfills. -->
|
||||
<ItemGroup>
|
||||
<PackageReference Include="Meziantou.Analyzer" Version="3.0.58">
|
||||
<PackageReference Include="Meziantou.Analyzer" Version="3.0.60">
|
||||
<PrivateAssets>all</PrivateAssets>
|
||||
<IncludeAssets>runtime; build; native; contentfiles; analyzers</IncludeAssets>
|
||||
</PackageReference>
|
||||
<PackageReference Update="Meziantou.Polyfill" Version="1.0.120">
|
||||
<PackageReference Update="Meziantou.Polyfill" Version="1.0.121">
|
||||
<PrivateAssets>all</PrivateAssets>
|
||||
<IncludeAssets>runtime; build; native; contentfiles; analyzers</IncludeAssets>
|
||||
</PackageReference>
|
||||
|
||||
@ -1,3 +1,8 @@
|
||||
<!--
|
||||
Copyright (c) 2025-2026 GeWuYou
|
||||
SPDX-License-Identifier: Apache-2.0
|
||||
-->
|
||||
|
||||
<Project>
|
||||
|
||||
<!--
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using System.Collections.Concurrent;
|
||||
|
||||
namespace GFramework.Core.Abstractions.Architectures;
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Enums;
|
||||
|
||||
namespace GFramework.Core.Abstractions.Architectures;
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using System.Reflection;
|
||||
using GFramework.Core.Abstractions.Lifecycle;
|
||||
using GFramework.Core.Abstractions.Model;
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
using GFramework.Core.Abstractions.Properties;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Properties;
|
||||
|
||||
namespace GFramework.Core.Abstractions.Architectures;
|
||||
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Command;
|
||||
using GFramework.Core.Abstractions.Environment;
|
||||
using GFramework.Core.Abstractions.Events;
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Architectures;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
using GFramework.Core.Abstractions.Enums;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Enums;
|
||||
|
||||
namespace GFramework.Core.Abstractions.Architectures;
|
||||
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Architectures;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Architectures;
|
||||
|
||||
/// <summary>
|
||||
/// 架构模块接口,继承自架构生命周期接口。
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
using GFramework.Core.Abstractions.Enums;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Enums;
|
||||
|
||||
namespace GFramework.Core.Abstractions.Architectures;
|
||||
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Command;
|
||||
using GFramework.Core.Abstractions.Events;
|
||||
using GFramework.Core.Abstractions.Ioc;
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Ioc;
|
||||
using GFramework.Core.Abstractions.Lifecycle;
|
||||
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Ioc;
|
||||
|
||||
namespace GFramework.Core.Abstractions.Architectures;
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Bases;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Bases;
|
||||
|
||||
/// <summary>
|
||||
/// 表示键值对的接口,定义了通用的键值对数据结构契约
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Bases;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Bases;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Rule;
|
||||
|
||||
namespace GFramework.Core.Abstractions.Command;
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Rule;
|
||||
|
||||
namespace GFramework.Core.Abstractions.Command;
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Command;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Events;
|
||||
using GFramework.Core.Abstractions.Utility;
|
||||
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Controller;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Controller;
|
||||
|
||||
/// <summary>
|
||||
/// 控制器标记接口,用于标识控制器组件
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Coroutine;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Coroutine;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Coroutine;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Coroutine;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Coroutine;
|
||||
|
||||
/// <summary>
|
||||
/// 表示协程的执行状态枚举
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Coroutine;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Coroutine;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Coroutine;
|
||||
|
||||
/// <summary>
|
||||
/// 时间源接口,提供当前时间、时间增量以及更新功能
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Coroutine;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Coroutine;
|
||||
|
||||
/// <summary>
|
||||
/// 定义一个可等待指令的接口,用于协程系统中的异步操作控制
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using System.ComponentModel;
|
||||
|
||||
namespace GFramework.Core.Abstractions.Cqrs;
|
||||
|
||||
@ -1,3 +1,8 @@
|
||||
<!--
|
||||
Copyright (c) 2025-2026 GeWuYou
|
||||
SPDX-License-Identifier: Apache-2.0
|
||||
-->
|
||||
|
||||
<Project>
|
||||
<!-- import parent: https://docs.microsoft.com/en-us/visualstudio/msbuild/customize-your-build -->
|
||||
<PropertyGroup>
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Enums;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Enums;
|
||||
|
||||
/// <summary>
|
||||
/// 架构阶段枚举,定义了系统架构初始化和运行过程中的各个关键阶段
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Environment;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Environment;
|
||||
|
||||
/// <summary>
|
||||
/// 定义环境接口,提供应用程序运行环境的相关信息
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Events;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Events;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Events;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Events;
|
||||
|
||||
/// <summary>
|
||||
/// 事件接口,定义了事件注册的基本功能
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Events;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Events;
|
||||
|
||||
/// <summary>
|
||||
/// 事件总线接口,提供事件的发送、注册和注销功能
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Events;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Events;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Events;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Events;
|
||||
|
||||
/// <summary>
|
||||
/// 提供注销功能的接口
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Events;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Events;
|
||||
|
||||
/// <summary>
|
||||
/// 提供统一注销功能的接口,用于管理需要注销的对象列表
|
||||
|
||||
@ -1,4 +1,9 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<!--
|
||||
Copyright (c) 2025-2026 GeWuYou
|
||||
SPDX-License-Identifier: Apache-2.0
|
||||
-->
|
||||
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
|
||||
<!--
|
||||
配置项目构建属性
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
// IsExternalInit.cs
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
// IsExternalInit.cs
|
||||
// This type is required to support init-only setters and record types
|
||||
// when targeting netstandard2.0 or older frameworks.
|
||||
|
||||
|
||||
@ -1,13 +1,22 @@
|
||||
using System.Reflection;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using System.Reflection;
|
||||
using GFramework.Core.Abstractions.Rule;
|
||||
using GFramework.Core.Abstractions.Systems;
|
||||
|
||||
namespace GFramework.Core.Abstractions.Ioc;
|
||||
|
||||
/// <summary>
|
||||
/// 依赖注入容器接口,定义了服务注册、解析和管理的基本操作
|
||||
/// 依赖注入容器接口,定义服务注册、解析与生命周期管理的统一入口。
|
||||
/// </summary>
|
||||
public interface IIocContainer : IContextAware
|
||||
/// <remarks>
|
||||
/// 实现者必须在 <see cref="IDisposable.Dispose" /> 中释放容器拥有的根 <see cref="IServiceProvider" /> 及其
|
||||
/// 关联同步资源,并保证释放操作幂等。
|
||||
/// 容器一旦释放,后续任何注册、解析、查询或作用域创建调用都必须抛出
|
||||
/// <see cref="ObjectDisposedException" />,避免消费者继续访问失效的运行时状态。
|
||||
/// </remarks>
|
||||
public interface IIocContainer : IContextAware, IDisposable
|
||||
{
|
||||
#region Register Methods
|
||||
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Lifecycle;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Lifecycle;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Lifecycle;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Lifecycle;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Lifecycle;
|
||||
|
||||
/// <summary>
|
||||
/// 可销毁接口,为需要资源清理的组件提供标准销毁能力
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Lifecycle;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Lifecycle;
|
||||
|
||||
/// <summary>
|
||||
/// 可初始化接口,为需要初始化的组件提供标准初始化能力
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Lifecycle;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Lifecycle;
|
||||
|
||||
/// <summary>
|
||||
/// 完整生命周期接口,组合了初始化和销毁能力
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Localization;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using System.Globalization;
|
||||
using GFramework.Core.Abstractions.Systems;
|
||||
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Localization;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Localization;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Localization;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Localization;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Localization;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Localization;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
|
||||
/// <summary>
|
||||
/// 定义日志记录接口,提供日志记录和级别检查功能
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
|
||||
/// <summary>
|
||||
/// 定义日志工厂接口,用于创建日志记录器实例
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
|
||||
/// <summary>
|
||||
/// 定义日志工厂提供者的接口,用于创建具有指定名称和最小日志级别的日志记录器
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
|
||||
/// <summary>
|
||||
/// 定义日志级别的枚举,用于标识不同严重程度的日志消息
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Logging;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,4 +1,7 @@
|
||||
using GFramework.Core.Abstractions.Architectures;
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Architectures;
|
||||
using GFramework.Core.Abstractions.Lifecycle;
|
||||
using GFramework.Core.Abstractions.Rule;
|
||||
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Pause;
|
||||
|
||||
/// <summary>
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
using GFramework.Core.Abstractions.Utility;
|
||||
|
||||
namespace GFramework.Core.Abstractions.Pause;
|
||||
|
||||
@ -1,3 +1,6 @@
|
||||
// Copyright (c) 2025-2026 GeWuYou
|
||||
// SPDX-License-Identifier: Apache-2.0
|
||||
|
||||
namespace GFramework.Core.Abstractions.Pause;
|
||||
|
||||
/// <summary>
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user