Generate strongly-typed settings classes from appsettings.json at compile time.
| Key | Value |
|---|---|
| Language | C# / .NET 8+ |
| Distribution | NuGet (dotnet add package AppSettingsGen) |
| License | MIT |
| Category | Source Generator / C# |
You are building AppSettingsGen from scratch as an autonomous agent. Complete all 6 phases below in order. Do NOT ask for human input — make reasonable decisions and keep moving. Read SPEC.md in this directory for the full project specification before starting.
Create the project from scratch with proper structure.
# dotnet new sln -n AppSettingsGen
# dotnet new classlib -n AppSettingsGen -o src/AppSettingsGen
# dotnet new xunit -n AppSettingsGen.Tests -o tests/AppSettingsGen.Tests
# dotnet sln add src/AppSettingsGen tests/AppSettingsGen.Tests
# dotnet add src/AppSettingsGen package Microsoft.CodeAnalysis.CSharp --version 4.*- Initialize a git repository
- Create the standard project structure for C# / .NET 8+
- Install all dependencies listed in the spec
- Create a
CLAUDE.mdwith project conventions (coding style, commit format, test patterns) - Make an initial commit: "chore: initialize AppSettingsGen project"
Read SPEC.md thoroughly and create PLAN.md:
- Break down ALL core features into ordered implementation tasks
- Each task should be small enough to implement and test in one step
- Identify the critical path (what must be built first)
- Note any architectural decisions and their rationale
- List all CLI commands / API surfaces to implement
- Commit: "docs: add implementation plan"
Follow PLAN.md step by step:
- Implement the core architecture first (protocols, base classes, plugin system)
- Then implement each feature one at a time
- Write clean, idiomatic C# / .NET 8+ code
- Follow the conventions in
CLAUDE.md - Commit after each logical unit of work with descriptive messages
- If a design decision isn't specified in the spec, choose the simplest working approach
Write comprehensive tests and ensure everything passes:
# dotnet test
# dotnet build --no-incremental- Write unit tests for all core logic
- Write integration tests for CLI commands / public API
- Test edge cases and error paths
- Achieve >80% code coverage on core modules
- Fix any failing tests before proceeding
- Commit: "test: add comprehensive test suite"
Polish the code and fix any issues:
# dotnet build --no-incremental -warnaserror
# dotnet format- Run linter and fix all issues
- Run formatter
- Review all public APIs for consistency and usability
- Ensure all CLI commands work exactly as documented in the spec
- Verify error messages are clear and actionable
- Remove any dead code or TODOs
- Commit: "refactor: polish and lint cleanup"
Finalize the project for release:
- Write a comprehensive README.md with:
- Project description and motivation
- Installation instructions
- Quick-start usage examples
- Full CLI/API reference
- Contributing guide section
- License
- Create a
.pre-commit-hooks.yamlif the spec mentions pre-commit integration - Create a GitHub Actions CI workflow (
.github/workflows/ci.yml) - Verify ALL success criteria from
SPEC.mdare met - Run the full test suite one final time
- Final commit: "docs: add README and CI configuration"
- No placeholders: Every function must have a real implementation, not
passorTODO - No over-engineering: Build exactly what the spec says, nothing more
- Test everything: If it's a core feature, it has a test
- Commit often: Small, logical commits with descriptive messages
- Stay focused: If you hit a blocker, simplify the approach rather than adding complexity