New article: "Using OpenAI's Moderation API in an ABP Application with the AI Management Module"#24853
New article: "Using OpenAI's Moderation API in an ABP Application with the AI Management Module"#24853
Conversation
|
Images automagically compressed by Calibre's image-actions ✨ Compression reduced images by 71%, saving 281.5 KB.
1 image did not require optimisation. |
There was a problem hiding this comment.
Pull request overview
This PR adds a new community article documenting how to integrate OpenAI’s omni-moderation-latest Moderation API into an ABP application using the AI Management Module, with a CMS Kit Comments example.
Changes:
- Added a new community article markdown post with step-by-step setup and code samples.
- Added accompanying images used by the article.
Reviewed changes
Copilot reviewed 1 out of 6 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
| docs/en/Community-Articles/2026-02-04-Omni-Moderation-in-AI-Management-Module/post.md | New article content and code snippets for moderation + AI Management + CMS Kit integration. |
| docs/en/Community-Articles/2026-02-04-Omni-Moderation-in-AI-Management-Module/images/example-comment.png | Screenshot used in the article. |
| docs/en/Community-Articles/2026-02-04-Omni-Moderation-in-AI-Management-Module/images/ai-management-widget.png | Image asset used in the article. |
| The AI Management Module addresses all these concerns by providing: | ||
|
|
||
| - **Dynamic Workspace Management**: Create, configure, and update AI workspaces directly from a user-friendly administrative interface—no code changes required. | ||
| - **Provider Flexibility**: Seamlessly switch between different AI providers (OpenAI, Gemini, Antrophic, Azure OpenAI, Ollama, and custom providers) without modifying your application code. |
There was a problem hiding this comment.
Typo in provider list: "Antrophic" should be "Anthropic" (the AI company name).
| - **Provider Flexibility**: Seamlessly switch between different AI providers (OpenAI, Gemini, Antrophic, Azure OpenAI, Ollama, and custom providers) without modifying your application code. | |
| - **Provider Flexibility**: Seamlessly switch between different AI providers (OpenAI, Gemini, Anthropic, Azure OpenAI, Ollama, and custom providers) without modifying your application code. |
| // This allows runtime configuration changes without redeployment | ||
| var config = await _workspaceConfigurationStore.GetOrNullAsync<OpenAIAssistantWorkspace>(); | ||
|
|
||
| if(config == null) | ||
| { | ||
| throw new UserFriendlyException("Could not find the 'OpenAIAssistant' workspace!"); |
There was a problem hiding this comment.
The IWorkspaceConfigurationStore usage here doesn’t match the AI Management module docs in this repo. docs/en/modules/ai-management/index.md shows retrieving a workspace configuration by name via _workspaceConfigurationStore.GetAsync("MyWorkspace"), but this article uses GetOrNullAsync<OpenAIAssistantWorkspace>() (and never defines OpenAIAssistantWorkspace). This is likely to confuse readers and/or produce non-compiling sample code. Update the snippet to use the documented API and explicitly show which configuration type/properties are expected.
| // This allows runtime configuration changes without redeployment | |
| var config = await _workspaceConfigurationStore.GetOrNullAsync<OpenAIAssistantWorkspace>(); | |
| if(config == null) | |
| { | |
| throw new UserFriendlyException("Could not find the 'OpenAIAssistant' workspace!"); | |
| // This allows runtime configuration changes without redeployment. | |
| // The "OpenAIAssistant" workspace is expected to be configured with an ApiKey | |
| // (via the AI Management UI) that will be used when calling OpenAI. | |
| var config = await _workspaceConfigurationStore.GetAsync("OpenAIAssistant"); | |
| if (config == null) | |
| { | |
| throw new UserFriendlyException("Could not find the 'OpenAIAssistant' workspace configuration!"); |
| } | ||
|
|
||
| var client = new ModerationClient( | ||
| model: "omni-moderation-latest", |
There was a problem hiding this comment.
This code hardcodes the model name ("omni-moderation-latest") even though earlier sections emphasize that the AI Management Module allows runtime model selection without code changes. To align the sample with the article’s goal, read the model name from the workspace configuration instead of hardcoding it here.
| model: "omni-moderation-latest", | |
| model: config.Model, |
Description
Resolves https://github.com/volosoft/vs-internal/issues/8110
Checklist