Skip to content

Add monitoring section to user guides#272

Merged
nrichers merged 22 commits intomainfrom
nrichers/sc-5833/documentation-provide-guidance-on-how-to
Aug 16, 2024
Merged

Add monitoring section to user guides#272
nrichers merged 22 commits intomainfrom
nrichers/sc-5833/documentation-provide-guidance-on-how-to

Conversation

@nrichers
Copy link
Copy Markdown
Collaborator

@nrichers nrichers commented Aug 14, 2024

Internal Notes for Reviewers

This PR adds initial user guide information for ongoing monitoring, including:

  • New "Monitoring" section on the "Guides" landing page
  • New "Ongoing monitoring" core docs
  • New "Ongoing monitoring" tab under test descriptions
  • Pull in new notebook for ongoing monitoring
  • New glossary section for monitoring
  • Other docs where monitoring should be mentioned, e.g. under "Install and initialize the developer framework"

Output

New "Monitoring" section on the "Guides" landing page

image image

New "Ongoing monitoring" core docs

image

New "Ongoing monitoring" tab under "Test descriptions"

image

Pull in new notebook for ongoing monitoring

image

New glossary section for monitoring

image image

Other docs where monitoring should be mentioned

image

External Release Notes

Monitoring is a critical component of model risk management, as emphasized in regulations such as SR 11-7, SS1/23, and E-24. With this release of ValidMind, we officially support ongoing monitoring that you can enable for both existing and new models.

image

Scenarios where ongoing monitoring is warranted:

  • Pre-approval monitoring of new models
  • Monitoring during significant updates
  • Post-production monitoring

As your code runs, the monitoring template for your model automatically populates with data, providing a comprehensive view of your model’s performance over time. You access and examine these results within the ValidMind Platform UI, allowing you to identify any deviations from expected performance and take corrective actions as needed.

Learn more ...

@nrichers nrichers added the internal Not to be externalized in the release notes label Aug 14, 2024
Copy link
Copy Markdown
Collaborator

@validbeck validbeck left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! I particularly love the descriptions as to what monitoring is and what is important.

Some thoughts:

Monitoring scenarios

  • Shall we just make it consistent with our love of m-dashes here instead of colons? (Esp since the rest of the page has them already!)
  • I would maybe break the paragraphs up a little as that much dense (both visually and information wise) text is hard to parse in a large chunk.

(Pushed a commit for these suggestions, as well as some other minor wording/formatting tweaks.)

Enable monitoring for models & review monitoring results

Since this main page is already so text-heavy / sets up the context, I think it serves better as a landing page... by the time we get to the bottom it's very long.

I propose two new articles under Ongoing monitoring (you can then also put them in "What's next"):

  • Enable monitoring for models
  • Review (model) monitoring results

@nrichers
Copy link
Copy Markdown
Collaborator Author

@validbeck thank you for reviewing and pushing some changes! Much appreciated.

I propose two new articles under Ongoing monitoring

Agreed, it makes sense to split this topic now. I was originally trying to create a placeholder to iterate over in the future, but there's too much content already. Splitting will address this issue.

... page is already so text-heavy ...

Related, I had some Mermaid diagrams in there at some point but removed them to save some time. I might see if I can include these to reduce the feeling of text heaviness a bit.

@juanmleng
Copy link
Copy Markdown
Contributor

Looks great :) A couple of minor suggestions:

  • Use Ongoing Monitoring in the glossary instead of Monitoring
  • To aligned ourselves a bit with SR11-7, replace effectiveness by robustness

For example in here:

Screenshot 2024-08-14 at 21 38 06

I suggest:

Ongoing Monitoring
To regularly evaluate the ongoing accuracy, robustness, and stability of a model after it has been deployed.

Ongoing Monitoring
Monitoring of model performance is a critical component of model risk management, as emphasized in SR 11-7. This process involves regularly assessing a model’s predictive accuracy, robustness, and stability to ensure that it continues to perform as expected and remains fit for its intended purpose after deployment.

Same here:

Screenshot 2024-08-14 at 21 43 31

I suggest the same text or similar:
Monitoring of model performance is a critical component of model risk management, as emphasized in SR 11-7. This process involves regularly assessing a model’s predictive accuracy, robustness, and stability to ensure that it continues to perform as expected and remains fit for its intended purpose after deployment.

@cachafla
Copy link
Copy Markdown
Contributor

Looks great :) A couple of minor suggestions:

  • Use Ongoing Monitoring in the glossary instead of Monitoring
  • To aligned ourselves a bit with SR11-7, replace effectiveness by robustness

For example in here:

Screenshot 2024-08-14 at 21 38 06 I suggest:

Ongoing Monitoring To regularly evaluate the ongoing accuracy, robustness, and stability of a model after it has been deployed.

Ongoing Monitoring Monitoring of model performance is a critical component of model risk management, as emphasized in SR 11-7. This process involves regularly assessing a model’s predictive accuracy, robustness, and stability to ensure that it continues to perform as expected and remains fit for its intended purpose after deployment.

Same here:

Screenshot 2024-08-14 at 21 43 31 I suggest the same text or similar: Monitoring of model performance is a critical component of model risk management, as emphasized in SR 11-7. This process involves regularly assessing a model’s predictive accuracy, robustness, and stability to ensure that it continues to perform as expected and remains fit for its intended purpose after deployment.

We might not necessarily want to stick to only talk about SR 11-7 since other regulations such as SS 1/23 and E23 are relevant and applicable to our customers. Maybe there's a way to make these statements more generic or somehow say things like "as emphasized in regulations such as X, Y, Z"?

@juanmleng
Copy link
Copy Markdown
Contributor

Looks great :) A couple of minor suggestions:

  • Use Ongoing Monitoring in the glossary instead of Monitoring
  • To aligned ourselves a bit with SR11-7, replace effectiveness by robustness

For example in here:
Screenshot 2024-08-14 at 21 38 06
I suggest:
Ongoing Monitoring To regularly evaluate the ongoing accuracy, robustness, and stability of a model after it has been deployed.
Ongoing Monitoring Monitoring of model performance is a critical component of model risk management, as emphasized in SR 11-7. This process involves regularly assessing a model’s predictive accuracy, robustness, and stability to ensure that it continues to perform as expected and remains fit for its intended purpose after deployment.
Same here:
Screenshot 2024-08-14 at 21 43 31
I suggest the same text or similar: Monitoring of model performance is a critical component of model risk management, as emphasized in SR 11-7. This process involves regularly assessing a model’s predictive accuracy, robustness, and stability to ensure that it continues to perform as expected and remains fit for its intended purpose after deployment.

We might not necessarily want to stick to only talk about SR 11-7 since other regulations such as SS 1/23 and E23 are relevant and applicable to our customers. Maybe there's a way to make these statements more generic or somehow say things like "as emphasized in regulations such as X, Y, Z"?

I agree with Andres' comments. I'm happy to go with either a generic approach or include references to specific MRM frameworks. Since all the regulations emphasize monitoring, mentioning SS1/23, SR 11-7, TRIM, and E-23 should cover the key guidelines.

@nrichers
Copy link
Copy Markdown
Collaborator Author

Updates to address review feedback

@validbeck if these updates pass muster with you, could you approve this PR, please?

Updated wording

Now refers to "ongoing monitoring" in the glossary, uses "robustness" instead of "effectiveness", and includes references to sample regulations:

image image

Separate task topics

Reworked overview topic with some sample output from testing and child topics included below:

image

Separate task topic for enabling monitoring:

image

Separate task topic for reviewing monitoring results:

image

…to' of github.com:validmind/documentation into nrichers/sc-5833/documentation-provide-guidance-on-how-to
@github-actions
Copy link
Copy Markdown
Contributor

PR Summary

This pull request introduces significant enhancements to the documentation by adding comprehensive sections on monitoring capabilities. The key changes include:

  1. New Monitoring Section: A new section titled 'Monitoring' has been added to the documentation, which includes detailed guides on enabling and reviewing ongoing monitoring for models.

    • Enable Monitoring: Instructions on how to add monitoring=True to the code snippet and select a monitoring template.
    • Review Monitoring Results: Steps to access and review the monitoring results within the ValidMind Platform UI.
  2. Glossary Updates: New terms related to monitoring have been added to the glossary, including definitions for backtesting, model drift, model performance, ongoing monitoring, recalibrating models, and reporting and governance.

  3. Sample Monitoring Results: Images and examples of monitoring results have been included to provide a visual understanding of the monitoring process.

  4. Test Descriptions: The test descriptions section has been updated to include ongoing monitoring tests, such as Feature Drift, Prediction Across Each Feature, Prediction Correlation, and Target Prediction Distribution Plot.

  5. Code Samples: The code samples have been updated to include examples of how to enable and use monitoring in the ValidMind platform.

  6. Miscellaneous Updates: Minor updates to existing documentation to ensure consistency and clarity, including changes to the glossary and various sections to reflect the new monitoring capabilities.

Test Suggestions

  • Verify that the new 'Monitoring' section is accessible from the main documentation page.
  • Test the code snippets provided in the 'Enable Monitoring' section to ensure they work as expected.
  • Check the links in the 'Review Monitoring Results' section to ensure they navigate to the correct pages.
  • Validate the glossary entries for new monitoring terms to ensure they are accurate and comprehensive.
  • Run the updated code samples to confirm that monitoring can be enabled and results can be reviewed as described.
  • Ensure that the images and examples of monitoring results are correctly displayed and provide clear insights.

@nrichers
Copy link
Copy Markdown
Collaborator Author

@validbeck I updated this PR with the new style for the "Prerequisites" section, FYI.

image image

@nrichers nrichers added documentation Improvements or additions to documentation highlight Feature to be curated in the release notes and removed internal Not to be externalized in the release notes labels Aug 16, 2024
@nrichers nrichers merged commit ef213ef into main Aug 16, 2024
@nrichers nrichers deleted the nrichers/sc-5833/documentation-provide-guidance-on-how-to branch August 16, 2024 16:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation highlight Feature to be curated in the release notes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants