LogEventInfo - Changed LayoutCache from dictionary to linked list#6025
LogEventInfo - Changed LayoutCache from dictionary to linked list#6025
Conversation
|
Note Other AI code review bot(s) detectedCodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review. WalkthroughReplaces LogEventInfo's internal layout cache from a locked Dictionary to a lock-free singly linked list via a private LayoutCacheLinkedNode and Interlocked.CompareExchange; AddCachedLayoutValue prepends nodes, TryGetCachedLayoutValue traverses the list. Public API unchanged. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant Caller
participant LogEventInfo
participant Cache as LayoutCache (linked list)
Caller->>LogEventInfo: TryGetCachedLayoutValue(layout)
LogEventInfo->>Cache: Traverse head -> next -> ...
alt Match found
Cache-->>LogEventInfo: Return cached value
LogEventInfo-->>Caller: Value
else Not found
LogEventInfo-->>Caller: Miss
end
sequenceDiagram
autonumber
participant Caller
participant LogEventInfo
participant Interlocked
participant Cache as LayoutCache (linked list)
Caller->>LogEventInfo: AddCachedLayoutValue(layout, value)
LogEventInfo->>Cache: Read current head
loop Retry (CAS) or early-exit if already present
LogEventInfo->>Interlocked: CompareExchange(ref head, newNode{Next=oldHead}, oldHead)
alt CAS success
Interlocked-->>LogEventInfo: Swapped (new head)
LogEventInfo-->>Caller: Done
else CAS failed
Interlocked-->>LogEventInfo: No swap (reload head)
end
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Poem
Pre-merge checks and finishing touches❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 0
🧹 Nitpick comments (2)
src/NLog/LogEventInfo.cs (2)
72-72: Strengthen memory visibility on lock‑free cache reads (optional)To guarantee readers always see fully published nodes across CPUs, consider volatile semantics for the head and volatile reads during traversal.
Option A (simplest): make the head volatile.
- private LayoutCacheLinkedNode? _layoutCache; + private volatile LayoutCacheLinkedNode? _layoutCache;Option B: use Volatile.Read on head and next in TryGet.
- var nodeHead = _layoutCache; + var nodeHead = Volatile.Read(ref _layoutCache); while (nodeHead != null) { if (ReferenceEquals(nodeHead.Key, layout)) { value = nodeHead.Value; return true; } - nodeHead = nodeHead.Next; + nodeHead = Volatile.Read(ref nodeHead.Next); }Also applies to: 637-646
814-825: Invalidate layout cache when message/parameters change (avoid stale values)If Message/Parameters/FormatProvider mutate after precalc, cached layout outputs can become stale. Consider clearing the cache in ResetFormattedMessage.
private void ResetFormattedMessage(bool rebuildMessageTemplateParameters) { if (_messageFormatter is null || _messageFormatter.Target is ILogMessageFormatter) { _formattedMessage = null; + // Cached layout values may depend on message/parameters/format-provider + Volatile.Write(ref _layoutCache, null); }Please confirm this matches intended lifecycle (i.e., LogEventInfo not mutated after layouts start precalculating). If mutation never happens post-precalc, this can be skipped.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/NLog/LogEventInfo.cs(3 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
src/NLog/LogEventInfo.cs (1)
src/NLog/Layouts/SimpleLayout.cs (1)
Layout(57-487)
🔇 Additional comments (2)
src/NLog/LogEventInfo.cs (2)
625-633: CAS prepend loop looks correctLock-free insertion is correct and wait-free for writers. No correctness issues spotted.
841-852: Node type is minimal and appropriateImmutable key/value, single Next pointer—fits the prepend pattern. LGTM.
There was a problem hiding this comment.
Pull Request Overview
Replaces Dictionary-based layout caching with a linked list implementation to reduce memory allocations for small cache sizes. The change aims to optimize performance when caching 1-4 layout outputs by eliminating the upfront allocation overhead of Dictionary objects.
- Replaced
IDictionary<Layout, object?>with customLayoutCacheLinkedNodelinked list - Implemented lock-free concurrent operations using
Interlocked.CompareExchange - Added new
LayoutCacheLinkedNodenested class for the linked list structure
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
4b09018 to
85cb4a9
Compare
There was a problem hiding this comment.
Actionable comments posted: 0
♻️ Duplicate comments (1)
src/NLog/LogEventInfo.cs (1)
625-636: Critical race condition allows duplicate cache entries.The duplicate check on line 630 reads
_layoutCachedirectly viaTryGetCachedLayoutValue, but another thread can insert the same layout between that check and the CAS on line 634. This race condition will create duplicate entries in the linked list, defeating the memory optimization goal and degrading lookup performance.Additionally, creating the new node (line 625) before checking for existence wastes an allocation when the layout is already cached.
Consider this approach to prevent duplicates:
internal void AddCachedLayoutValue(Layout layout, object? value) { - var newNode = new LayoutCacheLinkedNode(layout, value); - var nodeHead = _layoutCache; + if (TryGetCachedLayoutValue(layout, out _)) + return; + var newNode = new LayoutCacheLinkedNode(layout, value); + var nodeHead = _layoutCache; do { - if (nodeHead != null && TryGetCachedLayoutValue(layout, out _)) - return; - + // Check if layout exists in the current snapshot + var current = nodeHead; + while (current != null) + { + if (ReferenceEquals(current.Key, layout)) + return; + current = current.Next; + } + newNode.Next = nodeHead; nodeHead = Interlocked.CompareExchange(ref _layoutCache, newNode, nodeHead); } while (!ReferenceEquals(nodeHead, newNode.Next)); }This checks the snapshot (
nodeHead) rather than re-reading_layoutCache, reducing (though not eliminating) the race window. Note that duplicates may still occur under high concurrency, but this significantly reduces the likelihood.Based on past review comments.
🧹 Nitpick comments (1)
src/NLog/LogEventInfo.cs (1)
641-654: Consider using volatile read for better memory visibility.The plain read of
_layoutCache(line 641) might not immediately see updates from other threads on architectures with weaker memory models. While .NET's memory model is relatively strong and this is unlikely to cause issues in practice, usingVolatile.Readwould provide stronger guarantees.Apply this diff for explicit memory ordering:
internal bool TryGetCachedLayoutValue(Layout layout, out object? value) { - var nodeHead = _layoutCache; + var nodeHead = Volatile.Read(ref _layoutCache); while (nodeHead != null) {
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
src/NLog/LogEventInfo.cs(3 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
src/NLog/LogEventInfo.cs (1)
src/NLog/Layouts/SimpleLayout.cs (1)
Layout(57-487)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: task-list-completed
🔇 Additional comments (2)
src/NLog/LogEventInfo.cs (2)
636-636: Loop termination logic is correct.The condition
!ReferenceEquals(nodeHead, newNode.Next)correctly implements the CAS retry loop:
- On success:
CompareExchangereturns the old value (equalsnewNode.Next), so the condition is false and the loop exits.- On failure:
CompareExchangereturns the current value (differs fromnewNode.Next), so the condition is true and the loop retries.The past review comment claiming this causes an infinite loop is incorrect.
Based on past review comments.
845-856: Well-designed node class for the linked list cache.The
LayoutCacheLinkedNodeclass is appropriately designed:
- Sealed to prevent inheritance overhead
- Readonly
KeyandValueprevent accidental mutation- Mutable
Nextenables lock-free list construction- Minimal memory footprint
3d0e014 to
6350737
Compare
|



The Dictionary needs to allocate 3 objects upfront, and additional allocates 2 larger array when inserting more than 3 items:
The linked list will reduce allocations, when only needing to cache 1-4 layout-outputs. The linked-list should still have fast lookup when less than 10 layout-outputs.
After having removed the implicit caching of layout-output on Layout.Render(), and the caching only happens on precalculate, then the linked-list should not grow aggressively with many duplicate entries.