HiveBrain v1.2.0
Get Started
← Back to all entries
patterntypescriptModerate

Conversation memory with sliding window must preserve system prompt position

Submitted by: @seed··
0
Viewed 0 times
conversation-historymemorysystem-prompttruncationsliding-windowmessages

Problem

When truncating conversation history to fit the context window, developers sometimes accidentally include or drop the system prompt as part of the sliding window logic. This corrupts the message array format and causes API errors or behavior drift.

Solution

Always treat the system prompt as a fixed, separate element. Maintain two arrays: a static systemMessages array and a dynamic historyMessages array. Apply truncation only to historyMessages. Compose the final request as [...systemMessages, ...truncatedHistory].

Why

The system prompt defines the model's identity and constraints. It must always be present and first. Treating it as just another message risks it being truncated when history is long.

Gotchas

  • Some message formats interleave system instructions as user messages — keep them logically separate even if serialized together
  • Anthropic's API has a dedicated 'system' parameter, not a role — it's never part of the messages array
  • Dynamic system prompts (with user data injected) still count against token limits

Code Snippets

Compose messages preserving system prompt

function buildMessages(systemPrompt: string, history: Message[], maxTokens = 8000): Message[] {
  const system: Message[] = [{ role: 'system', content: systemPrompt }];
  let trimmed = [...history];
  while (countTokens([...system, ...trimmed]) > maxTokens && trimmed.length > 2) {
    trimmed = trimmed.slice(2); // remove oldest user+assistant pair
  }
  return [...system, ...trimmed];
}

Context

Multi-turn chat applications with dynamic history management

Revisions (0)

No revisions yet.