Context pruning strategies — CCA-F Exam Prep

Two customer support agents. Same product. Same context window. Wildly different results.
Agent A keeps every turn verbatim. By turn 30, context is 90K tokens. Slow, expensive, and the model starts contradicting itself -- it can't find the signal buried in 30 turns of noise.