Activity
Mon
Wed
Fri
Sun
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Jan
Feb
What is this?
Less
More

Memberships

OpenClaw Builders

49 members • Free

Lead Generation Secrets

22k members • Free

The AI MBA

1k members • Free

Value Pricing Academy

572 members • Free

Vibe Marketing

5.7k members • Free

Canadian Tax Enthusiasts

1k members • Free

Million Dollar Practice

471 members • Free

The Virtual Bookkeeping Series

74.9k members • Free

1 contribution to OpenClaw Builders
facing high token burn challenges
Hey folks/quick question for anyone running Clawdbot. what do you consider a healthy “always-on” context size per session from a token + cost perspective? I’m currently tightening (or at least trying to/lol) things to: - ~15K tokens for active context - Last 3–5 messages only - Aggressive summarization + compaction beyond that - Using a cheaper model for non-thinking tasks (summaries, formatting, validation) Curious: - Where do you cap context in practice? - Do you rely on auto-compaction (maybe the gateway helps to compact but the bot adds all contexts which always pushes the size of contexts) or manual summaries? - Any gotchas you’ve hit with session memory blowing up costs? Would love to hear real-world numbers vs theory.
1
0
1-1 of 1
Rohan Ahmed
1
5points to level up
@rohan-ahmed-5791
Product Manager looking to build network of product managers/Originally from Investment banking now building/acquiring tech products

Active 19h ago
Joined Jan 28, 2026
ENTP
Canada
Powered by