I'm setting a summary model override in the openclaw.json config:
"lossless-claw": {
"enabled": true,
"config": {
"freshTailCount": 32,
"contextThreshold": 0.75,
"incrementalMaxDepth": -1,
"summaryProvider": "cloudflare-ai-gateway",
"summaryModel": "cloudflare-ai-gateway/claude-haiku-4-5"
}
}
Anthropic enables a default 5m cache for models with no cache retention specified. It does not seem that LCM would benefit from a token cache so I want to disable it to reduce costs. However it looks like LCM does not pass the params defined in the agent:models config and all API calls use the default short cache.
Please add the capability to override the model params or pass the params specified under agent:models to the API when making requests.
Thanks!
I'm setting a summary model override in the openclaw.json config:
Anthropic enables a default 5m cache for models with no cache retention specified. It does not seem that LCM would benefit from a token cache so I want to disable it to reduce costs. However it looks like LCM does not pass the params defined in the agent:models config and all API calls use the default short cache.
Please add the capability to override the model params or pass the params specified under agent:models to the API when making requests.
Thanks!