Skip to content

fix: use LLM for unique video extension prompts; remove tokens on 400 email-domain-rejected#375

Open
crowwdev wants to merge 5 commits intochenyme:mainfrom
crowwdev:fix/video-llm-prompts-banned-token-removal
Open

fix: use LLM for unique video extension prompts; remove tokens on 400 email-domain-rejected#375
crowwdev wants to merge 5 commits intochenyme:mainfrom
crowwdev:fix/video-llm-prompts-banned-token-removal

Conversation

@crowwdev
Copy link

@crowwdev crowwdev commented Mar 21, 2026

PR Title

fix: use LLM for unique video extension prompts; remove tokens on 400 email-domain-rejected


Summary

Two independent fixes:

  1. Video extension scene repetition — Each extension round now gets a unique LLM-generated continuation prompt instead of reusing the original prompt, preventing repeated scenes in long videos.

  2. Auto-remove banned tokens — When a token refresh returns 400 with email-domain-rejected in the response body, the token is immediately removed from the pool.


Changes

  • Bug fix
  • Feature
  • Refactor
  • Documentation
  • Test

app/services/grok/services/video.py

  • Added _generate_continuation_prompt() using curl_cffi.requests.AsyncSession (no new dependencies)
  • Reads app.api_key from config and sets Authorization header if configured
  • Sets "stream": False explicitly in the LLM request
  • Uses grok-3-fast model with temperature=0.8, max_tokens=60
  • Falls back to original prompt silently if LLM call fails
  • _stream_chain() and _collect_chain() call LLM for each extension round (plan.is_extension == True)

app/services/grok/services/video_extend.py

  • Imports _generate_continuation_prompt from video.py

app/services/token/manager.py

  • In record_fail(): when status_code == 400 and reason contains email-domain-rejected, token is immediately removed from pool
  • In _refresh_one(): same check during cooling token refresh — 400 + email-domain-rejected removes token from pool instantly

Verification

  • 30-second video (5 rounds) generates 5 distinct prompts visible in logs
  • Tokens with email-domain-rejected disappear from pool immediately after next refresh cycle
  • Container starts without errors (no new pip dependencies)
  • Fallback to original prompt works when LLM is unavailable

@crowwdev crowwdev changed the title fix: use LLM for unique video extension prompts; auto-remove email-do… fix: use LLM for unique video extension prompts; remove tokens on 400 email-domain-rejected Mar 21, 2026
jiangmuran added a commit to jiangmuran/grok2api_pro that referenced this pull request Mar 21, 2026
…s and auto-remove email-domain-rejected tokens

- Video extension: Each extension round now gets a unique LLM-generated continuation prompt instead of reusing the original prompt, preventing repeated scenes in long videos
- Token management: When a token refresh returns 400 with 'email-domain-rejected', the token is immediately removed from the pool
- Modified files: video.py, video_extend.py, manager.py

Upstream PR: chenyme#375
piexian added a commit to piexian/grok2api that referenced this pull request Mar 22, 2026
上游来源与合并方式:

- chenyme#366 chenyme#366 :手工移植 usage 估算与 responses usage 映射,按当前主线重接 chat/responses 接口,没有直接套用原 patch。

- chenyme#374 chenyme#374 :参考原始思路改为 app-chat 单链路接入,移除已不存在的 ws 回退,并补齐 request_overrides。

- chenyme#375 chenyme#375 :参考原始思路重写视频续写与 token 清理,不使用 localhost 自调用,直接复用现有 app-chat 请求链路,并处理 email-domain-rejected。

- chenyme#336 chenyme#336 :手工合并多图参考视频支持,保持旧 image_reference/image_url 兼容,仅在多图或 @图N 场景启用新链路。
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant