Skip to content

VLLM 离线推理 #260

@fancy771962586

Description

@fancy771962586

在尝试用VLLM离线推理的时候,用的VLLM版本v0.10.0,
texts = [ processor.apply_chat_template( msg, tokenize=False, add_generation_prompt=True ) for msg in batch_data ]报错
apply_chat_template 失败: can only concatenate str (not "list") to str

换到官方说的VLLM版本v0.9.1,只提供了通过API在线推理的方式,最后在VLLM版本v0.10.0用手动构造prompt的方式解决了:
prompt = f"<|user|><|img|><|imgpad|><|endofimg|>{text_part}<|endofuser|><|assistant|>"

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions