lmflow.utils.conversation_template.hymba#

Attributes#

Classes#

Module Contents#

class lmflow.utils.conversation_template.hymba.HymbaConversationTemplate[source]#

Bases: lmflow.utils.conversation_template.base.ConversationTemplateForTool

encode_conversation(tokenizer: transformers.PreTrainedTokenizer, messages: List[Dict[str, str]], system: str | None = None, tools: List[str] | None = None, **kwargs) Sequence[Tuple[List[int], List[int]]][source]#

Messages here should be guaranteed to be in pairs, with the first message being the user message and the second message being the system message. Data example: ```json {

“conversation_id”: 2, “system”: “sysinfo1”, “tools”: [“tool_1_desc”], “messages”: [

{

“role”: “user”, “content”: “hi”

}, {

“role”: “assistant”, “content”: “Hello!”

}

]

}#

lmflow.utils.conversation_template.hymba.HYMBA_TEMPLATE[source]#