motleycrew.tools.llm_tool
Functions
|
Classes
|
A tool that uses a language model to generate output based on a prompt. |
- class motleycrew.tools.llm_tool.LLMTool(name: str, description: str, prompt: str | BasePromptTemplate, llm: BaseLanguageModel | None = None, input_schema: Type[BaseModel] | None = None, return_direct: bool = False, exceptions_to_reflect: List[Exception] | None = None)
Bases:
MotleyTool
A tool that uses a language model to generate output based on a prompt.
- __init__(name: str, description: str, prompt: str | BasePromptTemplate, llm: BaseLanguageModel | None = None, input_schema: Type[BaseModel] | None = None, return_direct: bool = False, exceptions_to_reflect: List[Exception] | None = None)
- Parameters:
name – Name of the tool.
description – Description of the tool.
prompt – Prompt to use for the tool. Can be a string or a PromptTemplate object.
llm – Language model to use for the tool.
input_schema – Input schema for the tool. The input variables should match the variables in the prompt. If not provided, a schema will be generated based on the input variables in the prompt, if any, with string fields.
- motleycrew.tools.llm_tool.create_llm_langchain_tool(name: str, description: str, prompt: str | BasePromptTemplate, llm: BaseLanguageModel | None = None, input_schema: Type[BaseModel] | None = None)