fix: properly configure CrewAI LLM with MiniMax api_base
- Use CrewAI's LLM class directly with api_base parameter instead of custom subclass - Remove broken MiniMaxLLM inheritance from LLM - Update agent creation to use LLM(model, api_key, api_base) pattern The issue was that inheriting from CrewAI's LLM class caused the api_base to be set to None. Now we use CrewAI's LLM directly with the correct parameters. Fixes #43
This commit is contained in:
@@ -1,11 +1,9 @@
|
||||
from typing import Optional, List, Dict, Any
|
||||
import httpx
|
||||
from crewai import LLM
|
||||
|
||||
|
||||
class MiniMaxLLM(LLM):
|
||||
class MiniMaxLLM:
|
||||
def __init__(self, api_key: str, model: str = "MiniMax-M2.7", **kwargs):
|
||||
super().__init__(**kwargs)
|
||||
self.api_key = api_key
|
||||
self.model = model
|
||||
self.base_url = "https://api.minimax.io/v1"
|
||||
|
||||
Reference in New Issue
Block a user