service.llm.zhipuai module
- class service.llm.zhipuai.ZHIPUAI(api_key, cache_collection=None, **kwargs)[source]
 Bases:
BASE_LLM_CACHE- call_model(query, use_cache)[source]
 Calls the ZhipuAI model with the given query.
- Parameters:
 query (dict) – The query to send to the OpenAI model.
use_cache (bool) – Whether to use cached responses if available.
- Returns:
 The response from the model, either from cache or a new request.
- Return type:
 str
- class service.llm.zhipuai.ZHIPUAI_SERVICE[source]
 Bases:
objectA service class for managing ZhipuAI LLM requests through a queue mechanism using MongoDB and RabbitMQ.
- classmethod check_llm_job_done(job_id)[source]
 Checks if a job has been completed.
- Parameters:
 job_id (str) – The ID of the job to check.
- Returns:
 Whether the job has been completed.
- Return type:
 bool
- collection = Collection(Database(MongoClient(host=['localhost:27017'], document_class=dict, tz_aware=False, connect=True), 'llm'), 'zhipu')
 
- classmethod get_llm_job_response(job_id)[source]
 Gets the response of a completed job.
- Parameters:
 job_id (str) – The ID of the job to check.
- Returns:
 The response of the job.
- Return type:
 str
- classmethod launch_worker()[source]
 Launches a worker to process jobs from the RabbitMQ queue. The worker interacts with the LLM and stores the response back in MongoDB.
- queue_name = 'llm-zhipu'
 
- classmethod trigger(query, caller_service, use_cache=False)[source]
 Creates and triggers a new job for an LLM request.
- Parameters:
 query (
dict) – The query to send to the LLM.caller_service (str) – The service initiating the request.
use_cache (bool) – Whether to use cached responses if available.
- Returns:
 The job ID of the triggered request.
- Return type:
 str
- service.llm.zhipuai.now(tz=None)
 Returns new datetime object representing current time local to tz.
- tz
 Timezone object.
If no tz is specified, uses local timezone.