API Documentation
This section provides detailed documentation for all exposed APIs used within our system.
Service
MAIC’s implementation is arranged as services, where each service contains implementation for deploying service module or submodules as asynchrony called producer and comsumers.
llm (Large Language Model)
Caching Wrapper
- class service.llm.base.BASE_LLM_CACHE[source]
 Bases:
objectA base class for implementing caching of queries and responses for a Language Model (LLM). This class is intended to be inherited and implemented by a subclass.
- check_cache(query)[source]
 Checks if a cached response exists for the provided query.
- Parameters:
 query – The query to check in the cache.
- Returns:
 The cached response if available, otherwise a constant (NO_CACHE_YET). Also updates the internal usage cost for cached responses.
- clear_usage()[source]
 Clears the stored usage statistics.
- print_usage()[source]
 Prints the cumulative token usage statistics.
- write_cache(query, response, usage)[source]
 Writes the provided query, response, and usage information to the cache.
- Parameters:
 query – The query to be cached.
response – The response to be cached.
usage – The token usage information.
Mock API
OpenAI API
- class service.llm.openai.OPENAI(api_key, cache_collection=None, **kwargs)[source]
 Bases:
BASE_LLM_CACHEA class to interact with OpenAI’s language model while using a cache mechanism to optimize requests. Inherits from BASE_LLM_CACHE.
- class service.llm.openai.OPENAI_SERVICE[source]
 Bases:
objectA service class for managing OPENAI LLM requests through a queue mechanism using MongoDB and RabbitMQ.
- collection = Collection(Database(MongoClient(host=['localhost:27017'], document_class=dict, tz_aware=False, connect=True), 'llm'), 'openai')
 
- static get_response(job_id)[source]
 Retrieves the response of a job with the given ID.
- Parameters:
 job_id (ObjectId) – The ID of the job to retrieve the response for.
- Returns:
 The response of the job, or None if the job is not found or has not completed.
- Return type:
 str
- static get_response_sync(job_id, timeout=300)[source]
 Retrieves the response of a job with the given ID synchronously.
- Parameters:
 job_id (ObjectId) – The ID of the job to retrieve the response for.
timeout (int, optional) – The maximum time to wait for the job to complete.
- Returns:
 The response of the job, or None if the job is not found or has not completed within the timeout.
- Return type:
 str
- static launch_worker()[source]
 Launches a worker to process jobs from the RabbitMQ queue. The worker interacts with the LLM and stores the response back in MongoDB.
- logger = <Logger service.llm.openai (INFO)>
 
- queue_name = 'llm-openai'
 
- static trigger(parent_service, parent_job_id=None, use_cache=False, **query)[source]
 Creates and triggers a new job for an LLM request.
- Parameters:
 caller_service (str) – The service initiating the request.
use_cache (bool, optional) – Whether to use cached responses if available.
**query – The query to send to the LLM.
- Returns:
 The job ID of the triggered request.
- Return type:
 str
- service.llm.openai.now(tz=None)
 Returns new datetime object representing current time local to tz.
- tz
 Timezone object.
If no tz is specified, uses local timezone.
ZhipuAI API
- class service.llm.zhipuai.ZHIPUAI(api_key, cache_collection=None, **kwargs)[source]
 Bases:
BASE_LLM_CACHE- call_model(query, use_cache)[source]
 Calls the ZhipuAI model with the given query.
- Parameters:
 query (dict) – The query to send to the OpenAI model.
use_cache (bool) – Whether to use cached responses if available.
- Returns:
 The response from the model, either from cache or a new request.
- Return type:
 str
- class service.llm.zhipuai.ZHIPUAI_SERVICE[source]
 Bases:
objectA service class for managing ZhipuAI LLM requests through a queue mechanism using MongoDB and RabbitMQ.
- classmethod check_llm_job_done(job_id)[source]
 Checks if a job has been completed.
- Parameters:
 job_id (str) – The ID of the job to check.
- Returns:
 Whether the job has been completed.
- Return type:
 bool
- collection = Collection(Database(MongoClient(host=['localhost:27017'], document_class=dict, tz_aware=False, connect=True), 'llm'), 'zhipu')
 
- classmethod get_llm_job_response(job_id)[source]
 Gets the response of a completed job.
- Parameters:
 job_id (str) – The ID of the job to check.
- Returns:
 The response of the job.
- Return type:
 str
- classmethod launch_worker()[source]
 Launches a worker to process jobs from the RabbitMQ queue. The worker interacts with the LLM and stores the response back in MongoDB.
- queue_name = 'llm-zhipu'
 
- classmethod trigger(query, caller_service, use_cache=False)[source]
 Creates and triggers a new job for an LLM request.
- Parameters:
 query (
dict) – The query to send to the LLM.caller_service (str) – The service initiating the request.
use_cache (bool) – Whether to use cached responses if available.
- Returns:
 The job ID of the triggered request.
- Return type:
 str
- service.llm.zhipuai.now(tz=None)
 Returns new datetime object representing current time local to tz.
- tz
 Timezone object.
If no tz is specified, uses local timezone.
PreClass
PreClass: Main Service
- class service.preclass.main.PRECLASS_MAIN[source]
 Bases:
objectMain service class for handling the pre-class lecture processing pipeline.
This class manages a multi-stage workflow for processing lecture materials, including converting presentations to different formats and generating various educational materials.
- class STAGE[source]
 Bases:
objectEnumeration of processing stages for the pre-class pipeline.
- FINISHED = 100
 
- GEN_ASKQUESTION = 8
 
- GEN_DESCRIPTION = 4
 
- GEN_READSCRIPT = 7
 
- GEN_SHOWFILE = 6
 
- GEN_STRUCTURE = 5
 
- PDF2PNG = 2
 
- PPT2TEXT = 3
 
- PPTX2PDF = 1
 
- PUSH_AGENDA = 99
 
- START = 0
 
- static get_status(job_id)[source]
 
- static launch_worker()[source]
 Launches the worker process for handling pre-class processing jobs.
Establishes a connection to RabbitMQ and begins consuming messages from the queue. Each message triggers the appropriate stage of processing based on the job’s current stage. The worker can be terminated using CTRL+C.
- Raises:
 KeyboardInterrupt – When the worker is manually stopped
- static trigger(parent_service, source_file)[source]
 Initiates a new pre-class processing job.
- Parameters:
 parent_service (str) – Identifier of the parent service initiating this job
source_file (str) – Path to the source presentation file to be processed
- Returns:
 The ID of the created job
- Return type:
 str
Notes
Creates a new job in MongoDB and pushes it to RabbitMQ queue for processing. Moves the source file to a buffer location for processing.
- service.preclass.main.now(tz=None)
 Returns new datetime object representing current time local to tz.
- tz
 Timezone object.
If no tz is specified, uses local timezone.
PreClass: Data Structures
- class service.preclass.model.AgendaStruct(title, children=None, function=None)[source]
 Bases:
objectA tree structure representing an agenda or outline with hierarchical nodes.
- title
 The title/heading of this agenda item
- Type:
 str
- children
 List of child AgendaStruct or PPTPageStruct objects
- Type:
 list
- dfs_recursive_call(function)[source]
 Recursively call the given function on each node in the agenda structure.
- Parameters:
 function – The function to be called on each node
- flatten()[source]
 Return a flattened list of all nodes in the agenda structure.
- Returns:
 List of all nodes in the agenda structure
- Return type:
 list
- formalize()[source]
 Return a string representation of the title/heading of this agenda item
- classmethod from_dict(dict)[source]
 Create an agenda structure from a dictionary representation.
- Parameters:
 dict – Dictionary representation of the agenda structure
- Returns:
 The agenda structure object
- Return type:
 
- get_structure_trace()[source]
 Generate a formatted string representation of the structure’s hierarchy.
- Returns:
 Indented string showing the structure trace
- Return type:
 str
- insert_page(page, trace)[source]
 Insert a page into the agenda structure following the given trace path.
- Parameters:
 page – The page object to insert
trace (list) – List of section titles forming path to insertion point
- Returns:
 True if page was inserted successfully, False otherwise
- Return type:
 bool
- serialize()[source]
 
- to_dict()[source]
 Return a dictionary representation of the agenda structure.
- Returns:
 Dictionary representation of the agenda structure
- Return type:
 dict
- type = 'node'
 
- class service.preclass.model.AskQuestion(question, question_type, selects, answer, reference)[source]
 Bases:
FunctionBaseFunction to present a question to users.
- Parameters:
 question (str) – The question text
question_type – Type/category of question
selects – Available answer options
answer – Correct answer
reference – Reference material (commented out)
- class service.preclass.model.FunctionBase(call, label, value)[source]
 Bases:
objectBase class for defining interactive functions/actions within the agenda.
- call
 Function identifier/name
- Type:
 str
- label
 Display label for the function
- Type:
 str
- value
 Parameters/values for the function
- Type:
 dict
- classmethod from_dict(dict)[source]
 
- to_dict()[source]
 
- class service.preclass.model.PPTPageStruct(page, stringified, function=None)[source]
 Bases:
objectRepresents a PowerPoint page/slide in the agenda structure.
- content
 The actual page/slide content
- stringified
 String representation of the page
- Type:
 str
- dfs_recursive_call(function)[source]
 
- flatten(condition=None)[source]
 
- formalize()[source]
 
- classmethod from_dict(dict)[source]
 
- get_structure_trace()[source]
 
- serialize()[source]
 
- to_dict()[source]
 
- type = 'ppt'
 
- class service.preclass.model.ReadScript(script)[source]
 Bases:
FunctionBaseFunction to read a script/text.
- Parameters:
 script (str) – The script content to be read
- class service.preclass.model.ShowFile(file_id)[source]
 Bases:
FunctionBaseFunction to display a file.
- Parameters:
 file_id – Identifier for the file to be shown
PreClass: Sub-Services
- service.preclass.processors.pptx2pdf module
 - service.preclass.processors.pdf2png module
 - service.preclass.processors.ppt2text module
 - service.preclass.processors.gen_description module
 - service.preclass.processors.gen_structure module
SERVICEStructurelizorStructurelizor.input_scriptsStructurelizor.root_titleStructurelizor.promptStructurelizor.call_generation()Structurelizor.extract()Structurelizor.find_trace()Structurelizor.format_assistant()Structurelizor.format_prompt()Structurelizor.format_user()Structurelizor.parse_tab()Structurelizor.script2string()
now()
 - service.preclass.processors.gen_showfile module
 - service.preclass.processors.gen_readscript module
 - service.preclass.processors.gen_askquestion module
 - service.preclass.processors.qa_utils module
 
InClass
InClass: Main Service
InClass: Controller
- service.inclass.functions.base_class module
 - service.inclass.functions.showFile module
 - service.inclass.functions.readScript module
 - service.inclass.functions.askQuestion module
AskQuestionAskQuestion.async_call_director()AskQuestion.format_history()AskQuestion.format_history_for_agent()AskQuestion.format_question()AskQuestion.format_question_obj()AskQuestion.get_agent_id2name_dict()AskQuestion.get_agent_name_by_id()AskQuestion.get_agent_role()AskQuestion.init_statusAskQuestion.is_agent_ta()AskQuestion.is_agent_teacher()AskQuestion.step()
AskQuestionAgentNameConstAskQuestionStatusAskQuestionStatus.CHECKING_DID_STUDENT_ANSWERAskQuestionStatus.CONTINUE_FINISHEDAskQuestionStatus.DIRECTOR_NOT_FOUND_FINISHEDAskQuestionStatus.FINISHEDAskQuestionStatus.FORCE_CALL_TEACHER_BEFORE_RETURNAskQuestionStatus.NEED_CALL_DIRECTORAskQuestionStatus.NEED_TRANSFORM_SENTENCEAskQuestionStatus.NOT_ASKEDAskQuestionStatus.TEACHER_RESPONSE_FINISHEDAskQuestionStatus.WAITING_DIRECTOR_RETURNAskQuestionStatus.WAITING_STUDENT_ANSWERAskQuestionStatus.WAITING_USER_INPUTAskQuestionStatus.WAIT_FINISH
 - service.inclass.functions.enums module
 
InClass: SessionController
- service.inclass.classroom_session module
AgentTypeClassroomSessionClassroomSession.add_user_message()ClassroomSession.check_llm_job_done()ClassroomSession.clear_session()ClassroomSession.copy_current_session()ClassroomSession.disable_user_input()ClassroomSession.enable_user_input()ClassroomSession.force_user_input()ClassroomSession.get_action_history()ClassroomSession.get_activation()ClassroomSession.get_agent_by_id()ClassroomSession.get_agent_list()ClassroomSession.get_all_agenda_of_section_by_lecture()ClassroomSession.get_current_function()ClassroomSession.get_history()ClassroomSession.get_llm_job_response()ClassroomSession.get_teacher_agent_id()ClassroomSession.get_user_input_status()ClassroomSession.is_streaming()ClassroomSession.load_next_agenda()ClassroomSession.push_llm_job_to_list()ClassroomSession.reset_displayed_file()ClassroomSession.send_answer_to_message()ClassroomSession.send_function_to_message()ClassroomSession.send_markdown_message()ClassroomSession.send_question_to_message()ClassroomSession.send_script_to_message()ClassroomSession.send_streamed_model_request()ClassroomSession.session_info()ClassroomSession.set_step_id()ClassroomSession.to_next_function()ClassroomSession.update_function_status()