API Documentation

This section provides detailed documentation for all exposed APIs used within our system.

Service

MAIC’s implementation is arranged as services, where each service contains implementation for deploying service module or submodules as asynchrony called producer and comsumers.

llm (Large Language Model)

Caching Wrapper

class service.llm.base.BASE_LLM_CACHE[source]

Bases: object

A base class for implementing caching of queries and responses for a Language Model (LLM). This class is intended to be inherited and implemented by a subclass.

check_cache(query)[source]

Checks if a cached response exists for the provided query.

Parameters:

query – The query to check in the cache.

Returns:

The cached response if available, otherwise a constant (NO_CACHE_YET). Also updates the internal usage cost for cached responses.

clear_usage()[source]

Clears the stored usage statistics.

print_usage()[source]

Prints the cumulative token usage statistics.

write_cache(query, response, usage)[source]

Writes the provided query, response, and usage information to the cache.

Parameters:
  • query – The query to be cached.

  • response – The response to be cached.

  • usage – The token usage information.

Mock API

class service.llm.mock.MockLLMService[source]

Bases: object

classmethod check_llm_job_done(*args, **kwargs)[source]
classmethod get_llm_job_response(*args, **kwargs)[source]
classmethod trigger(*args, **kwargs)[source]

OpenAI API

class service.llm.openai.OPENAI(api_key, cache_collection=None, **kwargs)[source]

Bases: BASE_LLM_CACHE

A class to interact with OpenAI’s language model while using a cache mechanism to optimize requests. Inherits from BASE_LLM_CACHE.

class service.llm.openai.OPENAI_SERVICE[source]

Bases: object

A service class for managing OPENAI LLM requests through a queue mechanism using MongoDB and RabbitMQ.

collection = Collection(Database(MongoClient(host=['localhost:27017'], document_class=dict, tz_aware=False, connect=True), 'llm'), 'openai')
static get_response(job_id)[source]

Retrieves the response of a job with the given ID.

Parameters:

job_id (ObjectId) – The ID of the job to retrieve the response for.

Returns:

The response of the job, or None if the job is not found or has not completed.

Return type:

str

static get_response_sync(job_id, timeout=300)[source]

Retrieves the response of a job with the given ID synchronously.

Parameters:
  • job_id (ObjectId) – The ID of the job to retrieve the response for.

  • timeout (int, optional) – The maximum time to wait for the job to complete.

Returns:

The response of the job, or None if the job is not found or has not completed within the timeout.

Return type:

str

static launch_worker()[source]

Launches a worker to process jobs from the RabbitMQ queue. The worker interacts with the LLM and stores the response back in MongoDB.

logger = <Logger service.llm.openai (INFO)>
queue_name = 'llm-openai'
static trigger(parent_service, parent_job_id=None, use_cache=False, **query)[source]

Creates and triggers a new job for an LLM request.

Parameters:
  • caller_service (str) – The service initiating the request.

  • use_cache (bool, optional) – Whether to use cached responses if available.

  • **query – The query to send to the LLM.

Returns:

The job ID of the triggered request.

Return type:

str

service.llm.openai.now(tz=None)

Returns new datetime object representing current time local to tz.

tz

Timezone object.

If no tz is specified, uses local timezone.

ZhipuAI API

class service.llm.zhipuai.ZHIPUAI(api_key, cache_collection=None, **kwargs)[source]

Bases: BASE_LLM_CACHE

call_model(query, use_cache)[source]

Calls the ZhipuAI model with the given query.

Parameters:
  • query (dict) – The query to send to the OpenAI model.

  • use_cache (bool) – Whether to use cached responses if available.

Returns:

The response from the model, either from cache or a new request.

Return type:

str

class service.llm.zhipuai.ZHIPUAI_SERVICE[source]

Bases: object

A service class for managing ZhipuAI LLM requests through a queue mechanism using MongoDB and RabbitMQ.

classmethod check_llm_job_done(job_id)[source]

Checks if a job has been completed.

Parameters:

job_id (str) – The ID of the job to check.

Returns:

Whether the job has been completed.

Return type:

bool

collection = Collection(Database(MongoClient(host=['localhost:27017'], document_class=dict, tz_aware=False, connect=True), 'llm'), 'zhipu')
classmethod get_llm_job_response(job_id)[source]

Gets the response of a completed job.

Parameters:

job_id (str) – The ID of the job to check.

Returns:

The response of the job.

Return type:

str

classmethod launch_worker()[source]

Launches a worker to process jobs from the RabbitMQ queue. The worker interacts with the LLM and stores the response back in MongoDB.

queue_name = 'llm-zhipu'
classmethod trigger(query, caller_service, use_cache=False)[source]

Creates and triggers a new job for an LLM request.

Parameters:
  • query (dict) – The query to send to the LLM.

  • caller_service (str) – The service initiating the request.

  • use_cache (bool) – Whether to use cached responses if available.

Returns:

The job ID of the triggered request.

Return type:

str

service.llm.zhipuai.now(tz=None)

Returns new datetime object representing current time local to tz.

tz

Timezone object.

If no tz is specified, uses local timezone.

PreClass

PreClass: Main Service

class service.preclass.main.PRECLASS_MAIN[source]

Bases: object

Main service class for handling the pre-class lecture processing pipeline.

This class manages a multi-stage workflow for processing lecture materials, including converting presentations to different formats and generating various educational materials.

class STAGE[source]

Bases: object

Enumeration of processing stages for the pre-class pipeline.

FINISHED = 100
GEN_ASKQUESTION = 8
GEN_DESCRIPTION = 4
GEN_READSCRIPT = 7
GEN_SHOWFILE = 6
GEN_STRUCTURE = 5
PDF2PNG = 2
PPT2TEXT = 3
PPTX2PDF = 1
PUSH_AGENDA = 99
START = 0
static get_status(job_id)[source]
static launch_worker()[source]

Launches the worker process for handling pre-class processing jobs.

Establishes a connection to RabbitMQ and begins consuming messages from the queue. Each message triggers the appropriate stage of processing based on the job’s current stage. The worker can be terminated using CTRL+C.

Raises:

KeyboardInterrupt – When the worker is manually stopped

static trigger(parent_service, source_file)[source]

Initiates a new pre-class processing job.

Parameters:
  • parent_service (str) – Identifier of the parent service initiating this job

  • source_file (str) – Path to the source presentation file to be processed

Returns:

The ID of the created job

Return type:

str

Notes

Creates a new job in MongoDB and pushes it to RabbitMQ queue for processing. Moves the source file to a buffer location for processing.

service.preclass.main.now(tz=None)

Returns new datetime object representing current time local to tz.

tz

Timezone object.

If no tz is specified, uses local timezone.

PreClass: Data Structures

class service.preclass.model.AgendaStruct(title, children=None, function=None)[source]

Bases: object

A tree structure representing an agenda or outline with hierarchical nodes.

title

The title/heading of this agenda item

Type:

str

children

List of child AgendaStruct or PPTPageStruct objects

Type:

list

dfs_recursive_call(function)[source]

Recursively call the given function on each node in the agenda structure.

Parameters:

function – The function to be called on each node

flatten()[source]

Return a flattened list of all nodes in the agenda structure.

Returns:

List of all nodes in the agenda structure

Return type:

list

formalize()[source]

Return a string representation of the title/heading of this agenda item

classmethod from_dict(dict)[source]

Create an agenda structure from a dictionary representation.

Parameters:

dict – Dictionary representation of the agenda structure

Returns:

The agenda structure object

Return type:

AgendaStruct

get_structure_trace()[source]

Generate a formatted string representation of the structure’s hierarchy.

Returns:

Indented string showing the structure trace

Return type:

str

insert_page(page, trace)[source]

Insert a page into the agenda structure following the given trace path.

Parameters:
  • page – The page object to insert

  • trace (list) – List of section titles forming path to insertion point

Returns:

True if page was inserted successfully, False otherwise

Return type:

bool

serialize()[source]
to_dict()[source]

Return a dictionary representation of the agenda structure.

Returns:

Dictionary representation of the agenda structure

Return type:

dict

type = 'node'
class service.preclass.model.AskQuestion(question, question_type, selects, answer, reference)[source]

Bases: FunctionBase

Function to present a question to users.

Parameters:
  • question (str) – The question text

  • question_type – Type/category of question

  • selects – Available answer options

  • answer – Correct answer

  • reference – Reference material (commented out)

class service.preclass.model.FunctionBase(call, label, value)[source]

Bases: object

Base class for defining interactive functions/actions within the agenda.

call

Function identifier/name

Type:

str

label

Display label for the function

Type:

str

value

Parameters/values for the function

Type:

dict

classmethod from_dict(dict)[source]
to_dict()[source]
class service.preclass.model.PPTPageStruct(page, stringified, function=None)[source]

Bases: object

Represents a PowerPoint page/slide in the agenda structure.

content

The actual page/slide content

stringified

String representation of the page

Type:

str

dfs_recursive_call(function)[source]
flatten(condition=None)[source]
formalize()[source]
classmethod from_dict(dict)[source]
get_structure_trace()[source]
serialize()[source]
to_dict()[source]
type = 'ppt'
class service.preclass.model.ReadScript(script)[source]

Bases: FunctionBase

Function to read a script/text.

Parameters:

script (str) – The script content to be read

class service.preclass.model.ShowFile(file_id)[source]

Bases: FunctionBase

Function to display a file.

Parameters:

file_id – Identifier for the file to be shown

PreClass: Sub-Services

InClass

InClass: Main Service

InClass: Controller

InClass: SessionController