Skip to main content

chat_history.py

Source: src/sunholo/agents/chat_history.py

Functions

extract_chat_history(chat_history=None)

Extracts paired chat history between human and AI messages.

This function takes a chat history and returns a list of pairs of messages, where each pair consists of a human message followed by the corresponding AI response.

Args: chat_history (list): List of chat messages.

Returns: list: List of tuples with paired human and AI messages.

Example:

chat_history = [
{"name": "Human", "text": "Hello, AI!"},
{"name": "AI", "text": "Hello, Human! How can I help you today?"}
]
paired_messages = extract_chat_history(chat_history)
print(paired_messages)
# Output: [("Hello, AI!", "Hello, Human! How can I help you today?")]

create_message_element(message: dict)

Extracts the main content of a message.

Args: message (dict): The message to extract content from.

Returns: str: The text or content of the message.

Raises: KeyError: If neither 'content' nor 'text' fields are found.

Example:

message = {"text": "Hello, AI!"}
content = create_message_element(message)
print(content)
# Output: 'Hello, AI!'

embeds_to_json(message: dict)

Converts the 'embeds' field in a message to a JSON string.

Args: message (dict): The message containing the 'embeds' field.

Returns: str: JSON string representation of the 'embeds' field or an empty string if no embeds are found.

Example:

message = {"embeds": [{"type": "image", "url": "https://example.com/image.png"}]}
json_string = embeds_to_json(message)
print(json_string)
# Output: '[{"type": "image", "url": "https://example.com/image.png"}]'

extract_chat_history_async(chat_history=None)

Extracts paired chat history between human and AI messages.

For this lightweight processing, we use a simpler approach that minimizes overhead.

Args: chat_history (list): List of chat messages.

Returns: list: List of tuples with paired human and AI messages.

extract_chat_history_async_cached(chat_history: List[dict] = None) -> List[Tuple]

Async version that uses the cache and runs in a thread pool if needed.

extract_chat_history_with_cache(chat_history: List[dict] = None) -> List[Tuple]

Main function to replace the original extract_chat_history.

Uses incremental caching for better performance with growing chat histories.

is_ai(message: dict)

Checks if a message was specifically sent by an AI.

Args: message (dict): The message to check.

Returns: bool: True if the message was sent by an AI, otherwise False.

Example:

message = {"name": "AI"}
print(is_ai(message))
# Output: True

is_bot(message: dict)

Checks if a message was sent by a bot.

Args: message (dict): The message to check.

Returns: bool: True if the message was sent by a bot, otherwise False.

Example:

message = {"name": "AI"}
print(is_bot(message))
# Output: True

is_human(message: dict)

Checks if a message was sent by a human.

Args: message (dict): The message to check.

Returns: bool: True if the message was sent by a human, otherwise False.

Example:

message = {"name": "Human"}
print(is_human(message))
# Output: True

warm_up_cache(chat_histories: List[List[dict]])

Pre-populate cache with common chat histories.

Args: chat_histories: List of chat history lists to cache

Classes

ChatHistoryCache

Incremental cache for chat history processing.

Caches processed message pairs and only processes new messages when the chat history is extended.

  • init(self, max_cache_size: int = 1000)

    • Initialize self. See help(type(self)) for accurate signature.
  • _extract_chat_history_full(self, chat_history: List[dict]) -> List[Tuple]

    • Full extraction when no cache is available.
  • _find_cached_prefix(self, current_history: List[dict]) -> Tuple[Optional[List[Tuple]], int]

    • Find the longest cached prefix of the current chat history.

Returns: Tuple of (cached_pairs, cache_length) or (None, 0) if no cache found

  • _get_cache_key(self, chat_history: List[dict]) -> str

    • Generate a cache key based on the chat history content.
  • _process_new_messages(self, new_messages: List[dict], cached_pairs: List[Tuple]) -> List[Tuple]

    • Process only the new messages, considering the state from cached pairs.

Args: new_messages: New messages to process cached_pairs: Previously processed message pairs

Returns: List of new message pairs

  • _update_cache(self, chat_history: List[dict], pairs: List[Tuple])

    • Update cache with new result.
  • _verify_cache_validity(self, current_prefix: List[dict], cached_prefix: List[dict]) -> bool

    • Quick verification that cached data is still valid.
  • clear_cache(self)

    • Clear the entire cache.
  • extract_chat_history_incremental(self, chat_history: List[dict]) -> List[Tuple]

    • Extract chat history with incremental caching.

Args: chat_history: List of chat message dictionaries

Returns: List of (human_message, ai_message) tuples

Sunholo Multivac

Get in touch to see if we can help with your GenAI project.

Contact us

Other Links

Sunholo Multivac - GenAIOps

Copyright ©

Holosun ApS 2025