Deconvolute SDK
API ReferencePython SDK

Security Clients

Client integrations.

These classes handle the direct connection to your LLM providers.

deconvolute.clients

transport

PinnedNetworkBackend Objects

class PinnedNetworkBackend(httpcore.AsyncNetworkBackend)

A custom network backend for secure DNS pinning with fallback support.

This backend intercepts TCP connection requests. If the requested host matches the target original_host, it routes the socket to a list of resolved IPs. It tries them sequentially, falling back on failure. Once an IP successfully connects, it locks onto that IP for all future connections to ensure zero latency overhead. It delegates the TLS handshake to the original host string to preserve SNI.

__init__

def __init__(original_host: str, pinned_ips: list[str],
             backend: httpcore.AsyncNetworkBackend) -> None

Initializes the PinnedNetworkBackend.

Arguments:

  • original_host str - The target hostname to intercept (e.g. 'api.example.com').
  • pinned_ips list[str] - The resolved IP addresses to try connecting to.
  • backend httpcore.AsyncNetworkBackend - The underlying httpcore backend to wrap.

connect_tcp

async def connect_tcp(
        host: str,
        port: int,
        timeout: float | None = None,
        local_address: str | None = None,
        socket_options: typing.Iterable[tuple[int, int, int]
                                        | tuple[int, int, bytes | bytearray]
                                        | tuple[int, int, None, int]]
    | None = None,
        **kwargs: Any) -> httpcore.AsyncNetworkStream

Intercepts and routes TCP connections to the pinned IP if the host matches.

Arguments:

  • host str - The hostname requested by the client.
  • port int - The destination port.
  • timeout float | None, optional - The connection timeout in seconds. Defaults to None.
  • local_address str | None, optional - The local address to bind to. Defaults to None.
  • **kwargs Any - Additional keyword arguments passed to the backend.

Returns:

  • httpcore.AsyncNetworkStream - The established TCP stream.

connect_unix_socket

async def connect_unix_socket(*args: Any,
                              **kwargs: Any) -> httpcore.AsyncNetworkStream

Delegates UNIX socket connections to the underlying backend.

Arguments:

  • *args Any - Positional arguments.
  • **kwargs Any - Keyword arguments.

Returns:

  • httpcore.AsyncNetworkStream - The established UNIX socket stream.

sleep

async def sleep(seconds: float) -> None

Delegates sleep calls to the underlying backend.

Arguments:

  • seconds float - The duration to sleep in seconds.

openai

OpenAIProxy Objects

class OpenAIProxy(BaseLLMProxy)

Synchronous Proxy for the OpenAI client.

This wrapper serves as a transparent middleware for the OpenAI SDK. It intercepts calls to client.chat.completions.create to inject security defenses (Canaries) and validate outputs (Content Scanning).

All other calls (e.g. client.embeddings, client.images, client.models) are transparently delegated to the underlying client via the BaseLLMProxy mechanism, ensuring full compatibility with the original SDK.

chat

@property
def chat() -> "ChatProxy"

Access the intercepted 'chat' namespace.

Returns:

  • ChatProxy - A wrapper around the original client.chat object that enables interception of completion creation calls.

AsyncOpenAIProxy Objects

class AsyncOpenAIProxy(BaseLLMProxy)

Asynchronous Proxy for the OpenAI client.

Mirrors the behavior of OpenAIProxy but handles await calls for use in asyncio event loops. It intercepts await client.chat.completions.create.

chat

@property
def chat() -> "AsyncChatProxy"

Access the intercepted async 'chat' namespace.

Returns:

  • AsyncChatProxy - A wrapper around the original client.chat object.

ChatProxy Objects

class ChatProxy()

Helper proxy to navigate the client.chat namespace.

This class exists to bridge the gap between client and client.chat.completions. It forwards all unknown attributes to the real OpenAI chat object.

completions

@property
def completions() -> "CompletionsProxy"

Access the intercepted 'completions' namespace.

Returns:

  • CompletionsProxy - The core interceptor that wraps the create method.

CompletionsProxy Objects

class CompletionsProxy()

The core interceptor for Synchronous Chat Completions.

This class handles the lifecycle of the security checks:

  1. Input: Modifies the 'messages' list (e.g. Canary injection).
  2. Execution: Calls the real OpenAI client.
  3. Output: Validates and sanitizes the response content.

create

def create(*args: Any, **kwargs: Any) -> Any

Wraps the standard openai.chat.completions.create method.

This method intercepts the request to inject security payloads and intercepts the response to validate content. If stream=True is detected, output validation is currently skipped to preserve streaming behavior.

Arguments:

  • *args - Positional arguments forwarded to OpenAI.
  • **kwargs - Keyword arguments (messages, model, tools, etc.) forwarded to OpenAI.

Returns:

  • ChatCompletion - The original OpenAI response object, but with its content sanitized (e.g. tokens removed) and validated.

Raises:

  • SecurityResultError - If a scanner (e.g. Language, Canary) identifies a threat in any of the generated choices.
  • DeconvoluteError - If integrity checks are enabled but the request configuration is invalid (e.g. missing a 'system' message).

AsyncChatProxy Objects

class AsyncChatProxy()

Helper proxy to navigate the Async client.chat namespace.

completions

@property
def completions() -> "AsyncCompletionsProxy"

Access the intercepted async 'completions' namespace.

AsyncCompletionsProxy Objects

class AsyncCompletionsProxy()

The core interceptor for Asynchronous Chat Completions.

Identical in logic to CompletionsProxy but uses await for execution and validation calls.

create

async def create(*args: Any, **kwargs: Any) -> Any

Wraps the standard await openai.chat.completions.create method.

Arguments:

  • *args - Positional arguments forwarded to OpenAI.
  • **kwargs - Keyword arguments (messages, model, etc.).

Returns:

  • ChatCompletion - The sanitized OpenAI response object.

Raises:

  • SecurityResultError - If a threat is detected in the response.
  • DeconvoluteError - If the request configuration is invalid.

mcp

MCPProxy Objects

class MCPProxy()

Transparent proxy for mcp.ClientSession that enforces security policies.

This proxy sits between the Application and the MCP Client. It intercepts:

  1. list_tools(): To filter out tools that are blocked by policy.
  2. call_tool(): To block execution of unsafe tools or detect tampering.

All other method calls (e.g. list_resources) are delegated directly to the underlying session.

__init__

def __init__(session: ClientSession,
             firewall: MCPFirewall,
             integrity_mode: IntegrityLevel = "snapshot",
             transport_origin: TransportOrigin | None = None,
             init_result: types.InitializeResult | None = None) -> None

Arguments:

  • session - The original connected mcp.ClientSession.
  • firewall - The configured enforcement engine.
  • integrity_mode - 'snapshot' (default) or 'strict'.

initialize

async def initialize(*args: Any, **kwargs: Any) -> Any

Intercepts session initialization to dynamically extract the server's identity and enforce version constraints.

__aenter__

async def __aenter__() -> "MCPProxy"

Allow using the guarded session directly in 'async with'. We enter the underlying session, but return 'self' (the Proxy).

__aexit__

async def __aexit__(exc_type: Any, exc_value: Any, traceback: Any) -> None

Pass context exit to the underlying session.

__getattr__

def __getattr__(name: str) -> Any

Delegate any unknown methods (like list_resources) to the real session.

list_tools

async def list_tools(*args: Any, **kwargs: Any) -> types.ListToolsResult

Intercepts tool discovery to hide blocked tools.

  1. Fetches all tools from the server.
  2. Passes them through the Firewall filter.
  3. Registers allowed tools in the SessionRegistry (snapshotting).
  4. Returns a ListToolsResult containing ONLY the allowed tools.

call_tool

async def call_tool(name: str,
                    arguments: dict[str, Any] | None = None,
                    *args: Any,
                    **kwargs: Any) -> types.CallToolResult

Intercepts tool execution to enforce policy.

  1. Checks Firewall for Policy (Is this allowed?) and Integrity (Is this known?).
  2. If UNSAFE, returns a fake Error Result (prevents network call).
  3. If SAFE/WARNING, proceeds with the real network call.

secure_stdio_session_impl

@asynccontextmanager
async def secure_stdio_session_impl(
        server_parameters: Any,
        policy_path: str,
        integrity: IntegrityLevel = "snapshot",
        agent_id: str | None = None) -> AsyncIterator[Any]

Implementation for the secure stdio transport wrapper.

secure_sse_session_impl

@asynccontextmanager
async def secure_sse_session_impl(
        url: str,
        policy_path: str,
        integrity: IntegrityLevel = "snapshot",
        pin_dns: bool = True,
        agent_id: str | None = None) -> AsyncIterator[Any]

Implementation for the secure sse transport wrapper with transparent DNS pinning.

This context manager connects to a remote MCP server using Server-Sent Events. If DNS pinning is enabled, it resolves the hostname asynchronously and routes the underlying TCP socket to the pinned IP, preventing DNS Rebinding attacks while fully preserving TLS certificate validation.

Arguments:

  • url str - The target SSE endpoint URL.
  • policy_path str - Path to the Deconvolute security policy.
  • integrity IntegrityLevel, optional - The integrity check mode. Defaults to "snapshot".
  • pin_dns bool, optional - Whether to enforce DNS pinning. Defaults to True.

Yields:

  • AsyncIterator[Any] - The guarded MCP ClientSession proxy.

base

BaseLLMProxy Objects

class BaseLLMProxy()

Abstract Base Class for all Client Proxies.

This class provides the core infrastructure for storing state (client, keys, scanners) and transparently delegating attributes. It organizes scanners based on their capabilities (Injecting vs. Scanning).

Attributes:

  • _client Any - The original, wrapped LLM client instance.
  • api_key str | None - The Deconvolute API Key.
  • _all_scanners list[BaseScanner] - The full list of active security scanners.
  • _injectors list[BaseScanner] - Scanners that implement 'inject'. Used to modify the input prompt (e.g. Canary).
  • _scanners list[BaseScanner] - Scanners that implement 'check'. Used to scan the output response (e.g. Language, Canary).

__init__

def __init__(client: Any,
             scanners: list[BaseScanner],
             api_key: str | None = None)

Initializes the proxy infrastructure.

Arguments:

  • client - The original LLM client instance.
  • scanners - A strict list of scanners. The factory (llm_guard) is responsible for resolving defaults before calling this.
  • api_key - Optional Deconvolute API key.

__getattr__

def __getattr__(name: str) -> Any

Delegates attribute access to the underlying client.

This enables 'transparency': if the user calls a method we don't explicitly intercept, it passes through to the real client.

On this page