How MCP Works: The Architecture Behind AI That Can Actually Do Things

MCP operates through three interconnected layers — host, client, and server — enabling AI to query live data and trigger real-time actions through a standardised protocol. The deeper opportunity, especially in private markets, is building proprietary MCP servers that give AI governed, controlled access to internal systems rather than generic public tools.
Devanshee Kothari
Devanshee Kothari
Growth and Research Manager
April 27, 2026

In our previous post, we introduced the Model Context Protocol — the open standard that allows AI systems to connect to external tools and data sources in a standardised way. If you have not read it yet, that is a good starting point. In this post, we go one level deeper: what does an MCP system actually look like, and how do the components fit together?

Three Layers, One System

An MCP ecosystem has three distinct components: the MCP host, the MCP client, and the MCP server.

The MCP host is the AI application the user interacts with — an AI assistant, an agent, a chat interface. This is the entry point. The user asks a question or gives an instruction here.

The MCP client lives inside the host. It is responsible for the communication layer — translating the AI model's requests into a format that the MCP server can understand, and translating the server's responses back into something the AI can work with.

The MCP server is the external layer — the piece that sits in front of a specific tool, system, or data source. When a firm builds an MCP server, it is essentially wrapping its internal system in a standardised protocol that any MCP-compatible AI can then connect to.

What an MCP Server Exposes

An MCP server communicates with the AI through three types of primitives: tools, resources, and prompts.

Tools are actions the AI can instruct the server to perform — submitting data, updating a record, querying a live feed. Tools are model-controlled: the AI decides when and how to use them based on the task at hand.

Resources are structured data the AI can read — documents, database records, historical logs. Resources provide context without triggering active computation.

Prompts are reusable templates that shape how the AI interacts with a specific system. They standardise how certain tasks are framed so the AI produces consistent, reliable outputs when working with that system.

The Communication Flow

The underlying transport layer of MCP uses JSON-RPC 2.0 — a lightweight protocol for remote procedure calls. What this means practically is that requests and responses between the AI and external systems are structured, standardised, and two-directional. The AI can both receive data from a system and trigger actions within it.

A practical example: a user asks an AI assistant, "What deals do we have available in layer one structures this week?" The AI, recognising it cannot answer from its own knowledge, uses the MCP client to query an available MCP server. The server processes the request, queries the underlying data source, and returns a structured response. The AI translates that into a clear, readable answer for the user. The entire interaction happens in seconds, with no human intermediary, no manual lookup, and no portal login.

Why Building Your Own MCP Server Matters

Most discussions of MCP focus on connecting to publicly available tools — Google Drive, Slack, GitHub. But the more significant opportunity, particularly in financial services, is building proprietary MCP servers that expose internal systems to AI agents in a controlled, governed way.

When a firm builds its own MCP server, it decides exactly what the AI can access, what it can do, and what it cannot touch. It controls the permission model. It controls the data flow. The AI becomes an intelligent interface to internal infrastructure — rather than a disconnected assistant operating on generic information.

In private markets, where the information that matters most — deal terms, availability, counterparty demand — exists in fragmented, unstructured form across messaging platforms and individual contacts, this distinction is significant. We are building for exactly this. More in the next post.

--

DISCLAIMER

The information provided is for informational and analytical purposes only and should not be construed as financial advice or an offer or an invitation to buy or sell any securities or related financial instruments. It does not constitute a solicitation, recommendation, or endorsement of any particular security, investment product, or strategy. Investors should seek professional advice of their own before making any investment decisions. The views expressed do not necessarily reflect those of EVIDENT or its affiliates. Readers should independently verify any claims and seek appropriate professional advice before making decisions. For Professional Investors only, as defined under Cap.571 Securities and Futures Ordinance in the laws of Hong Kong Special Administrative Region of China.

Related posts