

TODAY'S ISSUE
TODAY’S DAILY DOSE OF DATA SCIENCE
​Build a shared memory for Claude Desktop and Cursor​
Developers use Claude Desktop and Cursor independently with no context sharing or connection.
In other words, Claude Desktop (an MCP host) has no visibility over what you do in Cursor (another MCP host), and vice versa.
Today, we'll show you how to add a common memory layer to Claude Desktop and Cursor so you can cross-operate without losing context.
Our tech stack:
- ​Graphiti MCP​ as a memory layer for AI Agents (​GitHub repo​; 10k stars).
- Cursor and Claude Desktop as the MCP hosts.
Here’s the workflow:

- User submits query to Cursor & Claude.
- Facts/Info are stored in a common memory layer using Graphiti MCP.
- Memory is queried if context is required in any interaction.
- Graphiti shares memory across multiple hosts.
We have added a video below that gives a walkthrough of the final outcome:
Implementation details
​A shared memory for Claude Desktop and Cursor
Now, let's dive into the code!
Docker Setup
Deploy the Graphiti MCP server locally using Docker Compose. This setup starts the MCP server with Server-Sent Events (SSE) transport.

The Docker setup above includes a Neo4j container, which launches the database as a local instance.
This configuration lets you query and visualize the knowledge graph using the Neo4j browser preview.

Connect MCP server to Cursor
With tools and our server ready, let's integrate it with our Cursor IDE!
Go to: File → Preferences → Cursor Settings → MCP → Add new global MCP server.
In the JSON file, add what's shown below:

Connect MCP server with Claude
With tools and our server ready, let's integrate it with our Claude Desktop!
Go to: File → Settings → Developer → Edit Config.
In the JSON file, add what's shown below:

Done!
Our Graphiti MCP server is live and connected to Cursor & Claude!
Now you can chat with Claude Desktop, share facts/info, store the response in memory, and retrieve them from Cursor, and vice versa:
This way, you can pipe Claude’s insights straight into Cursor, all via a single MCP.
To summarize, here's the full workflow again for your reference.

- User submits query to Cursor & Claude.
- Facts/Info are stored and retrieved from a common memory layer.
The memory layer is powered by Graphiti MCP.
While the steps detailed above should help you, you can find a detailed setup guide in the ​Graphiti MCP README file →​
Thanks for reading!
THAT'S A WRAP
No-Fluff Industry ML resources to
Succeed in DS/ML roles

At the end of the day, all businesses care about impact. That’s it!
- Can you reduce costs?
- Drive revenue?
- Can you scale ML models?
- Predict trends before they happen?
We have discussed several other topics (with implementations) in the past that align with such topics.
Here are some of them:
- Learn sophisticated graph architectures and how to train them on graph data in this crash course.
- So many real-world NLP systems rely on pairwise context scoring. Learn scalable approaches here.
- Run large models on small devices using Quantization techniques.
- Learn how to generate prediction intervals or sets with strong statistical guarantees for increasing trust using Conformal Predictions.
- Learn how to identify causal relationships and answer business questions using causal inference in this crash course.
- Learn how to scale and implement ML model training in this practical guide.
- Learn 5 techniques with implementation to reliably test ML models in production.
- Learn how to build and implement privacy-first ML systems using Federated Learning.
- Learn 6 techniques with implementation to compress ML models.
All these resources will help you cultivate key skills that businesses and companies care about the most.
SPONSOR US
Advertise to 600k+ data professionals
Our newsletter puts your products and services directly in front of an audience that matters — thousands of leaders, senior data scientists, machine learning engineers, data analysts, etc., around the world.