Add MCP tool annotations for better AI understanding (#149)
Add ToolAnnotations to all 5 MCP tools:
- deep-research: readOnlyHint=true, openWorldHint=true
- write-research-plan: readOnlyHint=true, openWorldHint=true
- generate-SERP-query: readOnlyHint=true, openWorldHint=true
- search-task: readOnlyHint=true, openWorldHint=true
- write-final-report: readOnlyHint=true, openWorldHint=true
All tools are marked with:
- readOnlyHint=true: Only reads/generates content, doesn’t modify state
- openWorldHint=true: Uses external search and AI APIs
These annotations help AI assistants make better decisions about tool usage.
🤖 Generated with Claude Code
Co-authored-by: triepod-ai noreply@github.com Co-authored-by: Claude Opus 4.5 noreply@anthropic.com
Deep Research
Lightning-Fast Deep Research Report
Deep Research uses a variety of powerful AI models to generate in-depth research reports in just a few minutes. It leverages advanced “Thinking” and “Task” models, combined with an internet connection, to provide fast and insightful analysis on a variety of topics. Your privacy is paramount - all data is processed and stored locally.
✨ Features
🎯 Roadmap
🚀 Getting Started
Use Free Gemini (recommend)
Get Gemini API Key
One-click deployment of the project, you can choose to deploy to Vercel or Cloudflare
Currently the project supports deployment to Cloudflare, but you need to follow How to deploy to Cloudflare Pages to do it.
Start using
Use Other LLM
⌨️ Development
Follow these steps to get Deep Research up and running on your local browser.
Prerequisites
Installation
Clone the repository:
Install dependencies:
Set up Environment Variables:
You need to modify the file
env.tplto.env, or create a.envfile and write the variables to this file.Run the development server:
Open your browser and visit http://localhost:3000 to access Deep Research.
Custom Model List
The project allow custom model list, but only works in proxy mode. Please add an environment variable named
NEXT_PUBLIC_MODEL_LISTin the.envfile or environment variables page.Custom model lists use
,to separate multiple models. If you want to disable a model, use the-symbol followed by the model name, i.e.-existing-model-name. To only allow the specified model to be available, use-all,+new-model-name.🚢 Deployment
Vercel
Cloudflare
Currently the project supports deployment to Cloudflare, but you need to follow How to deploy to Cloudflare Pages to do it.
Docker
You can also specify additional environment variables:
or build your own docker image:
If you need to specify other environment variables, please add
-e key=valueto the above command to specify it.Deploy using
docker-compose.yml:or build your own docker compose:
Static Deployment
You can also build a static page version directly, and then upload all files in the
outdirectory to any website service that supports static pages, such as Github Page, Cloudflare, Vercel, etc..⚙️ Configuration
As mentioned in the “Getting Started” section, Deep Research utilizes the following environment variables for server-side API configurations:
Please refer to the file env.tpl for all available environment variables.
Important Notes on Environment Variables:
Privacy Reminder: These environment variables are primarily used for server-side API calls. When using the local API mode, no API keys or server-side configurations are needed, further enhancing your privacy.
Multi-key Support: Supports multiple keys, each key is separated by
,, i.e.key1,key2,key3.Security Setting: By setting
ACCESS_PASSWORD, you can better protect the security of the server API.Make variables effective: After adding or modifying this environment variable, please redeploy the project for the changes to take effect.
📄 API documentation
Currently the project supports two forms of API: Server-Sent Events (SSE) and Model Context Protocol (MCP).
Server-Sent Events API
The Deep Research API provides a real-time interface for initiating and monitoring complex research tasks.
Recommended to use the API via
@microsoft/fetch-event-source, to get the final report, you need to listen to themessageevent, the data will be returned in the form of a text stream.POST method
Endpoint:
/api/sseMethod:
POSTBody:
Headers:
See the detailed API documentation.
GET method
This is an interesting implementation. You can watch the whole process of deep research directly through the URL just like watching a video.
You can access the deep research report via the following link:
Query Params:
Model Context Protocol (MCP) Server
Currently supports
StreamableHTTPandSSEServer Transport.StreamableHTTP server endpoint:
/api/mcp, transport type:streamable-httpSSE server endpoint:
/api/mcp/sse, transport type:sseNote: Since deep research take a long time to execute, you need to set a longer timeout to avoid interrupting the study.
If your server sets
ACCESS_PASSWORD, the MCP service will be protected and you need to add additional headers parameters:Enabling MCP service requires setting global environment variables:
Note: To ensure that the MCP service can be used normally, you need to set the environment variables of the corresponding model and search engine. For specific environment variable parameters, please refer to env.tpl.
🪄 How it works
Research topic
Propose your ideas
Information collection
Generate Final Report
🙋 FAQs
Why does my Ollama or SearXNG not work properly and displays the error
TypeError: Failed to fetch?If your request generates
CORSdue to browser security restrictions, you need to configure parameters for Ollama or SearXNG to allow cross-domain requests. You can also consider using the server proxy mode, which is a backend server that makes requests, which can effectively avoid cross-domain issues.🛡️ Privacy
Deep Research is designed with your privacy in mind. All research data and generated reports are stored locally on your machine. We do not collect or transmit any of your research data to external servers (unless you are explicitly using server-side API calls, in which case data is sent to API through your configured proxy if any). Your privacy is our priority.
🙏 Acknowledgements
dzhng/deep-researchfor inspiration.🤝 Contributing
We welcome contributions to Deep Research! If you have ideas for improvements, bug fixes, or new features, please feel free to:
For major changes, please open an issue first to discuss your proposed changes.
✉️ Contact
If you have any questions, suggestions, or feedback, please create a new issue.
📝 License
Deep Research is released under the MIT License. This license allows for free use, modification, and distribution for both commercial and non-commercial purposes.
🌟 Star History