First Conversation
This tutorial walks you through your first complete agent conversation in DeerFlow — from launching the app to getting meaningful work done with the agent.
Prerequisites
- DeerFlow app is running (see Quick Start)
- At least one model is configured in
config.yaml
Steps
Open the workspace
Open http://localhost:2026 in your browser. You will see the conversation workspace.
Send your first message
Type a question in the input box, for example:
Research the top 3 most popular open-source LLM frameworks in 2024 and compare their strengths and weaknesses.Press Enter to send.
Watch the agent work
You will see the agent start working:
- Expand the thinking steps to see which tools it is calling
- Watch search results stream in
- Wait for the final report to be generated
Interact with the result
Once the report is generated, you can:
- Ask for more detail on a specific section
- Ask to export the report as a file (the agent will use the
present_filestool) - Ask to create a chart based on the research findings
What just happened
The agent used the DeerFlow Harness to:
- Receive your message and add it to the thread state
- Run the middleware chain (memory injection, title generation)
- Call the LLM, which decided to search the web
- Execute web search tool calls
- Synthesize results into a structured response
- Update the thread state with any artifacts produced
Next steps
Last updated on