Build the Ultimate Vertical Agent: Create a Billion-Dollar AI Startup

Discover how to build the ultimate vertical AI agent and create a billion-dollar AI startup. Learn to leverage the Vercel AI SDK to build a custom agent that streamlines workflows and integrates deeply with specific knowledge work.

12 juni 2025

party-gif

Build your own AI-powered assistant to streamline your workflow and boost productivity. Discover how to create a customized "cursor" that can automate tasks, generate content, and integrate seamlessly with your favorite tools. This blog post will guide you through the process of building a versatile AI agent that can revolutionize the way you work.

How to Build a Cursor for X Agentic Software

To build a cursor for X agentic software, we'll be using the Vercel AI SDK. The Vercel AI SDK provides two main parts:

  1. AI SDK Core: This allows you to set up the AI agent and connect to different large language models. You can easily swap out different models without changing the underlying logic in your codebase.

  2. AI SDK UI: This enables you to stream the results, whether it's text or structured output, to the front-end in an efficient way, allowing you to build production-level agentic applications.

Here's how you can build the core functionality:

  1. Set up a TypeScript project and install the Vercel AI SDK.
  2. Import the necessary functions from the SDK, such as generateText, streamText, and streamObject.
  3. Define your agent's tools, such as getting the weather or generating a PRD.
  4. Use the streamText and streamObject functions to create an agent that can execute these tools and stream the results back.

For the front-end, we'll use Next.js and Chaskell as the UI library:

  1. Create a Next.js project and set up the necessary API endpoints.
  2. Implement the useChat hook from the Vercel AI SDK React package to handle the chat interaction and stream the results.
  3. Create a split-view layout with a chat panel on the left and a content panel on the right.
  4. Use a shared state to pass the data from the chat panel to the content panel, allowing the agent to stream the results to the content area.
  5. Implement the content panel to display the generated content, such as the PRD.

To handle long-running tasks and stream the results, we'll use the createDataStreamResponse function from the Vercel AI SDK. This allows us to merge multiple data streams and push custom data to the front-end.

By combining the AI SDK Core and AI SDK UI, you can build a robust cursor for X agentic software that provides a seamless chat experience and a customized content panel for reviewing and interacting with the agent's outputs.

For more detailed information and a step-by-step guide, check out the "Build Production-Ready Large Language Model Applications" section in the AI Builder Club, which includes a dedicated module on the Vercel AI SDK.

Versel AI SDK Core: Generating Text, Structured Output, and Building Agents

Firstly, let's cover the AI SDK core. Versel SDK provides a few different functions for generating text, generating objects (structured output), as well as integrating with tools.

To demonstrate this, we'll open a new folder in Cursor and set up a TypeScript project:

npm init -y
npm install ai

Then, we can create a main.ts file and import the necessary functions from the Versel AI SDK:

import { Anthropic } from 'ai/anthropic';

const model = new Anthropic('YOUR_ANTHROPIC_API_KEY');

async function answerQuestion(prompt: string) {
  const { text } = await model.generateText({
    prompt,
    maxTokens: 100,
  });
  return text;
}

console.log(await answerQuestion('What is the capital of France?'));

This code sets up an Anthropic language model, defines a function to generate text, and then calls that function to get a response.

You can also stream the text output in a more efficient way:

async function streamText(prompt: string) {
  const textStream = await model.streamText({
    prompt,
    maxTokens: 100,
  });

  for await (const chunk of textStream) {
    console.log(chunk.text);
  }
}

await streamText('The capital of France is');

In addition to generating text, the Versel AI SDK also supports structured output. You can define a schema using the zod library and then stream the structured data:

import * as z from 'zod';

const articleSchema = z.object({
  title: z.string(),
  author: z.string(),
  date: z.string(),
  content: z.string(),
});

async function streamArticle(prompt: string) {
  const objectStream = await model.streamObject({
    prompt,
    schema: articleSchema,
  });

  for await (const article of objectStream) {
    console.log(article);
  }
}

await streamArticle('Write a short article about the history of the Eiffel Tower.');

Finally, the Versel AI SDK makes it easy to build agents that can call external tools and integrate them into the workflow:

import { Tool } from 'ai/tools';

const weatherTool: Tool = {
  name: 'Weather',
  description: 'Get the current weather for a location',
  parameters: {
    location: z.string(),
  },
  invoke: async (location: string) => {
    // Call a weather API and return the result
    return { temperature: 20, condition: 'Sunny' };
  },
};

async function runAgent(prompt: string) {
  const agent = await model.createAgent({
    prompt,
    tools: [weatherTool],
    maxSteps: 3,
  });

  for await (const result of agent.run()) {
    console.log(result);
  }
}

await runAgent('What is the weather like in Paris today?');

This demonstrates how you can define custom tools and integrate them into an agent, which can then be executed with a given prompt.

The Versel AI SDK provides a powerful and flexible foundation for building production-ready, large language model applications. In the next section, we'll explore how to use the AI SDK UI to create a web-based interface for the agent.

Versel AI SDK UI: Streaming Results and Building a Chat Interface

To build the chat interface and stream the results using Versel AI SDK UI, we can follow these steps:

  1. Set up the Next.js Project: We'll create a new Next.js project and install the necessary dependencies, including the Chassen UI library.

  2. Create the API Endpoint: We'll set up an API endpoint using the route.ts file, where we'll define the chat functionality. This includes handling the message input, calling the streamText function, and returning the result data stream.

  3. Implement the Chat Interface: In the page.tsx file, we'll use the useChat hook from the Versel AI SDK React package to create the chat interface. This will handle the input, send the message to the API endpoint, and display the message history.

  4. Render the Tool Responses: We'll enhance the chat interface to display the tool responses, including their state (partial, complete) and the input/output data.

  5. Streaming Custom Data: To handle cases where the agent needs to call a long-running tool, we'll use the createDataStreamResponse function from Versel AI SDK to merge multiple data streams and push custom data to the front-end.

  6. Implement the Playground/Canvas: We'll create a separate ContentPanel component to display the generated content, such as the PRD document. This component will use the shared state between the ChatPanel and ContentPanel to receive and render the data.

  7. Manage State and Persistence: We'll introduce a shared state between the ChatPanel and ContentPanel components to handle the short-term memory and data persistence, using a solution like Supabase for the backend.

By following these steps, you'll be able to build a complete cursor-like system using the Versel AI SDK, with a chat interface for the user to interact with the agent and a playground/canvas to display the generated content.

Streaming Tool Responses and Creating a Persistent Playground

To stream the tool responses back to the front-end and create a persistent playground, we can follow these steps:

  1. Implement the API Endpoint to Stream Tool Responses:

    • Create a function called generatePRD that calls the large language model to generate the PRD content.
    • Use the createDataStreamResponse function from the Vercel AI SDK to merge the original agent response stream with the custom data stream.
    • In the generatePRD execution, write the text delta to the data stream as the content is generated.
    • Merge the original agent response into the data stream at the end.
  2. Render the Streamed Data in the Front-end:

    • Create a new function called PRDDisplay to filter and group the custom data received from the API endpoint.
    • Render the PRD content in the content panel, grouping the content based on the tool call ID.
  3. Implement State Management:

    • Restructure the application into separate components: ProjectPage, ChatPanel, and ContentPanel.
    • Create a shared state called projectData that can be updated and shared between the chat panel and content panel.
    • Use the useEffect hook to monitor changes in the message, data, and isLoading states, and update the projectData accordingly.
  4. Persist the Generated Content:

    • Integrate with a backend service, such as Supabase, to store the generated PRD content and the conversation history.
    • Update the projectData state with the persisted data when the page is loaded.

By following these steps, you can create a cursor-like application that can stream tool responses, display the generated content in a persistent playground, and integrate with a backend service to save the generated content and conversation history.

Conclusion

Here is the body of the "Conclusion" section in Markdown format:

In this guide, we have explored the process of building a cursor or an agentic software that can complete specific tasks in a certain vertical. We covered the key components required to build such a system, including the agent that has access to various tools and assistants, and the specialized playground or canvas for users to review and collaborate with the agent.

Using the Vercel AI SDK, we demonstrated how to set up the core functionality of the agent, including text generation, structured output, and integration with external tools. We also showed how to build the front-end user interface using Next.js and Chaskell, allowing users to interact with the agent through a chat-like experience and view the results in a content panel.

Additionally, we discussed the importance of state management and the integration with a backend, such as Supabase, to provide persistent storage and user-based pricing tiers.

Throughout the guide, we emphasized the vast opportunities available in building cursor-like applications for various domains, such as writing, video editing, and product development. We also provided resources, such as the "How to Build an AI Startup in Three Hours with Less than $500" guide by Greg Eisenberg, to help solo entrepreneurs get started with building AI-powered products.

If you're interested in learning more about building production-ready large language model applications and the detailed implementation of the cursor for X system, be sure to check out the AI Builder Club, where we have a dedicated section covering these topics in-depth, along with a GitHub repository for reference.

Thank you for following along, and we look forward to seeing the innovative cursor-like applications you build in the future!

FAQ