1/12/2026AI Engineering

Build Link Tracker with AI: Next.js, Supabase, Vercel

Build Link Tracker with AI: Next.js, Supabase, Vercel

Building a Link Tracking Tool with AI: A Deep Dive for Engineers

This document outlines the process of building a functional link tracking application from scratch using AI-powered coding assistants. The focus is on a practical, engineer-centric approach, leveraging tools like Cursor, Claude, and Perplexity to accelerate development while maintaining control and understanding of the underlying technologies. We will cover project setup, AI agent interaction, core feature development (Next.js, Superbase), deployment (Vercel), and iterative refinement.

1. Project Initialization and Tooling

The foundation of any software project lies in its setup and the tools employed. For this project, the chosen environment prioritizes an integrated development experience with strong AI capabilities.

1.1. Environment Setup

The demonstration utilizes Cursor, an IDE designed for AI-assisted development. This choice is motivated by its suitability for building “serious software,” distinguishing it from simpler online IDEs that might be adequate for Minimum Viable Products (MVPs) but lack the depth for complex development workflows. The project begins with an empty folder, signifying a true “from scratch” approach.

1.2. AI Coding Agents

Claude Code is selected as the primary AI coding agent. The workflow involves launching multiple instances of Claude Code to handle different aspects of development concurrently. This parallel processing capability is a significant advantage, allowing one agent to focus on code generation while another assists with terminal commands or research. The interaction with Claude Code is managed through a chat interface within Cursor. Permissions are explicitly granted for the agent to operate within the project directory.

1.3. Project Definition

A markdown file, build-idea.md, serves as the central repository for the project’s conceptualization. This file outlines the core functionality: a link tracking tool for URL attribution and analytics. The goal is to understand which traffic sources yield the most engagement, a common requirement for content creators, advertisers, and SEO professionals. The initial description within build-idea.md can be generated by simply prompting an AI agent, such as Claude, with a high-level request like: “I want to create a link tracking tool. Give me the technical specification for it.”

2. Core Application Development with Next.js

The application’s frontend and backend logic will be built using Next.js, a popular React framework that facilitates server-side rendering, static site generation, and API routes.

2.1. Initial Next.js App Generation

The first major task for the AI agent is to generate the Next.js application structure. This is initiated by instructing Claude Code:

Read build-idea.md to understand what I want to build. Do not write any code yet.

Following this, the instruction to generate the application is given:

Get to work and build the entire Next.js app exactly as the build idea creates. Let's focus on the setup. Build it as an MVP. We're going to make it good later.

The AI agent is expected to create the foundational files and directory structure for a Next.js project.

2.2. Version Control with Git

Robust version control is crucial. The process involves initializing a Git repository and managing commits.

2.2.1. Git Initialization

The AI agent is prompted to provide the necessary terminal commands for initializing a Git repository:

Give me the necessary terminal commands for me to initialize a new Git repository here inside of my project. I want to do them myself.

The AI responds with the standard Git commands:

  • git init
  • git add .
  • git commit -m "first commit"

These commands are executed directly in the integrated terminal within Cursor.

2.2.2. GitHub Integration

Connecting the local repository to a remote GitHub repository is the next step for collaboration and deployment.

  1. Create a GitHub Repository: A new repository is created on GitHub (e.g., github.com/new). Key details include:
    • Repository Name: building-app-from-scratch
    • Visibility: Public (for demonstration purposes)
    • No README, .gitignore, or license file initially.
  2. Link Local to Remote: The GitHub repository URL is copied. The AI agent is then instructed to help link the local repository:
Get remote add is the repo. What's the issue? Oh, we need to add origin. Okay, that's my bad.
git remote add origin <your_github_repo_url>

Branch Management and Pushing:

  • git branch -m main: Renames the default branch to main.
  • git push -u origin main: Pushes the local main branch to the remote origin and sets up upstream tracking.

After these steps, refreshing the GitHub repository page should display the initial project files, including build-idea.md.

2.3. Dependency Management and Configuration

As the Next.js app is built, specific dependencies and configurations need to be managed.

2.3.1. .gitignore File

A comprehensive .gitignore file is essential to prevent sensitive or unnecessary files from being committed to the repository. The AI is prompted:

Create a comprehensive gitignore file that covers build ID. I'm going to tag our build idea spec in the root of our project.

This ensures that build artifacts, environment variables, and other temporary files are excluded from Git tracking.

2.3.2. Handling Experimental Features

During Next.js app creation or subsequent build processes, experimental features might be encountered. For instance, the “React compiler” might be flagged as experimental. In such cases, it’s advisable to stick to proven technologies for stability:

React compiler is experimental. We don't want to do that. We want to use proven tech stack to keep things simple.

2.3.3. Superbase Integration

Superbase is chosen as the database solution, providing a PostgreSQL database, authentication, and other backend services. The integration involves installing the necessary client libraries.

Install Superbase.

The overall tech stack is summarized as:

  • Frontend/Backend Framework: Next.js
  • Database: Superbase
  • Deployment: Vercel
  • AI Features: Open router (if needed, not explicitly used in this MVP phase)

The command npm install @supabase/supabase-js would be executed to add the Superbase client library.

2.4. Managing AI Agent Permissions

AI agents often require explicit permission to execute terminal commands. This can lead to a repetitive workflow of approving each action. To streamline this, a configuration file can be created.

  1. Researching Configuration: Perplexity is used for web research on how to configure Claude Code’s settings for allowed commands:
Browse the web to figure out how to create the settings.json file for Claude Code so that we can give it a list of allowed commands so that it doesn't keep asking me for permission every single time and then also give me a good JSON file with safe terminal commands that we can give for AI agents.

Creating settings.json: Based on the research, a settings.json file is created, defining a list of safe and approved terminal commands. This dramatically reduces the number of permission prompts.

{
  "allowedCommands": [
    "git init",
    "git add",
    "git commit",
    "git remote add",
    "git branch",
    "git push",
    "npm install",
    "npm run dev",
    "npm run build",
    "echo",
    "mkdir",
    "cd",
    "rm",
    "cp",
    "mv",
    "clear",
    "pwd",
    "ls"
    // Add other safe commands as needed
  ]
}

Restarting the Agent: After creating the configuration file, it might be necessary to restart the AI agent or the IDE for the changes to take effect.

3. Iterative Development and Refinement

The development process is iterative, involving code generation, testing, and refinement.

3.1. AI-Driven Code Generation and Refinement

Claude Code is used to generate significant portions of the application code. The interaction model emphasizes clear instructions and feedback loops.

  • Initial Build: The AI generates the basic Next.js application structure.
  • Error Handling and Debugging: When errors occur, the AI is prompted to analyze them and suggest fixes. For instance, if the build fails due to permission issues or missing configurations, the AI can be asked:
Analyze what this is and implement a clean and minimal fix.

This often involves providing the error logs or a screenshot for context.

  • UI/UX Improvement: The AI can be tasked with refining the user interface and experience. For example, using Gemini 3 Pro (known for its frontend capabilities) to improve the design without altering functionality:
Use Gemini 3 Pro to read build-idea.md and update the design and UI of our front end to make it look better. Do not change anything else. Do not add or remove any features. Just focus on improving the design and layout.

This instruction highlights the importance of precise prompting to guide the AI’s actions.

  • Feature Implementation: Specific features, like the analytics dashboard or link creation, are developed through targeted prompts. The AI is asked to implement steps based on the build-idea.md and a potential master-plan.md document that outlines the project’s roadmap.

3.2. Database Schema Design and Implementation

The link tracking tool requires a database to store information about links and clicks. Superbase is used for this purpose.

3.2.1. Schema Definition

Based on the build-idea.md, the following tables are identified:

  • links table: Stores information about each generated short link.
    • id: Unique identifier (primary key).
    • user_id: Foreign key referencing the user who created the link.
    • original_url: The destination URL.
    • slug: The unique short identifier for the link.
    • created_at: Timestamp of creation.
  • clicks table: Records each time a short link is accessed.
    • id: Unique identifier (primary key).
    • link_id: Foreign key referencing the links table.
    • source: The traffic source (e.g., “Twitter”, “SEO”).
    • timestamp: Timestamp of the click.
    • country: The country from which the click originated (optional for MVP).
    • referrer: The referrer URL (optional for MVP).

3.2.2. SQL Schema Generation

The AI agent is prompted to generate the SQL for creating these tables:

Read master plan and help me execute the superbase table creation. Tell me exactly what to do and why.

The AI provides the necessary SQL CREATE TABLE statements, which are then executed in the Superbase SQL editor.


CREATE TABLE links (
    id SERIAL PRIMARY KEY,
    user_id UUID REFERENCES auth.users ON DELETE CASCADE,
    original_url TEXT NOT NULL,
    slug TEXT NOT NULL UNIQUE,
    created_at TIMESTAMP WITH TIME ZONE DEFAULT timezone('utc'::text, now())
);

CREATE TABLE clicks (
    id SERIAL PRIMARY KEY,
    link_id INTEGER REFERENCES links(id) ON DELETE CASCADE,
    source TEXT,
    timestamp TIMESTAMP WITH TIME ZONE DEFAULT timezone('utc'::text, now()),
    country TEXT,
    referrer TEXT
);

3.2.3. Schema Verification

After creating the tables, a verification step is crucial. The AI is asked to provide a query to double-check the schema:

Give me a simple SQL query that will just double check that everything was created successfully.

A query like SELECT column_name, data_type FROM information_schema.columns WHERE table_name IN ('links', 'clicks'); can be used. The results are then presented to the AI for confirmation.

3.3. Authentication Implementation

Securely managing user access is fundamental. Superbase provides built-in authentication services.

3.3.1. Enabling Email Authentication

The first step is to enable email-based sign-in and confirmation within Superbase.

  1. Navigate to Auth Providers: In the Superbase dashboard, go to Authentication > Sign in providers.
  2. Enable Email: Ensure the “Email” provider is enabled.
  3. Configure Redirect URLs: Crucially, configure the allowed redirect URLs for authentication flows. This includes:
    • http://localhost:3000/* for local development.
    • The Vercel deployment URL (e.g., https://your-app-name.vercel.app/*).

3.3.2. Frontend Authentication Logic

The Next.js application needs to integrate with Superbase authentication. This involves:

  • User Session Management: Using supabase.auth.getSession() to check if a user is logged in.
  • Sign Up: Implementing a sign-up form that calls supabase.auth.signUp().
  • Login: Implementing a login form that calls supabase.auth.signInWithPassword().
  • Logout: Implementing a logout button that calls supabase.auth.signOut().
  • Redirects: Handling redirects after successful sign-up, login, and email confirmation. The AI can assist in refining these redirects, for example, ensuring users are directed to the login page after email verification rather than the dashboard if they are not yet logged in.
// Example of sign up logic (simplified)
import { createClient } from '@supabase/supabase-js';

const supabase = createClient(process.env.NEXT_PUBLIC_SUPABASE_URL, process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY);

async function handleSignUp(email, password) {
  const { data, error } = await supabase.auth.signUp({
    email: email,
    password: password,
  });

  if (error) {
    console.error('Error signing up:', error.message);
    // Handle error
  } else {
    console.log('Sign up successful. Check your email for confirmation.');
    // Redirect to a confirmation page or show a message
  }
}

3.3.3. Handling Email Confirmation Issues

If email confirmation links fail or lead to errors, it’s essential to review the Superbase authentication settings, particularly the redirect URLs. The AI can help diagnose these issues by analyzing error messages and configuration details.

3.4. Link Generation and Redirection

The core functionality of the link tracker is to generate short URLs and redirect users to the original destination.

3.4.1. Creating Short Links

A form is needed within the application to allow users to input a destination URL and optionally a custom slug. Upon submission, the application should:

  1. Generate a unique slug if one is not provided.
  2. Save the original URL, slug, and user ID to the links table in Superbase.
  3. Return the generated short URL (e.g., yourdomain.com/slug).

The backend logic for this can be implemented using Next.js API routes or server components.

// Example API route for creating a link (simplified)
import { createClient } from '@supabase/supabase-js';

const supabase = createClient(process.env.NEXT_PUBLIC_SUPABASE_URL, process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY);

export default async function handler(req, res) {
  if (req.method === 'POST') {
    const { originalUrl, slug } = req.body;
    const { data: { user } } = await supabase.auth.getUser();

    if (!user) {
      return res.status(401).json({ error: 'Unauthorized' });
    }

    const { data, error } = await supabase
      .from('links')
      .insert([
        { user_id: user.id, original_url: originalUrl, slug: slug || Math.random().toString(36).substring(2, 8) }
      ])
      .select();

    if (error) {
      return res.status(500).json({ error: error.message });
    }

    res.status(200).json(data);
  } else {
    res.setHeader('Allow', ['POST']);
    res.status(405).end(`Method ${req.method} Not Allowed`);
  }
}

3.4.2. Handling Redirects and Click Tracking

When a user accesses a short URL (e.g., yourdomain.com/some-slug):

  1. Retrieve Link: The application needs to fetch the corresponding link_id and original_url from the links table based on the slug.
  2. Track Click: Before redirecting, record the click event in the clicks table. This involves capturing the link_id, current timestamp, and potentially the traffic source (if provided via query parameters or referrer).
  3. Redirect: Issue an HTTP 301 or 302 redirect to the original_url.

This logic is typically handled by a Next.js API route or a dynamic route segment (e.g., pages/[slug].js or app/[slug]/page.js).

// Example dynamic route for redirection (simplified)
import { createClient } from '@supabase/supabase-js';

const supabase = createClient(process.env.NEXT_PUBLIC_SUPABASE_URL, process.env.NEXT_PUBLIC_SUPABASE_ANON_KEY);

export async function getServerSideProps({ params, req, query }) {
  const { slug } = params;
  const source = query.source || null; // Example: ?source=twitter

  try {
    // 1. Retrieve link
    const { data: link, error: linkError } = await supabase
      .from('links')
      .select('id, original_url')
      .eq('slug', slug)
      .single();

    if (linkError || !link) {
      return { notFound: true };
    }

    // 2. Track click
    await supabase.from('clicks').insert([
      {
        link_id: link.id,
        source: source,
        timestamp: new Date().toISOString(),
        // Add country, referrer if available
      },
    ]);

    // 3. Redirect
    return {
      redirect: {
        destination: link.original_url,
        permanent: false, // Use false for temporary redirects
      },
    };
  } catch (error) {
    console.error('Redirect error:', error);
    return { notFound: true };
  }
}

// This component itself might be minimal or just return null
export default function RedirectPage() {
  return null;
}

3.5. Analytics Dashboard

The application should provide a dashboard to visualize link performance.

3.5.1. Data Aggregation

The dashboard will query the links and clicks tables to aggregate data:

  • Total number of clicks per link.
  • Distribution of clicks by source.
  • Potentially, trends over time.

This data can be fetched using Superbase’s query capabilities.

// Example query to get clicks per link and source (simplified)
async function getAnalyticsData(userId) {
  const { data, error } = await supabase.rpc('get_link_analytics', { user_id_param: userId });
  // Assumes a stored procedure 'get_link_analytics' exists in Superbase
  // Or construct a direct query:
  /*
  const { data, error } = await supabase
    .from('links')
    .select('id, slug, original_url, clicks:clicks!link_id(source, timestamp)')
    .eq('user_id', userId);
  */
  if (error) {
    console.error('Error fetching analytics:', error);
    return [];
  }
  return data;
}

3.5.2. Visualization

Libraries like Chart.js or Recharts can be integrated into the Next.js frontend to display this data in charts and graphs. The AI can assist in generating the code for these visualizations.

4. Deployment with Vercel

Vercel is the chosen platform for deploying the Next.js application due to its seamless integration with Next.js and Git repositories.

4.1. Connecting to GitHub

  1. Import Project: In Vercel, import the GitHub repository containing the Next.js project.
  2. Configure Build Settings: Vercel typically auto-detects Next.js projects. Ensure the build command (npm run build) and output directory are correctly configured.
  3. Environment Variables: This is a critical step. Vercel needs access to Superbase credentials.
    • Superbase URL and Anon Key: Retrieve these from the Superbase project settings (API section).
    • Add these as environment variables in Vercel (Project Settings > Environment Variables). Use meaningful names like NEXT_PUBLIC_SUPABASE_URL and NEXT_PUBLIC_SUPABASE_ANON_KEY.
    • Ensure these keys are correctly mapped in the Next.js application (e.g., using process.env.NEXT_PUBLIC_...).

4.2. Deployment Process

Once the repository is connected and environment variables are set, Vercel automatically deploys the application on every push to the main branch (or a configured production branch).

  1. Initial Deployment: Trigger the first deployment.
  2. Monitoring Deployments: Vercel provides a dashboard to monitor build logs, deployment status, and rollback to previous versions if needed.
  3. Troubleshooting Deployment Failures: If a deployment fails, examine the Vercel build logs. Common issues include:
    • Missing environment variables.
    • Build script errors.
    • Dependency installation problems.
    • Runtime errors that only manifest in the production environment.

The AI agents can be used to analyze these logs and suggest fixes.

4.3. Testing the Deployed Application

After a successful deployment, thorough testing is required:

  • Accessing the Site: Navigate to the Vercel-provided URL.
  • Authentication Flow: Test sign-up, login, and email confirmation processes.
  • Link Generation: Create new links and verify they are saved correctly.
  • Redirection: Test clicking on the generated short links to ensure they redirect to the correct destination and that clicks are tracked.
  • Analytics: Verify that the dashboard displays accurate data.

If issues arise, the AI can be prompted to review the code and deployment configuration. For example:


Analyze what this is and implement a clean and minimal fix.

(accompanied by screenshots or logs).

5. Advanced Considerations and Iterations

While the MVP covers core functionality, further enhancements can be made.

5.1. Enhancing Analytics

  • Visualizations: Implement more sophisticated charts and graphs for better data representation.
  • Click Details: Display detailed click information, including timestamps, sources, and potentially geolocation data.
  • Link Performance Metrics: Calculate metrics like click-through rates or conversion rates.

5.2. Security Best Practices

  • Rate Limiting: Implement rate limiting on API routes to prevent abuse.
  • Input Validation: Strengthen validation on all user inputs.
  • Database Row-Level Security (RLS): Configure Superbase RLS policies to ensure users can only access their own data.

5.3. UI/UX Personality

As noted during the development process, the default UI generated by AI can sometimes feel generic. Prompting the AI with specific design style requests can yield more unique and professional results:


Give it some personality. Choose some specific design style that's not that popular nowadays but that still looks very professional and very clean.

This iterative refinement of the UI, guided by AI, ensures a polished final product.

Conclusion

Building a functional application from scratch using AI coding agents is a practical and efficient approach for engineers. By leveraging tools like Cursor, Claude, and Perplexity, developers can accelerate development cycles, automate repetitive tasks, and focus on architectural decisions and complex problem-solving. The process involves meticulous prompt engineering, iterative refinement, and a strong understanding of the underlying technologies like Next.js, Superbase, and Vercel. This deep dive demonstrates how AI can act as a powerful co-pilot, significantly enhancing productivity and enabling the rapid creation of sophisticated software solutions.