Build a Competitor Price Monitor in 30 Minutes (No Infrastructure Required)

A practical walkthrough: use CrawlHQ's /v1/watch and /v1/extract together to build a live competitor pricing alert system that fires to Slack the moment any competitor changes their pricing page.

CrawlHQ Team · 17 March 2026 · 7 min read

Competitor pricing changes are high-signal events. When a competitor drops prices, they’re either losing margin to acquire customers or responding to pressure. When they raise prices, there’s an opening. Either way, you want to know the moment it happens — not when your sales team stumbles across it three weeks later.

This tutorial walks through building a real-time competitor price monitor using CrawlHQ’s /v1/watch and /v1/extract APIs, with alerts delivered to Slack. No servers. No cron jobs. No infrastructure to maintain.

What We’re Building

  • Watch 5 competitor pricing pages on a daily schedule
  • When any page changes, automatically extract structured pricing data
  • Deliver a formatted alert to a Slack channel with the before/after diff
  • Total time: about 30 minutes, including Slack setup

Prerequisites

  • CrawlHQ API key (get one free)
  • A Slack webhook URL (create one in your Slack workspace settings)
  • Node.js 18+ or Python 3.10+ for the webhook handler

Step 1: Identify Your Competitor URLs

Start with the exact URLs, not the homepage. Pricing pages are usually:

  • /pricing
  • /plans
  • /pricing-plans
  • Sometimes buried under /enterprise or a subdomain

For this tutorial, we’ll use placeholder URLs. Swap in your actual competitors.

const COMPETITORS = [
  { name: "CompetitorA", url: "https://competitora.com/pricing" },
  { name: "CompetitorB", url: "https://competitorb.com/plans" },
  { name: "CompetitorC", url: "https://competitorc.com/pricing" },
  { name: "CompetitorD", url: "https://competitord.com/pricing" },
  { name: "CompetitorE", url: "https://competitore.io/plans" },
];

Step 2: Register Watches with extract_on_change

The power move here is using extract_on_change: true with a pricing schema. When the page changes, CrawlHQ doesn’t just tell you that something changed — it extracts the new structured pricing data and includes it in the webhook payload.

const PRICING_SCHEMA = {
  plans: [{
    name: "string",
    price_monthly: "number",
    price_annual: "number",
    features: ["string"],
    highlighted: "boolean"
  }],
  custom_pricing_available: "boolean",
  free_tier_available: "boolean"
};

async function registerWatches() {
  const CRAWLHQ_KEY = process.env.CRAWLHQ_API_KEY;
  const WEBHOOK_URL = process.env.WEBHOOK_URL; // your endpoint

  for (const competitor of COMPETITORS) {
    const response = await fetch("https://api.crawlhq.dev/v1/watch", {
      method: "POST",
      headers: {
        "X-API-Key": CRAWLHQ_KEY,
        "Content-Type": "application/json",
      },
      body: JSON.stringify({
        url: competitor.url,
        schedule: "0 9 * * 1-5", // 9am weekdays
        webhook: WEBHOOK_URL,
        watch_selector: ".pricing, [class*='pricing'], [class*='plan']",
        extract_on_change: true,
        extract_schema: PRICING_SCHEMA,
        metadata: { competitor_name: competitor.name } // passed through to webhook
      }),
    });

    const data = await response.json();
    console.log(`Registered watch for ${competitor.name}: ${data.watch_id}`);
  }
}

registerWatches();

Run this once. CrawlHQ will check each URL daily and fire your webhook whenever something changes.

Step 3: Build the Webhook Handler

Your webhook handler receives a POST from CrawlHQ when a page changes. Here’s a minimal Express.js handler that formats the alert and posts to Slack:

import express from "express";

const app = express();
app.use(express.json());

app.post("/webhook/crawlhq", async (req, res) => {
  const event = req.body;

  // Only act on content_changed events
  if (event.event !== "content_changed") {
    return res.json({ ok: true });
  }

  const competitorName = event.metadata?.competitor_name || "Unknown";
  const diff = event.diff || "No diff available";
  const newPricing = event.extracted?.plans;

  // Format Slack message
  const slackMessage = {
    blocks: [
      {
        type: "header",
        text: {
          type: "plain_text",
          text: `🚨 ${competitorName} updated their pricing page`
        }
      },
      {
        type: "section",
        text: {
          type: "mrkdwn",
          text: `*URL:* ${event.url}\n*Detected at:* ${new Date(event.changed_at).toLocaleString("en-IN", { timeZone: "Asia/Kolkata" })}`
        }
      },
      {
        type: "section",
        text: {
          type: "mrkdwn",
          text: `*Diff:*\n\`\`\`${diff.slice(0, 500)}\`\`\``
        }
      },
    ]
  };

  // Add structured pricing if available
  if (newPricing && newPricing.length > 0) {
    const planSummary = newPricing
      .map(p => `• ${p.name}: $${p.price_monthly}/mo`)
      .join("\n");

    slackMessage.blocks.push({
      type: "section",
      text: {
        type: "mrkdwn",
        text: `*New pricing structure:*\n${planSummary}`
      }
    });
  }

  // Post to Slack
  await fetch(process.env.SLACK_WEBHOOK_URL, {
    method: "POST",
    headers: { "Content-Type": "application/json" },
    body: JSON.stringify(slackMessage),
  });

  res.json({ ok: true });
});

app.listen(3000, () => console.log("Webhook handler running on :3000"));

Step 4: Deploy (Three Options)

Option A: Vercel (easiest) Convert to a Vercel serverless function and deploy in 2 minutes:

vercel deploy

Option B: Railway Push to GitHub, connect to Railway, done. Free tier handles this load easily.

Option C: Cloudflare Workers Fits in a single Worker file, costs essentially nothing at this volume.

Credit Cost Breakdown

With 5 competitors checked daily on weekdays:

ItemCost
Watch checks (5/day × 5 days × 4 weeks)100 credits/month
Extract on change (assume 3 changes/month)15 credits
Total~115 credits/month

At ₹0.40/credit on the Starter plan: ₹46/month for 24/7 competitive pricing intelligence.

What’s Next

Once this is running, a few natural extensions:

  1. Historical trending — store extracted pricing data in Supabase or Postgres and chart how plans evolve over time
  2. Price comparison dashboard — use the structured data to build a live comparison table
  3. CRM trigger — when a competitor raises prices, automatically create a task in your CRM for the sales team to reach out to their customers
  4. Expand coverage — add 20 more competitors; the marginal cost is minimal

The monitor we’ve built here is genuinely production-grade. It handles JavaScript-rendered pages, uses semantic extraction that survives site redesigns, and delivers structured data rather than just “something changed.”


Full source code for this tutorial is available on GitHub. Start free with 500 credits →

C
CrawlHQ Team
Building India's web data API platform. Previously: data engineering, growth engineering, and too much time on HN.

Related Articles

Ready to build?

500 free credits. No credit card. API key in 30 seconds.

Get API Key Free →