Most URL shortener dashboards open on a number — usually a large one. Total clicks since creation, displayed in a badge big enough to screenshot. That number is there because it is easy to produce and satisfying to look at. It is also nearly impossible to act on.
This post is about the metrics worth measuring, the ones to ignore, and a short weekly ritual that lets a single marketer extract real signal from a link library of any size.
The two camps: vanity versus actionable#
The vanity camp contains any metric that aggregates across all time without context. "Total clicks: 48,230" tells you that a link exists and that people have followed it. It does not tell you whether those clicks happened last week or spread over two years, whether they came from real visitors or from search-engine crawlers and uptime monitors, whether any of them converted into the thing you actually want, or whether click volume is growing or collapsing.
The actionable camp contains metrics bounded by a time window, segmented to a meaningful level, and attached — however loosely — to a downstream outcome. "Unique visitors from organic search in Germany, last 30 days, and what fraction clicked again within seven days" is a different question than "total clicks all time". The first has an answer that can change your next decision. The second mostly confirms that you have been running a link for a while.
The distinction matters because dashboards that lead with vanity numbers train you to ignore your analytics. You check the total, feel something in the vague direction of satisfaction or alarm, and close the tab. The dashboards worth opening every week lead with time-windowed metrics that answer specific operational questions.
Metrics worth tracking#
Click versus unique click#
Every click event recorded at the redirect layer has an IP address attached. Unique visitor counts — the uniq(ip) aggregation in Elido's ClickHouse queries — collapse all requests from the same network address into a single visitor. A single person who clicks your link four times in a day is one visitor; four separate people who each click once are also four visitors, but they have different signals.
The ratio of total clicks to unique visitors is your repeat-engagement rate. A link with 1,000 clicks and 950 unique visitors has almost no repeat traffic — most people clicked once and left. A link with 1,000 clicks and 300 unique visitors has significant repeat engagement: something is bringing people back. Whether that is desirable depends entirely on the use case. For an onboarding flow, repeat clicks may mean users are confused. For a content series, they may mean the link has been bookmarked.
One note on the "cookieless counting" framing that appears in many analytics marketing materials: Elido's unique counting uses IP address as the identifier (uniq(ip) in the analytics repo at services/analytics-api/internal/clickhouse/repo.go). This is not a cross-session fingerprint — it is a coarser network-level heuristic. Two people behind the same NAT will count as one visitor; the same person on mobile and desktop will count as two. For campaign-level attribution it is a reasonable approximation. For individual-level identity resolution it is not.
Geo and device breakdown#
The breakdown queries — ByCountry, ByDevice, ByBrowser in the analytics API — answer one operational question: does your traffic match the expected audience for this link?
If you ran a Germany-targeted email campaign and the geo breakdown shows 80% traffic from outside Germany, either the email was forwarded, the link was shared on social media in a different market, or your targeting was off. If a link intended for a mobile checkout flow shows 40% desktop traffic, someone is sending it to desktop users, and the experience on the other end may not be designed for them.
Device data is also useful for diagnosing QR code campaigns specifically. A QR campaign from a print placement should show near-100% mobile traffic. Significant desktop traffic from a QR link means the code is being encountered digitally — in an email, on a website, in a shared screenshot — not scanned from physical media.
The geo data in Elido resolves to country code at redirect time from the full IP, which is then discarded before the event is persisted. The stored record contains only the ISO-3166 country code, not the full IP. City-level data is not available in the current schema.
Referrer and UTM attribution#
The referrer breakdown (ByReferrer) groups clicks by the host portion of the Referer header. This tells you which sites and applications are sending traffic to your link — not which campaign you intended to run, but which traffic sources are actually active.
Direct traffic (no referrer) is the most common single bucket and the least informative. In practice it is a blend of: people who typed or pasted the URL directly, traffic from apps that strip referrer headers (most native mobile apps, email clients, messaging apps), and the residual of attribution that falls off in transit.
UTM parameters sit on the destination URL rather than on the short link itself. The redirect passes them through intact: s.elido.me/spring → https://shop.example.com/spring?utm_source=email&utm_medium=newsletter. Your analytics platform reads the UTMs from the landing page URL, not from the shortener. The shortener's referrer breakdown tells you where clicks come from before the destination page; the destination page's UTM data tells you which campaign sent them.
The combination is where attribution closes. Referrer from the shortener confirms a click happened; UTM parameters on the destination confirm which campaign claimed it. If you see clicks in the shortener's referrer log from mail.google.com but no corresponding UTM-attributed sessions in your analytics platform, the destination page is losing the UTM values — usually because of a redirect between the landing page URL and the actual destination, or because of a consent banner that resets the session.
Time-of-day and day-of-week patterns#
The heatmap endpoint returns click counts bucketed by hour-of-day (0–23) and day-of-week (Monday through Sunday). For most campaign types, this is the most useful single visualization in the analytics stack.
The pattern tells you when your audience is active in the context where they encounter your link. Newsletter links peak on the morning the email was sent and again two or three days later as people work through their inbox backlog. QR codes on retail displays peak at Saturday lunch. Social media links peak in the evenings on weekdays and plateau through the weekend. Time-gated offers — "48-hour flash sale" — produce a sharp spike and a cliff.
The practical use is scheduling. If you are writing copy for a newsletter that includes a short link, knowing that your list reads on Tuesday morning means sending on Monday night at 9pm is not an intuition, it is a pattern you have seen in the click heatmap three times.
For campaigns where the destination changes or where you want to test different landing pages at different times, Elido's smart routing supports time-of-day rules at the link level. A link can route to different destinations before and after a given hour on a given day. The heatmap tells you where the peak is; the routing rule lets you act on it without reprinting or resending.
Conversion tracking and server-side pixels#
Click volume is a leading indicator; a conversion is the event that actually matters. The two are related but often poorly correlated. A link with low clicks and a 40% conversion rate is more valuable than a link with high clicks and a 2% conversion rate.
Elido's conversion tracking connects a click record to a downstream event via a postback. A purchase, form submission, or app install on the destination page fires a server-to-server call that links the outcome back to the originating click. This is done server-side to avoid the attribution loss that comes from Safari's ITP stripping cookies on cross-site redirects.
The pixel configuration in services/click-ingester/internal/pixels/pixels.go supports Meta CAPI and TikTok Events API. Credentials are registered once at the workspace level; the click-ingester fires the platform's conversion API call for each qualifying click event without requiring any client-side script at the redirect layer. The distinction matters: a redirect that injects a pixel into the browser before delivering the destination URL creates a GDPR consent obligation; a server-to-server call keyed on data the user has already shared with the destination platform is architecturally different. The GDPR-friendly shortener guide covers this distinction in more detail.
For conversion rates in the analytics dashboard, the useful metric is not the absolute conversion count but the conversion rate per referrer source. If 8% of visitors who clicked from your email newsletter converted but only 1.2% of visitors who found the link through social sharing converted, the email list is driving qualified traffic and the social audience is not. That is a decision about where to invest in the next campaign, not a footnote.
Bot-filtered versus raw counts#
Every analytics number you see in the Elido dashboard is bot-filtered by default. The edge-redirect service runs User-Agent detection before deciding whether to emit a click event to the Kafka stream at all. The bot detection in services/edge-redirect/internal/bot/bot.go matches against a list of known crawler signatures — Googlebot, Bingbot, Slackbot, Discordbot, uptime monitors, curl, wget, scripting libraries — and suppresses the click event for anything that matches. Requests with no User-Agent are also suppressed.
A separate suspicion scoring layer (services/edge-redirect/internal/suspicion/suspicion.go) marks human-looking but uncertain traffic — requests missing both User-Agent and Accept-Language, per-IP click bursts above a rate threshold — with an is_suspicious flag that propagates into the ClickHouse schema as is_suspicious and suspicion_reasons. Analytics queries by default filter out suspicious rows.
The practical implication: if you run a link to a page that ranks in Google search, the click count in your shortener dashboard will be much lower than the impression count in Search Console. All Googlebot verification requests — and there are many — are filtered before they reach your analytics. Your shortener click count is closer to a human redirect count than a raw HTTP request count. This is the right denominator for conversion rate calculations.
Raw counts that include bot traffic can overstate click volume by 20–60% depending on link type. Links used in email with plain short URLs and links embedded in open web content attract very different crawler profiles. "Total clicks" that include bots are not a meaningful metric for any decision.
A/B variant performance#
When a link has multiple destination variants, Elido's edge selects among them at redirect time using weighted random selection (or round-robin rotation where configured). The selected destination URL is recorded in the destination column of the click_events table.
The ByDestination breakdown query groups click counts by the resolved destination URL. For a link with two variants — variant A at 50% weight and variant B at 50% weight — the destination breakdown shows how many actual clicks each variant received. Over a few hundred clicks, the distribution should approximate 50/50; the deviation from expected weight is itself a signal (extreme skew may indicate a bot-heavy click pattern against one variant).
Connecting variant performance to conversion outcomes requires postback data from the destination page, but even without it, time-windowed click counts per destination can indicate engagement: if variant A generates 10% more clicks than variant B on a 50/50 split, one possible explanation is that variant A's landing page is being shared or bookmarked at a higher rate (people returning to the destination URL directly), which is a signal about content quality independent of conversion.
Click decay curve#
The click decay curve — clicks per day plotted from the day a link was created — has a characteristic shape for each traffic source type.
Email links spike sharply on the send date and decay within 48–72 hours to near-zero. Organic social shares spike on the day of posting and sustain at a lower level for several days as the post recirculates through different time zones. SEO-supporting links that rank for a query build slowly and sustain at a roughly flat level until the ranking changes. QR codes on physical materials show low-but-persistent baseline traffic that reflects the ongoing circulation of the physical artifact.
Knowing the expected decay shape for a link type lets you detect anomalies. An email link that shows a second spike 10 days after the send date was either forwarded at scale, mentioned in another publication, or picked up by an aggregator. A QR link on packaging that shows a sharp drop at a specific date may indicate a product recall, a retail clearance, or a distribution channel change.
The timeseries endpoint in the analytics API (/workspaces/{id}/timeseries) supports both hourly and daily bucketing. For decay analysis, daily buckets over a 90-day window give the curve shape clearly. Hourly buckets are useful for the first 72 hours of a campaign where the intraday pattern matters (email send-time optimization, for example).
What to ignore#
All-time cumulative totals. A counter that has been incrementing for two years tells you about your history, not your current performance. It also compounds all the noise — bot traffic from before you had bot filtering configured, campaign traffic from links that are no longer active, experimental links you created and forgot about. Cut the time window to the last 30 days and the number becomes a campaign-cycle metric. Keep the "all time" badge on your dashboard and you will almost certainly ignore it except when you want to show someone a big number.
Unfiltered raw click counts. Any metric that includes bot and crawler traffic is measuring your exposure to the internet's automated traffic, not your audience's behavior. If your dashboard offers a "raw" versus "filtered" toggle, use filtered for all operational decisions. Raw data belongs in your data warehouse for debugging link infrastructure issues, not on your weekly review screen.
Country totals without time context. "Top country: United States (41%)" is data. "United States share grew from 28% to 41% in the last two weeks, while Germany share fell from 35% to 22%" is signal. The absolute top-country ranking at a point in time tells you approximately nothing about whether your targeting is working or shifting.
Social-style engagement scores. Some platforms produce a proprietary "engagement score" or "link health index" that combines multiple signals into a single number. These scores are not standardized across platforms, not auditable, and not connected to any outcome metric in your stack. They are dashboard cosmetics. The underlying signals — click rate, unique visitor fraction, referrer distribution — are the actual data.
GDPR note: measuring without storing full IPs#
Every metric in this post can be produced without storing a full IP address. Geo resolution, bot detection, and rate-limiting all run at redirect time on the full IP. The full IP is discarded before the event is written to ClickHouse. What persists is the /24 network prefix for IPv4 and /48 for IPv6 — sufficient for the uniq(ip) approximation in unique visitor counts, insufficient to identify an individual.
This is the default behavior, not an opt-in. You do not need to configure IP truncation in your workspace settings; the pipeline is built this way at the ingestion layer.
The consequence for analytics: unique visitor counts are network-level approximations, not individual-level identities. Two people on the same home network — a household's shared WiFi — count as one unique visitor. One person on mobile and later on a home broadband connection counts as two. For campaign-scale measurement, these approximations are tolerable. For individual-level attribution, they are not the right tool, and the right tool for that (first-party identifiers tied to logged-in users) requires a direct relationship with the user that is outside the redirect layer's scope.
For a detailed analysis of what GDPR requires of a URL shortener's data handling, the GDPR-friendly shortener post covers Articles 5, 6, and 28 with concrete procurement questions.
A short weekly analytics ritual#
The goal of a weekly analytics review is to answer three questions in under 15 minutes:
1. Is traffic trending in the right direction?
Open the timeseries view for your workspace, set the window to 90 days, and look at the shape. Flat, growing, or declining? Does the trend line match what you know about your campaigns? If there is a spike you cannot explain, trace it to specific links using the top-links breakdown before closing the tab.
2. Are the traffic sources what you expected?
Check the referrer breakdown and the geo breakdown for the same 90-day window. Are the top referrers the channels you were investing in? Is the geographic distribution consistent with your target market? A significant referrer source you did not expect — a newsletter aggregator, a forum, a content site — is worth knowing about. It may be worth cultivating.
3. Are the links with active conversion tracking performing?
For any link with a postback or pixel configured, check the conversion count in the current campaign window. Not the all-time total — the current window. If conversion rate has dropped materially from the previous window, there are three likely causes: the destination page changed, the traffic source mix changed, or something in the fulfillment flow broke. All three are actionable and none of them would surface from looking at click counts alone.
This ritual takes longer when something is wrong. That is the point. The metrics above are diagnostic tools: most weeks, the timeseries is fine, the sources are expected, and the conversion rate is stable. On the weeks when one of those things is off, you want to find out on Monday, not a month later when the campaign post-mortem is already written.
The pricing page has the breakdown of which analytics features are available on each plan tier. Timeseries, geo, device, referrer, and top-links breakdown are on all paid plans. Heatmap, funnel, cohort retention, conversion tracking, and CSV export are on Pro and above.