Foundations

How to actually use Authority Score, DR, DA, and Domain Score together

Semrush says your authority is 49. Ahrefs says 53. Moz says 44. DataForSEO says 51. Which is right? All of them, sort of — and none of them, exactly. Each metric measures roughly the same underlying thing (your domain's link-based authority) using different methodologies, different crawl coverage, and different scoring functions. Using just one gives you a single perspective with built-in bias. Using all four lets you triangulate. This is the practical guide to working with all four metrics, when they agree, when they disagree, and what to do with the answer.

By Gareth Hoyle Read time 11 min
TL;DR

The four metrics — Semrush Authority Score (AS), Ahrefs Domain Rating (DR), Moz Domain Authority (DA), and DataForSEO Domain Score — all attempt to measure the same underlying signal: link-based authority. Each uses a different crawl, different link-quality models, different scoring curves. Healthy rule of thumb: Ahrefs DR runs 3–5 points higher than Semrush AS for established domains; Moz DA runs 3–7 points lower; DataForSEO Domain Score tends to track Ahrefs closely. When they all agree within 5 points, the signal is robust. When variance exceeds 8 points, dig deeper — usually one of the tools is missing data the others have. The mature analytical move is using all four as a triangulation, not picking one to trust.

Why this question matters

Most SEO conversations include exactly one authority number. Someone says "their DR is 67, ours is 49" and the room treats that as established fact. The number gets used to justify investments, set targets, brief agencies, build campaign cases.

The problem: that single number is one tool's view of the truth. It's not the truth. Ahrefs DR is what Ahrefs's crawler and scoring model produce. Semrush AS is what Semrush's crawler and scoring model produce. They differ — sometimes by 1–2 points (which is noise), sometimes by 10+ points (which is signal that something's specifically off in one tool's view of that domain).

Treating any one metric as canonical produces three predictable failures:

The four-tool triangulation isn't paranoia. It's how you'd measure anything else important to the business.

What each metric actually measures

Metric 01

Semrush — Authority Score

Scale: 1–100. Scoring inputs include backlink profile (volume and quality), organic search performance, organic traffic estimates, and a spam-detection layer.

Distinguishing feature: AS factors organic traffic and ranking performance into the score, not just links. So a domain with strong rankings but a weaker link profile can still register a respectable AS, where pure-link metrics would understate it.

Strengths: Captures broader site health, not just links. Tends to be relatively stable quarter-over-quarter (less noisy than pure link metrics).

Biases: Inflates scores for domains with strong organic visibility but weak link profiles (some EEAT-heavy sites). Underweights brand-new domains that haven't yet accumulated organic visibility, even when their link profiles are good.

Best used for: Holistic competitive comparison. Site health diagnostics. Long-term trend tracking.

Metric 02

Ahrefs — Domain Rating

Scale: 1–100. Scoring is heavily focused on the backlink profile — quantity, quality, diversity, and the DR of referring domains. Logarithmic scale (the gap between DR 70 and DR 80 is much larger than the gap between DR 30 and DR 40).

Distinguishing feature: DR is the most pure-link metric of the four. It doesn't factor in organic traffic or rankings. It's a direct read on link-based authority.

Strengths: Cleanest signal of link-based authority specifically. The largest crawl among the major SEO tools, so coverage is broadest. Industry-default reference point — most SEO conversations default to DR for this reason.

Biases: Tends to run 3–5 points higher than Semrush AS for established domains. Pure link focus means a domain with great content but weak link profile registers low even when business performance is strong. Logarithmic scale makes high DRs look closer than they are (the work to go from DR 80 to DR 85 is enormous).

Best used for: Pure link authority comparison. Backlink target qualification (filtering link prospects by DR). Quarterly link-building progress tracking.

Metric 03

Moz — Domain Authority

Scale: 1–100. Original methodology dates from 2010 — Moz pioneered this whole metric category. Scoring uses a machine-learning model trained to predict ranking ability, with backlink-profile inputs.

Distinguishing feature: DA is calibrated against actual SERP performance. Moz periodically retrains the model so the metric continues to predict rankings as Google's algorithms evolve.

Strengths: Most directly correlated with actual ranking performance, by design. The longest historical dataset (the metric is older than the others). Most widely recognised in legacy enterprise SEO contracts and historical reporting.

Biases: Tends to run 3–7 points lower than Semrush AS for established domains. Smaller crawl than Ahrefs means coverage gaps for international and niche domains. Periodic recalibrations cause occasional step-changes that can confuse trend tracking.

Best used for: Ranking-prediction modelling. Long-historical comparison (where you have years of DA data already). Validating that other metrics' authority readings translate to actual SERP performance.

Metric 04

DataForSEO — Domain Score

Scale: 1–100 (normalised from a 1–1000 internal scale). Scoring uses a backlink-profile model with quality weighting based on the source domain's own score, with spam-link filtering.

Distinguishing feature: DataForSEO is a data-as-a-service provider rather than a primary SEO tool, so Domain Score is built for downstream API consumption rather than a polished interface. Less discussed in SEO marketing circles, but the underlying methodology is comparable to Ahrefs DR with different crawl coverage.

Strengths: Cheapest of the four to access at scale (per-call pricing rather than per-seat). Increasingly used by tools that triangulate or normalise — including newer SEO platforms that don't have their own crawls. Tends to track Ahrefs DR closely (within ±2 points typically).

Biases: Smaller historical dataset means trend tracking only goes back a few years. Crawl coverage is improving but still narrower than Ahrefs for some long-tail domains. The 1–1000 native scale (that we normalise to 1–100) means small differences look smaller than they are at the bottom of the scale.

Best used for: High-volume programmatic measurement (where per-seat pricing of Ahrefs/Semrush is prohibitive). Cross-validation against Ahrefs to detect crawl-coverage gaps.

Where the metrics typically agree (and what that means)

Across most domains, the four metrics agree on the broad strokes. A genuinely high-authority site (DR 75+) registers as high-authority across all four — maybe 70 on Moz DA, maybe 78 on DataForSEO Domain Score, maybe 73 on Semrush AS. The variance is real but the directional signal is consistent.

This consistency is what makes triangulation valuable. When the four agree within ~5 points, the consensus is robust. When all four say "this domain has authority around 50," you can act on that with confidence — none of them is wildly miscalibrated for that specific domain.

The cases where they agree:

For most B2B brands operating in major markets with mature SEO programmes, agreement is the rule. The variance is small enough to be noise.

Where the metrics disagree (and what to do about it)

Disagreement is more interesting than agreement. When variance exceeds about 8 points across the four metrics, there's almost always something specific going on with the domain that one or more tools is misreading.

Disagreement pattern 01

One tool's coverage is incomplete

Most common pattern. A domain has substantial link equity from sources one crawl has indexed and another hasn't. Maybe a recent burst of links from international media that Ahrefs picked up but Moz hasn't crawled yet. Maybe historical links from sites Semrush deemed low-quality but Ahrefs and DataForSEO scored normally.

The signal: when one tool is significantly lower than the other three, it's usually missing data. When one tool is significantly higher, it's usually counting links the others filtered out (often correctly).

What to do: Look at the raw referring-domains count and overlap between tools. If Ahrefs sees 14,000 referring domains and Moz sees 11,000, the gap is coverage. The "true" authority is probably closer to the higher-coverage view.

Disagreement pattern 02

The domain just changed (acquisition, merger, redirect)

When a domain's link profile changes substantially — through acquisition, rebranding, redirect strategy, or migration — the four tools update their scoring at different speeds. Semrush might recognise the change in 30 days; Moz might take 90; Ahrefs might process it within weeks; DataForSEO depends on API call timing.

The signal: if the variance is large AND the variance has appeared recently (vs being stable), there's been a change.

What to do: Ignore single-tool readings during the transition period. Wait 90 days for all four to stabilise. Use the post-stabilisation consensus.

Disagreement pattern 03

Strong organic performance, weaker link profile

A domain that ranks well, drives traffic, but has a relatively thin link profile (often EEAT-driven sites that earn citations rather than links, or freshly-launched authoritative content). Semrush AS picks up the organic performance and inflates the authority score. The pure-link metrics (DR, DA, Domain Score) read lower.

The signal: Semrush AS is significantly higher than the other three.

What to do: Recognise this as a real positioning. The site has authority, just not through links. Pure SEO link-building targeting is less relevant; protecting and extending the EEAT/citation pattern is more relevant.

Disagreement pattern 04

Heavy spam-link history, partially filtered

A domain that earned authority through aggressive (or compromised) link-building tactics, where some tools have filtered the spam more thoroughly than others. Moz tends to be most aggressive about spam filtering; Ahrefs tends to count more of these links.

The signal: Moz DA is significantly lower than Ahrefs DR or DataForSEO Domain Score.

What to do: Take the lower number more seriously. Google is closer to Moz's view than Ahrefs's view in this scenario. The Ahrefs DR is overstating actual ranking-ability.

Disagreement pattern 05

Niche or international domain

Domains in specific languages, geographies, or industries where one tool's crawl is structurally weaker than another's. DataForSEO and Ahrefs tend to have stronger international coverage than Moz; Semrush varies by region.

The signal: variance is consistent across multiple domains in the same niche or market — not just the one being audited.

What to do: Recognise the structural bias. For international SEO contexts, weight Ahrefs and DataForSEO higher. For US-mainstream contexts, the Moz baseline is fine.

The triangulation model — what to actually report

For each domain you measure, produce four numbers. Then produce two derived numbers:

Consensus authority (mean)

The arithmetic mean of the four metrics. This is the closest thing to "true" authority you can produce — no single tool's bias dominates, and the math automatically averages out crawl gaps.

Variance (standard deviation)

How much the four metrics disagree. Lower variance = more confidence in the consensus number. Higher variance = something specific is happening with this domain that's worth investigating before acting on the score.

Practical thresholds we use:

VarianceConfidenceWhat it means
0–4 pointsHighAll four tools agree. The consensus number is robust.
4–8 pointsMediumNormal variance for established domains. Trust the consensus.
8–15 pointsLowReal disagreement. One of the disagreement patterns above is in play. Investigate before acting.
15+ pointsVery lowTools are reading materially different versions of this domain. Don't rely on any single number.

The CFO-defensible answer

The reason this matters in practical strategic terms: when a marketing leader presents a competitive analysis or a budget case using authority numbers, the question that almost always comes is "where does this number come from?"

The single-source answer is fragile:

"Their authority is 67, ours is 49 according to Ahrefs."

The CFO has every right to ask why Ahrefs is the source of truth. They don't run Ahrefs. They don't necessarily know what DR measures. They might know that Moz exists and gives different numbers. They've been around long enough to be sceptical of single-source measurement in any other discipline.

The triangulation answer is sturdier:

"Their authority is 67, ours is 49 — that's the consensus across Semrush, Ahrefs, Moz, and DataForSEO. The four tools agree within ±3 points on both numbers, so the gap is robust. The 18-point delta is signal, not measurement noise."

That holds up. It's not aspirational sales positioning — it's how you'd measure anything else of importance. The CFO can verify the methodology, push back if they want, and ultimately make a decision based on numbers that survived cross-validation.

The cost-benefit reality

Honesty about what this costs to do. Running four tools instead of one is more expensive — each has its own per-seat or per-call pricing. For most in-house teams, the math doesn't make sense to do month-by-month.

The economical pattern: run the four-tool triangulation periodically (quarterly or for high-stakes decisions). For ongoing monitoring, default to whichever single tool you've already licensed (typically Ahrefs or Semrush in mature programmes). Treat that as a reasonable approximation; verify quarterly with the full triangulation.

The sales pitch we make at visible.md is exactly this: outsourcing the four-tool triangulation to the agency that runs all four APIs is more cost-effective than licensing all four in-house. But the methodology stands regardless of who runs it. Even in-house teams can do this once per quarter using trial access or one-time pulls.

The strategic conclusion

The single-tool authority number worked when the choice of tool didn't strategically matter — when the differences between Ahrefs DR, Moz DA, and Semrush AS were small enough that the choice was ergonomic rather than analytical.

The differences are no longer small enough. As AI engines have started using authority signals to weight content for citation and training, and as buyers have increasingly used cross-source verification in their procurement processes, the costs of single-source bias have risen.

The mature programme of 2026 measures all four. Quarterly minimum. The cost is small. The defensibility benefit is large. And the analytical signal in disagreement — those patterns where tools diverge — is information that single-source measurement leaves on the floor.

Triangulate without licensing four tools

Get a Search Visibility Audit.

Every audit cross-checks authority across Semrush, Ahrefs, Moz, and DataForSEO — with consensus and variance reported per brand. The CFO-defensible answer, without licensing four enterprise tools. From $997, in 48 hours.