May 15, 2026

What Does eNPS Mean for US Law Firms, and Why Do Generic 2026 Benchmarks Mislead Managing Partners?

Shivani Shah

A managing partner at a 320-attorney US firm asked his firm administrator for “one number that tells me how the firm feels.”

She brought him an eNPS score: +18. He compared it to the Culture Amp 2026 benchmark her HR vendor cited — professional services median +21 — and concluded the firm was slightly below average but fine. Six months later, the firm lost five mid-level associates from two practice groups in the span of a quarter. The eNPS score had been “fine” the whole time.

The eNPS number wasn’t wrong. The benchmark was. “Professional services” in the Culture Amp dataset is dominated by consulting firms, accounting firms, and IT services companies — organizations with fundamentally different working conditions, hierarchies, and trust dynamics than US law firms. A +18 attorney eNPS at an Am Law firm in 2026 is not roughly average. It is, in our experience administering engagement surveys for law firms for nearly four decades, a meaningful early-warning signal.

Employee Net Promoter Score (eNPS) has exploded as a metric in 2026. AIHR, Lattice, Leapsome, Culture Amp, Perceptyx, ContactMonkey, and a dozen other vendors have published comprehensive eNPS guides in the past 90 days. Every single one is written for generic corporate HR. None of them addresses how the metric should be interpreted at a US law firm, where the structural features that produce the score — partnership track anxiety, billing pressure, the partner-associate trust gap, anonymity skepticism — are different from the structural features of the corporations the benchmarks are built on.

This guide covers what eNPS is, why generic 2026 benchmarks mislead law firm leadership, how to run an eNPS program properly at a US law firm, and what a meaningful eNPS score actually looks like in the legal industry. SRA has administered confidential engagement surveys, including eNPS-based pulse measurement, for US law firms since 1987. The benchmark data and design principles below are drawn from that work.

What eNPS is (the basics, fast)

Employee Net Promoter Score is a single-question employee engagement metric adapted from Fred Reichheld’s customer NPS methodology (developed at Bain & Company in 2003). The question is:

“On a scale of 0 to 10, how likely are you to recommend this firm as a place to work to a friend or colleague?”

Respondents are categorized into three groups based on their score, and the eNPS calculation subtracts the percentage of Detractors from the percentage of Promoters.

Category Score range What they signal
Promoters 9–10 Actively recommend the firm; high engagement, high retention probability
Passives 7–8 Satisfied enough to stay but not enthusiastic; vulnerable to outside offers
Detractors 0–6 Actively dissatisfied; signal flight risk or already disengaged

eNPS formula: % Promoters − % Detractors. Passives are excluded from the calculation but count toward the total respondent base. The score ranges from −100 to +100.

Example: 200 respondents. 80 score 9 or 10 (40% Promoters). 70 score 7 or 8 (Passives, excluded). 50 score 0 to 6 (25% Detractors). eNPS = 40 − 25 = +15.

The appeal of eNPS is its simplicity. One question, one number, one chart leadership can read in 30 seconds. Response rates run materially higher than traditional 50-question engagement surveys — typically 70–85 percent versus 30–50 percent. For managing partners who want a single tracking metric for firm sentiment over time, eNPS is genuinely useful.

It is also genuinely misleading when used without context. Most of the confusion at US law firms in 2026 comes from importing generic benchmarks without thinking about why those benchmarks are wrong for legal.

Why generic 2026 eNPS benchmarks don’t work at US law firms

Most published 2026 eNPS benchmarks come from a small number of large-vendor datasets: Culture Amp’s January 2026 benchmark drawn from ~102 million responses across 5,000 organizations; Perceptyx’s sector reports; Qualtrics’ industry data; QuestionPro’s 2025 sector tables. The headline benchmarks most often cited for “professional services” run between +18 and +25 for 2026.

Three structural features make these benchmarks misleading when applied to US law firms.

1. “Professional services” in benchmark datasets is mostly consulting and accounting, not law. The Culture Amp 2026 professional services median of +21 is drawn primarily from management consulting firms, accounting firms, and IT services organizations. These organizations have different hierarchies (manager-employee, not partner-associate), different career models (up-or-out promotion versus partnership track), and different trust dynamics around feedback. A law firm comparing itself to a +21 “professional services” benchmark is comparing itself to organizations that look nothing like it.

2. Partnership track anxiety depresses scores systemically. Associates at a US law firm are, on average, less likely than corporate employees to give 9 or 10 ratings, regardless of how good the firm actually is. The reasons are structural: associates are competing in a tournament for a small number of partnership slots, they are aware that aggregate sentiment data could affect firm decisions, and they are reluctant to be unambiguously enthusiastic about a workplace where the realistic outcome for most of them is that they will leave before partnership consideration. This depresses Promoter percentages firm-wide and produces lower eNPS scores than equivalently functional non-legal organizations.

3. Billing pressure compresses the score in both directions. Lawyers operating under 2,000+ billable hour expectations have a different relationship with their workplace than corporate employees with weekly hour caps. The compressed score range — fewer enthusiastic 9-10s, more 6-7s in the Detractor/Passive zone — is partly a feature of billing pressure, not a verdict on firm quality. The same firm running 1,950-hour expectations versus 2,150-hour expectations will see materially different eNPS scores even if every other variable is constant.

4. Anonymity skepticism is higher at law firms. Associates at US law firms are more skeptical of survey anonymity than corporate employees, for entirely rational reasons: small enough cohorts that responses can be triangulated, anxiety about anything appearing in a firm system, and history of being told things are confidential that turned out not to be. Anonymity skepticism shows up as more Passive responses (a safer rating than 9-10 or 0-6) and as silent non-responses, both of which distort the eNPS calculation downward.

Practically, this means that the same eNPS score has different meaning at a law firm than it does in published professional services benchmarks. Here is how the categories shift when calibrated against a US law firm dataset.

Generic 2026 professional services benchmark Equivalent meaning at a US law firm Likely interpretation
Below 0 Below −8 Crisis — immediate action required
0 to +10 (“poor”) −8 to +5 Below industry average; meaningful concerns present
+10 to +30 (“average”) +5 to +20 Typical range for US law firms
+30 to +50 (“good”) +20 to +35 Strong performance for a US law firm
+50+ (“world-class”) +35+ Exceptional — rare in legal industry

The numbers in the right column are directional, drawn from SRA’s administration of engagement programs at US law firms over the past several decades. The benchmarks shift by firm size, practice mix, and geography, and the most useful comparison is always year-over-year movement at the same firm rather than absolute comparison to any external dataset. But the directional point holds: a +18 eNPS at a US law firm is not “slightly below the +21 professional services average.” It is solidly in the middle of the law firm range and may not warrant the alarm the generic benchmark would suggest.

The corollary is also true. A +35 eNPS at a US law firm is genuinely strong performance. A +35 eNPS measured against the Culture Amp generic benchmark would read as merely “above average.” Using the wrong benchmark causes firms to undervalue real engagement strengths and overweight ordinary fluctuations.

How to run eNPS at a US law firm: timing, anonymity, and what to ask alongside it

eNPS is designed to be simple. At a law firm, that simplicity needs to be paired with operational discipline to produce data you can actually act on.

Cadence. Quarterly is the most common rhythm at law firms, with annual paired with a fuller engagement survey. Monthly is too frequent and produces survey fatigue, particularly with associates who already feel over-asked. Annual-only loses the trend signal that makes eNPS useful in the first place.

Anonymity architecture. This is the single most consequential design choice. Associates need to believe their responses are genuinely anonymous, not just be told they are. Three design principles matter: (1) minimum segment size of 7–10 respondents before any sub-cohort score is reported, (2) external administration by a third party rather than internal HR, (3) transparent communication about what data is collected, how it is aggregated, and who sees it. The firms that get this right see response rates above 75 percent and meaningful Detractor representation; the firms that get it wrong see suspiciously high Passive percentages and low response rates.

The follow-up question. eNPS without an open-text follow-up is almost useless. The score tells you where you are; the follow-up tells you why. The most useful follow-up question is: “What is the single most important reason for your score?” Not “what do you like and dislike,” not “how could the firm improve” — those produce diffuse answers. The single-most-important-reason framing produces specific, themed responses that can drive action.

Segmentation. A firm-wide eNPS of +18 can hide an associate eNPS of −5 and a partner eNPS of +45. The aggregate number is operationally misleading. Segment by tenure (first-, second-, third-year associates separately if cohort size permits), practice group, office, and partnership track status. Do not segment in a way that compromises anonymity, but segment far enough to see where the real signal lives.

Action-and-loop-close. Running eNPS without closing the loop is worse than not running it at all. Associates who give honest feedback and see no action become more cynical than associates who were never asked. Within 60–90 days of every eNPS cycle, leadership should communicate what the data showed, what actions are being taken, and what the next measurement point is. This is the discipline most firms skip and the discipline that determines whether eNPS produces value or just produces a number.

Running eNPS as part of a broader engagement program

eNPS is most useful when it is one signal in a broader engagement and review infrastructure, not a standalone metric. The single-question score tells leadership how the firm feels. The open-text follow-ups, the engagement survey, the upward review data, and the formal evaluation findings tell leadership what to do about it.

SRA administers eNPS as part of confidential firm engagement programs for US law firms. The score is calibrated against a legal-industry-specific benchmark dataset, the open-text follow-ups are thematically analyzed by team members who understand legal industry dynamics, and the program runs alongside (but architecturally separate from) the firm’s formal performance review and partner evaluation systems.

Schedule a consultation on eNPS and engagement program design        → Explore SRA’s Firm Engagement Survey program

What eNPS tells you that other diagnostics don’t — and what it misses

eNPS occupies a specific role in a law firm’s engagement infrastructure. Used in that role, it is valuable. Used as a substitute for other diagnostics, it misleads.

What eNPS does well What eNPS does not measure
Tracks aggregate sentiment movement over time at low survey cost Specific operational issues driving the score
Provides one clear metric for board and executive committee reporting Partner-specific attrition pressure
Functions as an early attrition warning signal; Detractor % is the leading indicator Differences between practice groups without segmentation
Achieves high response rates that fuller surveys cannot Cultural concerns associates don’t feel safe naming
Maps cleanly to executive-level conversations about firm health Whether the firm is improving in specific dimensions
Functions as a comparison metric across time periods, if benchmarks are right Why an associate scored the way they did, without a follow-up question

The most productive way to think about eNPS at a US law firm: it is the headline number. Other instruments tell you the story underneath. Upward review programs surface partner-specific issues. Engagement surveys with thematic open-text analysis surface cultural concerns. Self-assessment data in formal reviews surfaces development gaps. eNPS gives leadership the single tracking metric they want; the rest of the engagement infrastructure gives leadership what to do about it.

Using eNPS as an early attrition warning signal

One of the most operationally useful applications of eNPS at US law firms is as a leading indicator of attrition. The Detractor percentage — not the headline eNPS score — is the signal worth tracking.

A Detractor in an eNPS survey is someone scoring 0–6. At a generic corporate employer, a Detractor is dissatisfied. At a US law firm, a Detractor is almost always actively considering leaving or already mentally checked out. The NALP Foundation’s 2025 Update on Associate Attrition found that 83 percent of departing associates left within five years of hire — a record high. The Detractor cohort in an eNPS survey is statistically heavily weighted toward those imminent departures.

The practical implication: track Detractor percentage by cohort and practice group as carefully as the headline eNPS score. A firm whose overall eNPS is stable at +18 but whose mid-level associate Detractor percentage is climbing from 12 percent to 22 percent over three quarters is heading into an attrition spike that the headline number will not show until it is too late.

For a deeper treatment of how eNPS and engagement data connect to attrition signals, see Which Employee Engagement Software Should US Law Firms Actually Use in 2026?.

Three eNPS follow-up questions that generate actionable data at US law firms

Generic eNPS guides recommend the standard follow-up: “What is the most important reason for your score?” That works, but at a US law firm three specific follow-up framings produce more actionable data. Pick one and stay with it across measurement cycles for trend comparability.

Follow-up question What it surfaces Best used when
“What is the single most important reason for your score?” Open thematic data; widest possible signal Standard cadence; building baseline understanding
“If you were starting your career over today, would you join this firm again? Why or why not?” Strong retention signal; surfaces career-stage frustrations When attrition is a leadership concern
“What is one thing this firm should change in the next six months that would improve your score?” Specific, time-bound, actionable When leadership is committed to closing the loop publicly

The third framing is the most demanding because it implicitly commits the firm to acting on what associates say. Firms that ask this question and then do not act on it produce worse engagement outcomes than firms that never asked. Choose accordingly.

Frequently asked questions

What is a good eNPS score for a US law firm? Calibrated against a US law firm dataset, +20 to +35 represents strong performance, +5 to +20 is typical, and below −8 is a crisis-level signal. These ranges are materially different from the generic 2026 professional services benchmarks (which run +18 to +25 as “average”) because of the structural features of law firms — partnership track anxiety, billing pressure, anonymity skepticism, and the partner-associate trust gap. Year-over-year movement at the same firm is more meaningful than absolute comparison to any benchmark.

How often should a US law firm run eNPS? Quarterly is the most common cadence at firms that take eNPS seriously, supplemented by an annual fuller engagement survey. Monthly produces survey fatigue particularly with associates. Annual-only loses the trend signal that makes eNPS useful.

Should partners be included in eNPS surveys? Yes, but interpret partner scores separately from associate scores. The drivers of partner satisfaction (origination credit, firm strategy, partnership terms, compensation system) are different from the drivers of associate satisfaction (workload, feedback quality, partnership track clarity, work allocation). A single firm-wide eNPS that averages partners and associates together hides more than it reveals.

Is eNPS the same as an engagement survey? No. eNPS is a one-question pulse metric. An engagement survey is a longer instrument (typically 20–40 questions) that diagnoses specific drivers of engagement. eNPS tells you what; engagement surveys tell you why. The most effective US law firms run both — eNPS quarterly for the tracking metric, fuller engagement survey annually or biannually for diagnostic depth.

Can eNPS replace exit interviews? No, but it can make exit interviews less surprising. eNPS data, particularly Detractor percentage by cohort, gives leadership early signal that something is wrong with a specific group of attorneys. Exit interviews are still where the specific narrative of an individual departure gets captured. The two work together.

What’s the difference between eNPS and a partner performance review? eNPS measures aggregate sentiment about the firm as a place to work. A partner performance review is an individual evaluation of a partner’s contributions, leadership, and development. They are different instruments measuring different things. We covered partner evaluation in 

Partner Performance Review: How US Law Firms Evaluate Equity Partners in 2026.

Does running eNPS by itself improve culture? No. Running eNPS without acting on the data produces the same outcomes (or worse) as not running it at all. The firms that see culture improve from eNPS programs are the ones that (1) get the benchmark right, (2) segment the data meaningfully, (3) communicate the findings to attorneys, (4) take visible action on the top one or two themes, and (5) report progress at the next cycle.

Sources

  • AIHR (November 2025). Employee Net Promoter Score (eNPS): 2026 Ultimate Guide. aihr.com
  • Culture Amp (January 2026). eNPS Benchmark Dataset — Industry Medians (102M+ responses across 5,000 organizations). Cited in checkbox.com
  • Leapsome (May 2026). Employee Net Promoter Score (eNPS): 2026 Guide. leapsome.com
  • Perceptyx (May 2026). Employee Net Promoter Score: A Complete Guide. perceptyx.com
  • ClearlyRated (January 2026). Employee Net Promoter Score (eNPS): The Ultimate 2026 Guide. clearlyrated.ai
  • Sopact (May 2026). NPS Benchmarks by Industry 2026: eNPS Guide. sopact.com
  • NALP Foundation (2025). Update on Associate Attrition and Hiring (CY 2025). nalpfoundation.org
  • Legal Evolution (December 2025). The 2026 ‘Burning Issues’ Confronting Firm Leaders. legalevolution.org

Related reading on srahq.com

eNPS is a useful metric when calibrated correctly. It is misleading when measured against the wrong benchmark. For US law firms, the right benchmark is other US law firms — and the right surrounding infrastructure is an engagement program that turns the number into action.

SRA administers confidential eNPS and engagement programs for US law firms, with benchmarks drawn from nearly four decades of legal industry data and architectural separation between engagement surveys, performance reviews, and partner evaluations. Built for US law firms since 1987.

Firm Engagement Survey  |  Upward Reviews  |  360-Degree Feedback  |  Schedule a Consultation  |  All Services

Exclusively serving United States law firms since 1987.

→ Read more SRA articles

Check Out More Articles!

Transform Your Firm’s Performance Evaluation Today