If you’ve ever had a “good steps week” and then watched your numbers drift back down, you’ve already learned the core problem with step tracking: your step count is noisy.
That noise matters more than most people realize. A new analysis from the UK Biobank accelerometer program suggests that when researchers rely on a short window of wearable data (like a single 7‑day period), they can underestimate the relationship between step count and heart disease risk because they’re not capturing your usual behavior over time. In statistics, this is called regression dilution bias—and it has a surprisingly practical takeaway for busy people.
This post is about usual step count: what it means, why it’s a better mental model than “my steps,” and how to track it without turning your life into a spreadsheet.
The new finding: step counts are fairly repeatable… but not perfectly
In February 2026, researchers used UK Biobank data to ask two questions:
1) How stable are accelerometer-measured behaviors over years? (steps, sedentary time, sleep, etc.)
2) What happens to estimated health associations when we correct for within-person variability?
They looked at a subset of participants who wore an accelerometer multiple times over a few years, and then illustrated the idea using step count and incident coronary heart disease (CHD). Their key point isn’t that step counts are “unreliable.” It’s subtler: step counts are moderately reproducible, and that moderate instability is enough to bias risk estimates downward if you don’t correct for it.
In their example, the association between daily steps and CHD was stronger after correction: roughly 20% lower CHD risk per usual +4,000 steps/day after accounting for regression dilution, compared with 13% before correction (their illustrative model).
Primary source: Zisou et al., International Journal of Epidemiology (2026) PMID: 41718664.
Regression dilution bias, explained like a human
Here’s the plain-English version.
Imagine two people:
- Person A averages ~8,000 steps/day most months.
- Person B averages ~4,000 steps/day most months.
Now imagine you only measure each person for one week.
- Person A has a hectic week and hits 6,000.
- Person B goes on vacation and hits 7,000.
If you use that one week as “the truth,” you’ve blurred the difference between A and B. In statistical terms, you’ve added measurement error to the exposure (step count). The usual result: the observed association between steps and health outcomes gets pulled toward zero.
That’s regression dilution bias.
It doesn’t mean the benefit is guaranteed, and it doesn’t mean the corrected estimate is “the real number for you.” It means: if step behavior varies within a person, short measurements can underestimate relationships.
Why this matters (even if you don’t care about statistics)
This is the part that’s useful on a Tuesday.
1) Your “true” step level is closer to your average than your best week
Many people set goals based on a high-motivation slice of life:
- a streak week
- a travel week
- a “new shoes” week
Then they judge themselves against that peak.
A better reference point is your usual step count—the level your life tends to settle into when things are normal.
2) Small changes might matter more than they look
If studies underestimate associations when they rely on short measurement windows, then small step increases that you maintain over time could be more meaningful than you’d think from a single snapshot.
This is consistent with a broader wearables literature: device-measured activity volume and intensity patterns are associated with later health risks, and the details of how activity is accumulated can matter. (For example, UK Biobank work linking wearable-measured activity energy expenditure with mortality.)
Explainer-ish primary source: Strain et al., Nature Medicine (2020) PMID: 32807930.
3) One-week “before/after” experiments can trick you
A lot of well-intentioned self-experiments follow this script:
- Week 1: baseline
- Week 2: new plan
- Compare
The problem: a two-week comparison is extremely sensitive to work travel, weather, sleep, stress, and pure randomness.
If you want to evaluate a change, you don’t need perfection—you need a longer horizon (or a repeated pattern).
A better approach: track the trend, not the day
If you do only one thing differently after reading this, make it this:
- Stop reacting to single days.
- Start paying attention to your 7‑day and 28‑day averages.
Day-to-day variance is normal. A trend line is information.
What counts as “usual”?
There’s no universal definition, but here’s a practical one:
- Your usual step count = the average you see across a few typical weeks.
If your app makes this easy, great. If not, you can approximate it with:
- a 7‑day average to smooth daily noise
- a 28‑day average to represent your “month-level” reality
The goal is not to build a flawless metric. It’s to avoid letting a weird week become your identity.
Do this today (15 minutes): the Usual Steps Audit
You don’t need a new routine. You need a calmer baseline.
Minute 0–3: Pull up your step history
Open whatever you already use (Apple Health, Google Fit, your phone’s built-in tracker). Find:
- the last 7 days
- the last 28 days (or “month”)
Write down both averages.
Minute 3–8: Label your last week
Ask one boring question:
- Was last week typical?
Typical doesn’t mean “good.” It means representative.
If it was not typical (travel, illness, a deadline, a snowstorm), treat the 28‑day number as your anchor.
Minute 8–12: Pick one lever that increases steps without requiring motivation
Choose exactly one:
- Commute lever: park 6–8 minutes farther (or get off one stop earlier)
- Calendar lever: two 6‑minute walks (one before lunch, one mid‑afternoon)
- Phone lever: take calls standing/walking indoors
- After-meal lever: 8–10 minutes of easy walking after your largest meal (only if it fits)
You’re not trying to “hit a goal.” You’re trying to nudge your usual up.
Minute 12–15: Define a success rule that matches the science
Make it trend-based:
- “If my 28‑day average is up by ~300–700 steps/day in a month, that counts.”
That might sound small. It’s not small when it’s real.
Two skeptical guardrails (so we don’t oversell this)
1) This does not prove causality. UK Biobank analyses are observational, even when the measurement is excellent.
2) Correction isn’t magic. Regression dilution correction addresses one bias (within-person variability). It doesn’t remove all confounding.
What it does do is improve our mental model: if steps fluctuate, you should think in terms of “usual behavior” rather than a single week.
The calm takeaway
Most people don’t fail at walking because they lack willpower.
They fail because they compare a messy, variable life to a clean, single-number goal.
“Usual step count” is a better target: it respects variance, rewards consistency, and aligns with how researchers increasingly think about wearable data.
If you want a simple north star, make it this:
> Raise your usual steps a little, and keep it there.
Sources
- Zisou C, et al. Reproducibility and associated regression dilution bias of accelerometer-derived physical activity and sleep in UK Biobank. Int J Epidemiol. 2026. PMID: 41718664
- Strain T, et al. Wearable-device-measured physical activity and future health risk. Nat Med. 2020. PMID: 32807930
- Biswas RK, et al. Wearable device-based health equivalence of different physical activity intensities against mortality, cardiometabolic disease, and cancer. Nat Commun. 2025. PMID: 41057301
- CDC. Physical Activity Basics and Your Health (updated Dec 3, 2025). https://www.cdc.gov/physical-activity-basics/about/index.html
- MedlinePlus. Exercise and Physical Fitness. https://medlineplus.gov/exerciseandphysicalfitness.html
- American Heart Association. Walking. https://www.heart.org/en/healthy-living/fitness/walking
