The American Community Survey is the richest source of local housing and demographic data in the United States. Updated annually, covering every ZIP code tabulation area in the country, and available at no cost through a documented REST API - it is the single best data foundation for any local SEO site targeting homeowners, renters, or real estate audiences. Yet most publishers either do not know it exists or treat it as too complicated to integrate.
It is not complicated. The API follows a consistent pattern, the response format is straightforward JSON, and the same six table codes cover the vast majority of homeowner content use cases. This guide walks through every step: getting your API key, understanding which tables to pull, making your first request, handling the edge cases that will break your pipeline if you do not plan for them, and building a caching strategy that keeps your pages fresh without burning through your daily request quota.
This post pairs with our broader guide on building a local SEO site using government data and the complete directory of government APIs for local content.
Understanding ACS 5-Year Estimates
The Census Bureau publishes the ACS in two main vintages: 1-year estimates and 5-year estimates. For local content at the ZIP code level, you almost always want the 5-year estimates (acs5). Here is why:
The 1-year estimates (acs1) cover only areas with populations of 65,000 or more. That means most ZIPs, small towns, and rural counties simply do not have 1-year data. The 5-year estimates cover every ZIP code tabulation area (ZCTA) in the country - all 33,000+ of them. For a local SEO site that aims to cover every market, not just major cities, the 5-year series is the only option that actually works.
The trade-off is that 5-year estimates represent a rolling average of the five most recent survey years rather than a single year snapshot. The 2022 ACS 5-year estimates cover survey years 2018-2022. This makes them slightly less current than the 1-year figures for large cities, but significantly more statistically reliable - particularly for small populations where a single year's sample might have a margin of error larger than the value itself.
For homeowner content, this trade-off is almost always acceptable. The year a neighborhood's housing stock was built, the dominant heating fuel type, and the owner/renter split do not change dramatically year over year. What matters is that the data is real, local, and verifiable - which 5-year estimates fully deliver.
The base API endpoint for the 2022 ACS 5-year release is:
https://api.census.gov/data/2022/acs/acs5
When the Census releases the 2023 estimates in December 2024, you simply update the year in the URL. The variable names and table structure remain stable across vintages, which makes version upgrades straightforward.
Getting Your API Key
The Census API is free and does not require payment or a credit card. You can make up to 500 requests per day without any key at all. However, with a free API key you get 500 requests per minute (per IP), which is necessary for bulk pulls across thousands of ZIPs.
To get a key:
- Go to
https://api.census.gov/data/key_signup.html - Enter your email address and organization name (any name is fine)
- Check your email for the key - it arrives within a few minutes
- Append it to every request as
&key=YOUR_KEY
The key is a 40-character hex string. Store it in your environment configuration or vault - do not hardcode it in client-side JavaScript that users can view. If you are building a server-side generator, pass it as an environment variable. If you are building a browser tool, route the Census requests through your own proxy endpoint.
Note: The Census API documentation uses "api_key" in some examples and "key" in others. The correct parameter name is
key. Usingapi_keysilently ignores the key and you remain rate-limited at the unauthenticated tier.
The Essential Housing Tables
The ACS publishes over 1,200 data tables. For homeowner content, six tables cover the core data dimensions. Each table has a base count variable (ending in _001E) and a series of category variables (ending in _002E, _003E, etc.). The E suffix means estimate; M means margin of error.
| Table | Key Variables | Description | Content Use Case |
|---|---|---|---|
B25034 |
_001E through _011E | Year Structure Built - 10 decade buckets from pre-1939 to 2020+ | Housing stock age - drives "older homes need X" content, renovation guides |
B25040 |
_001E through _010E | House Heating Fuel - gas, electricity, fuel oil, propane, wood, solar, no fuel | HVAC content, utility cost guides, winter prep checklists |
B25003 |
_001E, _002E (owner), _003E (renter) | Tenure - owner occupied vs renter occupied | Audience filter - high renter % means lower homeowner content ROI for that ZIP |
B19013 |
_001E | Median Household Income in the Past 12 Months | Cost guide framing - "affordable for households earning $X" |
B25077 |
_001E | Median Value (Dollars) - owner-occupied housing units | Market trend pages, renovation ROI calculations |
B25088 |
_001E (all), _002E (with mortgage), _003E (without) | Median Monthly Housing Costs | Affordability guides, "true cost of homeownership" content |
B25034: Year Structure Built in Detail
This table is the most powerful for homeowner content because it directly answers "how old are the homes in this area?" - a question with immediate practical implications. The 11 category variables map to these decade buckets:
B25034_002E- Built 2020 or laterB25034_003E- Built 2010 to 2019B25034_004E- Built 2000 to 2009B25034_005E- Built 1990 to 1999B25034_006E- Built 1980 to 1989B25034_007E- Built 1970 to 1979B25034_008E- Built 1960 to 1969B25034_009E- Built 1950 to 1959B25034_010E- Built 1940 to 1949B25034_011E- Built 1939 or earlier
Sum _002E through _004E and divide by _001E to get the share of homes built in the last 25 years. Sum _009E through _011E divided by _001E to get the share built before 1960 - the cohort most likely to have knob-and-tube wiring, original cast iron pipes, and single-pane windows. These derived metrics are the content hooks: "43% of homes in this ZIP were built before 1960 - here is what to inspect."
B25040: House Heating Fuel in Detail
The heating fuel table drives an entire category of HVAC, utility, and winterization content. The variables:
B25040_002E- Utility gas (includes natural gas and manufactured gas)B25040_003E- Bottled, tank, or LP gas (propane)B25040_004E- ElectricityB25040_005E- Fuel oil, keroseneB25040_006E- Coal or cokeB25040_007E- WoodB25040_008E- Solar energyB25040_009E- Other fuelB25040_010E- No fuel used
The dominant fuel type sets the editorial direction for heating-related content. A ZIP where 78% of homes use utility gas gets a different HVAC guide than a ZIP where 61% use electricity. In rural ZIPs in the Northeast and Midwest, fuel oil (B25040_005E) may be 30-40% of homes - a specific and underserved audience for heating oil delivery cost comparisons, tank maintenance guides, and conversion-to-heat-pump content.
Making Your First API Call
The Census API URL structure is consistent: base endpoint + ?get= (fields to return) + &for= (geography) + &key= (your key). Here is a complete request to fetch housing stock age and heating fuel for all ZIPs in the US:
https://api.census.gov/data/2022/acs/acs5?get=B25034_001E,B25034_002E,B25034_003E,B25034_004E,B25034_007E,B25034_011E,B25040_001E,B25040_002E,B25040_004E,NAME&for=zip%20code%20tabulation%20area:*&key=YOUR_KEY
Breaking this down:
get=B25034_001E,...- comma-separated list of variable codes to returnNAME- always include this; it returns the human-readable name of the geography ("ZCTA5 78701")for=zip%20code%20tabulation%20area:*- the geography filter. The*means all ZIPs. URL-encode the space as%20.key=YOUR_KEY- your API key
The response is a JSON array where the first element is the header row and subsequent elements are data rows:
[
["B25034_001E","B25034_002E","B25034_003E","NAME","zip code tabulation area"],
["2847","45","312","ZCTA5 78701","78701"],
["4103","0","89","ZCTA5 78702","78702"],
...
]
Parse this into a keyed object using the first row as column names and the last element (zip code tabulation area) as the key. In JavaScript:
const [headers, ...rows] = data;
const zipIndex = headers.indexOf('zip code tabulation area');
const byZip = {};
for (const row of rows) {
const zip = row[zipIndex];
byZip[zip] = {};
headers.forEach((h, i) => { byZip[zip][h] = row[i]; });
}
Values come back as strings, not numbers. Always parse them: parseInt(byZip['78701']['B25034_001E'], 10). Watch for the suppression sentinel value "-666666666" before parsing - parseInt of that string returns -666666666, which will corrupt any percentage calculation.
Fetching by State to Stay Under Rate Limits
Fetching all 33,000+ ZIPs at once works fine for bulk data collection, but it returns a very large response (typically 3-8 MB depending on how many variables you request). For targeted pulls focused on one state or region, use the &in=state:{FIPS} parameter to restrict the geography:
https://api.census.gov/data/2022/acs/acs5?get=B25034_001E,B25034_002E,NAME&for=zip%20code%20tabulation%20area:*&in=state:48&key=YOUR_KEY
FIPS 48 is Texas. Here are the FIPS codes for the 10 most-populated states:
| State | FIPS Code | Approx ZIP Count |
|---|---|---|
| California | 06 | ~2,600 |
| Texas | 48 | ~1,900 |
| Florida | 12 | ~1,100 |
| New York | 36 | ~2,100 |
| Pennsylvania | 42 | ~2,400 |
| Illinois | 17 | ~1,900 |
| Ohio | 39 | ~2,500 |
| Georgia | 13 | ~800 |
| North Carolina | 37 | ~1,000 |
| Michigan | 26 | ~1,700 |
A full list of all 50 state FIPS codes is available at https://api.census.gov/data/2022/acs/acs5?get=NAME&for=state:* - the state field in the response contains the FIPS code.
Note that the &in=state:{FIPS} filter for ZIP code tabulation areas is a "best effort" match - Census maps ZIPs to states by the state that contains the largest portion of the ZCTA. Border ZIPs that span two states may appear in one state's results but not the other's. For near-complete ZIP coverage, pull all ZIPs nationally and filter client-side by the ZIP prefix ranges for each state.
Handling Suppressed Values
Data suppression is the most common source of pipeline bugs when working with Census data. The Census Bureau suppresses estimates for small populations to protect respondent confidentiality and to prevent publishing statistically unreliable figures.
There are two distinct null conditions you need to handle:
- Suppressed value: The Census has data but is withholding it. Returned as the string
"-666666666". This is a deliberate sentinel, not an error. - Missing value: No data collected for this geography/variable combination. Returned as JSON
null.
A type-safe check in JavaScript:
function acsValue(raw) {
if (raw === null || raw === '-666666666') return null;
const n = parseInt(raw, 10);
return isNaN(n) ? null : n;
}
const totalUnits = acsValue(row['B25034_001E']);
if (totalUnits === null) {
// Fall back to county-level data or skip this section
}
When a ZIP is suppressed, the right fallback strategy depends on the data type:
- For median values (B25077, B19013, B25088): fall back to the county-level estimate. The Census API accepts
for=county:{FIPS}&in=state:{FIPS}with the same variable codes. - For count-based tables (B25034, B25040): if the total count is suppressed, skip the data section entirely and display a message like "Local data not available for this ZIP. View county-level data."
- Never display a zero or a blank where a suppressed value should be. Readers will interpret zero as accurate data, which erodes trust when they notice the discrepancy.
ZIPs with fewer than 100 housing units are most likely to have suppressed values. If your content strategy is focused on high-density urban and suburban markets, suppression will affect fewer than 5% of your ZIPs. If you are targeting rural areas, plan for 15-25% suppression rates on some variables.
Joining ACS Data to Your Content Templates
Raw variable values are not directly useful to readers - derived metrics are. Here are the most content-relevant calculations for the six core tables:
Housing Stock Age (B25034)
Calculate two summary metrics from the decade buckets:
const total = acsValue(row['B25034_001E']);
const recent = (acsValue(row['B25034_002E']) || 0) // 2020+
+ (acsValue(row['B25034_003E']) || 0) // 2010-2019
+ (acsValue(row['B25034_004E']) || 0); // 2000-2009
const pctRecent = total ? Math.round(recent / total * 100) : null;
const prewar = (acsValue(row['B25034_010E']) || 0) // 1940-1949
+ (acsValue(row['B25034_011E']) || 0); // pre-1939
const pctPrewar = total ? Math.round(prewar / total * 100) : null;
Then insert these into your template prose: "About {pctRecent}% of homes in {city} were built in the last 25 years. {pctPrewar}% were built before 1950, which means a meaningful share of the housing stock may have older electrical panels, galvanized plumbing, or limited insulation."
This sentence is impossible to write generically. It requires the data, it varies by ZIP, and it directly answers what a potential homebuyer or current owner wants to know. That is the standard to aim for with every data-to-prose translation.
Heating Fuel Dominant Type (B25040)
const fuelMap = {
'B25040_002E': 'natural gas',
'B25040_003E': 'propane',
'B25040_004E': 'electricity',
'B25040_005E': 'fuel oil',
'B25040_007E': 'wood'
};
let dominant = null, dominantPct = 0;
const fuelTotal = acsValue(row['B25040_001E']);
for (const [key, label] of Object.entries(fuelMap)) {
const val = acsValue(row[key]) || 0;
const pct = fuelTotal ? val / fuelTotal : 0;
if (pct > dominantPct) { dominantPct = pct; dominant = label; }
}
Template: "{Math.round(dominantPct * 100)}% of homes in {city} use {dominant} as their primary heating fuel. [Fuel-specific HVAC tips follow...]"
Owner Occupancy Rate (B25003)
const ownerOcc = acsValue(row['B25003_002E']);
const totalOcc = acsValue(row['B25003_001E']);
const ownerPct = totalOcc ? Math.round(ownerOcc / totalOcc * 100) : null;
Use this as a content qualifier: if ownerPct is below 40%, this ZIP has more renters than owners. That does not mean you should skip it, but it does mean framing your content around renters' rights, renter insurance, and "questions to ask your landlord" rather than roof replacement guides.
Caching and Refresh Strategy
ACS 5-year estimates are released once per year, in December. The 2022 5-year estimates (covering survey years 2018-2022) were released in December 2023. The 2023 estimates will be released in December 2024. This means your data does not go stale on a short cycle - but it does go stale on an annual one.
The caching strategy that works for pSEO at scale:
- Store the vintage year with every record. When you fetch and store ACS data, record
{ vintage: 2022, fetchedAt: Date.now(), data: {...} }. This lets you query "all ZIPs with vintage 2021 or older" to find records that need refreshing after a new release. - Set a 30-day TTL for re-fetches within the same vintage. The underlying data does not change between your fetch in January and another fetch in March - the Census does not update mid-year. A 30-day TTL prevents unnecessary API calls while keeping your data consistent with what the Census currently serves.
- Trigger a vintage upgrade sweep each December. When the new estimates drop, queue a re-fetch of all ZIPs with the new vintage year. This is a batch operation: at 33,000 ZIPs and 50 req/min, a full national refresh takes about 11 hours if you fetch one ZIP at a time. Fetch all-ZIPs-at-once in a single request (the full national dump) and update your store from that - it takes about 2 minutes.
- For browser-based tools, use IndexedDB. A national ZIP dataset with 6 table values per ZIP is roughly 15 MB of JSON - well within IndexedDB's practical limits, and completely outside localStorage's 5 MB hard cap.
Beyond Housing - Other ACS Tables Worth Knowing
The six core housing tables are the highest-value starting point, but several additional tables are worth integrating once your base pipeline is running:
- B08301 - Means of Transportation to Work: Breaks down commute mode - car, public transit, walked, worked from home. The "worked from home" variable (
B08301_021E) is a strong signal for identifying ZIP codes with high remote worker concentration - a valuable audience dimension for home office renovation content. - B25024 - Units in Structure: Distinguishes single-family detached, single-family attached, 2-unit, 3-4 unit, 5-9 unit, 10+ unit, and mobile home structures. High shares of single-family detached homes mean a strong homeowner audience. High shares of 5+ unit structures mean a condo/apartment market where exterior maintenance content is less relevant.
- B25071 - Gross Rent as a Percentage of Household Income: The rental cost burden. Useful for content targeting renters considering homeownership: "In this ZIP, median rent consumes X% of household income - here is how mortgage costs compare."
- B25064 - Median Gross Rent: A single-variable table giving median rent for the ZIP. Pair with B25077 (median home value) to calculate a price-to-rent ratio for buy-vs-rent content.
- B01003 - Total Population: The simplest possible filter. A ZIP with population under 500 is unlikely to be worth targeting regardless of how the housing data looks.
Homeowner.wiki fetches all 6 key ACS tables for every US ZIP in a single batch run - handles rate limits, suppression fallbacks, vintage tracking, and normalization automatically. No Census API key required, no pipeline to build.
Putting the ACS to Work
The Census ACS API is genuinely the best free data source for local homeowner content. Every piece of data is authoritative (it comes from the federal government), verifiable (anyone can reproduce your query), locally specific (33,000+ ZIPs), and practically unavailable to content farms that do not want to build a data pipeline. Those four properties together are what make data-driven local content durable against AI content floods and algorithm updates.
The technical barrier to using the ACS is low - a few dozen lines of fetch-and-parse code, a sensible caching layer, and a set of derived metrics mapped to content templates. The hard part is editorial: translating the numbers into content that is genuinely useful to a homeowner in that specific city, not just accurate data displayed in a table. The key is derived metrics combined with specific prose templates that force city-specific output.
Start with B25034 (housing stock age) and B25040 (heating fuel). These two tables alone generate enough unique local data to differentiate city pages in a way that thin editorial content simply cannot match. Add the other four tables once your pipeline is validated and your first batch of pages is indexed.
Stop Manually Wrestling with the Census API
Join the waitlist for Homeowner.wiki and get access to the full pSEO engine - all ACS tables pre-fetched, normalized, and ready to generate content for every US ZIP code.
Join the Waitlist