1

Possible Duplicate:
Where can I find the finest granularity of 2010 US census population information?

I'm working on a little data munging project and I need to find 2010 US Census data for the entire country down to 15 digit FIPS numbers / census block level. I can find some summaries but nothing granular enough. Is this data out there? And how would one get it?

  • Thanks whuber, I did try the answer there and they don't seem to provide that data. Or at least working through the accepted answer does not seem to get me said data . . . – Wyatt Barnett Jun 08 '12 at 17:59
  • The Census is the originator and official repository of these data, Wyatt. A convenient entry to downloading block shapefiles (for 2000 and 2010) is http://www.census.gov/cgi-bin/geo/shapefiles2010/main. For the population data, you probably want the SF1 tables available at http://2010.census.gov/news/press-kits/summary-file-1.html: this link is provided in the accepted answer to the duplicate question. – whuber Jun 08 '12 at 18:09
  • Thanks. Part of the problem is I don't need shapefiles at all -- just the FIPS and the population counts. I found someplace that would give me population and housing units down to 12 digit FIPS codes but nothing more granular . . . – Wyatt Barnett Jun 08 '12 at 18:25
  • 1
    For the record, at the end of the day the best source was finding the TIGER shapefiles w/ population and extracting the DBF files and stuffing them into a modern database. Worked perfectly. – Wyatt Barnett Dec 01 '12 at 19:07
  • @WyattBarnett, how did you go about pushing the .dbf into your database? I figure one could write a utility script against something like dbfpy, but I am wondering if you know of a pre-cooked, tried-and-true tool for dbf-to-db importing. ..Toad? – elrobis Dec 01 '12 at 20:49
  • @elrobis -- largely what you are thinking, though we are C# types here so we figured out how to talk to DBF files over ODBC and pulled them in that way. Could have hooked them directly to sql server but the file layouts were such that scripting / coding made a bit more sense. I'm happy to publish the dataset in a more useful format but I'm unsure where I should . .. – Wyatt Barnett Dec 01 '12 at 21:44
  • Oh, mostly I was just curious since .dbf is encountered all the time as a shapefile element. Such a utility might be handy. But I think in many cases, inventing a custom script is often advantageous as it allows you to coerce the data into whatever schema you want, plus you can use it to automate your update sequence in the future. Figured I'd ask. – elrobis Dec 01 '12 at 22:45

0 Answers0