ACS5 Puerto Rico names (seems like more than usual have spec char?)

I know that there are names that contain special characters, but it seems as if Puerto Rico has more than I remember and I cross-checked against an old list I had and these dont match. I had fun with Connecticut names and changes in the past.  I looked here ( https://www.census.gov/programs-surveys/acs/technical-documentation/table-and-geography-changes/2023/geography-changes.html) but did not find the changes. Is there a better place that covers name changes and related. I also looked here ( https://www.census.gov/programs-surveys/geography/technical-documentation/boundary-change-notes.2023.html#list-tab-1518919262 ).

Any help is appreciated.

https://api.census.gov/data/2021/acs/acs5/subject?get=S0101_C01_001E,S0101_C01_030E,S0101_C01_032E,S0101_C02_002E,S0101_C02_022E,S0101_C02_026E,S0101_C05_001E,NAME&for=county:*

Things like 

A\u00F1asco Municipio, Puerto Rico
["30812","6469","42.4","4.8","22.0","78.0","15446","Sheridan County, Wyoming","56","033"],
["8830","1729","40.5","5.4","22.2","77.8","3864","Sublette County, Wyoming","56","035"],
["42459","5390","36.4","6.3","25.8","74.2","20609","Sweetwater County, Wyoming","56","037"],
["23319","3609","39.6","4.6","17.8","82.2","11039","Teton County, Wyoming","56","039"],
["20514","3022","36.3","6.9","28.6","71.4","10077","Uinta County, Wyoming","56","041"],
["7768","1730","42.7","5.1","22.8","77.2","3764","Washakie County, Wyoming","56","043"],
["6891","1416","43.6","4.5","20.2","79.8","3207","Weston County, Wyoming","56","045"],
["18068","3716","43.3","3.6","19.0","81.0","9258","Adjuntas Municipio, Puerto Rico","72","001"],
["38307","7796","44.5","3.3","16.9","83.1","19581","Aguada Municipio, Puerto Rico","72","003"],
["55241","12323","43.6","3.6","18.1","81.9","28490","Aguadilla Municipio, Puerto Rico","72","005"],
["24567","5051","43.2","3.4","18.1","81.9","12561","Aguas Buenas Municipio, Puerto Rico","72","007"],
["24565","5425","45.1","3.7","17.8","82.2","12810","Aibonito Municipio, Puerto Rico","72","009"],
["25859","5489","43.8","3.2","17.2","82.8","13385","A\u00F1asco Municipio, Puerto Rico","72","011"],
["88017","19609","43.8","3.7","17.5","82.5","46097","Arecibo Municipio, Puerto Rico","72","013"],
["16183","3286","41.6","4.0","20.3","79.7","8651","Arroyo Municipio, Puerto Rico","72","015"],
["22836","4558","42.4","3.9","18.6","81.4","12044","Barceloneta Municipio, Puerto Rico","72","017"],
["28982","4938","38.0","4.5","21.2","78.8","14707","Barranquitas Municipio, Puerto Rico","72","019"],
["185939","41762","43.3","3.7","17.0","83.0","98657","Bayam\u00F3n Municipio, Puerto Rico","72","021"],
["47403","11093","45.8","3.1","17.0","83.0","24981","Cabo Rojo Municipio, Puerto Rico","72","023"],
["128182","26639","43.1","3.6","17.8","82.2","68742","Caguas Municipio, Puerto Rico","72","025"],
["32885","6864","43.8","3.6","17.6","82.4","17208","Camuy Municipio, Puerto Rico","72","027"],
["42811","7590","41.5","3.8","19.3","80.7","22288","Can\u00F3vanas Municipio, Puerto Rico","72","029"],
["155886","35589","43.7","3.6","17.2","82.8","84758","Carolina Municipio, Puerto Rico","72","031"],
["23536","4984","41.9","3.8","18.9","81.1","12441","Cata\u00F1o Municipio, Puerto Rico","72","033"],
["42134","9052","43.9","3.6","17.4","82.6","21946","Cayey Municipio, Puerto Rico","72","035"],
["11463","2742","44.8","3.1","17.0","83.0","6009","Ceiba Municipio, Puerto Rico","72","037"],
["17045","3637","43.5","4.0","18.8","81.2","8790","Ciales Municipio, Puerto Rico","72","039"],
["40125","7445","42.1","3.9","18.9","81.1","20749","Cidra Municipio, Puerto Rico","72","041"],
["35268","6587","43.7","3.8","18.8","81.2","18165","Coamo Municipio, Puerto Rico","72","043"],
["18990","3634","41.1","3.7","18.9","81.1","9431","Comer\u00EDo Municipio, Puerto Rico","72","045"],
Parents
  • Those names haven't changed; that's why you aren't finding them in the geography changes list. Looks like you pasted Unicode codes instead of the correct letter. F1 is the code for ñ, so it's "Añasco", not "A\u00F1asco". Check New Mexico, are you seeing Doña Ana county?

  • Hi, I am just runing the API and copy directly to excel and or word. They show up this way. I do map to the codes not names, but there seemed to be a lot more than I remember. I see Do\u00F1a Ana County, New Mexico . I am doing the API and copy paste. Maybe next year I'll have AI run Pythin scripts and clean it up. I have standard Win11 and dont want to change the screen or fonts from the defaults etc. (which might help). I pull 87 fields across many tables, and it takes 50 minutes or so and than a day plus to check and get a clean and happy excel table, and then maybe 3-4 days to put something in Tableau. I might pay someone in Fivver $50 to double check a bit Tableau vs the web data because I have so many steps and no one to review except me.

Reply
  • Hi, I am just runing the API and copy directly to excel and or word. They show up this way. I do map to the codes not names, but there seemed to be a lot more than I remember. I see Do\u00F1a Ana County, New Mexico . I am doing the API and copy paste. Maybe next year I'll have AI run Pythin scripts and clean it up. I have standard Win11 and dont want to change the screen or fonts from the defaults etc. (which might help). I pull 87 fields across many tables, and it takes 50 minutes or so and than a day plus to check and get a clean and happy excel table, and then maybe 3-4 days to put something in Tableau. I might pay someone in Fivver $50 to double check a bit Tableau vs the web data because I have so many steps and no one to review except me.

Children
  • You can probably pass an encoding signal into the API and get back the actual accented letters. I don't use the API myself, but it'll probably be something like encoding=utf-8

  • OK, you can't do that. All JSON from the API is thusly encoded. You'll need to use some python to de- then re-encode it, something like

    json_data.encode('latin1').decode('utf8')

    ...could work. Alternatively, you can decode using an online tool like this one.

  • Thanks for the info. At some point next year I might go to Python and make things easier, but I still have this API mapped to an Excel file and for this year, its what I'll use. The thought of AI (I use it every day and just put out 3 great Chrome Extension with it) trying to work though the exceptions and occasionally funky -666666666 etc. with Census data is a no go. However, giving it the raw data and asking it to completely rewrite all the tables in a clear concise manner might work better than anyone might think. 

  • Well, if you're stuck using only Excel, you could just write a macro or VBA routine to replace those \u codes with the correct letter. I think there's only about a dozen or so.

    The -666666666 and other similar number are the jam values, also easily replaced. There are only 6 of them for estimates and MOEs.