Bookmark and Share

Notice: On April 23, 2014, Statalist moved from an email list to a forum, based at

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: st: geocode command and google query limit

From   Lance Erickson <>
To   "" <>
Subject   RE: st: geocode command and google query limit
Date   Mon, 6 Feb 2012 18:01:50 +0000

I've noticed the same decline in successful requests using -geocode-. In fact, if I make a mistake on the first (and second) go with the command then subsequent attempts can be unsuccessful for EVERY address. I've used a looping strategy, perhaps similar to what you allude to, and in cases like this I end up getting a loop that never ends because I never get any successful geocodes. I have only tried using the command on a dataset with about 3,500 addresses in it and I don't know how big yours is, but assuming that you don't exhaust Google on the first run because of an enormous dataset, just wait a day or so and try again. The help file indicates that the limit resets each day. In my experience, I've been able to  successfully get codes for all addresses after waiting a day and I never have to manually enter addresses into

Here's an example of the code I use:

// get geocodes
gen geocode=.
gen geoscore=.
gen latitude=.
gen longitude=.

local i = 1
while geocode!=200 {
	drop geocode geoscore latitude longitude
	geocode, address(address) city(city) zip(zipcode)
		keep if geocode==200
		save "R:\geocode`i'", replace
	keep if geocode!=200
	local ++i

local k = `i'-1
use "R:\geocode1", clear
forvalues j = 2/`k' {
	append using "R:\geocode`j'"

-----Original Message-----
From: [] On Behalf Of Rüdiger Vollmeier
Sent: Sunday, February 05, 2012 5:14 AM
Subject: st: geocode command and google query limit

Dear Stata users,

I ve used the -geocode- command/.ado to locate institutions using the fulladdress option (using number, street, city, country information).
However, for a high percentage of requests I made I got the error code
620 (Google query limit reached). I am wondering what the best procedure is to deal with this issue.

So far I did loops over requests until a sufficiently high share of observations were geolocated and then I discretionally geocoded the unlocated observations using the Google web interface. However, I feel that this procedure is highly frustrating and there might be better procedures. Hence, the following questions popped into my mind:

1. Does anybody have a better idea how to deal with this issue (or, in other words, how to game Google's query limit)?
2. Does anybody know how this query limit is set by Google? The number of successful requests is decreasing in the requests requested so far
- but which which requests in the past does Google take into consideration? What is the critical number of requests when the limit is applied strictly?
3. Is it helpful to pause in between the requests (e.g. using the
-sleep- command)? How long should I pause it?

Thankful for any suggestions.
*   For searches and help try:

*   For searches and help try:

© Copyright 1996–2018 StataCorp LLC   |   Terms of use   |   Privacy   |   Contact us   |   Site index