Hi all, I am currently working on an online application for a large company that requires geocoding of customers postal address. Due to the amount of data that will need to be geocoded on a daily basis, the enterprise version of google maps is required. The application will produce an address and send it to google to get back lat/long. This will be achieved via the users browser by moving a pin on a map but also via server side code in another scenario. I was very surprised to read today, that as part of the terms of the enterprise licence, the data returned from google can only be cached or persisted to a database for a maximum of 30 days before being refreshed. This does not suit my requirements but for me poses the question, how do large sites such as myhome and daft manage this scenario? I presume they need to store the data for periods longer than 30 days. So I am putting this question out there, how do companies manage such a situation. Is it refreshing the data via an automated process the way such is achieved. Any feedback would be great.