Python script and multiple API calls practice with Google and Twitter

Note from the author: No artificial intelligence (AI) was used to write this blog.
However, being a non-native English speaker, I may have used a translator or grammar corrector in some cases, which may have altered my natural way of writing. Thank you for your understanding.

After doing a lot of Python tutorials and spending time “playing” with Postman and API calls during my journey to network programmability and automation, I wanted to make a concrete example, from zero to a visible result.

Since I love photography and my country is full of beautiful but not well-known places, I chose to do the following:

  • Take the complete list of municipalities, towns and villages in my country.
  • Choose one location per day, randomly, in this list.
  • Search and download a photo of this place on Google map, including the related information and the author of the picture.
  • Post all this on a dedicated twitter account.

I imposed to myself very simple rules to do this: to only use Python and API calls. And it must be automated to make one post per day, without having to touch anything.
Let’s see this in details.

The location list

In Switzerland, we have the Federal Statistical Office who publish a lot of useful data and, among others, the list of Swiss municipalities in text mode, already formatted.

This list is free and available here. It contains several information I do not need for my small project. So, my first work was to “clean” a little bit this list to keep only the municipality and the state. At the end, I created a nice CSV file with two columns, municipality and state, and 3480 lines.

But in fact, Switzerland have only 2222 municipality and my file had 1280 more entries. I realized the list contains also the names of the communes even if they have merged into one and also the lakes, and if a lake is over multiple states, I have it multiple times. At the end, I decided to keep my list like this, the more entries there are, if they are in Switzerland, the better it is for the diversity of the photos. The list is available on my GitHub page, here.

The Python Script

The script is available on my GitHub page here. Please be indulgent, there are certainly a lot of points to optimize and I am aware that it is not perfect.

1. Choose randomly one line on the CSV file to get the location

This seems very simple; choose randomly one line on a CSV file. It’s a static file, this should not be a big deal. But, I had a lot of trouble finding the ideal option. Here’s how I created it:

def commune_random():
    # Choose a random commune from the list
    communes_list = "./20170402_communes-list.csv"
    filesize = 3431
    offset = random.randrange(filesize)
    with open(communes_list) as f:
        my_choice = list(csv.reader(f))[offset]

2. Get a photo from Google map related to the location name

First, I needed to get an API key from Google to be able to make API calls. For this, the website contains all the information how to get an API key.

Once this step done, I need to build my API call and extract the needed information from it. This is where it becomes really interesting. All the details on how to make what API call are very well documented here on

For my case, the API call format is this:

From the step above, I get the entire line of the CVS file, formatted like this: “place, state”, so this is already the good format to insert this into my query. I just need to add my API key and extract the needed information to get the photo_reference.

Here is an example of the answer when  I searched “Baar, ZG”:

   "html_attributions" : [],
   "results" : [
         "formatted_address" : "Baar, Suisse",
         "geometry" : {
            "location" : {
               "lat" : 47.1953729,
               "lng" : 8.526087
            "viewport" : {
               "northeast" : {
                  "lat" : 47.2231001,
                  "lng" : 8.577710099999999
               "southwest" : {
                  "lat" : 47.1508099,
                  "lng" : 8.49607
         "icon" : "",
         "id" : "ed24443c409e85dffba8e031e2d9b05f93e683db",
         "name" : "Baar",
         "photos" : [
               "height" : 2592,
               "html_attributions" : [
                  "\u003ca href=\"\"\u003eDinkar Gupta\u003c/a\u003e"
               "photo_reference" : "CmRaAAAAEXQJAx6vQ6rUX-cMMhoX8dGrEif3wCezY5yzjMhsUsLW5LEpgGTzITRO-SrVUgxhj7hC0AWqXe1K0xMXiK6qgKZbcLwr48l6qDU_uUhAb5VCFigTLJCfylAaSHSAkmynEhBVYkWR4FuZTIlKf2BX3qgJGhR9GE5FZj6LHSGBSr5j0GKr88bgwA",
               "width" : 4608
         "place_id" : "ChIJD5CdDsurmkcRWvqmITv_G1E",
         "rating" : 0,
         "reference" : "ChIJD5CdDsurmkcRWvqmITv_G1E",
         "types" : [ "locality", "political" ]
   "status" : "OK"


From this, I needed to extract two things:

  1. The photo_reference (in red), this is the ID to download the picture itself.
  2. The html_attributions (in green), this is the name of the person who post the picture into Google map and the URL of this person.

To extract this, as the format is always the same, it’s just a dictionary extract like this:

photoref = photoref_answer["results"][0]["photos"][0]["photo_reference"]
photoremark = photoref_answer["results"][0]["photos"][0]["html_attributions"][0]

After a few tests, I noticed a problem, I could have a place without any picture. So for this I added a “try” and “except” into my script to repeat the search with another place if this happen.
And to download the picture, I use a simple “GET” and I test the if the server answer code is equal to 200. Otherwise also here, I restart the entire process.


3. Post the picture and text related to it on Twitter

To post on Twitter, I use tweepy, aka Twitter for Python. This is much easier than doing everything “manually” with the Twitter API.

Like for Google, on Twitter, before doing any API call, I needed to get my API key. For this, you can follow this guide. And this blog post is also a good help.

With tweepy, the syntax is very simple. With the picture, I added the name of the place, the name of the state, the name and URL of the author, and a reference to this page.

I decided to create a Twitter account dedicated for this, called Switzerland_Pix. You can view the results here:

Put everything together

Then, I choose to use Git to update my script, and to run it on a free small AWS Linux server, to practice also Git and AWS. Then, the script is activated every morning by a cron.



The documentations of websites like Google or Twitter are very detailed on how to use the API. Everything is accessible for free if we limit ourselves to a few API requests per day. On GitHub, there are a lot of extremely useful resources.

So finally, it was not very difficult. But that’s a relatively easy and short script and I do not consider myself a developer.


Did you like this article? Please share it…

2 Thoughts to “Python script and multiple API calls practice with Google and Twitter”

  1. lost1

    Just 120 lines of code to do that I was thinking a lot more was needed. The result looks impressive and fun.
    I had code course in school, I remember writing a lot more lines to do boring and not impressive stuff that ware running in console. Then I choose networking career to avoid coding, but now I cannot avoid it.
    If you can be both a networking expert and a developer you are the king.

    1. Thank you very much for your comment.
      Yes, it’s a very small script, but it uses libraries like requests and tweepy that are not small. 🙂
      I think for a network engineer it is more useful to know how to use and debug important libraries, like for example paramiko, netmiko, requests, etc.. than knowing how to code thousands of lines. We are not developers.

      Best Regards,

Leave a Comment