Retrieve data with APIs from Python

APIs (Application Programming Interface) are an expanding way of accessing data. Thanks to APIs, script automation is facilitated since it is longer necessary to store a file and manage its different versions, but only to query a database and let the data producer handle the updates.

Exercice
Manipulation
Author

Lino Galiana

Published

2025-12-09

If you want to try the examples in this tutorial:
View on GitHub Onyxia Onyxia Open In Colab

1 Introduction: What is an API?

In the previous chapters, we saw how to consume data from a file (the simplest access mode) or how to retrieve data through web scraping, a method that allows Python to mimic the behavior of a web browser and extract information by harvesting the HTML that a website serves.

Web scraping is a makeshift approach to accessing data. Fortunately, there are other ways to access data: data APIs. In computing, an API is a set of protocols that enables two software systems to communicate with each other. For example, the term “Pandas API” is sometimes used to indicate that Pandas serves as an interface between your Python code and a more efficient compiled language (C) that performs the calculations you request at the Python level. The goal of an API is to provide a simple access point to a functionality while hiding the implementation details.

In this chapter, we focus mainly on data APIs. They are simply a way to make data available: rather than allowing the user direct access to databases (often large and complex), the API invites them to formulate a query which is processed by the server hosting the database, and then returns data in response to that query.

The increased use of APIs in the context of open data strategies is one of the pillars of the 15 French ministerial roadmaps regarding the opening, circulation, and valorization of public data.

NoteNote

In recent years, an official geocoding service has been established for French territory. It is free and efficiently allows addresses to be geocoded via an API. This API, known as the National Address Database (BAN), has benefited from the pooling of data from various stakeholders (local authorities, postal services, IGN) as well as the expertise of contributors like Etalab. Its documentation is available at https://api.gouv.fr/les-api/base-adresse-nationale.

A common example used to illustrate APIs is that of a restaurant. The documentation is like your menu: it lists the dishes (databases) that you can order and any optional ingredients you can choose (the parameters of your query): chicken, beef, or a vegetarian option? When you order, you don’t get to see the recipe used in the kitchen to prepare your dish – you simply receive the finished product. Naturally, the more refined the dish you request (i.e. involving complex calculations on the server side), the longer it will take to arrive.

TipIllustration with the BAN API

To illustrate this, let’s imagine what happens when, later in the chapter, we make requests to the BAN API.

Using Python, we send our order to the API: addresses that are more or less complete, along with additional instructions such as the municipality code. These extra details are akin to information provided to a restaurant’s server—like dietary restrictions—which personalize the recipe.

Based on these instructions, the dish is prepared. Specifically, a routine is executed on Etalab’s servers that searches an address repository for the one most similar to the address requested, possibly adapting based on the additional details provided. Once the kitchen has completed this preparation, the dish is sent back to the client. In this case, the “dish” consists of geographic coordinates corresponding to the best matching address.

Thus, the client only needs to focus on submitting a proper query and enjoying the dish delivered. The complexity of the process is handled by the specialists who designed the API. Perhaps other specialists, such as those at Google Maps, implement a different recipe for the same dish (geographic coordinates), but they will likely offer a very similar menu. This greatly simplifies your work: you only need to change a few lines of API call code rather than overhauling a long and complex set of address identification methods.

Pedagogical Approach

After an initial presentation of the general principle of APIs, this chapter illustrates their use through Python via a fairly standard use case: we have a dataset that we first want to geolocate. To do this, we will ask an API to return geographic coordinates based on addresses. Later, we will retrieve somewhat more complex information through other APIs.

2 APIs 101

APIs intends to serve as an intermediary between a client and a server. This client can be of two types: a web interface or programming software. The API makes no assumptions about the tool sending it a command; it simply requires adherence to a standard (usually an HTTP request), a query structure (the arguments), and then awaits the result.

2.1 Understanding with an interactive example

The first mode (access via a browser) is primarily used when a web interface allows a user to make choices in order to return results corresponding to those selections. Let’s revisit the example of the geolocation API that we will use in this chapter. Imagine a web interface that offers the user two choices: a postal code and an address. These inputs will be injected into the query, and the server will respond with the appropriate geolocation.

Here are our two widgets that allow the client (the web page user) to choose their address.

A little formatting of the values provided by this widget allows one to obtain the desired query:

This gives us an output in JSON format, the most common output format for APIs.

If a beautiful display is desired, like the map above, the web browser will need to reprocess this output, which is typically done using Javascript, the programming language embedded in web browsers.

2.2 How to Do It with Python?

The principle is the same, although we lose the interactive aspect. With Python, the idea is to construct the desired URL and fetch the result through an HTTP request.

We have already seen in the web scraping chapter how Python communicates with the internet via the requests package. This package follows the HTTP protocol where two main types of requests can be found: GET and POST:

  • The GET request is used to retrieve data from a web server. It is the simplest and most common method for accessing the resources on a web page. We will start by describing this one.
  • The POST request is used to send data to the server, often with the goal of creating or updating a resource. On web pages, it is commonly used for submitting forms that need to update information in a database (passwords, customer data, etc.). We will see its usefulness later, when we begin to deal with authenticated requests where additional information must be submitted with our query.

Let’s conduct a first test with Python as if we were already familiar with this API.

import requests
adresse = "88 avenue verdier"
url_ban_example = f"https://api-adresse.data.gouv.fr/search/?q={adresse.replace(" ", "+")}&postcode=92120"
requests.get(url_ban_example)
<Response [200]>

What do we get? An HTTP code. The code 200 corresponds to successful requests, meaning that the server is able to respond. If this is not the case, for some reason x or y, you will receive a different code.

TipHTTP Status Codes

HTTP status codes are standard responses sent by web servers to indicate the result of a request made by a client (such as a web browser or a Python script). They are categorized based on the first digit of the code:

  • 1xx: Informational
  • 2xx: Success
  • 3xx: Redirection
  • 4xx: Client-side Errors
  • 5xx: Server-side Errors

The key codes to remember are: 200 (success), 400 (bad request), 401 (authentication failed), 403 (forbidden), 404 (resource not found), 503 (the server is unable to respond)

To retrieve the content returned by requests, there are several methods available. When the JSON is well-formatted, the simplest approach is to use the json method, which converts it into a dictionary:

req = requests.get(url_ban_example)
localisation_insee = req.json()
localisation_insee
{'type': 'FeatureCollection',
 'features': [{'type': 'Feature',
   'geometry': {'type': 'Point', 'coordinates': [2.309144, 48.81622]},
   'properties': {'label': '88 Avenue Verdier 92120 Montrouge',
    'score': 0.9734299999999999,
    'housenumber': '88',
    'id': '92049_9625_00088',
    'banId': '92dd3c4a-6703-423d-bf09-fc0412fb4f89',
    'name': '88 Avenue Verdier',
    'postcode': '92120',
    'citycode': '92049',
    'x': 649270.67,
    'y': 6857572.24,
    'city': 'Montrouge',
    'context': '92, Hauts-de-Seine, Île-de-France',
    'type': 'housenumber',
    'importance': 0.70773,
    'street': 'Avenue Verdier',
    '_type': 'address'}}],
 'query': '88 avenue verdier'}

In this case, we can see that the data is nested within a JSON. Therefore, a bit of code needs to be written to extract the desired information from it:

localisation_insee.get('features')[0].get('properties')
{'label': '88 Avenue Verdier 92120 Montrouge',
 'score': 0.9734299999999999,
 'housenumber': '88',
 'id': '92049_9625_00088',
 'banId': '92dd3c4a-6703-423d-bf09-fc0412fb4f89',
 'name': '88 Avenue Verdier',
 'postcode': '92120',
 'citycode': '92049',
 'x': 649270.67,
 'y': 6857572.24,
 'city': 'Montrouge',
 'context': '92, Hauts-de-Seine, Île-de-France',
 'type': 'housenumber',
 'importance': 0.70773,
 'street': 'Avenue Verdier',
 '_type': 'address'}

This is the main disadvantage of using APIs: the post-processing of the returned data. The necessary code is specific to each API, since the structure of the JSON depends on the API.

2.3 How to Know the Inputs and Outputs of APIs?

Here, we took the BAN API as a magical tool whose main inputs (the endpoint, parameters, and their formatting…) were known. But how does one actually get there in practice? Simply by reading the documentation when it exists and testing it with examples.

Good APIs provide an interactive tool called swagger. It is an interactive website where the API’s main features are described and where the user can interactively test examples. These documentations are often automatically created during the construction of an API and made available via an entry point /docs. They often allow you to edit certain parameters in the browser, view the obtained JSON (or the generated error), and retrieve the formatted query that produced it. These interactive browser consoles replicate the experimentation that can otherwise be done using specialized tools like postman.

Regarding the BAN API, the documentation can be found at https://adresse.data.gouv.fr/api-doc/adresse. Unfortunately, it is not interactive. However, it provides many examples that can be directly tested from the browser. You simply need to use the URLs provided as examples. These are presented using curl (a command-line equivalent of requests in Linux):

curl "https://api-adresse.data.gouv.fr/search/?q=8+bd+du+port&limit=15"

Just copy the URL (https://api-adresse.data.gouv.fr/search/?q=8+bd+du+port&limit=15), open a new tab, and verify that it produces a result. Then change a parameter and check again until you find the structure that fits. After that, you can move on to Python as suggested in the following exercise.

2.4 Application

To start this exercise, you will need the following variable:

adresse = "88 Avenue Verdier"
TipExercise 1: Structure an API Call from Python
  1. Test the API without any additional parameters, and convert the result into a DataFrame.
  2. Limit the search to Montrouge using the appropriate parameter and find the corresponding INSEE code or postal code via Google.
  3. (Optional): Display the found address on a map.

The first two rows of the DataFrame obtained in question 1 should be

label score housenumber id banId name postcode citycode x y city context type importance street _type locality
0 88 Avenue Verdier 92120 Montrouge 0.973430 88 92049_9625_00088 92dd3c4a-6703-423d-bf09-fc0412fb4f89 88 Avenue Verdier 92120 92049 649270.67 6857572.24 Montrouge 92, Hauts-de-Seine, Île-de-France housenumber 0.70773 Avenue Verdier address NaN
1 Avenue Verdier 44500 La Baule-Escoublac 0.719413 NaN 44055_3690 NaN Avenue Verdier 44500 44055 291884.83 6701220.48 La Baule-Escoublac 44, Loire-Atlantique, Pays de la Loire street 0.60104 Avenue Verdier address NaN

For question 2, this time we get back only one observation, which could be further processed with GeoPandas to verify that the point has been correctly placed on a map.

label score housenumber id banId name postcode citycode x y city context type importance street _type geometry
0 88 Avenue Verdier 92120 Montrouge 0.97343 88 92049_9625_00088 92dd3c4a-6703-423d-bf09-fc0412fb4f89 88 Avenue Verdier 92120 92049 649270.67 6857572.24 Montrouge 92, Hauts-de-Seine, Île-de-France housenumber 0.70773 Avenue Verdier address POINT (2.30914 48.81622)

Finally, for question 3, we obtain this map (more or less the same as before):

Make this Notebook Trusted to load map: File -> Trust Notebook

3 More Examples of GET Requests

3.1 Main Source

We will use as the main basis for this tutorial the permanent equipment database, a directory of public facilities open to the public.

We will begin by retrieving the data that interest us. Rather than fetching every variable in the file, we only retrieve the ones we need: some variables concerning the facility, its address, and its local municipality.

We will restrict our scope to primary, secondary, and higher education institutions in the department of Haute-Garonne (department 31). These facilities are identified by a specific code, ranging from C1 to C5.

import duckdb

query = """
FROM read_parquet('https://minio.lab.sspcloud.fr/lgaliana/diffusion/BPE23.parquet')
SELECT NOMRS, NUMVOIE, INDREP, TYPVOIE, LIBVOIE,
       CADR, CODPOS, DEPCOM, DEP, TYPEQU,
       concat_ws(' ', NUMVOIE, INDREP, TYPVOIE, LIBVOIE) AS adresse, SIRET
WHERE DEP = '31'
      AND NOT (starts_with(TYPEQU, 'C6') OR starts_with(TYPEQU, 'C7'))
"""

bpe = duckdb.sql(query)
bpe = bpe.to_df()

3.2 Retrieving Custom Data via APIs

We previously covered the general principle of an API request. To further illustrate how to retrieve data on a larger scale using an API, let’s try to fetch supplementary data to our main source. We will use the education directory, which provides extensive information on educational institutions. We will use the SIRET number to cross-reference the two data sources.

The following exercise will demonstrate the advantage of using an API to obtain custom data and the ease of fetching it via Python. However, this exercise will also highlight one of the limitations of certain APIs, namely the volume of data that needs to be retrieved.

TipExercise 2
  1. Visit the swagger of the National Education Directory API on api.gouv.fr/documentation and test an initial data retrieval using the records endpoint without any parameters.
  2. Since we have retained only data from Haute Garonne in our main database, we want to retrieve only the institutions from that department using our API. Make a query with the appropriate parameter, without adding any extras.
  3. Increase the limit on the number of parameters—do you see the problem?
  4. We will attempt to retrieve these data via the data.gouv Tabular API. Its documentation is here and the resource identifier is b22f04bf-64a8-495d-b8bb-d84dbc4c7983 (source). With the help of the documentation, try to retrieve data via this API using the parameter Code_departement__exact=031 to select only the department of interest.
  5. Do you see the problem and how we could automate data retrieval?

The first question allows us to retrieve an initial dataset.

identifiant_de_l_etablissement nom_etablissement type_etablissement statut_public_prive adresse_1 adresse_2 adresse_3 code_postal code_commune nom_commune ... libelle_nature code_type_contrat_prive pial etablissement_mere type_rattachement_etablissement_mere code_circonscription code_zone_animation_pedagogique libelle_zone_animation_pedagogique code_bassin_formation libelle_bassin_formation
0 0371489T Ecole maternelle Marie Pellin Ecole Public Allée Buissonnière None 37520 LA RICHE 37520 37195 La Riche ... ECOLE MATERNELLE 99 0370769K None None 0371522D None None 18A37 TOURAINE-NORD
1 0371502G Ecole élémentaire Lecotté Ecole Public 18 ter rue Anatole France None 37210 VERNOU SUR BRENNE 37210 37270 Vernou-sur-Brenne ... ECOLE DE NIVEAU ELEMENTAIRE 99 0370799T None None 0371729D None None 18A37 TOURAINE-NORD

2 rows × 72 columns

However, there are two issues: the number of rows and the department of interest. Let’s first address the latter with question 2.

identifiant_de_l_etablissement nom_etablissement type_etablissement statut_public_prive adresse_1 adresse_2 adresse_3 code_postal code_commune nom_commune ... libelle_nature code_type_contrat_prive pial etablissement_mere type_rattachement_etablissement_mere code_circonscription code_zone_animation_pedagogique libelle_zone_animation_pedagogique code_bassin_formation libelle_bassin_formation
0 0310171T Ecole maternelle publique Jean Macé Ecole Public 6 square François Lahille None 31770 COLOMIERS 31770 31149 Colomiers ... ECOLE MATERNELLE 99 0311325X None None 0312141J None None 16127 TOULOUSE NORD-OUEST
1 0310176Y Ecole maternelle publique Le Courraou Ecole Public Rue du Courraou None 31210 MONTREJEAU 31210 31390 Montréjeau ... ECOLE MATERNELLE 99 0310005M None None 0312866X None None 16106 COMMINGES
2 0310180C Ecole maternelle publique Jacques Prévert Ecole Public 11 rue Désiré None 31120 PORTET SUR GARONNE 31120 31433 Portet-sur-Garonne ... ECOLE MATERNELLE 99 0311093V None None 0311105H None None 16108 TOULOUSE SUD-OUEST
3 0310183F Ecole maternelle publique le pilat Ecole Public 1 rue du Dr Ferrand None 31800 ST GAUDENS 31800 31483 Saint-Gaudens ... ECOLE MATERNELLE 99 0310083X None None 0311108L None None 16106 COMMINGES
4 0310185H Ecole maternelle publique Jean Dieuzaide Ecole Public 14 chemin de la Glacière None 31200 TOULOUSE 31200 31555 Toulouse ... ECOLE MATERNELLE 99 0311265G None None 0312014W None None 16110 TOULOUSE NORD

5 rows × 72 columns

This is better, but we still only have 10 observations. If we try to adjust the number of rows (question 3), we get the following response from the API:

b'{\n  "error_code": "InvalidRESTParameterError",\n  "message": "Invalid value for limit API parameter: 200 was found but -1 <= limit <= 100 is expected."\n}'

Let’s try using more comprehensive data: the raw file on data.gouv. As seen in the metadata, we know there are over 1,000 schools for which data can be retrieved, but only 20 have been extracted here. The next field directly provides the URL to fetch the next 20 pages: this is how we can ensure we retrieve all our data of interest.

The key part for automating the retrieval of our data is the links key in the JSON:

{'profile': 'https://tabular-api.data.gouv.fr/api/resources/b22f04bf-64a8-495d-b8bb-d84dbc4c7983/profile/',
 'swagger': 'https://tabular-api.data.gouv.fr/api/resources/b22f04bf-64a8-495d-b8bb-d84dbc4c7983/swagger/',
 'next': 'https://tabular-api.data.gouv.fr/api/resources/b22f04bf-64a8-495d-b8bb-d84dbc4c7983/data/?Code_departement__exact=031&page=2&page_size=20',
 'prev': None}

By looping over it to traverse the list of accessible URLs, we can retrieve the data. Since the automation code is rather tedious to write, here it is:

import requests
import pandas as pd

# Initialize the initial API URL
url_api_datagouv = "https://tabular-api.data.gouv.fr/api/resources/b22f04bf-64a8-495d-b8bb-d84dbc4c7983/data/?Code_departement__exact=031&page_size=50"

# Initialize an empty list to store all data entries
all_data = []

# Initialize the URL for pagination
current_url = url_api_datagouv

# Loop until there is no next page
while current_url:
    try:
        # Make a GET request to the current URL
        response = requests.get(current_url)
        response.raise_for_status()  # Raise an exception for HTTP errors

        # Parse the JSON response
        json_response = response.json()

        # Extract data and append to the all_data list
        page_data = json_response.get('data', [])
        all_data.extend(page_data)
        print(f"Fetched {len(page_data)} records from {current_url}")

        # Get the next page URL
        links = json_response.get('links', {})
        current_url = links.get('next')  # This will be None if there's no next page

    except requests.exceptions.RequestException as e:
        print(f"An error occurred: {e}")
        break

The resulting DataFrame is as follows:

schools_dep31 = pd.DataFrame(all_data)
schools_dep31.head()
__id Identifiant_de_l_etablissement Nom_etablissement Type_etablissement Statut_public_prive Adresse_1 Adresse_2 Adresse_3 Code_postal Code_commune ... libelle_nature Code_type_contrat_prive PIAL etablissement_mere type_rattachement_etablissement_mere code_circonscription code_zone_animation_pedagogique libelle_zone_animation_pedagogique code_bassin_formation libelle_bassin_formation
0 1 0311890L Ecole maternelle publique Marie-Louise Ycart Ecole Public Rue du Stade None 31550 CINTEGABELLE 31550 31145 ... ECOLE MATERNELLE 99 0310084Y None None 0313170C None None 16128 TOULOUSE EST
1 2 0311891M Ecole maternelle publique Yvette Raynaud Ecole Public 1 rue des églantiers None 31120 ROQUES 31120 31458 ... ECOLE MATERNELLE 99 0311093V None None 0311105H None None 16108 TOULOUSE SUD-OUEST
2 3 0312111B Ecole élémentaire publique Château d'Ancely Ecole Public 4 allée du Vivarais None 31300 TOULOUSE 31300 31555 ... ECOLE DE NIVEAU ELEMENTAIRE 99 0310037X None None 0312826D None None 16126 TOULOUSE OUEST
3 4 0312119K Ecole élémentaire publique Françoise Dolto Ecole Public Avenue Pierre Mendès France None 31320 CASTANET TOLOSAN 31320 31113 ... ECOLE DE NIVEAU ELEMENTAIRE 99 0311633G None None 0311103F None None 16128 TOULOUSE EST
4 5 0312158C Ecole maternelle publique Eugène Montel Ecole Public Square Saint Exupéry None 31270 CUGNAUX 31270 31157 ... ECOLE MATERNELLE 99 0311093V None None 0311105H None None 16108 TOULOUSE SUD-OUEST

5 rows × 73 columns

We can merge this new data with our previous dataset to enrich it. For reliable production, care should be taken with schools that do not match, but this is not critical for this series of exercises.

bpe_enriched = bpe.merge(
  schools_dep31,
  left_on = "SIRET",
  right_on = "SIREN_SIRET"
)
bpe_enriched.head(2)
NOMRS NUMVOIE INDREP TYPVOIE LIBVOIE CADR CODPOS DEPCOM DEP TYPEQU ... libelle_nature Code_type_contrat_prive PIAL etablissement_mere type_rattachement_etablissement_mere code_circonscription code_zone_animation_pedagogique libelle_zone_animation_pedagogique code_bassin_formation libelle_bassin_formation
0 ECOLE PRIMAIRE PUBLIQUE DENIS LATAPIE LD LA BOURDETTE 31230 31001 31 C108 ... ECOLE DE NIVEAU ELEMENTAIRE 99 0310003K None None 0311108L None None 16106 COMMINGES
1 ECOLE MATERNELLE PUBLIQUE 21 CHE DE L AUTAN 31280 31003 31 C107 ... ECOLE MATERNELLE 99 0311335H None None 0311102E None None 16128 TOULOUSE EST

2 rows × 85 columns

This provides us with data enriched with new characteristics about the institutions. Although there are geographic coordinates in the dataset, we will pretend there aren’t to reuse our geolocation API.

4 Discovering POST Requests

4.1 Logic

So far, we have discussed GET requests. Now, we will introduce POST requests, which allow for more complex interactions with API servers.

To explore this, we will revisit the previous geolocation API but use a different endpoint that requires a POST request.

POST requests are typically used when specific data needs to be sent to trigger an action. For instance, in the web world, if authentication is required, a POST request can send a token to the server, which will respond by accepting your authentication.

In our case, we will send data to the server, which will process it for geolocation and then send us a response. To continue the culinary metaphor, it’s like handing over your own container (tupperware) to the kitchen to collect your takeaway meal.

4.2 Principle

Let’s look at this request provided on the geolocation API’s documentation site:

curl -X POST -F data=@path/to/file.csv -F columns=voie -F columns=ville -F citycode=ma_colonne_code_insee https://api-adresse.data.gouv.fr/search/csv/

As mentioned earlier, curl is a command-line tool for making API requests. The -X POST option clearly indicates that we want to make a POST request.

Other arguments are passed using the -F options. In this case, we are sending a file and adding parameters to help the server locate the data inside it. The @ symbol indicates that file.csv should be read from the disk and sent in the request body as form data.

4.3 Application with Python

We have requests.get, so naturally, we also have requests.post. This time, parameters must be passed to our request as a dictionary, where the keys are argument names and the values are Python objects.

The main challenge, illustrated in the next exercise, lies in passing the data argument: the file must be sent as a Python object using the open function.

TipExercise 3: A POST request to geolocate our data in bulk
  1. Save the adresse, DEPCOM, and Nom_commune columns of the equipment database merged with our previous directory (object bpe_enriched) in CSV format. Before writing to CSV, it may be helpful to replace commas in the adresse column with spaces.
  2. Create the response object using requests.post with the correct arguments to geocode your CSV.
  3. Transform your output into a geopandas object using the following command:
bpe_loc = pd.read_csv(io.StringIO(response.text))

The obtained geolocations take this form

index adresse DEPCOM Nom_commune result_score latitude longitude
0 0 LD LA BOURDETTE 31001 Agassac 0.404534 43.374288 0.880679
1 1 21 CHE DE L AUTAN 31003 Aigrefeuille 0.730453 43.567370 1.585932

By enriching the previous data, this gives:

NOMRS NUMVOIE INDREP TYPVOIE LIBVOIE CADR CODPOS DEPCOM DEP TYPEQU ... etablissement_mere type_rattachement_etablissement_mere code_circonscription code_zone_animation_pedagogique libelle_zone_animation_pedagogique code_bassin_formation libelle_bassin_formation result_score latitude_ban longitude_ban
0 ECOLE PRIMAIRE PUBLIQUE DENIS LATAPIE LD LA BOURDETTE 31230 31001 31 C108 ... None None 0311108L None None 16106 COMMINGES 0.404534 43.374288 0.880679
1 ECOLE MATERNELLE PUBLIQUE 21 CHE DE L AUTAN 31280 31003 31 C107 ... None None 0311102E None None 16128 TOULOUSE EST 0.730453 43.567370 1.585932

2 rows × 88 columns

We can check that the geolocation is not too off by comparing it with the longitudes and latitudes of the education directory added earlier:

NOMRS Nom_commune longitude_annuaire longitude_ban latitude_annuaire latitude_ban
333 ECOLE ELEMENTAIRE PUBLIQUE JEAN-CLAUDE GOUZE Grenade 1.29882 1.301707 43.767418 43.762838
972 COLLEGE LES CHALETS Toulouse 1.43879 1.439313 43.615239 43.615552
152 COLLEGE JACQUES MAURE Castelginest 1.42372 1.422982 43.694716 43.694764
70 ECOLE PRIMAIRE PUBLIQUE LE PETIT PRINCE Belberaud 1.56572 1.566711 43.507009 43.507045
526 ECOLE PRIMAIRE PUBLIQUE GERARD LANG Paulhac 1.55492 1.555474 43.756215 43.757823

Without going into detail, the positions seem very similar, with only minor inaccuracies.

To make use of our enriched data, we can create a map. To add some context to it, we can place a background map of the municipalities. This can be retrieved using cartiflette:

from cartiflette import carti_download
shp_communes = carti_download(
  crs = 4326,
  values = ["31"],
  borders="COMMUNE",
  vectorfile_format="topojson",
  filter_by="DEPARTEMENT",
  source="EXPRESS-COG-CARTO-TERRITOIRE",
  year=2022
)
shp_communes.crs = 4326
This is an experimental version of cartiflette published on PyPi.
To use the latest stable version, you can install it directly from GitHub with the following command:
pip install git+https://github.com/inseeFrLab/cartiflette.git

Represented on a map, this gives the following map:

Make this Notebook Trusted to load map: File -> Trust Notebook