Engage API for data export
This section will be your guide on how to export data from the Engage platform with help of Engage API.
Longenesis.Engage provides API for eligible clients. Endpoints to export participants' responses in JSON and Excel formats are available. Check the python code sample provided below to see how to obtain the token, as well as download the export file.
Additionally, interactive API documentation is available here: https://engage-openapi.longenesis.com/docs.

import requests
AUTH_ENDPOINT = "https://auth.longenesis.com/realms/curator-engage/protocol/openid-connect/token"
USERNAME = "<your username>"
PASSWORD = "<your password>"
CLIENT_ID = "<Longenesis issued client id>"
CLIENT_SECRET = "<Longenesis issued client id>"
ENGAGE_ORG_SLUG = "<orgagization slug>"
ENGAGE_ACTIVITY_SLUG = "<activity slug>"
access_token = requests.post(
        "username": USERNAME,
        "password": PASSWORD,
        "grant_type": "password",
        "client_id": CLIENT_ID,
        "client_secret": CLIENT_SECRET,
response = requests.get(
        "activity_filter": ENGAGE_ACTIVITY_SLUG,
        "date_filter": ["2022-01-01", "2023-01-31"],
    headers={"Authorization": f"Bearer {access_token}"},
with open("report_file.xlsx", "wb") as f:
Note, it is possible to export up to 5000 activity entries at one time.

In order to export the total activity results where the number of submissions exceed 5000, apply the parameter offset to define the number of entry with which to start the export. 5000 entries starting with the one indicated in the offset parameter will be exported. Below is an example how the offset parameter has been applied to export 20000 entries in four separate export requests.

  • /v2/xlsx_answers/ENGAGE_ORG_SLUG?offset=0&limit=5000
  • /v2/xlsx_answers/ENGAGE_ORG_SLUG?offset=5000&limit=5000
  • /v2/xlsx_answers/ENGAGE_ORG_SLUG?offset=10000&limit=5000
  • /v2/xlsx_answers/ENGAGE_ORG_SLUG?offset=15000&limit=5000
Filtering data by date
Please note how date_filter parameter here is a list. If you use curl-like download tool, the download link will look the following way:

Notice how date_filter appears twice in the URL. This is ok. The multiple date_filter values will get combined into a list on the API server. Please note that you can pass multiple intervals using date_filter, for instance, ...&date_filter=START1&date_filter=END1&date_filter=START2&date_filter=END2&, where START1, END1, START2, END2 are successive timestamps with timezone information.

Please consult https://docs.python.org/3/library/datetime.html#datetime.datetime.fromisoformat on how to construct valid ISO 8601 timestamps.
Filtering data by participants
The person_filter parameter allows you to limit the exported information to specific persons.

The download link will look the following way:

How to run the sample script
We provide the sample script in the python programming language. However, the workflow illustrated here will be the same in any other language.
To run the sample script you will need credentials (username, password, client_id, client_secret are issued by Longenesis) and the ID of the activity (you can find it in the Engage platform).
Apart from having Python installed, you will also need the popular requests library to make API calls and optionally a virtual environment too.
Assuming you have saved the sample code in a file my_sample_script.py and have set the credentials and activity ID correctly, you can create the virtual environment, run the script and have your export file by running the following commands in terminal:

python3 -m venv venv
source venv/bin/activate
pip install requests
python3 my_sample_script.py
A new file report_file.xlsx should now appear in the same folder.
Full API documentation is available here.
We use cookies in order to secure and improve the Longenesis web page functionality, as well as to optimize your experience within this page.
Please see our Privacy policy for more information on how we use the information about your use of our web page. By continuing to use this web page you agree to our Privacy Policy.