Recurring Date Queries
The questions a developer asks most often when working with disaster data are temporal: "what happened in the last 24 hours?", "is anything still active?", "what events did I miss since my last sync?". This page collects ready-to-run recipes for those questions, organised by hazard module, using the real property names of the live API.
Two surfaces for time, and which one to use
The ADAM API gives you two complementary ways to filter by time:
datetime=query parameter — the OGC standard temporal filter. Supports instants and intervals (with..for open-ended). Operates on the collection's primary temporal extent.filter=withTIMESTAMP('...')— lets you compare against an arbitrary date-time field (e.g.updated_atfor delta polling,published_atfor "events that became visible after X"). This is what the Live Map itself uses internally.
Practical rule:
- For simple date windows on the collection's main timeline → use
datetime=. - For delta polling ("everything updated since my last sync") and for filtering on alternative date fields → use
filter=field > TIMESTAMP('...'). - Combine
datetime=,filter=,bbox=,sortby=freely.
What CQL2 actually works on this API
The conformance document does not declare a CQL2 class — but the server (TiPG) accepts a useful subset.
Works: =, >=, <=, >, <, <>, AND, OR, NOT, IN (...), IS NULL, IS NOT NULL, LIKE (case-sensitive), CASEI(prop) = '...' (case-insensitive equality), TIMESTAMP('YYYY-MM-DDTHH:MM:SS') literals.
Fails (HTTP 500): ILIKE, BETWEEN, T_AFTER / T_BEFORE / T_DURING predicates, and TIMESTAMP('...Z') with a trailing Z — the server stores timestamps without timezone suffix and the literal must match exactly. Always use TIMESTAMP('2026-04-06T00:00:00'), never with Z.
Production patterns from the Live Map
The Live Map at adam.geospatial.wfp.org hits the same API you do. Below are the canonical query shapes it uses, captured from production. Drop them into a terminal and you should see live data.
About Accept: application/geo+json
The API gateway returns the body base64-encoded when the client does not declare a precise Accept header (*/* triggers binary handling). Always send -H "Accept: application/geo+json" with curl to receive plain JSON. Browsers and most HTTP libraries (httpx, requests, fetch) handle this automatically.
Recent significant earthquakes with a published map
Find earthquakes published since a given date, magnitude ≥ 5.0, with the dashboard already generated:
$ curl -sS -H "Accept: application/geo+json" \
"https://api.adam.geospatial.wfp.org/api/collections/adam.adam_eq_events/items?limit=2&filter=published_at%20%3E%20TIMESTAMP('2026-04-06T00:00:00')%20AND%20mag%20%3E=%205.0%20AND%20map_created%20=%20TRUE" \
| jq '{numberMatched, sample: (.features[0].properties | {published_at, mag, place, iso3, map_created})}'
{
"numberMatched": 24,
"sample": {
"published_at": "2026-04-06T07:22:42",
"mag": 5.2,
"place": "2km N of Tabuelan",
"iso3": "PHL",
"map_created": true
}
}
Recently published tropical storms
$ curl -sS -H "Accept: application/geo+json" \
"https://api.adam.geospatial.wfp.org/api/collections/adam.adam_ts_events/items?limit=2&filter=published_at%20%3E%20TIMESTAMP('2026-04-15T00:00:00')" \
| jq '{numberMatched, sample: (.features[0].properties | {name, published_at, current_storm_status, alert_level, wind_speed, iso3, cleared})}'
{
"numberMatched": 20,
"sample": {
"name": "SINLAKU-26",
"published_at": "2026-04-15T02:21:45",
"current_storm_status": "Category 3",
"alert_level": "Red",
"wind_speed": 203.7024,
"iso3": "MNP",
"cleared": false
}
}
Delta polling — what changed since my last sync
The Live Map refreshes its track and buffer overlays by asking "what was updated after my last poll". The updated_at field is the right pivot — and the value to pass is the timestamp of your previous successful call.
$ curl -sS -H "Accept: application/geo+json" \
"https://api.adam.geospatial.wfp.org/api/collections/adam.adam_ts_buffers/items?limit=100&filter=updated_at%20%3E%20TIMESTAMP('2026-04-15T00:00:00')" \
| jq '{numberMatched, fields: (.features[0].properties | keys)}'
{
"numberMatched": 4,
"fields": [
"alert_level", "alert_level_label", "created_at",
"episode_id", "event_id", "id", "name",
"source", "uid", "updated_at"
]
}
The same pattern works on adam.adam_ts_tracks, adam.adam_ts_events, adam.adam_eq_events — any collection that has an updated_at (or comparable) date-time field.
Active flood alerts in a window
For flood alerts, the date pivot is date_proc (processing date) and the active/closed flag is cleared, a string with values 'yes' or 'no':
$ curl -sS -H "Accept: application/geo+json" \
"https://api.adam.geospatial.wfp.org/api/collections/adam.adam_fl_alerts/items?limit=2&filter=date_proc%20%3E%20'2026-04-16T00:00:00'%20AND%20cleared%20=%20'yes'" \
| jq '{numberMatched, fields: (.features[0].properties | keys)}'
{
"numberMatched": 3,
"fields": [
"adm0_name", "adm1_name", "alert", "alertid",
"cleared", "date_proc", "dateofpeak", "index",
"iso3", "prod_url", "rb", "riverbasin", "trend", "warning"
]
}
Note: on flood collections the date fields are plain strings (no TIMESTAMP() wrapper needed) — direct ISO-8601 string comparison works.
Storm by name, excluding history points
To follow a specific storm's forecast nodes — but skipping the historical "previous position" markers used for visual context:
$ curl -sS -H "Accept: application/geo+json" \
"https://api.adam.geospatial.wfp.org/api/collections/adam.adam_ts_nodes/items?limit=2&filter=name%20IN%20('SINLAKU-26')%20AND%20node_time%20%3C%3E%20'previous%20position'" \
| jq '{numberMatched, sample: (.features[0].properties | {name, node_time, wind_speed, storm_status})}'
{
"numberMatched": 3,
"sample": {
"name": "SINLAKU-26",
"node_time": "19/04 18:00 UTC",
"wind_speed": 92.592,
"storm_status": "Tropical Storm"
}
}
node_time is a free-form string with one well-known sentinel value, 'previous position', which marks past observed positions distinct from forecast steps. The <> (not equal) operator is the cleanest way to exclude it.
Universal date patterns
These work on every collection that declares a proper temporal extent — i.e. the earthquake and tropical storm event/track/buffer/node collections. Substitute {collection} with the id you care about — see Collections.
Flood events are different
adam.adam_fl_events does not declare a typed temporal field, so the datetime= query parameter currently returns HTTP 500 on it. Filter the date fields directly with string comparison instead (e.g. filter=effective > '2026-04-10T00:00:00'). The flood alerts collection (adam.adam_fl_alerts) follows the same string-comparison pattern using date_proc.
Last N hours / days
datetime= accepts an interval. Compute "now − N" client-side:
from datetime import datetime, timedelta, timezone
import httpx
BASE = "https://api.adam.geospatial.wfp.org/api"
since = (datetime.now(timezone.utc) - timedelta(days=1)).strftime("%Y-%m-%dT%H:%M:%SZ")
url = f"{BASE}/collections/adam.adam_eq_events/items"
resp = httpx.get(url, params={"datetime": f"{since}/..", "limit": 100})
Year-to-date
?datetime=2026-01-01T00:00:00Z/..
A closed period (e.g. last calendar week)
from datetime import datetime, timedelta, timezone
today = datetime.now(timezone.utc).date()
last_monday = today - timedelta(days=today.weekday() + 7)
last_sunday = last_monday + timedelta(days=6)
datetime_param = f"{last_monday}T00:00:00Z/{last_sunday}T23:59:59Z"
"Events updated since my last sync" (delta polling)
For monitoring apps, store the timestamp of the last successful poll and use it as the lower bound on the next call. The temporal extent of the collection (which datetime= filters) is the closest standard proxy for "freshness". For finer-grained delta on updated_at, you would need server-side support that the current API does not expose; the practical workaround is to over-fetch a small window and de-duplicate by feature id client-side.
last_seen = "2026-04-19T10:00:00Z" # persist this between runs
seen_ids = {...} # persist these too
new = httpx.get(
f"{BASE}/collections/adam.adam_eq_events/items",
params={"datetime": f"{last_seen}/..", "limit": 500},
).json()
fresh = [f for f in new["features"] if f["id"] not in seen_ids]
Anniversary — same week, last year
from datetime import datetime, timedelta, timezone
today = datetime.now(timezone.utc).date()
anchor = today.replace(year=today.year - 1)
window = f"{anchor}T00:00:00Z/{anchor + timedelta(days=7)}T00:00:00Z"
Earthquakes — adam.adam_eq_events
Real queryables we use here: mag, mmi, depth, iso3, place, on_land, alert_sent, published.
M5+ earthquakes globally in the last 24 hours
GET /collections/adam.adam_eq_events/items
?datetime={now-24h}/..
&filter=mag >= 5
&sortby=-mag
&limit=50
All M6+ events of 2026, strongest first
?datetime=2026-01-01T00:00:00Z/2026-12-31T23:59:59Z
&filter=mag >= 6
&sortby=-mag
On-land events only, in a given country
on_land = true excludes purely offshore events; iso3 filters by country (ISO3 code).
?filter=on_land = true AND iso3 = 'TUR'
&datetime=2026-01-01T00:00:00Z/..
&sortby=-mag
Events for which an alert was actually dispatched
Combine the operational flag with a time window. Useful for reporting on dispatched products.
?filter=alert_sent = true AND published = true
&datetime=2026-01-01T00:00:00Z/..
&sortby=-mag
High-intensity OR high-magnitude shake
CQL2 OR lets you express "either of two severity criteria":
?filter=mag >= 7 OR mmi >= 8
&datetime=2026-01-01T00:00:00Z/..
Floods — adam.adam_fl_events
Real queryables: iso3, country, fl_popn, fl_croplnd, flood_area, cleared, effective, date_proc, source.
Two important quirks:
- The date fields (
effective,cleared,date_proc) are typed as plain strings (noformat: date-time). Thedatetime=query parameter does not work on this collection — but direct ISO-8601 string comparison infilter=does (the comparison is lexicographic, which agrees with ISO-8601 ordering). clearedis a string with values'yes'or'no', not a boolean. Filter ascleared = 'yes'/cleared = 'no'.
Currently open flood events
?filter=cleared = 'no'
&sortby=-fl_popn
Floods that became effective after a given date
?filter=effective > '2026-04-01T00:00:00'
&sortby=-fl_popn
Top floods by population affected
?filter=fl_popn > 0
&sortby=-fl_popn
&limit=10
Floods for a country, ranked by impact
?filter=iso3 = 'BGD'
&sortby=-fl_popn
Large-extent floods
?filter=flood_area > 1000
&sortby=-flood_area
Tropical Storms — adam.adam_ts_events
Real queryables we use here: name, iso3, wind_speed, max_storm_surge, alert_level, current_storm_status, cleared, published.
Currently active named storms
cleared = false means the storm is still being tracked.
?filter=cleared = false AND published = true
&datetime=2026-01-01T00:00:00Z/..
&sortby=-wind_speed
Storms with red or orange alert level
?filter=alert_level IN ('Red', 'Orange')
&datetime=2026-01-01T00:00:00Z/..
&sortby=-wind_speed
Find a storm by name (case-insensitive)
ILIKE is not supported on this API. Use the OGC CASEI() function to fold case before equality:
?filter=CASEI(name) = 'ian'
For prefix matching, fall back to case-sensitive LIKE:
?filter=name LIKE 'IAN%'
Storms with significant wind affecting a country
?filter=iso3 = 'PHL' AND wind_speed >= 100
&datetime=2026-01-01T00:00:00Z/..
&sortby=-wind_speed
Storms cleared in a given period (post-event review)
datetime= filters on the collection's primary temporal extent, so cleared storms within a window are reachable by combining the time window with the boolean flag:
?datetime=2026-01-01T00:00:00Z/2026-04-30T23:59:59Z
&filter=cleared = true
&sortby=-wind_speed
Composite recipes
The interesting questions usually combine bbox + datetime + filter + sortby. A few capstones to copy-adapt.
M5+ earthquakes in the Eastern Mediterranean, year-to-date
?bbox=20,30,40,42
&datetime=2026-01-01T00:00:00Z/..
&filter=mag >= 5
&sortby=-mag
Active cyclones near the Philippines
?bbox=115,5,128,21
&datetime=2026-01-01T00:00:00Z/..
&filter=cleared = false AND wind_speed >= 80
&sortby=-wind_speed
Floods affecting more than 100k people in a country
?filter=iso3 = 'BGD' AND fl_popn >= 100000
&sortby=-fl_popn
Putting a recipe into a script
Here is the canonical pattern. Drop in any URL from above, swap the params:
import httpx
from datetime import datetime, timedelta, timezone
BASE = "https://api.adam.geospatial.wfp.org/api"
COLLECTION = "adam.adam_eq_events"
since = (datetime.now(timezone.utc) - timedelta(hours=24)).strftime("%Y-%m-%dT%H:%M:%SZ")
resp = httpx.get(
f"{BASE}/collections/{COLLECTION}/items",
params={
"datetime": f"{since}/..",
"filter": "mag >= 5",
"sortby": "-mag",
"limit": 100,
},
)
resp.raise_for_status()
fc = resp.json()
print(f"{fc['numberMatched']} matches; first {fc['numberReturned']} shown")
for f in fc["features"]:
p = f["properties"]
print(f" M{p.get('mag', '?')} {p.get('place', '?')} {p.get('published_at', '?')}")
Troubleshooting recurring date queries
- HTTP 500 with
TIMESTAMP('...Z')— drop the trailingZ. Stored timestamps have no timezone suffix; the literal must match. UseTIMESTAMP('2026-04-06T00:00:00'). - HTTP 500 when using
datetime=onadam.adam_fl_events— flood events have string-typed date fields and no declared temporal extent. Usefilter=effective > '2026-04-01T00:00:00'(string comparison) instead. - Response body looks like base64 gibberish — the API gateway returns base64 when
Accept: */*(curl's default). Send-H "Accept: application/geo+json"to get plain JSON. Browsers andhttpx/requestshandle this for you. cleared IS NULLreturns 0 rows on flood collections —clearedis a string with values'yes'/'no', not a boolean. Usecleared = 'no'for active events.- Empty results when you expect data — check the collection's temporal extent at
GET /collections/{id}to confirm your window overlaps. Open the same query with?f=htmlfor visual inspection. ILIKEreturns 500 — useCASEI(prop) = 'value'for case-insensitive equality, orLIKE 'PATTERN%'for case-sensitive prefix matching.BETWEENreturns 500 — use the equivalentprop >= a AND prop <= b.- Wrong property name — every collection has its own
/queryables. Always confirm names againstGET /collections/{id}/queryables.
Next steps
- Query Features — the foundational reference for
bbox,datetime,filter,sortby, pagination - Display on a Map — render a query result visually
- Fetch Event Outputs — drill from a single event into its derived products
Live references
- Items endpoint on Swagger →
items_collections__collectionId__items_get - Queryables on Swagger →
queryables_collections__collectionId__queryables_get - Browse earthquake queryables live → /api/collections/adam.adam_eq_events/queryables