Geospatial Sample Data Management System
New Mexico Bureau of Geology and Mineral Resources
OcotilloAPI is a FastAPI-based backend service designed to manage geospatial sample location data across New Mexico. It supports research, field operations, and public data delivery for the Bureau of Geology and Mineral Resources.
- π RESTful API for managing sample location data
- πΊοΈ Native GeoJSON support via PostGIS
- π Filtering by location, date, type, and more
- π¦ PostgreSQL + PostGIS database backend
- π Optional authentication and role-based access
- π§Ύ Interactive API documentation via OpenAPI and ReDoc
The API exposes OGC API - Features endpoints under /ogc.
curl http://localhost:8000/ogc
curl http://localhost:8000/ogc/conformance
curl http://localhost:8000/ogc/collections
curl http://localhost:8000/ogc/collections/locationscurl "http://localhost:8000/ogc/collections/locations/items?limit=10&offset=0"
curl "http://localhost:8000/ogc/collections/wells/items?limit=5"
curl "http://localhost:8000/ogc/collections/springs/items?limit=5"
curl "http://localhost:8000/ogc/collections/locations/items/123"curl "http://localhost:8000/ogc/collections/locations/items?bbox=-107.9,33.8,-107.8,33.9"
curl "http://localhost:8000/ogc/collections/wells/items?datetime=2020-01-01/2024-01-01"Use filter + filter-lang=cql2-text with WITHIN(...):
curl "http://localhost:8000/ogc/collections/locations/items?filter=WITHIN(geometry,POLYGON((-107.9 33.8,-107.8 33.8,-107.8 33.9,-107.9 33.9,-107.9 33.8)))&filter-lang=cql2-text"Basic property filters are supported with properties:
curl "http://localhost:8000/ogc/collections/wells/items?properties=thing_type='water well' AND well_depth>=100 AND well_depth<=200"
curl "http://localhost:8000/ogc/collections/wells/items?properties=well_purposes IN ('domestic','irrigation')"
curl "http://localhost:8000/ogc/collections/wells/items?properties=well_casing_materials='PVC'"
curl "http://localhost:8000/ogc/collections/wells/items?properties=well_screen_type='Steel'"- Python 3.11+
uvpackage manager- Docker Desktop 4+ if wanting to host server/database locally with containers
- PostgreSQL with PostGIS extension if wanting to host server/database locally without containers
git clone https://github.com/DataIntegrationGroup/OcotilloAPI.git
cd OcotilloAPI| Mac/Linux | Windows |
uv venv
source .venv/bin/activate
uv sync --locked |
uv venv
source .venv/Scripts/activate
uv sync --locked |
pre-commit install# Edit `.env` to configure database connection and app settings
cp .env.example .envNotes:
- Create file gcs_credentials.json in the root directory of the project, and obtain its contents from a teammate.
- PostgreSQL uses the default port 5432.
In development set MODE=development to allow lexicon enums to be populated. When MODE=development, the app attempts to seed the database with 10 example records via transfers/seed.py; if a contact record already exists, the seed step is skipped.
Choose one of the following:
Option A: Local PostgreSQL + PostGIS
# run database migrations
alembic upgrade head
# start development server
uvicorn app.main:app --reloadNotes:
- Requires PostgreSQL with PostGIS installed locally.
- Use the
POSTGRES_*settings in.envfor your local instance.
Option B: Docker Compose (dev)
# include -d flag for silent/detached build
docker compose up --buildNotes:
- Requires Docker Desktop.
- Spins up two containers:
db(PostGIS/PostgreSQL) andapp(FastAPI API service). alembic upgrade headruns on app startup afterdocker compose up.- The database listens on port
5432both inside the container and on your host. EnsurePOSTGRES_PORT=5432in your.envto run local commands against the Docker DB (e.g.,uv run pytest,uv run python -m transfers.transfer).
To get staging data into the database: python -m transfers.transfer from the root directory of the project.
app/
βββ .env # Environment variables
βββ .pre-commit-config.yaml # pre-commit hook configuration file
βββ constants.py # Static variables used throughout the code
βββ docker-compose.yml # Docker compose file to build database and start server
βββ entrypoint.sh # Used by Docker to run database migrations and start server
βββ main.py # FastAPI entry point
|
βββ alembic/ # Alembic configuration and migration scripts
βββ api/ # Route declarations
βββ core/ # Settings, application config, and dependencies
βββ db/ # Database models, sessions, and engine
βββ docker/ # Custom Docker files
βββ schemas/ # Pydantic schemas and validations
βββ services/ # Reusable business logic, helpers, and database interactions
βββ tests/ # Code tests
βββ transfers/ # Scripts to transfer data from NM_Aquifer to current db schema
- Revise models in the
db/directory - Revise schemas in the
schemas/directory- Add validators for both fields and models as necessary
- Validations on incoming data only should be handled by Pydantic and 422 errors will be raised (default Pydantic)
- Validations against values in the database will be handled at the endpoint with custom checks and 409 errors will be raised
- Revise tests
- Revise fixtures in
tests/conftest.py - Revise fields in POST test payloads and asserts
- Revise fields in PATCH test payloads and asserts
- Revise fields in GET all and GET by ID test asserts
- Add tests for validations as necessary
- Revise fixtures in
Bonus:
- Update transfer scripts by revising fields and delineating where they come from in
NM_Aquifer
Notes:
- All
Createschema fields are defined as<type>if non-nullable and<type> | None = Noneif nullable - All
Updateschema fields are optional and default toNone - All
Responseschema fields are defined as<type>if non-nullable and<type> | Noneif nullable - All raised exceptions should use the
PydanticStyleExceptionas defined inservices/exceptions_helper.py - Errors handled by the database should be enumerated and handled in a database_error_handler in each router's file---
The oco command exposes project automation and bulk data utilities.
# Display available commands
oco --help
# Bulk import water level data from a CSV
oco water-levels bulk-upload --file water_levels.csv --output jsonThe bulk upload command parses and validates each row, creates the corresponding field events/samples/observations, and prints a JSON summary (matching the API response shape) so the workflow can be automated or scripted.
# Run unit tests
pytest
# Run Behave BDD specs
behave tests/featuresTests require a local Postgres/PostGIS instance. Set
POSTGRES_*values in.env, run migrations, and ensure the database is reachable before running the suites.
Legacy or staging datasets can be imported using the transfer utilities:
python -m transfers.transferConfigure the .env file with the appropriate credentials before running transfers.
To drop the existing schema and rebuild from migrations before transferring data, set:
export DROP_AND_REBUILD_DB=true