Skip to content

DataIntegrationGroup/OcotilloAPI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

NMSampleLocations aka OcotilloAPI

Code Format Dependabot Updates Sentry Release Tests codecov

Geospatial Sample Data Management System
New Mexico Bureau of Geology and Mineral Resources

OcotilloAPI is a FastAPI-based backend service designed to manage geospatial sample location data across New Mexico. It supports research, field operations, and public data delivery for the Bureau of Geology and Mineral Resources.


πŸš€ Features

  • 🌐 RESTful API for managing sample location data
  • πŸ—ΊοΈ Native GeoJSON support via PostGIS
  • πŸ”Ž Filtering by location, date, type, and more
  • πŸ“¦ PostgreSQL + PostGIS database backend
  • πŸ” Optional authentication and role-based access
  • 🧾 Interactive API documentation via OpenAPI and ReDoc

πŸ—ΊοΈ OGC API - Features

The API exposes OGC API - Features endpoints under /ogc.

Landing & metadata

curl http://localhost:8000/ogc
curl http://localhost:8000/ogc/conformance
curl http://localhost:8000/ogc/collections
curl http://localhost:8000/ogc/collections/locations

Items (GeoJSON)

curl "http://localhost:8000/ogc/collections/locations/items?limit=10&offset=0"
curl "http://localhost:8000/ogc/collections/wells/items?limit=5"
curl "http://localhost:8000/ogc/collections/springs/items?limit=5"
curl "http://localhost:8000/ogc/collections/locations/items/123"

BBOX + datetime filters

curl "http://localhost:8000/ogc/collections/locations/items?bbox=-107.9,33.8,-107.8,33.9"
curl "http://localhost:8000/ogc/collections/wells/items?datetime=2020-01-01/2024-01-01"

Polygon filter (CQL2 text)

Use filter + filter-lang=cql2-text with WITHIN(...):

curl "http://localhost:8000/ogc/collections/locations/items?filter=WITHIN(geometry,POLYGON((-107.9 33.8,-107.8 33.8,-107.8 33.9,-107.9 33.9,-107.9 33.8)))&filter-lang=cql2-text"

Property filter (CQL)

Basic property filters are supported with properties:

curl "http://localhost:8000/ogc/collections/wells/items?properties=thing_type='water well' AND well_depth>=100 AND well_depth<=200"
curl "http://localhost:8000/ogc/collections/wells/items?properties=well_purposes IN ('domestic','irrigation')"
curl "http://localhost:8000/ogc/collections/wells/items?properties=well_casing_materials='PVC'"
curl "http://localhost:8000/ogc/collections/wells/items?properties=well_screen_type='Steel'"

πŸ› οΈ Getting Started

Prerequisites

  • Python 3.11+
  • uv package manager
  • Docker Desktop 4+ if wanting to host server/database locally with containers
  • PostgreSQL with PostGIS extension if wanting to host server/database locally without containers

Installation & Setup

1. Clone the repository

git clone https://github.com/DataIntegrationGroup/OcotilloAPI.git
cd OcotilloAPI

2. Set up virtual environment and install dependencies

Mac/Linux Windows
uv venv
source .venv/bin/activate
uv sync --locked
uv venv
source .venv/Scripts/activate
uv sync --locked

3. Setup pre-commit hookes

pre-commit install

4. Setup environment variables

# Edit `.env` to configure database connection and app settings
cp .env.example .env

Notes:

  • Create file gcs_credentials.json in the root directory of the project, and obtain its contents from a teammate.
  • PostgreSQL uses the default port 5432.

In development set MODE=development to allow lexicon enums to be populated. When MODE=development, the app attempts to seed the database with 10 example records via transfers/seed.py; if a contact record already exists, the seed step is skipped.

5. Database and server

Choose one of the following:

Option A: Local PostgreSQL + PostGIS

# run database migrations
alembic upgrade head

# start development server
uvicorn app.main:app --reload

Notes:

  • Requires PostgreSQL with PostGIS installed locally.
  • Use the POSTGRES_* settings in .env for your local instance.

Option B: Docker Compose (dev)

# include -d flag for silent/detached build
docker compose up --build

Notes:

  • Requires Docker Desktop.
  • Spins up two containers: db (PostGIS/PostgreSQL) and app (FastAPI API service).
  • alembic upgrade head runs on app startup after docker compose up.
  • The database listens on port 5432 both inside the container and on your host. Ensure POSTGRES_PORT=5432 in your .env to run local commands against the Docker DB (e.g., uv run pytest, uv run python -m transfers.transfer).

Staging Data

To get staging data into the database: python -m transfers.transfer from the root directory of the project.

🧭 Project Structure

app/
β”œβ”€β”€ .env                    # Environment variables
β”œβ”€β”€ .pre-commit-config.yaml # pre-commit hook configuration file
β”œβ”€β”€ constants.py            # Static variables used throughout the code
β”œβ”€β”€ docker-compose.yml      # Docker compose file to build database and start server
β”œβ”€β”€ entrypoint.sh           # Used by Docker to run database migrations and start server
β”œβ”€β”€ main.py                 # FastAPI entry point
|
β”œβ”€β”€ alembic/                # Alembic configuration and migration scripts
β”œβ”€β”€ api/                    # Route declarations
β”œβ”€β”€ core/                   # Settings, application config, and dependencies
β”œβ”€β”€ db/                     # Database models, sessions, and engine
β”œβ”€β”€ docker/                 # Custom Docker files
β”œβ”€β”€ schemas/                # Pydantic schemas and validations
β”œβ”€β”€ services/               # Reusable business logic, helpers, and database interactions
β”œβ”€β”€ tests/                  # Code tests
└── transfers/              # Scripts to transfer data from NM_Aquifer to current db schema

Model Changes

  1. Revise models in the db/ directory
  2. Revise schemas in the schemas/ directory
    1. Add validators for both fields and models as necessary
    2. Validations on incoming data only should be handled by Pydantic and 422 errors will be raised (default Pydantic)
    3. Validations against values in the database will be handled at the endpoint with custom checks and 409 errors will be raised
  3. Revise tests
    1. Revise fixtures in tests/conftest.py
    2. Revise fields in POST test payloads and asserts
    3. Revise fields in PATCH test payloads and asserts
    4. Revise fields in GET all and GET by ID test asserts
    5. Add tests for validations as necessary

Bonus:

  • Update transfer scripts by revising fields and delineating where they come from in NM_Aquifer

Notes:

  • All Create schema fields are defined as <type> if non-nullable and <type> | None = None if nullable
  • All Update schema fields are optional and default to None
  • All Response schema fields are defined as <type> if non-nullable and <type> | None if nullable
  • All raised exceptions should use the PydanticStyleException as defined in services/exceptions_helper.py
  • Errors handled by the database should be enumerated and handled in a database_error_handler in each router's file---

πŸ“¦ Ocotillo CLI

The oco command exposes project automation and bulk data utilities.

# Display available commands
oco --help

# Bulk import water level data from a CSV
oco water-levels bulk-upload --file water_levels.csv --output json

The bulk upload command parses and validates each row, creates the corresponding field events/samples/observations, and prints a JSON summary (matching the API response shape) so the workflow can be automated or scripted.

πŸ§ͺ Testing

# Run unit tests
pytest

# Run Behave BDD specs
behave tests/features

Tests require a local Postgres/PostGIS instance. Set POSTGRES_* values in .env, run migrations, and ensure the database is reachable before running the suites.

πŸ”„ Data Transfers

Legacy or staging datasets can be imported using the transfer utilities:

python -m transfers.transfer

Configure the .env file with the appropriate credentials before running transfers.

To drop the existing schema and rebuild from migrations before transferring data, set:

export DROP_AND_REBUILD_DB=true

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors 11

Languages