Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 26 additions & 0 deletions IOS_SHORTCUTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,3 +43,29 @@ Same as Photo workflow but use endpoint: `http://<SERVER_IP>:3550/api/analyze_co
- **Connection fails** — Ensure your phone is on the same network. Test `http://<SERVER_IP>:3550/docs` in Safari.
- **Invalid response** — Check field names are exactly `file` and/or `user_prefs` (case-sensitive).
- **Photo won't upload** — Ensure the form field key is `file` and value comes from the Take Photo action.

## Import Profile from Share Sheet

Share a profile URL (`.json` or `.met`) from any app to import it directly into MeticAI.

### Setup

| # | Action | Configuration |
|---|--------|---------------|
| 1 | **Receive** | URLs from Share Sheet |
| 2 | **URL Encode** | Encode Shortcut Input |
| 3 | **URL** | `http://<SERVER_IP>:3550/?import=` appended with URL Encoded Text |
| 4 | **Open URLs** | Open the URL from step 3 |

### How It Works

MeticAI supports a `?import=<url>` query parameter. When the app loads with this parameter, it automatically opens the import dialog and begins importing the profile from the given URL.

### Alternative: Direct API Import

| # | Action | Configuration |
|---|--------|---------------|
| 1 | **Receive** | URLs from Share Sheet |
| 2 | **Get Contents of URL** | URL: `http://<SERVER_IP>:3550/api/import-from-url`, Method: POST, Body: JSON, `url` = Shortcut Input |
| 3 | **Get Dictionary Value** | Key: `profile_name` |
| 4 | **Show Notification** | Text: "Imported: " + Dictionary Value |
103 changes: 103 additions & 0 deletions apps/server/api/routes/profiles.py
Original file line number Diff line number Diff line change
Expand Up @@ -2265,6 +2265,109 @@ async def import_profile(request: Request):
)



def _validate_url_for_ssrf(url: str) -> None:
"""Reject URLs that could hit internal/private network resources."""
parsed = urlparse(url)
if parsed.scheme not in ("http", "https"):
raise ValueError(f"Unsupported scheme: {parsed.scheme}")
hostname = parsed.hostname
if not hostname:
raise ValueError("No hostname in URL")
blocked = {"localhost", "127.0.0.1", "0.0.0.0", "::1", "metadata.google.internal"}
if hostname.lower() in blocked:
raise ValueError(f"Blocked hostname: {hostname}")
try:
ip = ipaddress.ip_address(socket.gethostbyname(hostname))
if ip.is_private or ip.is_loopback or ip.is_link_local or ip.is_reserved:
raise ValueError(f"URL resolves to private/reserved IP: {ip}")
except socket.gaierror:
pass # Let httpx handle DNS errors


@router.post("/api/import-from-url")
async def import_from_url(request: Request):
"""Import a profile from a URL (JSON or .met format)."""
Comment on lines +2288 to +2290
Copy link

Copilot AI Mar 25, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are extensive tests for /api/profile/import (see TestProfileImportEndpoint), but no tests for the new /api/import-from-url route. Add unit tests covering: invalid URL/scheme, remote non-200, non-JSON body, missing name, dedupe/exists behavior, and successful import (mocking httpx + history save).

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Resolved in 0ef3ef4. Added TestImportFromUrl class with 8 tests: missing URL, invalid scheme, SSRF blocked, remote non-200, non-JSON body, missing name field, dedupe/exists, and full success path. All 829 backend tests pass.

request_id = request.state.request_id
try:
body = await request.json()
url = body.get("url", "").strip()
generate_description = body.get("generate_description", True)
if not url:
raise HTTPException(status_code=400, detail="No URL provided")
parsed = urlparse(url)
if parsed.scheme not in ("http", "https"):
raise HTTPException(status_code=400, detail="Only http and https URLs are supported")
try:
_validate_url_for_ssrf(url)
except ValueError as exc:
raise HTTPException(status_code=400, detail=f"Blocked URL: {exc}")
logger.info("Importing profile from URL: %s", url, extra={"request_id": request_id})
Comment on lines +2294 to +2305
Copy link

Copilot AI Mar 25, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This endpoint fetches an arbitrary user-provided URL with no SSRF protections. As written it can be used to probe internal services (localhost, 127.0.0.0/8, RFC1918, link-local, cloud metadata IPs, etc.) via the server. Add host/IP allow/deny validation (including DNS resolution and post-redirect validation) before performing the request, and reject private/loopback/link-local ranges.

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Resolved in 0ef3ef4. Added _validate_url_for_ssrf() that blocks: non-http(s) schemes, localhost/127.0.0.1/::1/0.0.0.0/metadata.google.internal hostnames, and any URL that resolves to a private, loopback, link-local, or reserved IP address. Called before the fetch, returns 400 on violation.

max_size = 5 * 1024 * 1024
try:
async with httpx.AsyncClient(timeout=10.0, follow_redirects=True) as client:
async with client.stream("GET", url) as resp:
resp.raise_for_status()
chunks: list[bytes] = []
total = 0
async for chunk in resp.aiter_bytes(8192):
total += len(chunk)
if total > max_size:
raise HTTPException(status_code=413, detail="Response too large (max 5 MB)")
chunks.append(chunk)
content = b"".join(chunks)
except httpx.TimeoutException:
raise HTTPException(status_code=408, detail="Request timed out fetching URL")
except httpx.HTTPStatusError as exc:
raise HTTPException(status_code=502, detail=f"Remote server returned {exc.response.status_code}")
except httpx.RequestError as exc:
raise HTTPException(status_code=502, detail=f"Failed to fetch URL: {exc}")
try:
profile_json = json.loads(content)
except Exception:
raise HTTPException(status_code=400, detail="URL did not return valid JSON")
if not isinstance(profile_json, dict):
raise HTTPException(status_code=400, detail="URL did not return a valid profile object")
if not profile_json.get("name"):
raise HTTPException(status_code=400, detail="Profile is missing a 'name' field")
profile_name = profile_json["name"]
reply = None
if generate_description:
try:
reply = await _generate_profile_description(profile_json, request_id)
except Exception as e:
logger.warning("Failed to generate description for URL import: %s", e, extra={"request_id": request_id})
from services.analysis_service import _build_static_profile_description
reply = _build_static_profile_description(profile_json)
else:
from services.analysis_service import _build_static_profile_description
reply = _build_static_profile_description(profile_json)
entry_id = str(uuid.uuid4())
created_at = datetime.now(timezone.utc).isoformat()
new_entry = {"id": entry_id, "created_at": created_at, "profile_name": profile_name, "user_preferences": f"Imported from URL: {url}", "reply": reply, "profile_json": deep_convert_to_dict(profile_json), "imported": True, "import_source": "url"}
with _history_lock:
history = load_history()
entries = history if isinstance(history, list) else history.get("entries", [])
for entry in entries:
if entry.get("profile_name") == profile_name:
return {"status": "exists", "message": f"Profile '{profile_name}' already exists", "entry_id": entry.get("id"), "profile_name": profile_name}
entries.insert(0, new_entry)
save_history(entries)
machine_profile_id = None
try:
result = await async_create_profile(profile_json)
machine_profile_id = result.get("id") if isinstance(result, dict) else None
logger.info("URL-imported profile uploaded to machine: %s", profile_name, extra={"request_id": request_id, "machine_profile_id": machine_profile_id})
except Exception as exc:
logger.warning("Profile saved to history but failed to upload to machine: %s", exc, extra={"request_id": request_id, "error_type": type(exc).__name__})
logger.info("Profile imported from URL successfully: %s", profile_name, extra={"request_id": request_id, "entry_id": entry_id, "source_url": url})
return {"status": "success", "entry_id": entry_id, "profile_name": profile_name, "has_description": reply is not None and "Description generation failed" not in reply, "uploaded_to_machine": machine_profile_id is not None}
except HTTPException:
raise
except Exception as e:
logger.error("Failed to import profile from URL: %s", str(e), exc_info=True, extra={"request_id": request_id, "error_type": type(e).__name__})
raise HTTPException(status_code=500, detail={"status": "error", "error": str(e)})

@router.post("/api/profile/import-all")
async def import_all_profiles(request: Request):
"""Import all profiles from the Meticulous machine that aren't already in history.
Expand Down
140 changes: 140 additions & 0 deletions apps/server/test_main.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
import time
import asyncio
import requests
import httpx


# Import the app and services
Expand Down Expand Up @@ -13874,3 +13875,142 @@ def test_prompt_empty_iterations(self):
prompt = build_dialin_recommendation_prompt(roast_level="dark", iterations=[])
assert "dark" in prompt
assert "Iteration" not in prompt


class TestImportFromUrl:
"""Tests for the /api/import-from-url endpoint."""

VALID_PROFILE = {"name": "URL Espresso", "temperature": 93.0, "stages": [{"name": "extraction"}]}

@staticmethod
def _mock_httpx_stream(response_bytes=b'{}', raise_for_status_error=None):
"""Build a mock httpx.AsyncClient that streams response_bytes."""
class FakeStream:
def __init__(self):
self.status_code = 200
def raise_for_status(self):
if raise_for_status_error:
raise raise_for_status_error
async def aiter_bytes(self, chunk_size=8192):
yield response_bytes
async def __aenter__(self):
return self
async def __aexit__(self, *args):
return False

class FakeClient:
def __init__(self, **kwargs):
pass
def stream(self, method, url):
return FakeStream()
async def __aenter__(self):
return self
async def __aexit__(self, *args):
return False

return FakeClient

def test_missing_url(self, client):
"""Returns 400 when URL is missing or empty."""
resp = client.post("/api/import-from-url", json={"url": ""})
assert resp.status_code == 400
assert "No URL" in resp.json()["detail"]

def test_invalid_scheme(self, client):
"""Returns 400 for non-http(s) schemes."""
resp = client.post("/api/import-from-url", json={"url": "ftp://example.com/profile.json"})
assert resp.status_code == 400
assert "http" in resp.json()["detail"].lower() or "scheme" in resp.json()["detail"].lower()

@patch('api.routes.profiles._validate_url_for_ssrf')
def test_ssrf_blocked(self, mock_validate, client):
"""Returns 400 when SSRF validation blocks the URL."""
mock_validate.side_effect = ValueError("URL resolves to private/reserved IP: 127.0.0.1")
resp = client.post("/api/import-from-url", json={"url": "http://localhost:8080/profile.json"})
assert resp.status_code == 400
assert "Blocked URL" in resp.json()["detail"]

@patch('api.routes.profiles._validate_url_for_ssrf')
@patch('httpx.AsyncClient')
def test_remote_non_200(self, mock_client_class, mock_ssrf, client):
"""Returns 502 when the remote server returns an error."""
mock_ssrf.return_value = None
mock_client_class.side_effect = [TestImportFromUrl._mock_httpx_stream(
raise_for_status_error=httpx.HTTPStatusError(
"Not Found", request=MagicMock(), response=MagicMock(status_code=404)
)
)]
# Replace with direct class substitution
fake_cls = TestImportFromUrl._mock_httpx_stream(
raise_for_status_error=httpx.HTTPStatusError(
"Not Found", request=MagicMock(), response=MagicMock(status_code=404)
)
)
mock_client_class.side_effect = None
mock_client_class.return_value = fake_cls()
resp = client.post("/api/import-from-url", json={"url": "https://example.com/missing.json"})
assert resp.status_code == 502
assert "404" in resp.json()["detail"]

@patch('api.routes.profiles._validate_url_for_ssrf')
@patch('httpx.AsyncClient')
def test_non_json_body(self, mock_client_class, mock_ssrf, client):
"""Returns 400 when URL returns non-JSON content."""
mock_ssrf.return_value = None
fake_cls = TestImportFromUrl._mock_httpx_stream(response_bytes=b"<html>Not JSON</html>")
mock_client_class.return_value = fake_cls()
resp = client.post("/api/import-from-url", json={"url": "https://example.com/page.html"})
assert resp.status_code == 400
assert "valid JSON" in resp.json()["detail"]

@patch('api.routes.profiles._validate_url_for_ssrf')
@patch('httpx.AsyncClient')
def test_missing_name_field(self, mock_client_class, mock_ssrf, client):
"""Returns 400 when profile JSON is missing 'name'."""
mock_ssrf.return_value = None
fake_cls = TestImportFromUrl._mock_httpx_stream(response_bytes=json.dumps({"temperature": 93.0}).encode())
mock_client_class.return_value = fake_cls()
resp = client.post("/api/import-from-url", json={"url": "https://example.com/profile.json"})
assert resp.status_code == 400
assert "name" in resp.json()["detail"].lower()

@patch('api.routes.profiles.async_create_profile', new_callable=AsyncMock, return_value={"id": "m-1"})
@patch('api.routes.profiles.save_history')
@patch('api.routes.profiles.load_history', return_value=[
{"id": "old-1", "profile_name": "URL Espresso", "reply": "desc"}
])
@patch('api.routes.profiles._validate_url_for_ssrf')
@patch('httpx.AsyncClient')
def test_dedupe_exists(self, mock_client_class, mock_ssrf, mock_load, mock_save, mock_create, client):
"""Returns 'exists' when a profile with the same name is already in history."""
mock_ssrf.return_value = None
fake_cls = TestImportFromUrl._mock_httpx_stream(response_bytes=json.dumps(self.VALID_PROFILE).encode())
mock_client_class.return_value = fake_cls()
resp = client.post("/api/import-from-url", json={"url": "https://example.com/profile.json"})
assert resp.status_code == 200
data = resp.json()
assert data["status"] == "exists"
assert data["profile_name"] == "URL Espresso"
mock_save.assert_not_called()

@patch('api.routes.profiles.async_create_profile', new_callable=AsyncMock, return_value={"id": "m-2"})
@patch('api.routes.profiles.save_history')
@patch('api.routes.profiles.load_history', return_value=[])
@patch('api.routes.profiles._generate_profile_description', new_callable=AsyncMock, return_value="Rich espresso")
@patch('api.routes.profiles._validate_url_for_ssrf')
@patch('httpx.AsyncClient')
def test_success_path(self, mock_client_class, mock_ssrf, mock_gen_desc, mock_load, mock_save, mock_create, client):
"""Full success path: fetch, parse, save, upload."""
mock_ssrf.return_value = None
fake_cls = TestImportFromUrl._mock_httpx_stream(response_bytes=json.dumps(self.VALID_PROFILE).encode())
mock_client_class.return_value = fake_cls()
resp = client.post("/api/import-from-url", json={"url": "https://example.com/profile.json"})
assert resp.status_code == 200
data = resp.json()
assert data["status"] == "success"
assert data["profile_name"] == "URL Espresso"
assert data["has_description"] is True
assert data["uploaded_to_machine"] is True
assert "entry_id" in data
mock_save.assert_called_once()
mock_create.assert_called_once()
12 changes: 11 additions & 1 deletion apps/web/public/locales/de/translation.json
Original file line number Diff line number Diff line change
Expand Up @@ -652,7 +652,17 @@
"bulkImportFailed": "Massen-Import fehlgeschlagen",
"fetchDetailsFailed": "Fehler beim Abrufen der Profildetails",
"noResponseStream": "Kein Antwortstream verfügbar",
"bulkImportStartFailed": "Fehler beim Starten des Massenimports"
"bulkImportStartFailed": "Fehler beim Starten des Massenimports",
"fromUrl": "Von URL",
"jsonOrMet": ".json / .met",
"importFromUrl": "Von URL importieren",
"urlPlaceholder": "Profil-URL einfügen...",
"urlHint": "Unterstützt .json- und .met-Profil-URLs",
"back": "Zurück",
"importButton": "Importieren",
"fetchingUrl": "Profil von URL abrufen...",
"importUrlFailed": "Import von URL fehlgeschlagen",
"invalidUrl": "Bitte geben Sie eine gültige URL ein"
},
"imageCrop": {
"title": "Profilbild zuschneiden",
Expand Down
12 changes: 11 additions & 1 deletion apps/web/public/locales/en/translation.json
Original file line number Diff line number Diff line change
Expand Up @@ -657,7 +657,17 @@
"bulkImportFailed": "Bulk import failed",
"bulkImportStartFailed": "Failed to start bulk import",
"noResponseStream": "No response stream available",
"fetchDetailsFailed": "Failed to fetch profile details"
"fetchDetailsFailed": "Failed to fetch profile details",
"fromUrl": "From URL",
"jsonOrMet": ".json / .met",
"importFromUrl": "Import from URL",
"urlPlaceholder": "Paste profile URL...",
"urlHint": "Supports .json and .met profile URLs",
"back": "Back",
"importButton": "Import",
"fetchingUrl": "Fetching profile from URL...",
"importUrlFailed": "Failed to import from URL",
"invalidUrl": "Please enter a valid URL"
},
"imageCrop": {
"title": "Crop Profile Image",
Expand Down
12 changes: 11 additions & 1 deletion apps/web/public/locales/es/translation.json
Original file line number Diff line number Diff line change
Expand Up @@ -652,7 +652,17 @@
"bulkImportFailed": "Importación masiva fallida",
"fetchDetailsFailed": "Error al obtener detalles del perfil",
"noResponseStream": "No hay flujo de respuesta disponible",
"bulkImportStartFailed": "Error al iniciar la importación masiva"
"bulkImportStartFailed": "Error al iniciar la importación masiva",
"fromUrl": "Desde URL",
"jsonOrMet": ".json / .met",
"importFromUrl": "Importar desde URL",
"urlPlaceholder": "Pegar URL del perfil...",
"urlHint": "Admite URLs de perfiles .json y .met",
"back": "Atrás",
"importButton": "Importar",
"fetchingUrl": "Obteniendo perfil desde URL...",
"importUrlFailed": "Error al importar desde URL",
"invalidUrl": "Introduce una URL válida"
},
"imageCrop": {
"title": "Recortar imagen de perfil",
Expand Down
12 changes: 11 additions & 1 deletion apps/web/public/locales/fr/translation.json
Original file line number Diff line number Diff line change
Expand Up @@ -652,7 +652,17 @@
"bulkImportFailed": "Échec de l'importation en masse",
"fetchDetailsFailed": "Échec de la récupération des détails du profil",
"noResponseStream": "Aucun flux de réponse disponible",
"bulkImportStartFailed": "Échec du démarrage de l'importation en masse"
"bulkImportStartFailed": "Échec du démarrage de l'importation en masse",
"fromUrl": "Depuis URL",
"jsonOrMet": ".json / .met",
"importFromUrl": "Importer depuis URL",
"urlPlaceholder": "Coller l'URL du profil...",
"urlHint": "Prend en charge les URL de profils .json et .met",
"back": "Retour",
"importButton": "Importer",
"fetchingUrl": "Récupération du profil depuis l'URL...",
"importUrlFailed": "Échec de l'importation depuis l'URL",
"invalidUrl": "Veuillez entrer une URL valide"
},
"imageCrop": {
"title": "Recadrer l'image du profil",
Expand Down
12 changes: 11 additions & 1 deletion apps/web/public/locales/it/translation.json
Original file line number Diff line number Diff line change
Expand Up @@ -652,7 +652,17 @@
"bulkImportFailed": "Importazione di massa fallita",
"fetchDetailsFailed": "Impossibile recuperare i dettagli del profilo",
"noResponseStream": "Nessun flusso di risposta disponibile",
"bulkImportStartFailed": "Impossibile avviare l'importazione di massa"
"bulkImportStartFailed": "Impossibile avviare l'importazione di massa",
"fromUrl": "Da URL",
"jsonOrMet": ".json / .met",
"importFromUrl": "Importa da URL",
"urlPlaceholder": "Incolla URL del profilo...",
"urlHint": "Supporta URL di profili .json e .met",
"back": "Indietro",
"importButton": "Importa",
"fetchingUrl": "Recupero del profilo dall'URL...",
"importUrlFailed": "Importazione da URL non riuscita",
"invalidUrl": "Inserisci un URL valido"
},
"imageCrop": {
"title": "Ritaglia immagine profilo",
Expand Down
Loading
Loading