Skip to content

Sites <> 1.7.x#9661

Closed
ItzNotABug wants to merge 3 commits intofeat-sitesfrom
1.7-merge
Closed

Sites <> 1.7.x#9661
ItzNotABug wants to merge 3 commits intofeat-sitesfrom
1.7-merge

Conversation

@ItzNotABug
Copy link
Copy Markdown
Contributor

@ItzNotABug ItzNotABug commented Apr 17, 2025

What does this PR do?

Sites <> 1.7.x
Need for update on updated Console SDK & CSV Imports.

cc @Meldiron

Test Plan

N/A.

Related PRs and Issues

N/A.

Checklist

  • Have you read the Contributing Guidelines on issues?
  • If the PR includes a change to an API's metadata (desc, label, params, etc.), does it also include updated API specs and example docs?

Summary by CodeRabbit

  • New Features
    • Introduced the ability to import documents into the database from CSV files via a new API endpoint.
    • Added persistent storage for CSV imports, accessible by relevant services.
  • Documentation
    • Added documentation for the new CSV import endpoint.
  • Bug Fixes
    • Corrected a minor typo in service comments.
  • Tests
    • Added comprehensive end-to-end tests for CSV import migrations, covering both successful and failure scenarios.

@ItzNotABug ItzNotABug self-assigned this Apr 17, 2025
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 17, 2025

Walkthrough

The changes introduce support for importing documents into the database from CSV files. This includes adding a new API endpoint for initiating CSV-based migrations, extending the migrations schema to track CSV-specific metadata, and updating the migration worker to handle CSV sources. New storage infrastructure is provisioned for imports, with corresponding Docker and application resource updates. Comprehensive end-to-end tests are added to verify successful and error scenarios for CSV migrations. Documentation for the new endpoint is also introduced.

Changes

File(s) Change Summary
Dockerfile, docker-compose.yml Added /storage/imports directory setup and permissions in Dockerfile; introduced new Docker volume appwrite-imports mounted at /storage/imports for relevant services in docker-compose.yml.
app/init/constants.php Added constant APP_STORAGE_IMPORTS for the imports storage path.
app/init/resources.php, app/worker.php Registered new resource deviceForImports for handling imports storage device, both in application and worker resource registries.
app/config/collections/projects.php Extended "migrations" collection schema with options, resourceId, and resourceType attributes; added index _key_resource_id on resourceId.
app/controllers/api/migrations.php Added new POST endpoint /v1/migrations/csv for importing documents from CSV files, including validation, file copying, migration document creation, and queueing.
src/Appwrite/Platform/Workers/Migrations.php Updated migration worker to support CSV sources, including new property for import device, constructor and method signature updates, and logic to instantiate a CSV source.
composer.json Updated utopia-php/migration dependency version from 0.8.* to 0.9.1.
docs/references/migrations/migration-csv.md Added documentation for the new CSV import migration endpoint.
tests/e2e/Services/Migrations/MigrationsBase.php Added comprehensive end-to-end tests for CSV migration, covering both successful imports and multiple error scenarios.

Sequence Diagram(s)

sequenceDiagram
    participant Client
    participant API
    participant Storage
    participant ImportsDevice
    participant DB
    participant MigrationWorker

    Client->>API: POST /v1/migrations/csv (bucketId, fileId, resourceId)
    API->>Storage: Validate bucket and file existence
    API->>ImportsDevice: Copy file to /storage/imports
    API->>DB: Create migration document (status: pending)
    API->>MigrationWorker: Enqueue migration
    API-->>Client: 202 Accepted + migration document

    MigrationWorker->>ImportsDevice: Read CSV file
    MigrationWorker->>DB: Import documents into collection
    MigrationWorker->>DB: Update migration status and counters
Loading

Suggested reviewers

  • Meldiron

Poem

🐇
A CSV hops into the field,
Through imports, its secrets revealed.
New paths and constants, volumes set,
With tests to ensure no bug is met.
Migrations now can dance and sing—
Oh, what data this spring will bring!
🥕

Tip

⚡💬 Agentic Chat (Pro Plan, General Availability)
  • We're introducing multi-step agentic chat in review comments and issue comments, within and outside of PR's. This feature enhances review and issue discussions with the CodeRabbit agentic chat by enabling advanced interactions, including the ability to create pull requests directly from comments and add commits to existing pull requests.

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

@ItzNotABug ItzNotABug requested a review from Meldiron April 17, 2025 08:01
@github-actions
Copy link
Copy Markdown

github-actions bot commented Apr 17, 2025

Security Scan Results for PR

Docker Image Scan Results

Package Version Vulnerability Severity
binutils 2.42-r0 CVE-2025-0840 HIGH
libexpat 2.6.4-r0 CVE-2024-8176 HIGH
libxml2 2.12.7-r0 CVE-2024-56171 HIGH
libxml2 2.12.7-r0 CVE-2025-24928 HIGH
libxml2 2.12.7-r0 CVE-2025-27113 HIGH
xz 5.6.2-r0 CVE-2025-31115 HIGH
xz-libs 5.6.2-r0 CVE-2025-31115 HIGH
golang.org/x/crypto v0.31.0 CVE-2025-22869 HIGH

Source Code Scan Results

🎉 No vulnerabilities found!

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (3)
app/controllers/api/migrations.php (1)

370-378: statusCounters initial value deviates from existing endpoints

Other migration creators initialise statusCounters and resourceData with JSON‑encoded objects ('{}').
Here they are initialised with PHP arrays [].
While both serialise to JSON objects in most drivers, the inconsistency can cause unexpected type mismatches when documents are later cast to arrays.

- 'statusCounters' => [],
- 'resourceData' => [],
+ 'statusCounters' => '{}',
+ 'resourceData' => '{}',
tests/e2e/Services/Migrations/MigrationsBase.php (2)

1154-1160: Extraneous endpoint key in request body

/v1/migrations/csv does not define an endpoint parameter, so the value sent here is ignored.
Cleaning it up makes the test input clearer and guards against future validation tightening.

-    'endpoint' => 'http://localhost/v1',

1187-1202: 60‑second polling window may be brittle in CI

assertEventually waits up to 60 s (120× 500 ms) for the migration to finish.
On heavily loaded runners this can still be insufficient, causing sporadic test flakiness.

Consider making the timeout configurable via an environment variable, e.g.

$timeoutMs = getenv('E2E_TIMEOUT_MS') ?: 120000;
$this->assertEventually($callback, $timeoutMs, 500);
📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 73d1e6c and 0b5369c.

⛔ Files ignored due to path filters (5)
  • composer.lock is excluded by !**/*.lock
  • tests/resources/csv/documents.csv is excluded by !**/*.csv
  • tests/resources/csv/irrelevant-column.csv is excluded by !**/*.csv
  • tests/resources/csv/missing-column.csv is excluded by !**/*.csv
  • tests/resources/csv/missing-row.csv is excluded by !**/*.csv
📒 Files selected for processing (11)
  • Dockerfile (1 hunks)
  • app/config/collections/projects.php (3 hunks)
  • app/controllers/api/migrations.php (3 hunks)
  • app/init/constants.php (1 hunks)
  • app/init/resources.php (1 hunks)
  • app/worker.php (1 hunks)
  • composer.json (1 hunks)
  • docker-compose.yml (4 hunks)
  • docs/references/migrations/migration-csv.md (1 hunks)
  • src/Appwrite/Platform/Workers/Migrations.php (5 hunks)
  • tests/e2e/Services/Migrations/MigrationsBase.php (2 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (2)
app/worker.php (1)
app/init/resources.php (1)
  • getDevice (527-623)
tests/e2e/Services/Migrations/MigrationsBase.php (4)
src/Appwrite/Migration/Migration.php (1)
  • Migration (18-427)
src/Appwrite/Platform/Appwrite.php (1)
  • Appwrite (12-22)
tests/e2e/Client.php (1)
  • Client (8-317)
src/Appwrite/Utopia/Request.php (1)
  • getHeaders (147-169)
⏰ Context from checks skipped due to timeout of 90000ms (13)
  • GitHub Check: E2E Service Test (Teams)
  • GitHub Check: E2E Service Test (Messaging)
  • GitHub Check: E2E Service Test (Realtime)
  • GitHub Check: E2E Service Test (Locale)
  • GitHub Check: E2E Service Test (Health)
  • GitHub Check: E2E Service Test (Projects)
  • GitHub Check: E2E Service Test (GraphQL)
  • GitHub Check: E2E Service Test (Databases)
  • GitHub Check: E2E Service Test (FunctionsSchedule)
  • GitHub Check: E2E General Test
  • GitHub Check: Benchmark
  • GitHub Check: Unit Test
  • GitHub Check: scan
🔇 Additional comments (21)
composer.json (1)

65-65: Verify compatibility and version format for utopia-php/migration
Pinning the dependency to "0.9.1" is valid, but it may prevent seamless patch upgrades. Consider whether a semantic version range (e.g., ^0.9.1) would be more appropriate per your release policy. Ensure the Composer lockfile is updated and run the migration-related test suite to catch any breaking changes introduced by version 0.9.1.

app/init/constants.php (1)

53-53: New import storage constant added correctly
The APP_STORAGE_IMPORTS constant follows the existing naming pattern for storage paths. Ensure any relevant configuration (e.g., environment variables, resource definitions) and infrastructure (volumes in Docker Compose) are updated to reference this path.

Dockerfile (2)

47-48: Ensure /storage/imports volume is mounted in Compose
You’ve added directory creation for /storage/imports. Verify that your docker-compose.yml also declares and mounts this path so CSV files persist across service restarts.


54-55: Set correct ownership and permissions for import storage
Applying chown and chmod to /storage/imports aligns with existing storage directories. You may optionally group these with other volume commands for readability, but functionally this is correct.

app/worker.php (1)

346-349: Register deviceForImports resource for CSV imports
This new deviceForImports resource mirrors the pattern used by other storage devices (e.g., deviceForSites). Confirm that your CSV migration controller and worker properly inject and utilize this resource when reading from and writing to the import storage path.

app/init/resources.php (1)

515-517: New resource added for handling CSV imports.

The deviceForImports resource is properly registered to provide storage device functionality for the new CSV import feature. This follows the established pattern used for other device resources.

docker-compose.yml (4)

75-75: New volume for CSV imports added to main appwrite service.

A dedicated persistent volume for imports is properly mounted to ensure data persistence across container restarts.


688-688: Same imports volume mounted to migrations worker.

The imports volume is correctly mounted to the migrations worker service, ensuring the worker has access to the same storage as the main service.


995-995: Fixed typo in comment.

Corrected "nessessary" to "necessary" in the comment.


1164-1164: Defined new volume for imports in volumes section.

The appwrite-imports volume is properly defined in the volumes section, ensuring it will be created as a persistent Docker volume.

app/config/collections/projects.php (3)

2279-2288: Added options attribute to migrations schema.

A properly structured JSON field is added to store configuration options for migrations, with appropriate filters and constraints.


2345-2365: Added resourceId and resourceType attributes to migrations schema.

These fields are correctly defined with appropriate size and constraints to track specific resources being migrated, which is essential for CSV import functionality.


2390-2395: Added index for resourceId to optimize queries.

A proper database index is added on the resourceId field with descending order, which will improve query performance when looking up migrations by resource.

src/Appwrite/Platform/Workers/Migrations.php (6)

21-22: Added imports for CSV source and Device classes.

Properly importing the required classes for the new CSV import functionality.

Also applies to: 28-29


37-38: Added protected property for imports device.

The $deviceForImports property is correctly added to store the device instance for handling CSV file storage.


64-65: Added deviceForImports to dependency injection list.

The imports device is properly injected into the constructor following Appwrite's dependency injection pattern.


71-75: Updated action method signature and stored device instance.

The action method now correctly:

  1. Accepts the Device instance as a parameter
  2. Stores it in the class property for later use

This follows the pattern used for other dependencies.


108-111: Added extraction of resourceId and options from migration document.

These attributes are properly extracted from the migration document to support the CSV import functionality.


139-145: Added CSV case to the source processor.

The CSV source is correctly instantiated with:

  1. The resource ID for identification
  2. The path from migration options
  3. The storage device for accessing files
  4. The project database for writing data

This integration follows the pattern established for other source types.

app/controllers/api/migrations.php (1)

348-350: Missing file‑type validation

The endpoint assumes that the uploaded object is a CSV if it is uncompressed and unencrypted.
Consider validating the MIME type (text/csv) and the file extension (*.csv) to fail fast when users accidentally pass a different file.

Would you like a short helper to perform this validation before the copy step?

tests/e2e/Services/Migrations/MigrationsBase.php (1)

921-972: Resource clean‑up missing

The test creates databases, collections, buckets and attributes but does not delete them on success/failure.
Accumulated artefacts can slow down subsequent tests and consume storage.

Do you want helper tear‑down code added to remove $databaseId, $collectionId, and the two buckets once the assertions finish?

@github-actions
Copy link
Copy Markdown

github-actions bot commented Apr 17, 2025

✨ Benchmark results

  • Requests per second: 905
  • Requests with 200 status code: 162,924
  • P99 latency: 0.204874512

⚡ Benchmark Comparison

Metric This PR Latest version
RPS 905 1,212
200 162,924 218,250
P99 0.204874512 0.151880528

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🔭 Outside diff range comments (1)
app/config/specs/swagger2-latest-console.json (1)

40613-40735: 💡 Verification agent

🧩 Analysis chain

Verify consistency of newly added usage metrics schema fields
Several new properties—such as buildsSuccessTotal, buildsFailedTotal, buildsSuccess, buildsFailed, and the corresponding execution metrics—have been added across the usage schemas (usageFunctions, usageFunction, usageSites, usageSite). Ensure each schema’s required array includes these new fields and that naming is uniform across all definitions.

Also applies to: 40759-40942, 40949-41211, 41220-41470


🏁 Script executed:

#!/bin/bash
# Check that metric fields are present in each usage schema's 'required' list
schemas=(usageFunctions usageFunction usageSites usageSite)
fields=("buildsSuccessTotal" "buildsFailedTotal" "buildsSuccess" "buildsFailed" "executionsMbSecondsTotal" "executionsMbSeconds")
for schema in "${schemas[@]}"; do
  echo "Checking $schema..."
  rg -A20 "\"$schema\"" -n app/config/specs/swagger2-latest-console.json \
    | sed -n '/required/,/\]/p' \
    | grep -E "$(IFS=\|; echo "${fields[*]}")" \
    || echo "⚠️ Missing one or more fields in $schema"
done

Length of output: 2371


Add missing usage metrics to required arrays
The new metrics (buildsSuccessTotal, buildsFailedTotal, buildsSuccess, buildsFailed, executionsMbSecondsTotal, executionsMbSeconds) aren’t listed in the required section of any of the usage schemas. Please update the following definitions in app/config/specs/swagger2-latest-console.json to include all six fields in their required arrays:

• usageFunctions (lines 40613–40735)
• usageFunction (lines 40759–40942)
• usageSites (lines 40949–41211)
• usageSite (lines 41220–41470)

This will ensure consistency and prevent missing-field validation errors.

🧹 Nitpick comments (2)
app/config/specs/open-api3-latest-console.json (2)

4346-4360: Ensure consistency and clarity of added descriptions

Many endpoints have been enriched with descriptive text, which is great for usability. Please verify that:

  • All descriptions follow the same style guidelines (tone, punctuation, capitalization, and no trailing spaces).
  • Placeholders and example URLs are accurate and escaped correctly in JSON.
  • Response codes referenced in descriptions match the actual responses block.

Consider creating a shared style guide or lint rule to keep entries uniform across all tags and endpoints.


39992-40301: Review schema extensions for usage metrics

You’ve introduced new metrics (buildsSuccessTotal, buildsFailedTotal, buildsSuccess, buildsFailed, buildsTimeAverage, executionsMbSeconds) across several usage* schemas. Please confirm:

  • Backend services populate these fields correctly in the JSON responses.
  • All required arrays have been updated to include the new properties.
  • Downstream consumers (e.g., Console SDK) handle the additional fields gracefully.

An example object illustrating these new arrays could improve clarity for integrators.

📜 Review details

Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 0b5369c and b18f2ce.

📒 Files selected for processing (6)
  • app/config/specs/open-api3-latest-client.json (8 hunks)
  • app/config/specs/open-api3-latest-console.json (183 hunks)
  • app/config/specs/open-api3-latest-server.json (138 hunks)
  • app/config/specs/swagger2-latest-client.json (8 hunks)
  • app/config/specs/swagger2-latest-console.json (194 hunks)
  • app/config/specs/swagger2-latest-server.json (140 hunks)
✅ Files skipped from review due to trivial changes (3)
  • app/config/specs/open-api3-latest-server.json
  • app/config/specs/open-api3-latest-client.json
  • app/config/specs/swagger2-latest-client.json
⏰ Context from checks skipped due to timeout of 90000ms (20)
  • GitHub Check: E2E Service Test (Teams)
  • GitHub Check: E2E Service Test (Messaging)
  • GitHub Check: E2E Service Test (VCS)
  • GitHub Check: E2E Service Test (Storage)
  • GitHub Check: E2E Service Test (Health)
  • GitHub Check: E2E Service Test (Users)
  • GitHub Check: E2E Service Test (Proxy)
  • GitHub Check: E2E Service Test (Realtime)
  • GitHub Check: E2E Service Test (Projects)
  • GitHub Check: E2E Service Test (Account)
  • GitHub Check: E2E Service Test (GraphQL)
  • GitHub Check: E2E Service Test (Locale)
  • GitHub Check: E2E Service Test (FunctionsSchedule)
  • GitHub Check: E2E Service Test (Databases)
  • GitHub Check: E2E Service Test (Avatars)
  • GitHub Check: Benchmark
  • GitHub Check: E2E General Test
  • GitHub Check: Unit Test
  • GitHub Check: Setup & Build Console SDK
  • GitHub Check: scan
🔇 Additional comments (3)
app/config/specs/swagger2-latest-console.json (1)

4558-18427: Descriptions Across Multiple Endpoints
The newly added "description" fields for endpoints under the console, functions, proxy, and sites tags significantly improve API documentation clarity and consistency. I reviewed capitalization, punctuation, and link formatting—everything follows style guidelines.

app/config/specs/swagger2-latest-server.json (2)

8274-19219: Added descriptions to multiple endpoints
The new "description" fields across several endpoints (functions, runtimes, specifications, deployments, executions, variables, sites, etc.) are clear, informative, and grammatically correct. They enhance discoverability and usability of the API.


8285-19230:

Details

✅ Verification successful

Incremented weight metadata consistently
All weight values have been bumped by one, maintaining the relative ordering of endpoints after documentation changes. Please verify that these increments are reflected identically in the corresponding OpenAPI 3 and console spec files to avoid drift.


🏁 Script executed:

#!/bin/bash
# Compare weight entries between Swagger2 and OpenAPI3 specs
grep -R '"weight":' app/config/specs/swagger2-latest-server.json | sort > /tmp/weights-swagger.txt
grep -R '"weight":' app/config/specs/open-api3-latest-server.json | sort > /tmp/weights-openapi3.txt
diff -u /tmp/weights-swagger.txt /tmp/weights-openapi3.txt || echo "Discrepancies in weight metadata detected"

Length of output: 209


Weight metadata consistency verified
All "weight" entries in app/config/specs/swagger2-latest-server.json and app/config/specs/open-api3-latest-server.json match exactly. No discrepancies detected—relative endpoint ordering is maintained across both specs.

Comment on lines +18029 to +18106
"\/migrations\/csv": {
"post": {
"summary": "Import documents from a CSV",
"operationId": "migrationsCreateCsvMigration",
"tags": [
"migrations"
],
"description": "Import documents from a CSV file into your Appwrite database. This endpoint allows you to import documents from a CSV file uploaded to Appwrite Storage bucket.",
"responses": {
"202": {
"description": "Migration",
"content": {
"application\/json": {
"schema": {
"$ref": "#\/components\/schemas\/migration"
}
}
}
}
},
"x-appwrite": {
"method": "createCsvMigration",
"weight": 310,
"cookies": false,
"type": "",
"deprecated": false,
"demo": "migrations\/create-csv-migration.md",
"edit": "https:\/\/github.com\/appwrite\/appwrite\/edit\/master\/docs\/references\/migrations\/migration-csv.md",
"rate-limit": 0,
"rate-time": 3600,
"rate-key": "url:{url},ip:{ip}",
"scope": "migrations.write",
"platforms": [
"console"
],
"packaging": false,
"auth": {
"Project": []
}
},
"security": [
{
"Project": []
}
],
"requestBody": {
"content": {
"application\/json": {
"schema": {
"type": "object",
"properties": {
"bucketId": {
"type": "string",
"description": "Storage bucket unique ID. You can create a new storage bucket using the Storage service [server integration](https:\/\/appwrite.io\/docs\/server\/storage#createBucket).",
"x-example": "<BUCKET_ID>"
},
"fileId": {
"type": "string",
"description": "File ID.",
"x-example": "<FILE_ID>"
},
"resourceId": {
"type": "string",
"description": "Composite ID in the format {databaseId:collectionId}, identifying a collection within a database.",
"x-example": "[ID1:ID2]"
}
},
"required": [
"bucketId",
"fileId",
"resourceId"
]
}
}
}
}
}
},
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Validate new CSV migration endpoint specification

The /migrations/csv POST operation is well-defined with required parameters (bucketId, fileId, resourceId) and proper response schema. Before merging, ensure:

  • The migration schema referenced under #/components/schemas/migration exists and aligns with the controller’s return payload.
  • Rate-limit settings, scopes, and demo/edit links point to the correct files and match implementation.
  • Your example in docs/references/migrations/migration-csv.md matches this spec exactly.

Run:

to verify the operationId consistency.


🏁 Script executed:

rg -n migrationsCreateCsvMigration app/controllers/api/migrations.php

Length of output: 71


🏁 Script executed:

#!/bin/bash
# Locate all controller files related to migrations
fd migrations -t f app/controllers

# Search for any CSV-related function or method names across controllers
rg -n -i "csv" app/controllers

Length of output: 910


🏁 Script executed:

#!/bin/bash
# Show the POST /v1/migrations/csv operation in the OpenAPI spec
rg -n "/v1/migrations/csv" -A10 -B5 app/config/specs/open-api3-latest-console.json

# Confirm the 'migration' schema exists under components/schemas
rg -n '"migration"\s*:' -A5 -B2 app/config/specs/open-api3-latest-console.json

Length of output: 452


🏁 Script executed:

#!/bin/bash
# Locate the /v1/migrations/csv path entry in the OpenAPI spec (with proper JSON quoting)
rg -n '"\/v1\/migrations\/csv"' -A20 -B5 app/config/specs/open-api3-latest-console.json

# Confirm the operationId under that entry matches the controller’s createCsvMigration
rg -n '"operationId"\s*:\s*"createCsvMigration"' -n app/config/specs/open-api3-latest-console.json

Length of output: 281


🏁 Script executed:

#!/bin/bash
# Display the POST /v1/migrations/csv spec entry including operationId and schema refs
rg -n '"/v1/migrations/csv":' -A30 -B5 app/config/specs/open-api3-latest-console.json

# Check operationId consistency
rg -n '"operationId": "createCsvMigration"' app/config/specs/open-api3-latest-console.json

# Verify response schema references the migration schema
rg -n '"\$ref": "#/components/schemas/migration"' app/config/specs/open-api3-latest-console.json

Length of output: 279


🏁 Script executed:

#!/bin/bash
# Use jq to extract the POST /v1/migrations/csv operation object
jq '.paths["/v1/migrations/csv"].post' app/config/specs/open-api3-latest-console.json

# If that succeeds, show its operationId and response schema
jq '.paths["/v1/migrations/csv"].post | {operationId, responses: .responses["200"].content["application/json"].schema}' app/config/specs/open-api3-latest-console.json

Length of output: 308


Add missing CSV migration endpoint to OpenAPI spec

It appears that /v1/migrations/csv is not defined under paths in your app/config/specs/open-api3-latest-console.json (jq returned null). You’ll need to add the full POST operation block so the spec and controller stay in sync:

• File: app/config/specs/open-api3-latest-console.json

  • Insert under paths:
    "/v1/migrations/csv": {
      "post": {
        "summary": "Import documents from a CSV",
        "operationId": "createCsvMigration",
        "tags": ["migrations"],
        "requestBody": {
          "required": true,
          "content": {
            "application/json": {
              "schema": {
                "type": "object",
                "required": ["bucketId","fileId","resourceId"],
                "properties": {
                  "bucketId": { "type": "string" },
                  "fileId":   { "type": "string" },
                  "resourceId": { "type": "string" }
                }
              }
            }
          }
        },
        "responses": {
          "200": {
            "description": "Migration started",
            "content": {
              "application/json": {
                "schema": { "$ref": "#/components/schemas/migration" }
              }
            }
          }
        }
      }
    }

• Verify that #/components/schemas/migration exists and matches the controller’s payload.
• Update rate‑limit, scopes, and demo/edit links within this block to reflect your implementation.
• Confirm your example in docs/references/migrations/migration-csv.md matches the final spec.

Comment on lines 18486 to +18569
},
"\/migrations\/csv": {
"post": {
"summary": "Import documents from a CSV",
"operationId": "migrationsCreateCsvMigration",
"consumes": [
"application\/json"
],
"produces": [
"application\/json"
],
"tags": [
"migrations"
],
"description": "Import documents from a CSV file into your Appwrite database. This endpoint allows you to import documents from a CSV file uploaded to Appwrite Storage bucket.",
"responses": {
"202": {
"description": "Migration",
"schema": {
"$ref": "#\/definitions\/migration"
}
}
},
"x-appwrite": {
"method": "createCsvMigration",
"weight": 310,
"cookies": false,
"type": "",
"deprecated": false,
"demo": "migrations\/create-csv-migration.md",
"edit": "https:\/\/github.com\/appwrite\/appwrite\/edit\/master\/docs\/references\/migrations\/migration-csv.md",
"rate-limit": 0,
"rate-time": 3600,
"rate-key": "url:{url},ip:{ip}",
"scope": "migrations.write",
"platforms": [
"console"
],
"packaging": false,
"auth": {
"Project": []
}
},
"security": [
{
"Project": []
}
],
"parameters": [
{
"name": "payload",
"in": "body",
"schema": {
"type": "object",
"properties": {
"bucketId": {
"type": "string",
"description": "Storage bucket unique ID. You can create a new storage bucket using the Storage service [server integration](https:\/\/appwrite.io\/docs\/server\/storage#createBucket).",
"default": null,
"x-example": "<BUCKET_ID>"
},
"fileId": {
"type": "string",
"description": "File ID.",
"default": null,
"x-example": "<FILE_ID>"
},
"resourceId": {
"type": "string",
"description": "Composite ID in the format {databaseId:collectionId}, identifying a collection within a database.",
"default": null,
"x-example": "[ID1:ID2]"
}
},
"required": [
"bucketId",
"fileId",
"resourceId"
]
}
}
]
}
},
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Validate new CSV migration endpoint documentation and resource links
The /migrations/csv POST endpoint is defined with the correct operationId, parameters, responses, and security scopes. Please verify that the referenced demo file and documentation edit path actually exist in the repository to avoid broken links in the docs.


🏁 Script executed:

#!/bin/bash
# Verify documentation demo and edit files exist in the repo
fd create-csv-migration.md
fd migration-csv.md

Length of output: 94


Ensure CSV migration docs links are valid

  • Verified that the reference file for the “migration‑csv” docs exists at:
    • docs/references/migrations/migration-csv.md
  • The demo file create-csv-migration.md could not be found in the repo.
    Please add this file at the expected path or update the endpoint’s documentation to point to an existing demo.

@pkg-pr-new
Copy link
Copy Markdown

pkg-pr-new bot commented Apr 17, 2025

Open in StackBlitz

npm i https://pkg.pr.new/appwrite/appwrite/@appwrite.io/console@9661

commit: bd1be71

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant