Conversation
WalkthroughThe changes introduce support for importing documents into the database from CSV files. This includes adding a new API endpoint for initiating CSV-based migrations, extending the migrations schema to track CSV-specific metadata, and updating the migration worker to handle CSV sources. New storage infrastructure is provisioned for imports, with corresponding Docker and application resource updates. Comprehensive end-to-end tests are added to verify successful and error scenarios for CSV migrations. Documentation for the new endpoint is also introduced. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant API
participant Storage
participant ImportsDevice
participant DB
participant MigrationWorker
Client->>API: POST /v1/migrations/csv (bucketId, fileId, resourceId)
API->>Storage: Validate bucket and file existence
API->>ImportsDevice: Copy file to /storage/imports
API->>DB: Create migration document (status: pending)
API->>MigrationWorker: Enqueue migration
API-->>Client: 202 Accepted + migration document
MigrationWorker->>ImportsDevice: Read CSV file
MigrationWorker->>DB: Import documents into collection
MigrationWorker->>DB: Update migration status and counters
Suggested reviewers
Poem
Tip ⚡💬 Agentic Chat (Pro Plan, General Availability)
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
Security Scan Results for PRDocker Image Scan Results
Source Code Scan Results🎉 No vulnerabilities found! |
There was a problem hiding this comment.
Actionable comments posted: 3
🧹 Nitpick comments (3)
app/controllers/api/migrations.php (1)
370-378:statusCountersinitial value deviates from existing endpointsOther migration creators initialise
statusCountersandresourceDatawith JSON‑encoded objects ('{}').
Here they are initialised with PHP arrays[].
While both serialise to JSON objects in most drivers, the inconsistency can cause unexpected type mismatches when documents are later cast to arrays.- 'statusCounters' => [], - 'resourceData' => [], + 'statusCounters' => '{}', + 'resourceData' => '{}',tests/e2e/Services/Migrations/MigrationsBase.php (2)
1154-1160: Extraneousendpointkey in request body
/v1/migrations/csvdoes not define anendpointparameter, so the value sent here is ignored.
Cleaning it up makes the test input clearer and guards against future validation tightening.- 'endpoint' => 'http://localhost/v1',
1187-1202: 60‑second polling window may be brittle in CI
assertEventuallywaits up to 60 s (120× 500 ms) for the migration to finish.
On heavily loaded runners this can still be insufficient, causing sporadic test flakiness.Consider making the timeout configurable via an environment variable, e.g.
$timeoutMs = getenv('E2E_TIMEOUT_MS') ?: 120000; $this->assertEventually($callback, $timeoutMs, 500);
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (5)
composer.lockis excluded by!**/*.locktests/resources/csv/documents.csvis excluded by!**/*.csvtests/resources/csv/irrelevant-column.csvis excluded by!**/*.csvtests/resources/csv/missing-column.csvis excluded by!**/*.csvtests/resources/csv/missing-row.csvis excluded by!**/*.csv
📒 Files selected for processing (11)
Dockerfile(1 hunks)app/config/collections/projects.php(3 hunks)app/controllers/api/migrations.php(3 hunks)app/init/constants.php(1 hunks)app/init/resources.php(1 hunks)app/worker.php(1 hunks)composer.json(1 hunks)docker-compose.yml(4 hunks)docs/references/migrations/migration-csv.md(1 hunks)src/Appwrite/Platform/Workers/Migrations.php(5 hunks)tests/e2e/Services/Migrations/MigrationsBase.php(2 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (2)
app/worker.php (1)
app/init/resources.php (1)
getDevice(527-623)
tests/e2e/Services/Migrations/MigrationsBase.php (4)
src/Appwrite/Migration/Migration.php (1)
Migration(18-427)src/Appwrite/Platform/Appwrite.php (1)
Appwrite(12-22)tests/e2e/Client.php (1)
Client(8-317)src/Appwrite/Utopia/Request.php (1)
getHeaders(147-169)
⏰ Context from checks skipped due to timeout of 90000ms (13)
- GitHub Check: E2E Service Test (Teams)
- GitHub Check: E2E Service Test (Messaging)
- GitHub Check: E2E Service Test (Realtime)
- GitHub Check: E2E Service Test (Locale)
- GitHub Check: E2E Service Test (Health)
- GitHub Check: E2E Service Test (Projects)
- GitHub Check: E2E Service Test (GraphQL)
- GitHub Check: E2E Service Test (Databases)
- GitHub Check: E2E Service Test (FunctionsSchedule)
- GitHub Check: E2E General Test
- GitHub Check: Benchmark
- GitHub Check: Unit Test
- GitHub Check: scan
🔇 Additional comments (21)
composer.json (1)
65-65: Verify compatibility and version format forutopia-php/migration
Pinning the dependency to"0.9.1"is valid, but it may prevent seamless patch upgrades. Consider whether a semantic version range (e.g.,^0.9.1) would be more appropriate per your release policy. Ensure the Composer lockfile is updated and run the migration-related test suite to catch any breaking changes introduced by version 0.9.1.app/init/constants.php (1)
53-53: New import storage constant added correctly
TheAPP_STORAGE_IMPORTSconstant follows the existing naming pattern for storage paths. Ensure any relevant configuration (e.g., environment variables, resource definitions) and infrastructure (volumes in Docker Compose) are updated to reference this path.Dockerfile (2)
47-48: Ensure/storage/importsvolume is mounted in Compose
You’ve added directory creation for/storage/imports. Verify that yourdocker-compose.ymlalso declares and mounts this path so CSV files persist across service restarts.
54-55: Set correct ownership and permissions for import storage
Applyingchownandchmodto/storage/importsaligns with existing storage directories. You may optionally group these with other volume commands for readability, but functionally this is correct.app/worker.php (1)
346-349: RegisterdeviceForImportsresource for CSV imports
This newdeviceForImportsresource mirrors the pattern used by other storage devices (e.g.,deviceForSites). Confirm that your CSV migration controller and worker properly inject and utilize this resource when reading from and writing to the import storage path.app/init/resources.php (1)
515-517: New resource added for handling CSV imports.The
deviceForImportsresource is properly registered to provide storage device functionality for the new CSV import feature. This follows the established pattern used for other device resources.docker-compose.yml (4)
75-75: New volume for CSV imports added to main appwrite service.A dedicated persistent volume for imports is properly mounted to ensure data persistence across container restarts.
688-688: Same imports volume mounted to migrations worker.The imports volume is correctly mounted to the migrations worker service, ensuring the worker has access to the same storage as the main service.
995-995: Fixed typo in comment.Corrected "nessessary" to "necessary" in the comment.
1164-1164: Defined new volume for imports in volumes section.The
appwrite-importsvolume is properly defined in the volumes section, ensuring it will be created as a persistent Docker volume.app/config/collections/projects.php (3)
2279-2288: Added options attribute to migrations schema.A properly structured JSON field is added to store configuration options for migrations, with appropriate filters and constraints.
2345-2365: Added resourceId and resourceType attributes to migrations schema.These fields are correctly defined with appropriate size and constraints to track specific resources being migrated, which is essential for CSV import functionality.
2390-2395: Added index for resourceId to optimize queries.A proper database index is added on the resourceId field with descending order, which will improve query performance when looking up migrations by resource.
src/Appwrite/Platform/Workers/Migrations.php (6)
21-22: Added imports for CSV source and Device classes.Properly importing the required classes for the new CSV import functionality.
Also applies to: 28-29
37-38: Added protected property for imports device.The
$deviceForImportsproperty is correctly added to store the device instance for handling CSV file storage.
64-65: Added deviceForImports to dependency injection list.The imports device is properly injected into the constructor following Appwrite's dependency injection pattern.
71-75: Updated action method signature and stored device instance.The action method now correctly:
- Accepts the Device instance as a parameter
- Stores it in the class property for later use
This follows the pattern used for other dependencies.
108-111: Added extraction of resourceId and options from migration document.These attributes are properly extracted from the migration document to support the CSV import functionality.
139-145: Added CSV case to the source processor.The CSV source is correctly instantiated with:
- The resource ID for identification
- The path from migration options
- The storage device for accessing files
- The project database for writing data
This integration follows the pattern established for other source types.
app/controllers/api/migrations.php (1)
348-350: Missing file‑type validationThe endpoint assumes that the uploaded object is a CSV if it is uncompressed and unencrypted.
Consider validating the MIME type (text/csv) and the file extension (*.csv) to fail fast when users accidentally pass a different file.Would you like a short helper to perform this validation before the copy step?
tests/e2e/Services/Migrations/MigrationsBase.php (1)
921-972: Resource clean‑up missingThe test creates databases, collections, buckets and attributes but does not delete them on success/failure.
Accumulated artefacts can slow down subsequent tests and consume storage.Do you want helper tear‑down code added to remove
$databaseId,$collectionId, and the two buckets once the assertions finish?
✨ Benchmark results
⚡ Benchmark Comparison
|
There was a problem hiding this comment.
Actionable comments posted: 2
🔭 Outside diff range comments (1)
app/config/specs/swagger2-latest-console.json (1)
40613-40735: 💡 Verification agent🧩 Analysis chain
Verify consistency of newly added usage metrics schema fields
Several new properties—such asbuildsSuccessTotal,buildsFailedTotal,buildsSuccess,buildsFailed, and the corresponding execution metrics—have been added across the usage schemas (usageFunctions,usageFunction,usageSites,usageSite). Ensure each schema’srequiredarray includes these new fields and that naming is uniform across all definitions.Also applies to: 40759-40942, 40949-41211, 41220-41470
🏁 Script executed:
#!/bin/bash # Check that metric fields are present in each usage schema's 'required' list schemas=(usageFunctions usageFunction usageSites usageSite) fields=("buildsSuccessTotal" "buildsFailedTotal" "buildsSuccess" "buildsFailed" "executionsMbSecondsTotal" "executionsMbSeconds") for schema in "${schemas[@]}"; do echo "Checking $schema..." rg -A20 "\"$schema\"" -n app/config/specs/swagger2-latest-console.json \ | sed -n '/required/,/\]/p' \ | grep -E "$(IFS=\|; echo "${fields[*]}")" \ || echo "⚠️ Missing one or more fields in $schema" doneLength of output: 2371
Add missing usage metrics to
requiredarrays
The new metrics (buildsSuccessTotal,buildsFailedTotal,buildsSuccess,buildsFailed,executionsMbSecondsTotal,executionsMbSeconds) aren’t listed in therequiredsection of any of the usage schemas. Please update the following definitions inapp/config/specs/swagger2-latest-console.jsonto include all six fields in theirrequiredarrays:• usageFunctions (lines 40613–40735)
• usageFunction (lines 40759–40942)
• usageSites (lines 40949–41211)
• usageSite (lines 41220–41470)This will ensure consistency and prevent missing-field validation errors.
🧹 Nitpick comments (2)
app/config/specs/open-api3-latest-console.json (2)
4346-4360: Ensure consistency and clarity of added descriptionsMany endpoints have been enriched with descriptive text, which is great for usability. Please verify that:
- All descriptions follow the same style guidelines (tone, punctuation, capitalization, and no trailing spaces).
- Placeholders and example URLs are accurate and escaped correctly in JSON.
- Response codes referenced in descriptions match the actual
responsesblock.Consider creating a shared style guide or lint rule to keep entries uniform across all tags and endpoints.
39992-40301: Review schema extensions for usage metricsYou’ve introduced new metrics (
buildsSuccessTotal,buildsFailedTotal,buildsSuccess,buildsFailed,buildsTimeAverage,executionsMbSeconds) across severalusage*schemas. Please confirm:
- Backend services populate these fields correctly in the JSON responses.
- All
requiredarrays have been updated to include the new properties.- Downstream consumers (e.g., Console SDK) handle the additional fields gracefully.
An example object illustrating these new arrays could improve clarity for integrators.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (6)
app/config/specs/open-api3-latest-client.json(8 hunks)app/config/specs/open-api3-latest-console.json(183 hunks)app/config/specs/open-api3-latest-server.json(138 hunks)app/config/specs/swagger2-latest-client.json(8 hunks)app/config/specs/swagger2-latest-console.json(194 hunks)app/config/specs/swagger2-latest-server.json(140 hunks)
✅ Files skipped from review due to trivial changes (3)
- app/config/specs/open-api3-latest-server.json
- app/config/specs/open-api3-latest-client.json
- app/config/specs/swagger2-latest-client.json
⏰ Context from checks skipped due to timeout of 90000ms (20)
- GitHub Check: E2E Service Test (Teams)
- GitHub Check: E2E Service Test (Messaging)
- GitHub Check: E2E Service Test (VCS)
- GitHub Check: E2E Service Test (Storage)
- GitHub Check: E2E Service Test (Health)
- GitHub Check: E2E Service Test (Users)
- GitHub Check: E2E Service Test (Proxy)
- GitHub Check: E2E Service Test (Realtime)
- GitHub Check: E2E Service Test (Projects)
- GitHub Check: E2E Service Test (Account)
- GitHub Check: E2E Service Test (GraphQL)
- GitHub Check: E2E Service Test (Locale)
- GitHub Check: E2E Service Test (FunctionsSchedule)
- GitHub Check: E2E Service Test (Databases)
- GitHub Check: E2E Service Test (Avatars)
- GitHub Check: Benchmark
- GitHub Check: E2E General Test
- GitHub Check: Unit Test
- GitHub Check: Setup & Build Console SDK
- GitHub Check: scan
🔇 Additional comments (3)
app/config/specs/swagger2-latest-console.json (1)
4558-18427: Descriptions Across Multiple Endpoints
The newly added"description"fields for endpoints under theconsole,functions,proxy, andsitestags significantly improve API documentation clarity and consistency. I reviewed capitalization, punctuation, and link formatting—everything follows style guidelines.app/config/specs/swagger2-latest-server.json (2)
8274-19219: Added descriptions to multiple endpoints
The new"description"fields across several endpoints (functions, runtimes, specifications, deployments, executions, variables, sites, etc.) are clear, informative, and grammatically correct. They enhance discoverability and usability of the API.
8285-19230:Details
✅ Verification successful
Incremented
weightmetadata consistently
Allweightvalues have been bumped by one, maintaining the relative ordering of endpoints after documentation changes. Please verify that these increments are reflected identically in the corresponding OpenAPI 3 and console spec files to avoid drift.
🏁 Script executed:
#!/bin/bash # Compare weight entries between Swagger2 and OpenAPI3 specs grep -R '"weight":' app/config/specs/swagger2-latest-server.json | sort > /tmp/weights-swagger.txt grep -R '"weight":' app/config/specs/open-api3-latest-server.json | sort > /tmp/weights-openapi3.txt diff -u /tmp/weights-swagger.txt /tmp/weights-openapi3.txt || echo "Discrepancies in weight metadata detected"Length of output: 209
Weight metadata consistency verified
All"weight"entries inapp/config/specs/swagger2-latest-server.jsonandapp/config/specs/open-api3-latest-server.jsonmatch exactly. No discrepancies detected—relative endpoint ordering is maintained across both specs.
| "\/migrations\/csv": { | ||
| "post": { | ||
| "summary": "Import documents from a CSV", | ||
| "operationId": "migrationsCreateCsvMigration", | ||
| "tags": [ | ||
| "migrations" | ||
| ], | ||
| "description": "Import documents from a CSV file into your Appwrite database. This endpoint allows you to import documents from a CSV file uploaded to Appwrite Storage bucket.", | ||
| "responses": { | ||
| "202": { | ||
| "description": "Migration", | ||
| "content": { | ||
| "application\/json": { | ||
| "schema": { | ||
| "$ref": "#\/components\/schemas\/migration" | ||
| } | ||
| } | ||
| } | ||
| } | ||
| }, | ||
| "x-appwrite": { | ||
| "method": "createCsvMigration", | ||
| "weight": 310, | ||
| "cookies": false, | ||
| "type": "", | ||
| "deprecated": false, | ||
| "demo": "migrations\/create-csv-migration.md", | ||
| "edit": "https:\/\/github.com\/appwrite\/appwrite\/edit\/master\/docs\/references\/migrations\/migration-csv.md", | ||
| "rate-limit": 0, | ||
| "rate-time": 3600, | ||
| "rate-key": "url:{url},ip:{ip}", | ||
| "scope": "migrations.write", | ||
| "platforms": [ | ||
| "console" | ||
| ], | ||
| "packaging": false, | ||
| "auth": { | ||
| "Project": [] | ||
| } | ||
| }, | ||
| "security": [ | ||
| { | ||
| "Project": [] | ||
| } | ||
| ], | ||
| "requestBody": { | ||
| "content": { | ||
| "application\/json": { | ||
| "schema": { | ||
| "type": "object", | ||
| "properties": { | ||
| "bucketId": { | ||
| "type": "string", | ||
| "description": "Storage bucket unique ID. You can create a new storage bucket using the Storage service [server integration](https:\/\/appwrite.io\/docs\/server\/storage#createBucket).", | ||
| "x-example": "<BUCKET_ID>" | ||
| }, | ||
| "fileId": { | ||
| "type": "string", | ||
| "description": "File ID.", | ||
| "x-example": "<FILE_ID>" | ||
| }, | ||
| "resourceId": { | ||
| "type": "string", | ||
| "description": "Composite ID in the format {databaseId:collectionId}, identifying a collection within a database.", | ||
| "x-example": "[ID1:ID2]" | ||
| } | ||
| }, | ||
| "required": [ | ||
| "bucketId", | ||
| "fileId", | ||
| "resourceId" | ||
| ] | ||
| } | ||
| } | ||
| } | ||
| } | ||
| } | ||
| }, |
There was a problem hiding this comment.
💡 Verification agent
🧩 Analysis chain
Validate new CSV migration endpoint specification
The /migrations/csv POST operation is well-defined with required parameters (bucketId, fileId, resourceId) and proper response schema. Before merging, ensure:
- The
migrationschema referenced under#/components/schemas/migrationexists and aligns with the controller’s return payload. - Rate-limit settings, scopes, and demo/edit links point to the correct files and match implementation.
- Your example in
docs/references/migrations/migration-csv.mdmatches this spec exactly.
Run:
to verify the operationId consistency.
🏁 Script executed:
rg -n migrationsCreateCsvMigration app/controllers/api/migrations.phpLength of output: 71
🏁 Script executed:
#!/bin/bash
# Locate all controller files related to migrations
fd migrations -t f app/controllers
# Search for any CSV-related function or method names across controllers
rg -n -i "csv" app/controllersLength of output: 910
🏁 Script executed:
#!/bin/bash
# Show the POST /v1/migrations/csv operation in the OpenAPI spec
rg -n "/v1/migrations/csv" -A10 -B5 app/config/specs/open-api3-latest-console.json
# Confirm the 'migration' schema exists under components/schemas
rg -n '"migration"\s*:' -A5 -B2 app/config/specs/open-api3-latest-console.jsonLength of output: 452
🏁 Script executed:
#!/bin/bash
# Locate the /v1/migrations/csv path entry in the OpenAPI spec (with proper JSON quoting)
rg -n '"\/v1\/migrations\/csv"' -A20 -B5 app/config/specs/open-api3-latest-console.json
# Confirm the operationId under that entry matches the controller’s createCsvMigration
rg -n '"operationId"\s*:\s*"createCsvMigration"' -n app/config/specs/open-api3-latest-console.jsonLength of output: 281
🏁 Script executed:
#!/bin/bash
# Display the POST /v1/migrations/csv spec entry including operationId and schema refs
rg -n '"/v1/migrations/csv":' -A30 -B5 app/config/specs/open-api3-latest-console.json
# Check operationId consistency
rg -n '"operationId": "createCsvMigration"' app/config/specs/open-api3-latest-console.json
# Verify response schema references the migration schema
rg -n '"\$ref": "#/components/schemas/migration"' app/config/specs/open-api3-latest-console.jsonLength of output: 279
🏁 Script executed:
#!/bin/bash
# Use jq to extract the POST /v1/migrations/csv operation object
jq '.paths["/v1/migrations/csv"].post' app/config/specs/open-api3-latest-console.json
# If that succeeds, show its operationId and response schema
jq '.paths["/v1/migrations/csv"].post | {operationId, responses: .responses["200"].content["application/json"].schema}' app/config/specs/open-api3-latest-console.jsonLength of output: 308
Add missing CSV migration endpoint to OpenAPI spec
It appears that /v1/migrations/csv is not defined under paths in your app/config/specs/open-api3-latest-console.json (jq returned null). You’ll need to add the full POST operation block so the spec and controller stay in sync:
• File: app/config/specs/open-api3-latest-console.json
- Insert under
paths:"/v1/migrations/csv": { "post": { "summary": "Import documents from a CSV", "operationId": "createCsvMigration", "tags": ["migrations"], "requestBody": { "required": true, "content": { "application/json": { "schema": { "type": "object", "required": ["bucketId","fileId","resourceId"], "properties": { "bucketId": { "type": "string" }, "fileId": { "type": "string" }, "resourceId": { "type": "string" } } } } } }, "responses": { "200": { "description": "Migration started", "content": { "application/json": { "schema": { "$ref": "#/components/schemas/migration" } } } } } } }
• Verify that #/components/schemas/migration exists and matches the controller’s payload.
• Update rate‑limit, scopes, and demo/edit links within this block to reflect your implementation.
• Confirm your example in docs/references/migrations/migration-csv.md matches the final spec.
| }, | ||
| "\/migrations\/csv": { | ||
| "post": { | ||
| "summary": "Import documents from a CSV", | ||
| "operationId": "migrationsCreateCsvMigration", | ||
| "consumes": [ | ||
| "application\/json" | ||
| ], | ||
| "produces": [ | ||
| "application\/json" | ||
| ], | ||
| "tags": [ | ||
| "migrations" | ||
| ], | ||
| "description": "Import documents from a CSV file into your Appwrite database. This endpoint allows you to import documents from a CSV file uploaded to Appwrite Storage bucket.", | ||
| "responses": { | ||
| "202": { | ||
| "description": "Migration", | ||
| "schema": { | ||
| "$ref": "#\/definitions\/migration" | ||
| } | ||
| } | ||
| }, | ||
| "x-appwrite": { | ||
| "method": "createCsvMigration", | ||
| "weight": 310, | ||
| "cookies": false, | ||
| "type": "", | ||
| "deprecated": false, | ||
| "demo": "migrations\/create-csv-migration.md", | ||
| "edit": "https:\/\/github.com\/appwrite\/appwrite\/edit\/master\/docs\/references\/migrations\/migration-csv.md", | ||
| "rate-limit": 0, | ||
| "rate-time": 3600, | ||
| "rate-key": "url:{url},ip:{ip}", | ||
| "scope": "migrations.write", | ||
| "platforms": [ | ||
| "console" | ||
| ], | ||
| "packaging": false, | ||
| "auth": { | ||
| "Project": [] | ||
| } | ||
| }, | ||
| "security": [ | ||
| { | ||
| "Project": [] | ||
| } | ||
| ], | ||
| "parameters": [ | ||
| { | ||
| "name": "payload", | ||
| "in": "body", | ||
| "schema": { | ||
| "type": "object", | ||
| "properties": { | ||
| "bucketId": { | ||
| "type": "string", | ||
| "description": "Storage bucket unique ID. You can create a new storage bucket using the Storage service [server integration](https:\/\/appwrite.io\/docs\/server\/storage#createBucket).", | ||
| "default": null, | ||
| "x-example": "<BUCKET_ID>" | ||
| }, | ||
| "fileId": { | ||
| "type": "string", | ||
| "description": "File ID.", | ||
| "default": null, | ||
| "x-example": "<FILE_ID>" | ||
| }, | ||
| "resourceId": { | ||
| "type": "string", | ||
| "description": "Composite ID in the format {databaseId:collectionId}, identifying a collection within a database.", | ||
| "default": null, | ||
| "x-example": "[ID1:ID2]" | ||
| } | ||
| }, | ||
| "required": [ | ||
| "bucketId", | ||
| "fileId", | ||
| "resourceId" | ||
| ] | ||
| } | ||
| } | ||
| ] | ||
| } | ||
| }, |
There was a problem hiding this comment.
💡 Verification agent
🧩 Analysis chain
Validate new CSV migration endpoint documentation and resource links
The /migrations/csv POST endpoint is defined with the correct operationId, parameters, responses, and security scopes. Please verify that the referenced demo file and documentation edit path actually exist in the repository to avoid broken links in the docs.
🏁 Script executed:
#!/bin/bash
# Verify documentation demo and edit files exist in the repo
fd create-csv-migration.md
fd migration-csv.mdLength of output: 94
Ensure CSV migration docs links are valid
- Verified that the reference file for the “migration‑csv” docs exists at:
docs/references/migrations/migration-csv.md
- The demo file
create-csv-migration.mdcould not be found in the repo.
Please add this file at the expected path or update the endpoint’s documentation to point to an existing demo.
commit: |
What does this PR do?
Sites <> 1.7.x
Need for update on updated Console SDK & CSV Imports.
cc @Meldiron
Test Plan
N/A.
Related PRs and Issues
N/A.
Checklist
Summary by CodeRabbit