Tags: openradx/adit
Tags
Add Mass Transfer app with partitioned DICOM export and conversion (#298 ) * Add mass transfer app and processing * Add mass transfer opt-out pseudonymization * Ensure mass transfer exports are cleaned on failure * Improve mass transfer filter and date inputs * Fix CI image build and refine mass transfer form layout * Document dcm2niix dependency and adjust mass transfer form layout * Fix mass transfer study time range and worker env * Route mass transfer via dicom queue and scope filters * Enforce filter names and tweak UI spacing * Fix duplicate studies in _find_studies recursive time-window split The recursive split used `mid` as the boundary for both halves, causing studies at the midpoint to appear in both. Additionally, since the DICOM query operates at date-level granularity, same-day splits produced identical queries returning the same results in both halves. Shift the right half to start at mid + 1s and deduplicate by StudyInstanceUID when merging. Co-Authored-By: Claude Opus 4.6 <[email protected]> * Fix pyright type errors * Address code review feedback for mass transfer PR * Use rslave mount propagation so containers see NAS mounts * split up tasks into processing and querying to manage creation of tasks better * Collapse two-phase mass transfer into single-phase with dedicated worker One task per partition handles discovery + export + convert on a separate mass_transfer queue. Includes review fixes and comprehensive test suite. * Add migration for export_cleaned field on MassTransferVolume * Include stdout in dcm2niix conversion error messages * Add three-mode anonymization with longitudinal linking * Fix DIMSE connection leak on abandoned generators and enable job cancellation of in-progress tasks * Remove MassTransferAssociation model, show user failed transfers with actual reason, improve status message and fix previous jobs from not being processed if the worker processing them was dead * Add persistent DIMSE connection mode to avoid per-operation association overhead * Rewrite mass transfer processing with deferred insertion, patient-centric folder structure, and two-mode pseudonymization * Use per-study random pseudonyms in non-linking mode to prevent patient correlation * Prefer C-MOVE over C-GET for mass transfer * Revert C-MOVE preference, keep pseudonymized UID fields, and detect empty retrievals Reverts the C-MOVE preference since C-MOVE requires PACS-side AE registration. Restores study/series pseudonymized UID fields on MassTransferVolume and populates them after DICOM anonymization. Tracks image count from C-GET to skip series with 0 retrieved images instead of passing empty directories to dcm2niix. * Include study time in folder name * Update spec with pseudonymized UID fields, folder name format, and cleanup notes * Re-add pseudonymized UID columns via migration 0010 Restores 0009 (which removed the fields) and adds 0010 to re-add them, since 0009 already ran on existing databases. * Add compressed transfer syntaxes to storage presentation contexts for C-GET IMPAX PACS stores most images in compressed formats (JPEG Lossless, JPEG 2000, etc.) but our storage contexts only offered uncompressed transfer syntaxes. During C-GET, the PACS could not send images back because no matching transfer syntax was negotiated, resulting in Success status with 0 images delivered. * Pass compressed transfer syntaxes when adding C-GET storage contexts The previous commit added compressed transfer syntaxes to StoragePresentationContexts, but dimse_connector.py was only passing cx.abstract_syntax to add_requested_context (which defaults to uncompressed-only). Now passes cx.transfer_syntax so the PACS can actually send images in JPEG Lossless and other compressed formats. * Fix C-GET reliability: presentation context split, dead association cleanup, and IMAGE-level pre-check - Split storage presentation contexts into image (compressed+uncompressed TSes) and non-image (uncompressed only) to prevent 0xA702 errors when PACS selects JPEG Lossless for non-pixel-data SOP classes like RawDataStorage - Fix persistent connection crash ("A former connection was not closed properly") when PACS drops the association between calls — clean up dead associations instead of raising AssertionError - Add IMAGE-level C-FIND pre-check before C-GET to skip series with no retrievable instances, avoiding wasted C-GET attempts on stale PACS catalog entries - Move max_search_results from Django settings to DicomServer model field - Add store handler error logging for C-GET sub-operation failures - Fix _find_studies to use actual time components for same-day queries and handle cross-midnight splits correctly * Pace C-GET requests and retry 0-image responses from IMPAX IMPAX returns "Success with 0 sub-operations" when overwhelmed by rapid-fire C-GET requests. Unlike selective transfer where each study is a separate procrastinate task (with natural inter-task delay), mass transfer processes hundreds of series in a tight loop with zero spacing. - Add 0.5s delay between C-GET requests to pace PACS load - Retry 0-image responses with exponential backoff (5-80s) + jitter - Mark 0-image failures as ERROR (not SKIPPED) so they are retried on subsequent task runs instead of being permanently lost - Remove series_has_instances preflight C-FIND (extra PACS load) - Log "silent empty" C-GET responses in dimse_connector - Persistent mode: check self.assoc before closing in func_wrapper * Reduce C-GET retry from 5 exponential to 1 quick retry Raw pynetdicom testing confirmed that IMPAX permanently refuses to serve certain series via C-GET (archived/offline storage), returning "Success with 0 sub-operations" regardless of load, timing, or transfer syntax. Five retries with exponential backoff (up to 80s) wasted ~40 minutes per affected study. One retry after 3-5s is enough to distinguish transient (PACS busy) from permanent (series unretrievable). Failed series are marked ERROR and retried on the next task run. * Fix stale C-FIND responses on persistent DIMSE connections When a C-FIND generator was abandoned mid-iteration (e.g. early return in _study_has_institution), the persistent association kept unconsumed responses. The next C-FIND on the same association mixed stale results from the previous query, causing ADIT to discover phantom series with wrong UIDs and institution names — leading to ~94% C-GET failure rate. Abort the association when a generator is not fully consumed in persistent mode so the next request gets a clean connection. Also update the task summary message to show downloaded/failed/skipped breakdown. * Add job-identifying parent folder to mass transfer output path The output path was missing the parent folder (adit_{app}_{pk}_{date}_{owner}) that other transfer types use via _create_destination_name, making it hard to associate output directories with specific jobs. Co-Authored-By: Claude Opus 4.6 <[email protected]> * Override rslave mount propagation in dev compose Co-Authored-By: Claude Opus 4.6 <[email protected]> * Add inline JSON filters, volume table, DICOM sidecar, and UI improvements - Replace M2M filter relation with inline filters_json JSONField - Add FilterSpec dataclass and pydantic validation (FilterSchema) - Add age filtering (min_age/max_age) with birth date range C-FIND and exact client-side age check - Replace dcm2niix sidecar augmentation with independent DICOM tag sidecar system (_extract_dicom_sidecar + _write_dicom_sidecar) - Add MassTransferVolumeTable (django-tables2) with status filter, replacing hardcoded HTML table in task detail view - Add CodeMirror 5 JSON editor for filter input - Change anonymization default to None; salt field only shown for linking mode - Add tests for sidecar extraction, pseudonymization leak prevention, age filtering, and FilterSpec * Fix CI failure by adding missing mass_transfer_worker Docker image tag The mass_transfer_worker service was missing its image tag in both the dev compose file and the CI workflow, causing "No such image" errors. Co-Authored-By: Claude Opus 4.6 <[email protected]> * Fix CI: add missing mass_transfer_worker image tag, add type hint and tests for _destination_base_dir CI failed because the mass_transfer_worker service had no pre-built image tag in the GitHub Actions workflow, causing 'No such image' on startup. Also adds MassTransferJob type annotation to _destination_base_dir and 5 test cases covering the job-identifying output folder. * Fix ruff lint errors: line length, import sorting, unused import * Fix pyright type errors in mass transfer module * Delegate compute_pseudonym to dicognito's IDAnonymizer Instead of reimplementing dicognito's ID generation algorithm, use IDAnonymizer directly so future algorithm changes are picked up automatically. * Vendor CodeMirror and jsonlint instead of loading from CDN Serves JS/CSS from Django static files so the app works in air-gapped hospital environments without internet access. * Revert "Vendor CodeMirror and jsonlint instead of loading from CDN" This reverts commit 5e63ec3. * Subclass TransferJob to add trial protocol fields MassTransferJob now extends TransferJob instead of DicomJob, gaining trial_protocol_id, trial_protocol_name, and the can_transfer_unpseudonymized permission. The processor passes the trial protocol fields through to DICOM pseudonymization. * Remove unused MASS_TRANSFER_EXPORT_BASE_DIR env variable Not read by any Python code — the processor uses the DicomFolder destination path directly. * Remove redundant MOUNT_DIR volume from dev compose Already inherited from docker-compose.base.yml with rslave propagation. The override was losing both rslave and the backup volume mount. * Remove persistent DIMSE mode, use manual connection management Switch mass transfer from persistent=True to explicit auto_connect=False with manually managed associations: - Phase 1 (discovery): one C-FIND association, aborted after use - Phase 2 (transfer): one C-GET association, closed in finally block Remove persistent parameter from DimseConnector and DicomOperator. Keep bug fixes: generator leak (finally block), dead association detection (is_alive check), compressed transfer syntaxes. * Address PR review feedback: simplify models, refactor processor, vendor assets - Remove MassTransferFilter model, CRUD views, URLs, templates, admin. Consolidate on filters_json with typed get_filters() getter. - Remove PSEUDONYMIZE_WITH_LINKING mode; use pseudonym_salt presence to distinguish linked vs random pseudonymization. - Refactor process() into _discover_series, _group_series, _transfer_grouped_series with patients->studies->series loop. - Add partition cleanup on retry (delete folder + DB volumes). - Remove cleanup_on_failure no-op override from MassTransferTask. - Use DicomToNiftiConverter instead of custom subprocess call. - Vendor CodeMirror 5.65.18 from static files instead of CDN. - Squash migrations (deleted all, fresh migration needed). - Remove rslave mount propagation from docker-compose.base.yml. - Fix pyproject.toml alphabetical ordering. * Generate fresh 0001_initial.py migration for mass transfer * Fix random pseudonym duplication bug, update and add test cases Fix _group_series to not generate random pseudonyms (they're assigned per-study in _transfer_grouped_series only, avoiding mismatches). Fix broken tests: - Patch DicomToNiftiConverter instead of removed subprocess.run - Use anonymization_mode=pseudonymize with salt instead of removed pseudonymize_with_linking mode - Rewrite skip-already-done test as partition cleanup test - Set pseudonym_salt="" for random pseudonym tests Add missing tests: - _group_series: deterministic, no-anon, random modes - Partition cleanup: folder deletion + volume cleanup - get_filters(): valid JSON, empty list, None * Serve CodeMirror from vendored static files, not CDN * Implement pseudonymization UI per medihack's spec Replace AnonymizationMode enum with two fields: - pseudonymize: BooleanField (default True, checkbox) - pseudonym_salt: CharField (pre-filled with random salt) Salt field is only visible when pseudonymize is checked (Alpine.js). Three behaviors: - pseudonymize=True + salt present: deterministic linked pseudonyms - pseudonymize=True + salt empty: random pseudonyms (no linking) - pseudonymize=False: no pseudonymization Help text on salt field is medihack's exact wording from PR review. * Remove duplicate pseudonym_salt field definition that overrode help text * Clear pseudonym_salt when pseudonymize is unchecked Add form clean() logic to set salt to empty string when the pseudonymize checkbox is off, avoiding a misleading salt value on jobs with no anonymization. Add form tests for all three pseudonymization modes. * Replace associations CSV export with full volume CSV export Export all 15 volume fields instead of just patient_id/pseudonym for linking-mode mass transfer jobs. Co-Authored-By: Claude Opus 4.6 <[email protected]> * Add missing django-codemirror dependency Co-Authored-By: Claude Opus 4.6 <[email protected]> * Fix mass_transfer form tests by granting user access to DICOM nodes The tests were failing because DicomNodeChoiceField filters queryset to nodes accessible by the user. Added group creation and access grants following the pattern used in batch_transfer tests. Co-Authored-By: Claude Opus 4.6 <[email protected]> * Add script to convert CSV filters to mass transfer JSON format Co-Authored-By: Claude Opus 4.6 <[email protected]> * Fix pyright and ruff linting errors Co-Authored-By: Claude Opus 4.6 <[email protected]> * Re-add auto_close flag for persistent DIMSE associations Add auto_close parameter to DimseConnector and DicomOperator. When auto_close=False, the decorator keeps the association open across calls, auto-switches on service type change (C-FIND -> C-GET), and aborts on abandoned generators to prevent stale responses. Mass transfer processor uses auto_close=False instead of manual connection management, reducing ~750 associations per partition to 2-3 and avoiding IMPAX overload. * Rename DICOM sidecar helpers to metadata and apply formatting Rename _SIDECAR_DICOM_TAGS, _extract_dicom_sidecar, and _write_dicom_sidecar to use "metadata" terminology instead of "sidecar" for clarity. Apply Ruff formatting fixes across processors.py and test_processor.py. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Explicit association management per medihack's request - Remove auto_close and service-switching from decorator - Raise error if wrong service type used on existing association - Discovery: one C-FIND association, explicitly opened and aborted - Transfer: one C-GET association per study, explicitly opened and closed * Move source/destination from job to task MassTransferTask now extends TransferTask instead of DicomTask, matching the pattern used by selective_transfer and batch_transfer. Source and destination are removed from MassTransferJob and set per-task in the form. * Fix tests for source/destination move and sidecar rename * Use auto_connect + auto_close=False for association management Replace manual open_connection/abort with auto_connect=True (default) and auto_close=False so the decorator handles connections automatically while keeping them alive across calls within the same phase. * Make CSV export available for all mass transfer jobs Previously restricted to linking-mode (pseudonymize + salt) only. * Add pseudonym salt to CSV header and job detail page * Abort association on generator abandonment in connect_to_server When a generator-based DIMSE operation (e.g. C-FIND) is abandoned mid-stream by the caller, pending responses remain on the wire. With auto_close=False the finally block no longer cleans this up, so we must catch GeneratorExit and abort the association to prevent corrupting subsequent operations on the same connection. * Add warning log on DIMSE generator abandonment and fully consume series generator Log a warning when a DIMSE generator is abandoned mid-stream to aid debugging association corruption issues. In _has_institution_match, fully consume the find_series generator via list() before filtering to avoid triggering the abandonment abort path. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Suppress UserWarning for invalid DT VR in frame reference datetime test The test intentionally uses a FrameReferenceDateTime value that triggers a pydicom UserWarning about invalid VR DT values. Adding filterwarnings keeps the test output clean without changing test behavior. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Refactor _transfer_grouped_series into smaller focused methods Extract deeply nested transfer loop (7-8 levels) into separate methods: SeriesTransferResult dataclass, _transfer_single_series, _export_and_convert_series, _export_series_to_folder, _zero_image_result, _create_volume_record, and _build_task_summary. Also rename sidecar helpers to metadata and rename pseudo_study_uid/pseudo_series_uid to study_uid_pseudonymized/series_uid_pseudonymized. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Fix flaky test by tracking sentinel in wado_retrieve Avoid cancelling the fetch task after the sentinel has been received, ensuring exceptions from the completed fetch function propagate instead of being swallowed by cancellation. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Add missing ON DELETE SET NULL constraint for mass_transfer queued_job FK Procrastinate deletes finished jobs via a raw SQL stored procedure (procrastinate_finish_job_v1), which bypasses Django's ORM-level on_delete=SET_NULL handling. This caused a ForeignKeyViolation when the worker tried to finalize completed MassTransferTasks, because the database-level FK constraint on queued_job_id defaulted to NO ACTION. The other apps (batch_transfer, selective_transfer, batch_query) already had a RunSQL migration step calling procrastinate_on_delete_sql() to set ON DELETE SET NULL at the PostgreSQL level. The mass_transfer app was missing this step because the queued_job field was included in the initial CreateModel migration rather than added later via AddField. This resulted in an infinite retry loop: each task completed its DICOM work successfully, but Procrastinate could not delete the job row, so the task was re-picked and re-processed indefinitely. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Allow DICOM server as mass transfer destination Previously mass transfer only supported folder destinations. This adds support for server destinations by downloading series to a temp directory and uploading via the destination operator. NIfTI conversion is disabled for server destinations both in the form (Alpine.js checkbox disabling) and backend (clean method clears the flag). Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Add dark mode support for CodeMirror in mass transfer form CSS overrides scoped to [data-bs-theme="dark"] so the JSON filter editor automatically matches the site-wide theme toggle. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Refactor mass transfer processor to create PENDING volumes before transfer Create MassTransferVolume records in PENDING state immediately after discovery so they appear in the UI while the task is running. Each volume is updated in place during transfer and saved via a finally block that guarantees persistence even on RetriableDicomError. Key changes: - Remove SeriesTransferResult dataclass; transfer methods now update MassTransferVolume fields directly instead of returning result objects - Replace _group_series() with _create_pending_volumes() + _group_volumes() - Remove _create_volume_record(); volumes are bulk-created upfront - Add persistent parameter to DicomOperator to abstract over DimseConnector.auto_close - Move destination setup (dest_operator/output_base) before try block with assert ensuring exactly one is set - Add defensive PENDING check and safe volume.save() in finally block - Replace protocol-specific comments (C-GET/C-FIND/DIMSE) with protocol-agnostic terms (fetch/query) - Restyle task detail template with Back to Job button and pagination - Add test fixture and _fake_export_success helper to reduce boilerplate Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Move mass transfer task queueing to background job Defers individual task queuing to a background Procrastinate job on the default queue so that the HTTP view returns immediately instead of blocking on potentially thousands of defer() calls. Adds factories, a dedicated tasks module, and comprehensive tests. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Add 24-hour safety timeout for mass transfer task processing Replace unlimited process timeout (None) with a 24-hour cap to prevent runaway tasks from blocking the mass_transfer queue indefinitely. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Address PR review findings for mass transfer module - Fix task reset routing: override queue_pending_task() on MassTransferTask so single-task resets use the mass_transfer queue instead of the generic dicom queue with wrong timeout - Deduplicate queue logic: queue_mass_transfer_tasks now delegates to task.queue_pending_task() instead of duplicating the defer/save code - Add error handling to task queueing loop with logging and re-raise - Narrow bare except clauses in _extract_dicom_metadata and _write_dicom_metadata to specific exception types with logging - Move close() error guard from processor into DicomOperator.close() - Add debug logging to silent except OSError: pass blocks - Remove unused cleanup_on_failure hook from DicomTask and all call sites - Add model-level clean() to MassTransferJob for date range validation - Add CheckConstraint ensuring partition_start < partition_end - Add MinValueValidator(1) to DicomServer.max_search_results - Make FilterSpec and DiscoveredSeries frozen dataclasses - Fix dimse_connector docstring referencing non-existent auto_config - Add docstring to build_partitions(), expand filters_json help_text - Sync inherited TransferTask field validators in migration - Fix test fixtures to satisfy new partition constraint - Add 10 new tests: filter JSON validation, discover_series filtering, date range rejection, source authorization - Delete outdated mass_transfer_spec.md - Document assertion usage policy in CLAUDE.md Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Use DICOM-style UIDs in mass transfer processor test fixtures Replace plain string identifiers with realistic DICOM UID format in test helper calls for consistency with actual DICOM data. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Avoid pebble subprocess DB leaks in queue pending tasks tests Replace run_worker_once() with direct queue_mass_transfer_tasks() calls in the queue tests. The worker spawned pebble daemon subprocesses that left PostgreSQL connections open, causing a teardown warning when running the full test suite. The queueing logic is now tested directly while task processing is already covered by core/tests/test_tasks.py. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Simplify mass transfer folder naming helpers Remove unnecessary SHA hash suffix from study folder names since studies are already nested under patient folders, and include series description in the fallback when series number is missing. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Replace dicognito-based pseudonym generation with own SHA-256 algorithm Decouple pseudonym computation from dicognito's IDAnonymizer to ensure stability across library upgrades. The new standalone compute_pseudonym function uses SHA-256 with divmod extraction into A-Z0-9, matching dicognito's approach but with a stronger hash. Deterministic pseudonyms are 14 chars, random pseudonyms are 15 chars so the two modes can be distinguished by length. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Auto-reconnect DIMSE on service switch and add mass transfer acceptance tests Instead of raising an error when switching DICOM services (e.g. C-FIND to C-GET) on a persistent connection, the DimseConnector now transparently closes the old association and opens a new one. This simplifies callers that need to chain different operations. Also adds Playwright acceptance tests covering pseudonymized, unpseudonymized, folder-destination, and NIfTI-conversion mass transfers across all three transfer protocols (C-MOVE, C-GET, DICOMweb). Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Close stale DB connection in queue_mass_transfer_tasks worker The procrastinate worker thread running queue_mass_transfer_tasks opens a Django DB connection (via MassTransferTask.save() in queue_pending_task) that was never closed, since worker threads don't go through Django's request/response lifecycle. This left an idle PostgreSQL session that blocked test database teardown, causing a PytestWarning. Add db.close_old_connections() in a finally block, matching the pattern already used in core/tasks.py _run_dicom_task. Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]> * Add scrollbar for long filters and fix cancel FK violation - Cap filters JSON display at 20 lines with overflow scroll - Clear queued_job_id before deleting procrastinate job on cancel * Fix pyright error: queued_job_id type annotation allows None * Add min_number_of_series_related_instances filter for mass transfer Add a new filter field that allows excluding series with fewer than a specified number of instances. Includes form validation (must be >= 1), processor filtering logic, and comprehensive tests for both. Co-Authored-By: Claude Opus 4.6 <[email protected]> * Add --min-series-instances arg to csv_to_mass_transfer_filters script Co-Authored-By: Claude Opus 4.6 <[email protected]> * Add --delimiter arg to csv_to_mass_transfer_filters script Allows parsing semicolon-delimited CSVs (common in European Excel exports) by passing e.g. --delimiter ";". Defaults to comma for backwards compatibility. Co-Authored-By: Claude Opus 4.6 <[email protected]> * Add MASS_TRANSFER_WORKER_REPLICAS to example.env Co-Authored-By: Claude Opus 4.6 <[email protected]> * Remove containers.containers.label from VS Code settings Co-Authored-By: Claude Opus 4.6 <[email protected]> * Merge DICOM metadata into dcm2niix JSON sidecars instead of separate files Instead of writing a separate *_dicom.json file, the extracted DICOM metadata is now merged directly into the dcm2niix-produced JSON sidecar files. dcm2niix-derived values take precedence on key conflicts. Also moves DICOM_METADATA_TAGS to Django settings for configurability. Co-Authored-By: Claude Opus 4.6 <[email protected]> * Move imports to module level in mass transfer processors Co-Authored-By: Claude Opus 4.6 <[email protected]> * Add mass transfer worker to observability config example Co-Authored-By: Claude Opus 4.6 <[email protected]> * Move MASS_TRANSFER_PROCESS_TIMEOUT to settings base Move the mass transfer process timeout constant from mass_transfer/tasks.py to settings/base.py where similar timeout settings already live. * Clarify mass transfer fetch reconciliation retry Move the fetch reconciliation delay to settings/base.py and rewrite the inline comment to frame the retry as cross-phase reconciliation between discovery and transfer, not network retry. The decision to probe once more depends on volume.number_of_images from the discovery phase, which is workflow state the operator/connector layer cannot see. * Pace mass transfer fetches per study instead of per series Move the PACS pacing delay from between consecutive series to between consecutive studies, and rename _DELAY_BETWEEN_SERIES accordingly. Series inside the same study share one association and fetch back-to-back; the pause now aligns with association boundaries where the PACS is most likely to reject or drop requests. Also add a TODO on the fetch reconciliation retry flagging the open question of whether it should live in the operator/connector layer or be handled via a stamina retry on a raised exception. * Add TODO comment for delay investigation Added a TODO comment to investigate necessity of a delay between studies. --------- Co-authored-by: Claude Opus 4.6 <[email protected]> Co-authored-by: Samuel Kwong <[email protected]> Co-authored-by: Kai Schlamp <[email protected]>
Fix anonymization failing on FrameReferenceDateTime element (#286) * Fix anonymization failing on FrameReferenceDateTime element Add FrameReferenceDateTime to SKIP_ELEMENTS_ANONYMIZATION to prevent dicognito from attempting to parse and shift this datetime element, which fails on certain DICOM files with non-standard datetime formats. Co-Authored-By: Claude Opus 4.5 <[email protected]> * Add tests for Pseudonymizer and FrameReferenceDateTime handling Add unit tests for the Pseudonymizer class to verify: - Basic pseudonymization sets PatientID and PatientName correctly - Empty/None pseudonyms raise appropriate errors - FrameReferenceDateTime elements are preserved (not causing parse errors) - StudyDate, StudyTime, and AcquisitionDateTime are preserved Co-Authored-By: Claude Opus 4.5 <[email protected]> --------- Co-authored-by: Claude Opus 4.5 <[email protected]>
Fix DimseConnector retry failure by resetting assoc in abort_connecti… …on (#283) ## Problem When abort_connection() was called during error handling, it would abort the DICOM association but fail to reset self.assoc to None. This caused subsequent retry attempts to fail with AssertionError("A former connection was not closed properly.") because open_connection() detected a stale association reference. ## Root Cause The abort_connection() method (line 204-207) was missing the critical `self.assoc = None` cleanup that close_connection() correctly performed (line 202). This inconsistency prevented the stamina retry mechanism from working properly - retries would fail immediately at the connection level instead of establishing a new connection. ## Solution Added `self.assoc = None` to abort_connection() after calling abort(), matching the behavior of close_connection(). This allows retry decorators to detect that no connection exists and successfully open a new connection on retry attempts. ## Testing - Added comprehensive test suite in test_dimse_connector.py - Tests verify retry behavior works correctly with the fix - All existing tests continue to pass Co-authored-by: Claude Sonnet 4.5 <[email protected]>
Allow direct download of a study from selective transfer query results ( #248) * init single study download demo * remove dummy code, this branch only shows an alternative frontend design * testing streaming zip download * Added error handling * lint fix * remove hardcoded server_id * Add dynamic file path construction * refactor/fix study time/date/modalities passing * add permission and acceptance test * don't display download column if no permission * rename permission * added some comments about error handling * fix lint errors * fix acceptance test * ensure ci removes volumes on compose-down * test works locally, add db commit to fix git ci * Make sure test db has new migrations * debug ci pipeline * lint * check if table is being rendered * more debugging * lint * More debugging * debug: check url path * fixed acceptance test * narrow down caught exceptions * refactoring and comments * lint * rework acceptance test * refactor * use task group for producer/consumer * lint * add type hinting * only start consumer after producer has put first item * separate barrier from start_consumer_event * fix merge * Remove DS_Store * validate request parameters * handle study_modalities is None * single use downloader class * refactor * validate path parameters * type hints for study_params * validate modalities * url encode parameter strings * single download error log, clean up scheduled tasks * Change executor settings, add clarifying comments * make the check of first put to queue threadsafe * Add error handling for filepath construction * rename safe path default * move async event setting outside lock * handle value error from file construction * small typo fixes * small parameter changes * move url encoding from template to consumer * add acceptance test for pseudo study * add unit test for early exit from invalid server id * remove default catch for study modalities * fix unit test conflicting with acceptance tests * Add clarifying comments * add missing comment --------- Co-authored-by: Kai Schlamp <[email protected]>
Add smoke tests of some models and views (#242) * Add comprehensive test cases for batch query, batch transfer, DICOM explorer, and selective transfer functionalities * Remove unused task creation in batch query and selective transfer tests * Refactor test cases in batch query, batch transfer, DICOM explorer, and selective transfer to remove redundant docstrings for improved readability * Add unit tests for DICOM job and node functionalities (#237) Co-authored-by: Kai Schlamp <[email protected]> * Correct expected HTTP status in test_batch_query_job_delete_view * Fix some tests * Remove some flaky dicom explorer tests * Update adit/batch_query/tests/test_batch_query.py Co-authored-by: Copilot <[email protected]> * Rename some tests and fix small issues --------- Co-authored-by: mhumzaarain <[email protected]> Co-authored-by: Muhammad Humza Arain <[email protected]> Co-authored-by: Copilot <[email protected]>
PreviousNext