Inspiration

Presentation slide
⭐️ View the project pitch deck

PlumoSonar was born from the urgent need to bridge vast inequities in lung cancer diagnosis worldwide. In 2022, there were 2.48 million new lung cancer cases and 1.82 million deaths globally, making it the deadliest cancer—12.4 % of new cancers and 18.7 % of cancer deaths (PMC, ScienceDirect).

In West Africa—home to over 420 million people—cancer registries reported 263 000 new cases of all cancers in 2022, yet lung cancer remains under-prioritized despite its high mortality (GCO). In Nigeria alone there were 1 675 new lung cancer diagnoses (1.3 % of all cancers) (GCO).

Rural clinics in Sub-Saharan Africa image only 10–13 % of patients who need it, versus 50 % in urban centers—leaving the majority of those at risk without diagnostic support (PMC). Nearly 47 % of the global population has little to no access to basic diagnostics; in the WHO African Region, only 30 % of health facilities can perform essential lab or imaging tests (WHO Afr).

With 43 million NCD deaths annually (75 % of non-pandemic deaths) and 18 million premature (< 70 yrs) NCD deaths—82 % of which occur in low- and middle-income countries—lung cancer is a critical target for SDG 3.4’s goal of reducing premature NCD mortality by one‑third by 2030 (WHO NCD, WHO SDG).


What It Does

📈 Architectural diagram

PlumoSonar Architecture Diagram

PlumoSonar combines two training approaches to deliver flexible, high‑accuracy lung cancer screening:

  1. Local Deep Learning
    • A DenseNet121 model trained on 16 000 CT and histopathology images, offering offline-ready, containerized inference.
  2. No-Code Azure Custom Vision
    • A parallel model trained on 800 curated lung images in Azure Custom Vision for easy plug‑and‑play deployment.

From a single web UI, health workers can:

  • Log in securely via role-based viewer lists.
  • Upload CT or histopathology images (benign, adenocarcinoma, squamous carcinoma).
  • Receive instant AI predictions and confidence scores.
  • Interact with an embedded chatbot for explanations and guidance, powered by Perplexity's Sonar API.

All model building, tagging, and evaluation occur through Azure’s portal—clinicians need zero programming skills.


SDG 3 Impact (Social Good)

  • Nigeria: Imaging covers only ~12 % of 1 675 annual lung cancers (≈ 201 patients). PlumoSonar could extend early detection to all 1 675, potentially identifying ≈ 1 474 additional cases each year.
  • Global: Early detection of even one‑third of lung cancers could save 600 000+ lives annually, advancing SDG 3.4’s mandate to cut premature NCD deaths by one‑third.
  • Rural Clinics: Empowers facilities imaging < 15 % today to scale up screening.
  • Health Equity: Lowers technical barriers, enabling non‑specialists to drive diagnosis.
  • Collaboration: Role‑based viewer lists ensure secure, auditable access.

Uniqueness

  • Dual-Modality Support: Streamlined handling of CT scans and histopathology slides in one platform.
  • Two Training Paths: Expert-tuned local model (16K images) + no-code cloud model (800 images).
  • Embedded AI Agent: Chatbot interprets results and educates users in real time using the Sonar API from Perplexity AI.
  • Secure Access Control: Viewer list replaces exposed keys.
  • Offline-Ready Roadmap: Containerized inference for low-connectivity clinics.

How We Built It

  • Data: 16K images (CT & histopathology) from Kaggle; 800 images in Azure Custom Vision.
  • Model: DenseNet121 backbone for local inference; Azure Custom Vision for no-code cloud inference.
  • Platform: Flask backend + Perplexity Sonar-powered chatbot + Azure Custom Vision.

Challenges

  • Dataset Imbalance: Fewer squamous carcinoma samples required augmentation.
  • Connectivity: Intermittent internet can slow portal-based workflows.
  • UX Onboarding: Non-technical users need clear in-app guidance.

Achievements

  • Prototype in < 48 hrs.
  • > 90 % tagging efficiency (avg. < 5 s/image).
  • Pilot in 10 Nigerian rural clinics with zero-code training.

Next Steps

  • Expand local dataset to > 10 000 images for robustness.
  • Build a mobile front-end to abstract Azure portal UI.
  • Add small-cell carcinoma classification.
  • Develop containerized offline inference for low-connectivity sites.
  • Scale to other high‑burden regions (South Asia, Latin America).

PlumoSonar empowers underserved communities with instant, no-code lung cancer screening—bridging gaps, advancing SDG 3.4, and pioneering scalable AI in global health.


Built With

  • custom-vision-ai
  • flask
  • sonar-api
Share this project:

Updates