Skip to content

ktomek/initspark

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

46 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Build codecov License: MIT GitHub release GitHub issues GitHub pull requests Maven Central

InitSpark

InitSpark is a lightweight, Kotlin Multiplatform (KMP) coroutine-based startup orchestration library. It provides a structured way to declare, sequence, and execute initialization tasks (called sparks) during your app's startup phase across JVM, Android, and iOS natively.

Installation

InitSpark is published to Maven Central.

Add the dependency to your build.gradle.kts:

sourceSets {
    commonMain.dependencies {
        implementation("io.github.ktomek:initspark:1.0.0") // Replace with latest version
    }
}

For JVM or Android-only projects:

dependencies {
    implementation("io.github.ktomek:initspark:1.0.0")
}

Features

  • πŸ”₯ Declarative DSL to define sparks
  • ⏱️ Time tracking for individual sparks and phases
  • βš™οΈ Three execution modes: await, async, and spark
  • 🌲 Dependency management between sparks (with cycle detection)
  • ⚠️ Spark importance levels: CRITICAL (fail-fast) and OPTIONAL (failure-tolerant)
  • πŸ” Configurable retry policies with None, Fixed, and Exponential backoff
  • πŸ“‘ Reactive SparkEvent stream for lifecycle monitoring
  • πŸ”‘ Flexible Key interface for spark identification
  • πŸ§ͺ Built-in testing support
  • πŸ“Š Performance metrics via SparkTimingInfo

Installation

Add to your build.gradle:

repositories {
    maven("https://jitpack.io")
}

dependencies {
    implementation("com.github.ktomek:initspark:<version>")
}

Getting Started

1. Implement Spark

class DatabaseSpark @Inject constructor(...) : Spark {
    override suspend fun execute() { /* initialize database */ }
}

2. Build a configuration

val sparks = setOf(
    DatabaseSpark(),
    NotificationSpark(),
    AnalyticsSpark(),
    /* ... */
)

val config = buildSparks(sparks) {
    // Sequential, must complete before next spark starts
    await { System.loadLibrary("crypto-lib") }
    await<LoggerSpark>()
    await<ActivityLifecycleSpark>()

    val ioContext = Dispatchers.IO
    val coreDeps = setOf(Key("Database"))

    // Parallel, completion is tracked
    async<DatabaseSpark>(
        key = "Database".asKey(),
        context = ioContext
    )
    async<NotificationSpark>(
        context = ioContext,
        needs = coreDeps,
        policy = SparkPolicy(importance = SparkImportance.OPTIONAL)
    )
    async<AnalyticsSpark>(
        context = ioContext,
        needs = coreDeps,
        policy = SparkPolicy(
            retry = RetryPolicy(
                retryCount = 3,
                backoff = Backoff.Exponential(initialDelayMillis = 200)
            )
        )
    )

    // Parallel, fire-and-forget (not tracked)
    spark<ConsentManagerSpark>(context = ioContext, needs = coreDeps)
}

3. Run

val initSpark = InitSpark(config, CoroutineScope(Dispatchers.Default))

// Suspending version (preferred)
initSpark.initialize()

// Blocking version (for Java interop or legacy code)
initSpark.initializeBlocking()

Spark Types

Builder function Execution Tracked Default key
await { } / await<T>() Sequential βœ… Class simple name
async { } / async<T>() Parallel βœ… Class simple name
spark { } / spark<T>() Parallel ❌ Class simple name

Each builder function accepts:

Parameter Type Description
key Key? Optional unique identifier (defaults to class name)
needs Set<Key> Keys of sparks that must complete first
context CoroutineContext Coroutine dispatcher
policy SparkPolicy Importance and retry configuration

Spark Keys

Key is an interface, letting you use any type with proper equality β€” data object, enum entry, or a plain string:

// String-backed key (default)
"Database".asKey()           // or Key("Database")

// Custom key type (recommended for robustness)
data object DatabaseKey : Key
enum class AppKey : Key { DATABASE, ANALYTICS }

Importance Levels

Control how failures propagate using SparkPolicy:

// CRITICAL (default): failure throws and halts initialization
async<DatabaseSpark>(policy = SparkPolicy(importance = SparkImportance.CRITICAL))

// OPTIONAL: failure is logged and emitted as a SparkEvent.Failed, but other sparks continue
async<AnalyticsSpark>(policy = SparkPolicy(importance = SparkImportance.OPTIONAL))

Retry Policies

Attach a RetryPolicy to automatically retry failing sparks:

val policy = SparkPolicy(
    retry = RetryPolicy(
        retryCount = 3,
        backoff = Backoff.Exponential(initialDelayMillis = 100L, factor = 2.0)
    )
)

Backoff strategies

Strategy Description
Backoff.None No delay between retries (default)
Backoff.Fixed(delayMillis) Constant delay
Backoff.Exponential(initialDelayMillis, factor) Delay Γ— factor on each attempt

Observing Lifecycle Events

Use the events flow to receive real-time lifecycle updates from the orchestrator:

launch {
    initSpark.events.collect { event ->
        when (event) {
            is SparkEvent.Started   -> log("β–Ά ${event.key} started")
            is SparkEvent.Completed -> log("βœ… ${event.key} done in ${event.duration}")
            is SparkEvent.Failed    -> log("❌ ${event.key} failed: ${event.error}")
            is SparkEvent.Retry     -> log("πŸ” ${event.key} retry #${event.retryCount}")
        }
    }
}

Waiting for Initialization

// Suspend until all TRACKABLE sparks are done
initSpark.waitUntilTrackableInitialized()

// Suspend until ALL sparks (including fire-and-forget) are done
initSpark.waitUntilInitialized()

// Or observe via StateFlow
initSpark.isTrackableInitialized.collect { ready -> if (ready) onReady() }
initSpark.isInitialized.collect { ready -> if (ready) onFullyReady() }

Timing API

Access detailed performance metrics after initialization:

initSpark.waitUntilInitialized()

with(initSpark.timing) {
    // Per-spark durations
    allDurations().forEach { (declaration, duration) ->
        Timber.d("Spark '${declaration.key}' [${declaration.type}] took $duration")
    }

    // Cumulative total (sum of all individual durations)
    Timber.d("Sum of all durations: ${sumOfDurations()}")
    Timber.d("Sum by type: ${sumOfDurationsByType()}")

    // Wall-clock window (first start β†’ last finish)
    Timber.d("Total wall-clock time: ${executionDelta()}")
    Timber.d("Wall-clock by type: ${executionDeltaByType()}")
}

Timing methods

Method Returns
durationOf(declaration) Duration for one spark, or null
allDurations() Map<SparkDeclaration, Duration>
sumOfDurations() Cumulative sum of all measured durations
sumOfDurationsByType() Cumulative sum grouped by SparkType
executionDelta() Wall-clock window (first start β†’ last stop)
executionDeltaByType() Wall-clock window grouped by SparkType

Contributing

Contributions are welcome!
Please review our CONTRIBUTING.md for details on code style, testing, and how to submit pull requests.

License

This project is licensed under the MIT License.


MIT License

MIT License

Copyright (c) 2023 ktomek

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

About

Startup orchestration for Kotlin-based apps

Resources

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages