Skip to content

durandtibo/coola

coola

CI Nightly Tests Nightly Package Tests Codecov
Documentation Documentation
Code style: black Doc style: google Ruff try/except style: tryceratops
PYPI version Python BSD-3-Clause
Downloads Monthly downloads

Overview

coola is a lightweight Python library that makes it easy to compare complex and nested data structures. It provides simple, extensible functions to check equality between objects containing PyTorch tensors, NumPy arrays, pandas/polars DataFrames, and other scientific computing objects.

Quick Links:

Why coola?

Python's native equality operator (==) doesn't work well with complex nested structures containing tensors, arrays, or DataFrames. You'll often encounter errors or unexpected behavior. coola solves this with intuitive comparison functions:

Check exact equality:

>>> import numpy as np
>>> import torch
>>> from coola.equality import objects_are_equal
>>> data1 = {"torch": torch.ones(2, 3), "numpy": np.zeros((2, 3))}
>>> data2 = {"torch": torch.ones(2, 3), "numpy": np.zeros((2, 3))}
>>> objects_are_equal(data1, data2)
True

Compare with numerical tolerance:

>>> from coola.equality import objects_are_allclose
>>> data1 = {"value": 1.0}
>>> data2 = {"value": 1.0 + 1e-9}
>>> objects_are_allclose(data1, data2)
True

Debug differences easily:

>>> from coola.equality import objects_are_equal
>>> actual = {"users": [{"id": 1, "score": 95}, {"id": 2, "score": 87}]}
>>> expected = {"users": [{"id": 1, "score": 95}, {"id": 2, "score": 88}]}
>>> objects_are_equal(actual, expected, show_difference=True)
False

Log output

numbers are different:
  actual   : 87
  expected : 88
mappings have different values for key 'score'
sequences have different values at index 1
mappings have different values for key 'users'

See the user guide for detailed examples.

Features

coola provides a comprehensive set of utilities for working with complex data structures:

🔍 Equality Comparison

Compare complex nested objects with support for multiple data types:

  • Exact equality: objects_are_equal() for strict comparison
  • Approximate equality: objects_are_allclose() for numerical tolerance
  • User-friendly difference reporting: Clear, structured output showing exactly what differs
  • Extensible: Add custom comparators for your own types

Learn more →

Supported types: JAXNumPypandaspolarsPyArrowPyTorchxarray • Python built-ins (dict, list, tuple, set, etc.)

See all type-specific comparison rules →

📊 Data Summarization

Generate human-readable summaries of nested data structures for debugging and logging:

  • Configurable depth control
  • Type-specific formatting
  • Truncation for large collections

Learn more →

🔄 Data Conversion

Transform data between different nested structures:

  • Convert between list-of-dicts and dict-of-lists formats
  • Useful for working with tabular data and different data representations

Learn more →

🗂️ Mapping Utilities

Work with nested dictionaries efficiently:

  • Flatten nested dictionaries into flat key-value pairs
  • Extract specific values from complex nested structures
  • Filter dictionary keys based on patterns or criteria

Learn more →

🔁 Iteration

Traverse nested data structures systematically:

  • Depth-first search (DFS) traversal for nested containers
  • Breadth-first search (BFS) traversal for level-by-level processing
  • Filter and extract specific types from heterogeneous collections

Learn more →

📈 Reduction

Compute statistics on sequences with flexible backends:

  • Calculate min, max, mean, median, quantile, std on numeric sequences
  • Support for multiple backends: native Python, NumPy, PyTorch
  • Consistent API regardless of backend choice

Learn more →

Installation

We highly recommend installing coola in a virtual environment to avoid dependency conflicts.

Using uv (recommended)

uv is a fast Python package installer and resolver:

uv pip install coola

Install with all optional dependencies:

uv pip install coola[all]

Install with specific optional dependencies:

uv pip install coola[numpy,torch]  # with NumPy and PyTorch

Using pip

Alternatively, you can use pip:

pip install coola

Install with all optional dependencies:

pip install coola[all]

Install with specific optional dependencies:

pip install coola[numpy,torch]  # with NumPy and PyTorch

Requirements

  • Python: 3.10 or higher
  • Core dependencies: None (fully optional dependencies)

Optional dependencies (install with coola[all]): JAXNumPypandaspolarsPyArrowPyTorchxarray

For detailed installation instructions, compatibility information, and alternative installation methods, see the installation guide.

Compatibility Matrix

coola jax* numpy* packaging* pandas* polars* pyarrow* torch* xarray* python
main >=0.5.0,<1.0 >=1.24,<3.0 >=22.0 >=2.0,<4.0 >=1.0,<2.0 >=11.0 >=2.0,<3.0 >=2024.1 >=3.10
1.1.3 >=0.5.0,<1.0 >=1.24,<3.0 >=22.0 >=2.0,<4.0 >=1.0,<2.0 >=11.0 >=2.0,<3.0 >=2024.1 >=3.10
1.1.2 >=0.5.0,<1.0 >=1.24,<3.0 >=22.0 >=2.0,<4.0 >=1.0,<2.0 >=11.0 >=2.0,<3.0 >=2024.1 >=3.10
1.1.1 >=0.5.0,<1.0 >=1.24,<3.0 >=22.0 >=2.0,<4.0 >=1.0,<2.0 >=11.0 >=2.0,<3.0 >=2024.1 >=3.10
1.1.0 >=0.5.0,<1.0 >=1.24,<3.0 >=22.0 >=2.0,<4.0 >=1.0,<2.0 >=11.0 >=2.0,<3.0 >=2024.1 >=3.10
1.0.1 >=0.5.0,<1.0 >=1.24,<3.0 >=22.0 >=2.0,<4.0 >=1.0,<2.0 >=11.0 >=2.0,<3.0 >=2024.1 >=3.10
1.0.0 >=0.5.0,<1.0 >=1.24,<3.0 >=22.0 >=2.0,<3.0 >=1.0,<2.0 >=11.0,<23.0 >=2.0,<3.0 >=2024.1 >=3.10
0.11.1 >=0.5.0,<1.0 >=1.24,<3.0 >=22.0 >=2.0,<3.0 >=1.0,<2.0 >=11.0,<22.0 >=2.0,<3.0 >=2024.1 >=3.10
0.11.0 >=0.5.0,<1.0 >=1.24,<3.0 >=22.0 >=2.0,<3.0 >=1.0,<2.0 >=11.0,<22.0 >=2.0,<3.0 >=2023.1 >=3.10
0.10.0 >=0.5.0,<1.0 >=1.24,<3.0 >=22.0 >=2.0,<3.0 >=1.0,<2.0 >=11.0,<22.0 >=2.0,<3.0 >=2023.1 >=3.10

* indicates an optional dependency

older versions
coola jax* numpy* packaging* pandas* polars* pyarrow* torch* xarray* python
0.9.1 >=0.5.0,<1.0 >=1.24,<3.0 >=22.0,<26.0 >=2.0,<3.0 >=1.0,<2.0 >=11.0,<22.0 >=2.0,<3.0 >=2023.1 >=3.10,<3.15
0.9.0 >=0.4.6,<1.0 >=1.24,<3.0 >=22.0,<26.0 >=2.0,<3.0 >=1.0,<2.0 >=11.0,<20.0 >=2.0,<3.0 >=2023.1 >=3.9,<3.14
0.8.7 >=0.4.6,<1.0 >=1.22,<3.0 >=21.0,<26.0 >=1.5,<3.0 >=1.0,<2.0 >=10.0,<20.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.14
0.8.6 >=0.4.6,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<20.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.14
0.8.5 >=0.4.6,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<19.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.14
0.8.4 >=0.4.6,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.14
0.8.3 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.8.2 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.8.1 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.8.0 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.7.4 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=10.0,<18.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.7.3 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.7.2 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<2.0 >=1.11,<3.0 >=2023.1 >=3.9,<3.13
0.7.1 >=0.4.1,<1.0 >=1.21,<3.0 >=1.3,<3.0 >=0.18.3,<1.0 >=1.10,<3.0 >=2023.1 >=3.9,<3.13
0.7.0 >=0.4.1,<1.0 >=1.21,<2.0 >=1.3,<3.0 >=0.18.3,<1.0 >=1.10,<3.0 >=2023.1 >=3.9,<3.13
0.6.2 >=0.4.1,<1.0 >=1.21,<2.0 >=1.3,<3.0 >=0.18.3,<1.0 >=1.10,<3.0 >=2023.1 >=3.9,<3.13
0.6.1 >=0.4.1,<1.0 >=1.21,<2.0 >=1.3,<3.0 >=0.18.3,<1.0 >=1.10,<3.0 >=2023.1 >=3.9,<3.13
0.6.0 >=0.4.1,<1.0 >=1.21,<2.0 >=1.3,<3.0 >=0.18.3,<1.0 >=1.10,<3.0 >=2023.1 >=3.9,<3.13
0.5.0 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.4.0 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.3.1 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.3.0 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.2.2 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.2.1 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.2.0 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<1.0 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.1.2 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.21 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.1.1 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.20 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.13
0.1.0 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.20 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.12
0.0.26 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.20 >=1.10,<2.2 >=2023.1,<2023.13 >=3.9,<3.12
0.0.25 >=0.4.1,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.20 >=1.10,<2.2 >=2023.4,<2023.11 >=3.9,<3.12
0.0.24 >=0.3,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.20 >=1.10,<2.2 >=2023.3,<2023.9 >=3.9,<3.12
0.0.23 >=0.3,<0.5 >=1.21,<1.27 >=1.3,<2.2 >=0.18.3,<0.20 >=1.10,<2.1 >=2023.3,<2023.9 >=3.9,<3.12
0.0.22 >=0.3,<0.5 >=1.20,<1.26 >=1.3,<2.1 >=0.18.3,<0.19 >=1.10,<2.1 >=2023.3,<2023.9 >=3.9,<3.12
0.0.21 >=0.3,<0.5 >=1.20,<1.26 >=1.3,<2.1 >=0.18.3,<0.19 >=1.10,<2.1 >=2023.3,<2023.8 >=3.9,<3.12
0.0.20 >=0.3,<0.5 >=1.20,<1.26 >=1.3,<2.1 >=0.18.3,<0.19 >=1.10,<2.1 >=2023.3,<2023.8 >=3.9

Contributing

Contributions are welcome! We appreciate bug fixes, feature additions, documentation improvements, and more. Please check the contributing guidelines for details on:

  • Setting up the development environment
  • Code style and testing requirements
  • Submitting pull requests

Whether you're fixing a bug or proposing a new feature, please open an issue first to discuss your changes.

API Stability

⚠️ Important: As coola is under active development, its API is not yet stable and may change between releases. We recommend pinning a specific version in your project’s dependencies to ensure consistent behavior.

License

coola is licensed under BSD 3-Clause "New" or "Revised" license available in LICENSE file.

About

Python library to check if two complex/nested objects are equal or not.

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages