Tags: simplejson/simplejson
Tags
Add Python 2.7 wheel builds for Windows platforms (#378) * Build the pure-Python fallback wheel as universal (py2.py3-none-any) Fixes #377. Pre-4.0 simplejson installed cleanly from PyPI on offline Python 2.7 builds via a wheelhouse populated with `pip download`. 4.0+ ships a `py3-none-any.whl` alongside the sdist, so Python 2.7 wheelhouses contain only the sdist for the pure-Python path. Modern pip then runs a PEP 517 isolated build on that sdist and requires setuptools>=42 from the wheelhouse, which offline users usually do not seed. Add `--universal` to the `bdist_wheel` step in the build_sdist job so the wheel is tagged `py2.py3-none-any` and is usable on both interpreters. The sdist install path and the separate C-extension wheels produced by cibuildwheel are unchanged, and pyproject.toml / the test_pep517_build job stay in place for modern ecosystem tooling. https://claude.ai/code/session_01Sc7XDDu56uaU1xZEjJR1GY * Build cp27 Windows AMD64 wheels in build_wheels_py27 Expands build_wheels_py27 into a matrix that covers both Linux x86_64 (unchanged) and Windows AMD64. Published cp27 wheels previously only covered manylinux1 / manylinux2010 x86_64, so Py2.7-on-Windows users had no matching binary wheel on PyPI, pip fell through to the sdist, and the PEP 517 isolated build failed under --no-index on wheelhouses without setuptools>=42. This was the scenario in #377. Ships alongside the universal py2.py3-none-any fallback wheel from the previous commit, which still catches platforms this matrix does not cover (macOS, aarch64 Linux, ppc64le, etc.). Matrix notes: - cibuildwheel v1 looks at the arch env var for the runner platform it actually runs on and ignores the others, so setting both CIBW_ARCHS_LINUX and CIBW_ARCHS_WINDOWS is safe. - Artifact names are now wheels-py27-<os>-<arch> so the Linux and Windows uploads don't collide under upload-artifact@v7's unique- name requirement. upload_pypi / upload_pypi_test use download-artifact@v8 with merge-multiple: true, so the rename is transparent to release uploads. - Adds build_wheels_py27 to gate_windows.needs so Windows branch protection catches Py2.7 Windows build failures. Per AGENTS.md, Py2 builds cannot be reproduced locally - if Windows cp27 fails under cibuildwheel v1.12.0 on windows-latest (VC++ 9.0 toolchain surface), fail-fast: false keeps the Linux wheel green while we iterate. https://claude.ai/code/session_01Sc7XDDu56uaU1xZEjJR1GY * Also build cp27 Windows x86 (32-bit) wheels Adds a third entry to the build_wheels_py27 matrix covering windows-latest/x86, so 32-bit Py2.7 interpreters on Windows (still common in legacy corporate installs, where the default Py2.7 MSI was 32-bit for years) also get a pre-built wheel on PyPI. Switches the arch env from the platform-specific CIBW_ARCHS_{LINUX,WINDOWS} pair to the platform-agnostic CIBW_ARCHS, matching the pattern in build_wheels. Drops fail-fast: false - branch protection won't merge with a red matrix entry anyway, so letting siblings cancel early on a failure is the more useful default. https://claude.ai/code/session_01Sc7XDDu56uaU1xZEjJR1GY * Revert universal py2.py3-none-any wheel from build_sdist Reverts the --universal flag added in 8fc52d0. Retagging the fallback pure-Python wheel to py2.py3-none-any would be picked up by pip on any Py2.7 platform without a matching cp27 binary wheel (macOS, aarch64/ppc64le Linux). Those users currently build from the sdist and get the C extension; under the universal wheel they would silently get the pure-Python implementation instead - a measurable perf regression with no user-visible signal. The cp27 Windows AMD64 + x86 wheels added in ae97829 and 61a04d5 cover the original #377 reporter (Py2.7 on Windows). Py2.7 users on macOS or non-x86_64 Linux fall back to the sdist as they did before, which is loud-failure behavior (online: builds fine, gets speedups; offline: clear error) rather than silent slowdown. https://claude.ai/code/session_01Sc7XDDu56uaU1xZEjJR1GY * Pin cibuildwheel to v1.11.1 for the Py2.7 wheel job v1.12.0 removed cp27 from WINDOWS_PYTHONS, so the Windows matrix entries added in ae97829 / 61a04d5 fail at identifier selection ("cibuildwheel: No build identifiers selected: BuildSelector('cp27-*' - 'pp*')") before the build phase even starts. v1.11.1 is the last release that still carries cp27 identifiers for both Linux (manylinux1 Docker image) and Windows (NuGet python2 package), so one version pin serves all three matrix entries (linux/x86_64, windows/AMD64, windows/x86). Also updates the AGENTS.md cibuildwheel gotchas note, which was pinning readers to the broken v1.12.0, so the next person touching this job learns the v1.11.1 reasoning from the doc rather than rediscovering it from CI. https://claude.ai/code/session_01Sc7XDDu56uaU1xZEjJR1GY * Enable cp27 Windows wheels via DISTUTILS_USE_SDK + msvc-dev-cmd My earlier v1.11.1 pin (85aba3c) did not fix "No build identifiers selected" on Windows, which confirmed a reading of the cibuildwheel changelog: v1.11.0 did not *remove* cp27 from Windows, it gated it behind a custom-compiler flag. cibuildwheel/windows.py filters cp27 out of WINDOWS_PYTHONS during identifier selection unless both DISTUTILS_USE_SDK and MSSdk are present in the cibuildwheel process environment; Microsoft pulled the VC 9.0 ("Visual C++ Compiler for Python 2.7") download years ago, so cibuildwheel won't pretend it has a usable compiler by default. v2.0.0 is the release that dropped cp27 entirely, not v1.12. So the fix is not a version pin, it is environment: - Restore pypa/[email protected] (the latest v1 release). - Set DISTUTILS_USE_SDK=1 and MSSdk=1 at the step scope so cibuildwheel itself sees them during identifier selection, and mirror them via CIBW_ENVIRONMENT_WINDOWS so the per-wheel build subprocess forwards them to distutils. - Add an ilammy/msvc-dev-cmd@v1 activation step on Windows runners with arch matching the wheel (x64 for AMD64, x86 for x86), so distutils finds an MSVC toolchain via INCLUDE/LIB/PATH in place of the missing VC 9.0 installer. Also updates AGENTS.md to document the real gate (env vars, not version number) and adds a "do not chase a v1.11.x pin" warning so the next person looking at a "No build identifiers selected" failure reaches the right fix instead of the one I tried first. https://claude.ai/code/session_01Sc7XDDu56uaU1xZEjJR1GY * Scope cp27 Windows env vars to Windows runners only The DISTUTILS_USE_SDK / MSSdk / CIBW_ENVIRONMENT_WINDOWS triple added in e494f4f was set unconditionally across all three build_wheels_py27 matrix entries. That's functionally a no-op on the Linux cp27 entry (distutils on POSIX doesn't consult those vars, and cibuildwheel targeting Linux ignores CIBW_ENVIRONMENT_WINDOWS), but it reads like those env vars matter everywhere. Guard them with `runner.os == 'Windows'` so the Linux cp27 entry sees an empty env and the Windows- only intent is explicit at the call site. build_wheels (the modern Py3 job) is a separate job that doesn't touch any of this - its [email protected] step has no DISTUTILS_ USE_SDK / MSSdk / msvc-dev-cmd wiring and isn't affected. https://claude.ai/code/session_01Sc7XDDu56uaU1xZEjJR1GY * Release 4.1.1 Bumps VERSION in setup.py, conf.py, and simplejson/__init__.py to 4.1.1 and stamps the CHANGES.txt entry with today's date. The 4.1.1 release closes #377: offline / --no-index installs on Py2.7-on- Windows were failing at the PEP 517 isolated-build step because no cp27 win_amd64 / win32 wheel existed on PyPI and the sdist fallback required setuptools>=42 in the wheelhouse. build_wheels_py27 now builds those wheels directly via cibuildwheel v1.12.0 plus the DISTUTILS_USE_SDK / MSSdk gate and ilammy/msvc-dev-cmd. https://claude.ai/code/session_01Sc7XDDu56uaU1xZEjJR1GY * Restrict push trigger to main and release tags Dedupe CI: feature-branch pushes with an open PR were firing both the push and pull_request events on the same SHA, doubling the workflow runs (and the cibuildwheel minute burn). Scoping push to branches: [main] + tags: [v*, test-v*] leaves pull_request as the single trigger for development branches, while preserving the existing release flow - upload_pypi / upload_pypi_test gate on startsWith(github.event.ref, 'refs/tags/v') / 'refs/tags/test-v', and those still fire under the new push config. merge_group keeps the merge-queue gate jobs triggerable. Also drops the commented-out release-trigger example that was sitting above the old on: line; if we ever want that, the git history has it. https://claude.ai/code/session_01Sc7XDDu56uaU1xZEjJR1GY --------- Co-authored-by: Claude <[email protected]>
Accelerate indented encoding in the C extension; release 4.1.0 (#376) * Accelerate indented encoding in the C extension; release 4.1.0 The encoder's C fast path previously fell back to the pure-Python _make_iterencode whenever indent= was not None. This change teaches encoder_listencode_list and encoder_listencode_dict to emit the newline-plus-indent prefix themselves: a shared encoder_build_indent_string helper builds '\n' + indent * level, and each container prepends the indent string before the first item, appends item_separator + indent between items, and emits '\n' + indent * (level-1) before the closing bracket. The slow-path branch in the list encoder defers the opening '[' until at least one item is produced, so an empty iterable_as_array iterator still serialises as '[]' instead of '[\n \n]'. encoder.py drops the indent is None gate so c_make_encoder is used for both indented and compact output. Benchmarks on a 1000-element nested dict show roughly 4-5x end-to-end speedup versus the pure-Python path, with byte-for-byte identical output across the existing indent tests (indent=0, indent=2, indent='\t', nested containers, empty containers). PEP 678 exc.add_note() annotations remain a Python-encoder-only feature. The three test_errors.py add_note tests used `indent=2` as a workaround to force the Python encoder; now that indent runs through C, they explicitly flip simplejson._toggle_speedups(False) around the assertions instead. While touching the encoder also fixed a pre-existing -Wdeclaration-after-statement violation in py_encode_basestring (PyObject *rval = ...) so the whole file rebuilds under the strict CFLAGS recipe in AGENTS.md. Bumps VERSION in setup.py, conf.py, and simplejson/__init__.py to 4.1.0 and adds a CHANGES.txt entry. https://claude.ai/code/session_01AUNR5ovNyLAfxZ1uw7ymBY * Emit PEP 678 add_note annotations from the C encoder The pure-Python encoder wraps each recursive encode step with a `try/except BaseException as exc: exc.add_note(...); raise`, producing a chain of "when serializing <container> item <key>" breadcrumbs on any failure. The C encoder previously propagated errors without annotating, which is why the three add_note tests in test_errors.py used `indent=2` as a workaround to route through Python — and why those tests broke once indent acceleration removed that workaround. This change adds three callsites that mirror the Python encoder's except-blocks: - `encoder_listencode_list` (fast and slow paths): on `encoder_listencode_obj` failure, emit `"when serializing %s item %zd"` with the container's tp_name and the running index. - `encoder_listencode_dict` (fast and slow paths): on `encoder_listencode_obj` failure for a value, emit `"when serializing %s item %R"` — `%R` uses PyUnicode_FromFormat's repr formatter so string keys render as `'test'` and non-str keys use their `__repr__`, matching Python's `%r`. - `encoder_listencode_default`: annotate both the default() raising path (with Py_TYPE(obj)->tp_name, i.e. the original object) and the encode-the-default-result path (Py_TYPE(newobj)->tp_name). The split mirrors the Python encoder where `o = _default(o)` rebinds `o` before `type(o).__name__` is read. The helper `encoder_annotate_exception` does PyErr_Fetch / PyErr_NormalizeException / PyObject_CallMethodObjArgs(add_note) / PyErr_Restore. Allocation failures on the note itself (e.g. a dict key whose __repr__ raises) are swallowed to avoid clobbering the in-flight exception. The whole feature is `#if PY_VERSION_HEX >= 0x030B0000` since PEP 678 landed in 3.11. State gains a cached `JSON_attr_add_note` so the method name doesn't re-intern on every error, parallelling the existing JSON_attr_for_json / JSON_attr_asdict caches. test_errors.py: the three add_note tests no longer need `indent=2` or `_toggle_speedups(False)` — they just call `json.dumps(x)` and assert on the resulting `exc.__notes__`. The _cibw_runner re-runs them with speedups disabled, so both paths are covered without test- level toggling. Verified by running the full suite with `REQUIRE_SPEEDUPS=1` under the strict CFLAGS recipe from AGENTS.md (including -Wdeclaration-after-statement), and by spot-checking the nested `{'a': [1, object(), 3]}` chain — notes match byte-for-byte across the C and Python paths, with and without indent. https://claude.ai/code/session_01AUNR5ovNyLAfxZ1uw7ymBY * Fix PyObject_Repr assertion in add_note helper on debug Python Tests on Python 3.14 standard debug and free-threaded debug core- dumped on the PR branch with: PyObject *PyObject_Repr(PyObject *): Assertion `!_PyErr_Occurred( tstate)' failed. Aborted (core dumped) .venv/bin/python -m simplejson.tests._cibw_runner The callsites in encoder_listencode_dict built the note with PyUnicode_FromFormat("...%R", key) BEFORE calling encoder_annotate_exception. The %R formatter routes into PyObject_Repr, which on debug builds asserts that no exception is currently set — but we were inside the bail path of a failing encoder_listencode_obj call, so the in-flight exception tripped the assert and SIGABRT followed. Release builds and 3.11 non-debug locally did not surface it because the assert is compiled out. Fix: pull the note construction inside the helper and have it run PyErr_Fetch + PyErr_NormalizeException first, so PyUnicode_FromFormatV executes against a clean exception state. The original exception is restored afterwards. The helper signature changes to (_speedups_state *state, const char *format, ...) and all five callsites (list fast/slow, dict fast/slow, default x2) pass the format string and args directly instead of pre-building a note object. Verified locally under strict CFLAGS (-Wdeclaration-after-statement clean) that the full suite still passes and that the add_note chains for the three pre-existing tests and the indented {'a': [1, object(), 3]} case match byte-for-byte. https://claude.ai/code/session_01AUNR5ovNyLAfxZ1uw7ymBY * Write indent prefix piece-by-piece instead of building a combined string encoder_build_indent_string previously allocated three transient PyObjects per container (PyUnicode_FromStringAndSize + PySequence_Repeat + PyNumber_Add) just to push one combined string to the accumulator, plus another PyNumber_Add for the separator_with_indent combination. The accumulator is already designed to take many pieces — on 3.14+ it is a PyUnicodeWriter that appends in-place at near-zero per-write cost, and on older Python it is a list-append. Replace build_indent_string with encoder_accumulate_newline_indent, which writes '\n' + (indent * level) directly via N+1 JSON_Accu_Accumulate calls using a cached state->JSON_newline constant and the user's s->indent. No intermediate PyObject gets materialised. The encoder_listencode_list and encoder_listencode_dict bodies drop their newline_indent / separator_with_indent locals and their bail- path Py_XDECREFs, emitting item_separator and the newline-indent sequence separately at each inter-item boundary. The benchmark on a 1000-element nested dict shows the speedup holds: ~4x over the Python encoder (min of 5 samples), indistinguishable from the pre-built version. On 3.14+ the win should be larger since the writer skips allocation entirely. Full suite passes under the strict CFLAGS recipe from AGENTS.md (-Wdeclaration-after-statement), and the indent=0 / empty-container / iter-empty edge cases match the Python output byte-for-byte. https://claude.ai/code/session_01AUNR5ovNyLAfxZ1uw7ymBY --------- Co-authored-by: Claude <[email protected]>
Exclude Pyodide wheels from PyPI uploads (#375) * Skip Pyodide wheels when uploading to PyPI PyPI rejects wasm wheels with "unsupported platform tag 'pyodide_2024_0_wasm32'". Strip them from the dist/ directory before publishing so the upload doesn't fail. The wheels are still built and preserved as workflow artifacts so users can grab them from the run. * Release 4.0.1 Bumps simplejson to 4.0.1 with a CHANGES.txt entry covering the PyPI upload fix that skips Pyodide/wasm wheels (PR #375). * Update Sphinx conf.py version to 4.0.1 conf.py was stale at 3.20.2 and got missed during the 4.0.0 bump. --------- Co-authored-by: Claude <[email protected]>
Merge pull request #317 from simplejson/update-cibw Update test & build matrix and use Github Actions as a Trusted Publisher
PreviousNext