Conversation
- Switched to using a strict text builder instead of aggregating up `IO Text`. Exceptions are managed via a final `evaluate` instead. - Made the function produce a single chunk rope, which speeds up eventual conversion to bytes. - Improved the predicate looking for characters to escape, which took a considerable amount of the time in a lot of cases.
|
does this need dogfooding or is it pretty solid? i'm thinking about doing another release soon |
|
Maybe hold off until after just in case. It seems like the one-chunk thing might make parsing slower (investigating). That's kind of a silly use case (emit then parse), but I suppose there could be other gotchas. Edit: think I know why. The way the parser is written, it calls some text functions (like |
The one from the library calls `length` on chunks, which is very bad for huge chunks, and seems like it might even be costly on more modest chunks.
|
Wrote a custom version of |
|
Manually ran CI against the ormolu commit here: https://github.com/unisonweb/unison/actions/runs/20247859959 I think it fixed the problem from the previous commit. |
This PR includes some improvements to the
emitJsonreplacement in the runtime.IO Text. Exceptions are managed via a finalevaluateinstead.