Igalia Compilers Team

Est. 2011

Igalia’s Compilers Team - A 2025 Retrospective

Hey, hey, it’s the beginning of a new year and before we sprint too far into 2026, let’s take a quick breather, zoom out, and celebrate what Igalia’s awesome compilers team got up to in 2025. Over the past year we’ve been deeply involved in shaping and shipping key Web and JavaScript standards, which includes not just participating in committees but also chairing and actively moving the proposals forward. We worked on major JavaScript runtimes and foundational ahead-of-time compilers including LLVM and Mesa, as well as JIT CPU emulation, and smaller language VMs.

Some big highlights of this year included our work on FEX and Mesa that helped Valve with their upcomimg gaming devices - the Steam Frame and the Steam Machine (we talk more about this in a dedicated blog post), our continued involvement in supporting RISC-V in contemporary compilers, and our key role in multiple WebAssembly implementations.

Standards #

In 2025, our standards work focused on parts of JavaScript developers touch every day like time, numbers, modules and more. Across TC39, WHATWG, WinterTC and internationalization ecosystems, we helped move proposals forward while turning specifications into running, interoperable code. So yep, let’s talk about our most significant standards contributions from the year!

Temporal #

It’s been an exciting year for the Temporal proposal, which adds a modern date-and-time API to JavaScript. For starters, MDN published their API documentation for it, which created a huge surge of interest.

On the shipping front: Firefox shipped their implementation of the proposal and it’s now available in Firefox 139. Chrome moved their implementation to beta in late 2025, and released it in early 2026. Meanwhile, we’ve been steadily working on getting Temporal into Safari, with support for correct duration math and the PlainMonthDay and PlainYearMonth types added during 2025/early 2026. You can read more about this in our recent post on implementing Temporal.

Alongside that, we’ve been working on the Intl Era and Month Code proposal, which has expanded in scope beyond era codes and month codes to cover other calendar-specific things that a JS engine with Intl must implement. This allows developers to make use of a number of commonly-used non-Gregorian calendars, including but not limited to the calendar used in Thailand, the Japanese Imperial calendar, and Islamic calendars.

Decimal #

A lot of our recent work around the Decimal proposal has now migrated to a newer similarly number-focused effort called Amount (formerly known as "Measure" and officially renamed in 2025). The proposal reached Stage 1 at the November 2024 TC39 plenary. We also launched a polyfill. Since then, we have iterated on the Amount API and data model a number of times in plenary. So while it started 2025 at stage 1 and remains at stage 1 heading into 2026, the design is noticeably sharper, thanks to a lot of TC39 discussions. We’re lined up to keep it pushing forward next year.

And because numerics work benefits a ton from regular iteration, in late 2024, we also kicked off a biweekly community call ("JS Numerics") for those in TC39 interested in proposals related to numbers, such as Decimal, Amount, intl-keep-trailing-zeros, etc. We still host it, and it’s turned out to be a genuinely productive place to hash things out without waiting for plenary.

Source Maps #

We implemented draft range mappings implementations on a number of systems: WebKit, Justin Ridgewell’s source map decoder, a source map validator, and more.

We also facilitated source map TG4 meetings and assisted with advancing proposals such as the scopes proposal. Throughout the year, we continued serving as editors for the ECMA-426 specification, landing a steady stream of improvements and clarifications.

Modules #

We pushed JavaScript’s module system forward on multiple fronts, especially around reducing the impact of modules on application startup:

We are among the most active members of the "Modules Harmony" group, an unofficial group within TC39 that aims at improving the capabilities of ESM to improve native adoption, while making sure that all modules proposals are well-coordinated with each other.

AsyncContext #

And over in the AsyncContext proposal world, we spent 2025 focusing on how the proposal should integrate with various web APIs. The way AsyncContext interacts with the web platform is unusually pervasive, and more challenging to figure out than the core TC39 proposal itself.

In a first for a TC39 proposal, it is not also going through the WHATWG stages process, where it has reached Stage 1. This gives us a clearer path to iterate with direct feedback from browser engines.

Unicode standards #

We have been working on Unicode MessageFormat, which is a Unicode standard for localizable dynamic message strings, designed to make it simple to create natural sounding localized messages.

In 2025, we helped the ICU4C implementation of Unicode MessageFormat align with ongoing specification changes. We also carried out experimental work on the custom function interface to support more extensible formatting formatting capabilities, which is currently under review.

WinterTC #

In December 2024, WinterTC was formed to replace WinterCG as an official ECMA Techincal committee to achieve some level of API interoperability across server-side JavaScript runtimes, especially for APIs that are common with the web.

We started chairing (together folks from Deno), and became involved in admin tasks. Over the course of the year, we:

Additionally, if you’re curious, we gave two talks about WinterTC: one at the Web Engines Hackfest together with Deno folks, the other chair of WinterTC; and one at JSConf.JP.

Node.js #

In Node.js, our work in 2025 spanned interoperability, proxy integration, and adding support for HTTP/HTTPS proxy and shipping integration of System CA certificates across platforms.

On the module side, we delivered interoperability features and bug fixes for require(esm) and helped stabilize it (read more about it in our colleague Joyee’s blog), shipped synchronous and universal loader hooks (now promoted to release candidate), integrated TypeScript into the compile cache, and improved the portability of the cache. Check out Joyee’s talk at JSConf JP if you are interested in learning more about these new module loader features.

We also strengthened System CA certificate integration along with JavaScript APIs for reading and configuring trusted CAs globally, adding built-in HTTP/HTTPS proxy support, and expanding documentation for using Node.js in enterprise environments.

Additionally, we started migration to the new V8 CppHeap model in Node.js and improved its V8 Platform integration.

V8 #

On the V8 side of things, we worked on HeapProfiler::QueryHolders, a companion API to the QueryObjects API.

We worked on extending the HeapStatistics API to include a new field that tracks the total of bytes allocated in an Isolate since its creation. This counter excludes allocations that happen due to GC operations and it’s intended to be used to create memory regression tests. Here’s the CL highlighting these changes.

We also started working on implementation of the import defer proposal on V8. This proposal extends the syntax of ESM imports to allow a mode where the evaluation of an imported module is deferred until its first access. From our work in Node.js, we upstreamed a few improvements and bug fixes in V8’s embedder API and startup snapshot implementation. We also contributed to Node.js’s V8 upgrade and upstreamed patches to address issues discovered in the upgrade.

As part of our collaboration with Cloudflare we added v8::IsolateGroup: a new unit that owns an independent pointer-compression cage. We then also enabled multiple cages per process (“multi-cage”), so thousands of isolates aren’t forced into one < 4 GiB region. Finally, we extended this to multiple sandboxes: one sandbox per isolate group instead of a single process-wide sandbox. In the end this work helped Cloudflare to enable the sandbox in Cloudflare workers.

Babel #

Our team also helps co-maintianing Babel. The build tools area is very active nowdays, and we strongly believe that alongside the innovation happening in the ecosystem companies need to invest on ensuring that the older and widely used tools keep being actively maintained and improving over time.

LLVM #

In LLVM, we helped extend auto-vectorization to take full advantage of the RISC-V vector extension’s many innovative features.

After four years of development by contributors from multiple organizations including Igalia, we finally enabled EVL tail folding for RISC-V as an LLVM default.

This work took advantage of the new VPlan infrastructure, extending it and developing it iteratively in-tree when needed to give us the ability to model a relatively complex vectorization scheme.

We also added full scalable segmented access support and taught the loop vectorizer to make smarter cost model decisions.

Building on top of this, we achieved improvements in RISC-V vectorization. In parallel, we also worked on LLVM scheduling models for the SpacemiT-x60 RISC-V processor, scoring a whopping 16% performance improvement.

Regarding WebAssembly in LLVM we landed a number of commits that improve size and performance of generated code, and added support for a few ISD nodes that enable vectorization for otherwise sequential codegen.

Mesa/IR3 #

We continued work on improving IR3, the Mesa compiler backend for Qualcomm Adreno GPUs. We implemented support for alias instructions novel to the a7xx generation of GPUs, significantly improving register pressure for texture instructions. We also refactored the post-RA scheduler to be able to reuse the legalization logic, significantly improving its accuracy when calculating instruction delays and, consequently, reducing latency.

We also added debug tooling to easily identify the shader that causes problems, among many other optimizations, implementations of new instructions, and bug fixes.

Guile and Whippet #

This year we also made some interesting progress on Whippet, a no-dependencies embeddable garbage collector. We were able to integrate Whippet into the Guile Scheme implementation, replacing Guile’s use of the venerable Boehm-Demers-Weiser library. We hope to merge the integration branch upstream over the next months. We also wrote up a paper describing the innards of some of Whippet’s algorithms.

We think Whippet is interesting whereever a programming language needs a garbage collector: it’s customizable and easy to manage, as it is designed to be "vendored" directly into a user’s source code repository. We are now in the phase of building out examples to allow for proper performance evaluation; after a bespoke Scheme implementation and Guile itself, we also wrote a fresh ahead-of-time compiler for WebAssembly, which in the near future will gain support for the garbage collection WebAssembly extensions, thanks to Whippet. For more info on our progress, check out Andy Wingo’s blog series.

FEX #

For the FEX x86 JIT emulator for ARM64, we worked on X87 Floating-Point Emulation, implemented x87 invalid operation bit handling in F80 mode, fixed IEEE 754 unordered comparison detection, and added f80 stack xchg optimization for fast path.

Besides further fixes for instruction implementations, we also worked on memory and stability improvements, protecting the last page of CodeBuffer, and implementing gradual memory growth. Finally, we also did some infrastructure work by upgrading the codebase to clang-format-19 and adding UBSAN support.

This year’s FEX work focused on x87 floating-point correctness and 32-bit compatibility—both critical for Valve’s Steam Frame, the ARM-powered VR headset they announced in November that uses FEX to run x86 games.

The x87 improvements matter because many games and middleware still use legacy floating-point code. Subtle deviations from Intel’s behavior—wrong exception flags, incorrect comparison semantics—cause crashes or weird behavior. Fixing invalid operation exceptions, IEEE 754 comparisons, and optimizing the x87 stack pass eliminated entire classes of compatibility bugs.

The 32-bit fixes are just as important. A huge chunk of Steam’s catalog is still 32-bit, and even 64-bit games often ship 32-bit launchers. Getting fcntl and addressing modes right means these games just work without users needing to do anything.

In total, this work gave Valve confidence that the Steam Frame could ship with solid library coverage, letting them announce the device on schedule.


Alright, that’s a wrap on our 2025 retrospective! We hope you had as much fun reading it as we had writing it, and building all the things we talked about along the way. We’ll see you next year with another roundup; until then, you can keep up with our latest work on the team blog.