Standardizing the New Architecture: React Native's Bridgeless Mode by Default
Published:
•
Duration: 7:32
0:00
0:00
Transcript
Host: Alex Chan
Guest: Marco Rossi, Senior Mobile Architect and React Native Core Contributor
Host: Hey everyone, welcome back to Allur, your go-to spot for all things PHP, Laravel, Go, and of course, the ever-evolving world of mobile development. I’m your host, Alex Chan, and today... honestly, if you’ve been in the React Native ecosystem for a while, today’s topic is kind of a “hold your breath” moment.
Host: Joining me today is Marco Rossi. Marco is a Senior Mobile Architect who has been deep in the React Native world since the early days—I’m talking like version 0.20 days. He’s a frequent contributor to the core and has been helping some of the biggest apps in the industry migrate to this new world. Marco, welcome to Allur!
Guest: Thanks, Alex! It’s great to be here. And yeah, you’re right—I’ve seen some things in those early versions that would give modern developers nightmares! But it’s a really exciting time to be talking about this. 0.76 feels like the framework finally growing up.
Host: (Laughs) Growing up is a good way to put it. I mean, we’ve been hearing about the "New Architecture" forever. It felt like this mythical thing that was always "coming soon." But now it’s the default. For someone who maybe hasn't been tracking every single commit, how big of a deal is it that the Bridge is... well, gone?
Guest: It’s massive. I mean, think about the Bridge like a narrow, one-lane tunnel between two cities—JavaScript Town and Native City. Every time you wanted to move a car, you had to take it apart, put the pieces in a box, send it through the tunnel, and reassemble it on the other side. That’s serialization. Even for a simple button tap! Now, with Bridgeless Mode and JSI—the JavaScript Interface—it’s like we’ve replaced the tunnel with a direct teleporter. JavaScript can now hold a direct reference to a native object and call its methods instantly. No more "boxes," no more "tunnels."
Host: That’s a great analogy. It always felt a bit clunky, right? Like, you’d have this powerful phone and you’re waiting for a JSON string to be parsed just to trigger a haptic feedback or something. So, Bridgeless is the final form, but it sits on these "four pillars" we keep hearing about. Can we break those down? Let’s start with Fabric.
Guest: Right, Fabric is the new rendering system. In the old days, the UI manager was asynchronous. So, if you had a really heavy bit of logic running in your JS thread, the UI could actually stutter or feel unresponsive because the updates were getting queued up. Fabric allows for concurrent rendering. It means the UI can stay responsive even when the app is doing heavy lifting in the background. It basically brings React’s "Concurrent Mode" ideas directly to the native UI layer.
Host: Oh! So that’s why some older apps felt "janky" during big data loads, even if the native side was technically fine?
Guest: Exactly. The JS thread was busy, so it couldn't tell the Bridge to tell the UI to move. Now, that communication is much more fluid. And then you have TurboModules, which handles the non-UI stuff—like Bluetooth, camera, or local storage. Traditionally, React Native would load every single native module as soon as the app started.
Host: Wait, even if I wasn't using them?
Guest: (Laughs) Yeah, exactly. If you had 50 modules, the app would load 50 modules. It killed startup time and memory. TurboModules changes that to "lazy loading." If the user never goes to the "Settings" page that needs the Camera module, that module is never even initialized. It’s a huge win for TTI—Time to Interactive.
Host: That’s such a relief. I’ve definitely worked on apps where the splash screen felt like it lasted a lifetime. But the real magic, from what I’ve read, is JSI. That’s the "C++ glue," right?
Guest: Yeah, JSI is the foundation. It’s what allows Hermes—the JS engine—to talk directly to the C++ layer. And to make sure we don't crash everything by calling the wrong thing, we have Codegen. It’s a tool that runs at build time and says, "Okay, JavaScript expects this function to take a string and a number, let's make sure the Native side is ready for exactly that." It catches those type-mismatches before the app even reaches a device.
Host: Interesting! So it’s almost like bringing a bit of that TypeScript safety down into the actual bridge... or, well, the lack of a bridge. Speaking of which, "Bridgeless Mode." If I’m writing a modern app, say I need to check if biometrics are supported. In the old world, that was always a Promise, right? I’d have to `await` the response from the native side.
Guest: Exactly. And that `await` added a tiny bit of latency—the "bridge tax," as we call it. In Bridgeless Mode, if you want to know if biometrics are there, you can literally just call a function and get a boolean back *synchronously*. No `await`, no callback. It just... happens.
Host: Wow. That has to be a game-changer for things like gestures and animations. I mean, if you’re tracking a finger moving across a screen, 5 to 10 milliseconds of bridge latency is the difference between it feeling like it’s attached to your finger or lagging behind it.
Guest: Absolutely. That’s where the "buttery smooth" feeling comes from. When you remove that overhead, you’re basically operating at the same speed as a Swift or Kotlin app.
Host: Okay, but I have to play devil’s advocate for a second. We have thousands of libraries in the ecosystem. Most of them were written for the old Bridge. If I turn on Bridgeless Mode today in a production app, is everything just going to... break?
Guest: (Laughs) That’s the million-dollar question, right? Meta actually thought about this. They built something called the Interop Layer. It basically acts as a shim. If you have an old library that’s still trying to send messages across the Bridge, the Interop Layer catches them and routes them correctly in the new system. It’s not *quite* as fast as a native TurboModule, but it means your app won't crash and you can migrate your dependencies one by one instead of doing a "Big Bang" rewrite.
Host: Oh, that’s a relief! I was imagining developers having to fork every single small library they use.
Guest: Yeah, that would be a nightmare. But actually, we’re seeing a huge push. Libraries like Reanimated, React Navigation, and VisionCamera—they’ve already shifted. The big players are already "New Arch" ready.
Host: What about the developer experience? I remember the old days of debugging where you’d open a Chrome tab and it would kind of... simulate the JS environment, but it wasn't quite the same as what was on the phone? It always felt a bit "off."
Guest: Ugh, tell me about it. Remote Debugging was notorious for "works in debug, fails in release" bugs because the JS engine in Chrome was different from the one on the device. With 0.76 and the New Architecture, that’s being replaced by the new Chrome DevTools integration. It connects directly to the Hermes engine running *on* the device. You get a stable, accurate inspection of your state without the "Bridge-busy" lag. It feels much more like debugging a web app, but it’s actually the real native code.
Host: Honestly, that alone might be worth the upgrade for some people.
Guest: Totally. It saves so many hours of chasing ghosts.
Host: So, looking at the big picture... we’ve reached this "maturity phase." If someone is starting a new project today, they run `npx react-native init`, and this is all just... on by default. They don't even have to think about it. Where do you see the framework going from here? Now that the infrastructure is finally in place?
Guest: I think the next couple of years will be about refinement. Now that we have this zero-latency foundation, I expect we’ll see much more complex native integrations. We might see more heavy-duty image processing or real-time AI stuff happening in React Native that used to be "native only." I also think the team will focus on shrinking binary sizes and making the Hermes engine even faster. We’ve built the engine; now we get to see how fast the car can actually go.
Host: It’s such an exciting time. I remember when people said React Native would never be truly native-performant, and it feels like we’re finally putting that argument to bed.
Guest: It really does. It’s no longer about "trading performance for developer speed." Now you kind of get both.
Host: Marco, this has been so enlightening. Thank you for breaking down the technical jargon and making it real for us.
Guest: My pleasure, Alex. Thanks for having me!
Host: Of course! And for everyone listening, if you’re sitting on an older version of React Native, 0.76 is definitely the sign you’ve been waiting for to start looking at that migration path. Check out the official docs—the migration guides are actually really solid now.