Michael Ouroumis logoichael Ouroumis

Optimize XML Parsing in React Native: From 35s to 5s with Kotlin Coroutines & Binary Search

Performance chart showing XML parsing time reduced from 35s to 5s in React Native via Kotlin coroutines

TL;DR: We reduced a route loading time in a React Native app from 35 seconds to 5 seconds by replacing a JS XML parser with a Kotlin-based native module using coroutines, rewriting a triple-nested algorithm using binary search, and parallelizing everything with Promise.all().


Why React Native XML Parsing Can Be a Performance Killer

In our React Native app, users select cycling routes — pre-recorded, video-based tracks that simulate real-world biking experiences. Each route includes:

  • A large XML file containing route metadata
  • Hundreds of GPS track points
  • One or more videos of the full cycling ride
  • Interactive spot markers for points of interest along the route

This app runs on a low-power Android TV box with 2 GB of RAM, limited CPU, and no GPU acceleration. The constrained hardware made performance optimization essential, especially during cold starts or when switching between routes.

When one of these video routes took up to 35 seconds to load, we knew something had to change.

Turns out, it wasn’t just the XML parsing that was slow — the entire segment cue-point matching logic was O(n³) in practice.

Problems we faced:

  • JavaScript XML parsing (react-native-xml2js) was slow and blocking
  • Our segment matching algorithm looped through seconds × track points × spots
  • Everything ran sequentially, blocking route rendering and UI interactions

The Original Algorithm: O(n³)

Here’s what our initial matching logic looked like:

for (let s = 0; s < durationSecs; s++) { for (let p = 1; p < points.length; p++) { for (const spot of spots) { if (distance(locationInTime, spot) <= 50) { // Add cue point } } } }

Time complexity:

O(S · P · N)

  • S = seconds in video
  • P = route track points
  • N = spot locations

This algorithm scanned the entire path every second for every spot. On large routes with 100+ segments, it could take 30–35 seconds.


The Fix: Kotlin Coroutines + Binary Search

1. Native Kotlin XML Parser with Coroutines

We replaced the JavaScript parser with a Native Module written in Kotlin using Jackson’s XmlMapper:

@ReactMethod fun parseXml(xmlString: String, promise: Promise) { val parsedMap = xmlMapper.readValue<Map<String, Any?>>(xmlString) val result = convertToWritableMap(parsedMap) promise.resolve(result) }
  • Used Dispatchers.IO for non-blocking native parsing
  • Normalized the XML structure to match xml2js output
  • Returned a WritableMap to the React Native JS bridge

Unlike JavaScript-based parsers like react-native-xml2js, Kotlin's coroutines run on a background thread (Dispatchers.IO) and do not block the JS thread. And since the UI thread is separate from the JS thread, this keeps both the rendering and interactions smooth — no dropped frames, no input lag, and no frozen screens, even on low-end devices.

Result:

  • XML parsing dropped to half of the original time
  • Reduced memory consumption and kept both JS and UI threads responsive

2. Async Work with Promise.all()

We parallelized all time-consuming operations:

const [metadata, mapData, media] = await Promise.all([ getRouteFromXml(routePath, false), getMapDataFromXml(routePath), getMediaFiles(routePath), ])

This alone shaved off 2–3 seconds, as JS wasn’t waiting on each step.

If you're not familiar with Promise.all() or want a refresher on how it works, check out this complete guide on asynchronous programming in JavaScript.


3. Rewriting the Algorithm: From O(n³) ➝ O(n·log n)

We switched to a binary-search-based approach using prefix sums:

const findPointIdx = (d: number) => { let lo = 0, hi = pref.length - 1 while (lo < hi) { const mid = (lo + hi) >>> 1 pref[mid] < d ? (lo = mid + 1) : (hi = mid) } return Math.max(1, lo) }
  • Created a pref[] array of distances along the track
  • Mapped each spot’s location to a distance
  • Used binary search to find the corresponding timestamp

Complexity improvement:

From O(S · P · N)O(N · log P)


Final Results

OperationBefore (Worst Case)After
XML parsing5–8s~2-4s
Segment matchingUp to 25s~1s
Total load timeUp to 35s~5s ✅

Lessons Learned

  • Coroutines in Kotlin are a game-changer for native performance
  • Promise.all() is underused but critical in React Native apps
  • Binary search and prefix sums aren’t just theory — they save real seconds
  • Native modules are worth the effort when data is large or devices are slow

Should You Use This Strategy?

Use CaseShould You Use It?
Small or occasional XML parsing❌ Too complex
Offline-first map or video apps✅ Absolutely
Cross-platform parity required⚠️ Needs iOS module too
Routes with 1000+ points or many spots✅ Big performance win
Low-resource Android devices✅ Definitely worth it

Final Thoughts

This wasn’t just optimization — it was an architectural shift. We took a brute-force bottleneck and rebuilt it using:

  • Kotlin + coroutines
  • A cleaner JS-native bridge
  • Algorithmic insight from CS fundamentals

If your React Native app is slow, especially when dealing with files or heavy logic on constrained devices:

  • Think in threads.
  • Look at complexity.
  • Build native.
  • Parallelize.

Sometimes, 35 seconds of pain can become 5 seconds of joy — with the right strategy.

Enjoyed this post? Share it: