Skip to main content
← Blog

Rebuilding the Address Input

VauDium ·

After two days and ten different approaches, we stopped solving the problem and removed it instead.

Rebuilding the Address Input

Simple feature. Type an address, suggestions appear, tap to select. The same autocomplete UI you see in every map app. Should be straightforward.

It took two days to build. Then we threw it all away.

It Worked in the Simulator

We wired up the Apple Maps server API for autocomplete. Absolute-positioned dropdown below the TextInput, 300ms debounce after 2+ characters, API call, results list. Tap a suggestion, the address and coordinates fill in.

Tested on the simulator. Clean. Fast. Accurate coordinates. Satisfying.

Then we tested on a real device.

The Invisible Enemy: The Keyboard

To type an address, you tap the TextInput. The soft keyboard rises. With the keyboard up, you tap a suggested address — and nothing happens.

Why? When the keyboard is up and you touch outside the text field, the mobile OS dismisses the keyboard first. During this process, the touch event is consumed. The suggestion item never receives the tap.

Why did it work in the simulator? The Mac’s hardware keyboard means the soft keyboard never appears. We were testing in an environment where the problem didn’t exist. A reminder of why real-device testing matters.

Attempt 1: Timing

The most intuitive fix. When the keyboard dismisses, onEndEditing fires and the dropdown disappears. Delay it by 300 milliseconds. The dropdown stays visible while the keyboard goes down — the user can tap again.

Result: didn’t work. The keyboard dismissal changes the layout, touch coordinates shift. We tried onPressIn (fires on touch-down). It wasn’t called either. The touch event itself was consumed by the system. This wasn’t a timing problem.

Attempt 2: React Native Settings

React Native’s ScrollView has a keyboardShouldPersistTaps option. Set it to "handled" and taps that Pressable components handle won’t dismiss the keyboard. Theoretically perfect.

Tapping the dropdown worked. The keyboard stayed up and the selection registered. Success!

…until we tapped other buttons on the page. State change button, date picker, category selector — nothing dismissed the keyboard. It sat there covering half the screen, refusing to leave.

keyboardShouldPersistTaps applies to the entire ScrollView. You can’t selectively apply it to just the dropdown. We tried dynamically switching between "handled" and the default based on dropdown visibility, but the value change triggered a re-render that cancelled in-flight touches.

Attempt 3: Native Module (iOS)

JS-level solutions exhausted, we went native. Built a KeyboardPersistView Expo module. The goal: touches inside this view shouldn’t dismiss the keyboard.

We overrode hitTest in Swift to set our view as the touch target. Instead of React Native’s touch handler, our view processes the touch. In touchesEnded, we find which child view was tapped and send a JS event.

Selection worked with this approach. But the keyboard still went down.

We tried keeping the keyboard up. Not calling super in touchesBegan, adding require(toFail:) to parent gesture recognizers, rendering the dropdown in a separate UIWindow — over five native iOS approaches. None could fully prevent React Native’s internal keyboard dismiss mechanism.

The hitTest approach — keyboard goes down but selection works — was the best we could do on iOS.

Attempt 4: The Android Wall

We moved to Android. Built the same native module. Overrode onInterceptTouchEvent, onTouchEvent, dispatchTouchEvent. Added logs to all three.

No logs. Not a single one.

Android had two compounding problems. First, views positioned outside their parent’s bounds with position: "absolute" are visible but don’t receive touch events. Unlike iOS. Second, even with normal flow positioning, the Android OS consumes the touch during keyboard dismissal before it reaches any view.

FOCUS_BLOCK_DESCENDANTS, isFocusable = false — tried everything. Portal rendering outside the FlatList created an avalanche of new bugs around position tracking and keyboard height.

We concluded that the dropdown approach was impossible on Android. A sad conclusion.

The Shift

On the afternoon of the second day, staring at the code, we stepped back.

Why does it have to be a dropdown?

We opened Google Maps. Tap the address field — a dedicated search screen opens. Type, select, return. The keyboard is up, you tap a suggestion — it works.

The reason is simple. It’s a separate screen. The screen’s own TextInput holds focus. The suggestions are a regular list in the same screen. You’re not touching outside a TextInput — you’re touching another view within the same screen.

Two days of fighting vanished with a design change.

It Turned Out Better

Honestly, we thought of it as a compromise at first. The dropdown didn’t work, so we reluctantly went with a separate screen. A concession.

But once we built it, it was actually better UX.

You can edit after selecting. With the dropdown, selecting closed it and that was it. With a separate screen, you can modify the text after selection. Add ”, 2nd floor cafe” after the long address string. Just keep typing.

Map preview came naturally. Selecting an address gives us coordinates. Coordinates mean we can show a mini map — “Is this the right place?” Visible on both the search screen and the original screen.

iOS and Android got the same UX. With the dropdown approach, only iOS worked (via native module), Android was disabled. The separate screen works identically on both.

And the code shrank dramatically. The AddressInput component went from 200+ lines to 50. iOS native module branching, Android branching, select-all detection, dropdown positioning, absolute layout, zIndex management — all gone. Replaced by one Pressable and one Text. Tap to navigate.

What We Learned

When you’re deep in a technical problem, tunnel vision kicks in. “How can I intercept this touch event?” “How can I defeat this gesture recognizer?” The instinct is to break through head-on.

But sometimes you need to step back and ask: “Why do I need to intercept this touch event?” “Is there an approach that doesn’t require a dropdown?”

It feels like we wasted two days. But without those two days, we couldn’t have been confident in the decision to switch to a separate screen. Only after trying ten approaches and watching them all fail could we commit to changing the design. And once we did, both the UX and the code improved.

Sometimes removing the problem is a better answer than solving it. But that judgment can only be made after you’ve tried hard enough to be sure.