Xai Software Engineer Onsite Coding Questions
5+ questions from real Xai Software Engineer Onsite Coding rounds, reported by candidates who interviewed there.
What does the Xai Onsite Coding round test?
The Xai onsite coding round is the core technical evaluation. Software Engineer candidates typically see 2-3 algorithm and data structure problems. Problems range from medium to hard difficulty, and interviewers evaluate both correctness and code quality.
Top Topics in This Round
Xai Software Engineer Onsite Coding Questions
This post was last edited by Anonymous on 2025-10-7 17:31. Just finished the xAI onsite interview today. The coding wasn't particularly difficult, but it was very demanding. I had to solve the problem
## Problem Given the root of a binary tree, a node is "bad" if its value does not lie within the range `[min_val, max_val]` inherited from its ancestors (similar to BST validity). Remove all bad nodes and their subtrees. Return the modified root. ```python class TreeNode: def __init__(self, val=0, left=None, right=None): ... def remove_bad_nodes( root: TreeNode, min_val: int = float('-inf'), max_val: int = float('inf') ) -> TreeNode | None: pass ``` **Example:** ``` Tree: 5 / \ 1 8 / \ 0 3 Range enforced: left child must be < parent, right child must be > parent Node 8 is bad if 8 > 5 is allowed, but 0 < 1 left of 5... walk through your definition. Output: cleaned tree with only valid-range nodes remaining. ``` ## Follow-ups 1. How does the valid range narrow as you traverse left vs. right? 2. What is the time complexity? Can you do this iteratively? 3. If a bad node has valid children, should the children be re-attached? Why or why not? 4. How would you extend this to an N-ary tree where each child has a specific position constraint?
## Problem Design a durable (persistent) cache that survives process restarts, combining in-memory speed with disk-backed durability. ## Likely LeetCode equivalent No confident LC match. ## Tags hash_table, design, xai, cache
Dynamic Batching: Implement a Request Batcher That Groups Requests for Efficient Processing
## Problem Build a `DynamicBatcher` that collects incoming requests and flushes them as a batch either when the batch reaches `max_size` items or after `max_wait_ms` milliseconds since the first request in the batch arrived — whichever comes first. ```python class DynamicBatcher: def __init__(self, max_size: int, max_wait_ms: int, process_batch: Callable[[list], list]): ... def submit(self, request: dict) -> Future: """ Returns a Future that resolves when the batch containing this request has been processed. """ ``` **Example behavior:** ``` batcher = DynamicBatcher(max_size=10, max_wait_ms=50, process_batch=db_bulk_insert) f1 = batcher.submit({"id": 1, "data": "..."}) f2 = batcher.submit({"id": 2, "data": "..."}) # If 8 more arrive within 50ms -> all 10 flushed together # If timeout hits first -> flush whatever is pending ``` ## Follow-ups 1. How do you associate each request's Future with its position in the batch result list? 2. What threading model do you use — a background flusher thread, asyncio, or something else? 3. How do you handle partial batch failures where some items succeed and others fail? 4. If the process_batch function is slow and requests pile up, how do you add backpressure?
xAI SWE Onsite - X Spaces (System Design)
## Problem Design a real-time audio broadcasting system (similar to Twitter Spaces) handling large concurrent audiences. ## Likely LeetCode equivalent No LC equivalent; system design question. ## Tags system_design, xai, real_time, audio
See All 5 Questions from This Round
Full question text, answer context, and frequency data for subscribers.
Get Access