I Failed 4 FAANG Interviews Before Two Pointers Finally Clicked
The Four Emails That Changed How I Study
I still have the rejection emails saved in a folder called "Motivation." Four FAANG interviews over seven months. Four polite variations of "we've decided to move forward with other candidates." Four problems that - I later realized - were all two-pointer problems wearing different costumes.
The worst part wasn't the rejection. It was the pattern I didn't see. After my fourth rejection, I sat down with a whiteboard and wrote out every problem I'd failed. A sorted array question at Google. A string manipulation problem at Meta. A greedy optimization at Amazon. A linked list question at Apple. On the surface, they looked completely different. Different data structures, different constraints, different stories in the problem statement.
But underneath, they were all the same technique. Two pointers. Moving through data from strategic positions, narrowing the search space with every step. And I had brute-forced every single one of them.
This is the guide I wish I'd had before those interviews. Not a dry textbook explanation - a reverse-engineering of my four failures into the pattern recognition framework that eventually got me an offer.
Rejection #1: The Sorted Array I Brute-Forced
The company: Google The problem: Given a sorted array of integers and a target sum, find two numbers that add up to the target. Return their indices. What I did: Nested loops. O(n^2). Got the right answer but watched my interviewer's enthusiasm drain in real time.
This was Two Sum II - the classic sorted array variant. And my mistake was treating it like the unsorted version.
What I Should Have Done
When an array is sorted, that's not just a nice property - it's a massive hint. Sorted order means you can make intelligent decisions about which direction to move. Here's the key insight I missed:
Place one pointer at the very beginning of the array (the smallest value) and one at the very end (the largest value). Add those two values together. Now you have exactly three possibilities:
- The sum equals the target. You're done. Return the indices.
- The sum is too small. You need a bigger sum. Since the array is sorted, the only way to increase the sum is to move the left pointer one step to the right - toward larger values.
- The sum is too large. You need a smaller sum. Move the right pointer one step to the left - toward smaller values.
Let me walk through a concrete example. Given the array [2, 7, 11, 15] and a target of 9:
Step 1: left = 0 (value 2), right = 3 (value 15)
sum = 2 + 15 = 17 > 9
Move right pointer left.
Step 2: left = 0 (value 2), right = 2 (value 11)
sum = 2 + 11 = 13 > 9
Move right pointer left.
Step 3: left = 0 (value 2), right = 1 (value 7)
sum = 2 + 7 = 9 == target
Return [0, 1]. Done.
Three steps instead of the six comparisons my nested loop would have made. And the gap grows dramatically with input size. For an array of 10,000 elements, my approach made up to 50 million comparisons. The two-pointer approach? At most 10,000.
Why This Works (The Proof That Clicked for Me)
How do you know you won't skip a valid pair? Imagine the correct answer involves elements at indices i and j, where i < j. At some point, your left pointer will be at or before i, and your right pointer at or after j. Can you accidentally move past either one?
If left < i, then array[left] <= array[i], so the sum with array[right] is at most target. The algorithm will move left rightward, toward i. You'll never move the right pointer past j while left is still before i. The same logic applies symmetrically. The two pointers are guaranteed to converge on the answer.
The Complexity Lesson
- My brute force: O(n^2) time, O(1) space
- Two-pointer approach: O(n) time, O(1) space
That's not an incremental improvement. That's the difference between a solution that handles a million elements in one second and one that takes eleven and a half days. In an interview, it's the difference between "strong hire" and "we've decided to move forward."
The Deeper Takeaway
Whenever you see a sorted array and a problem involving pairs or complements, your first instinct should be converging two pointers. Not a hash map (that's for unsorted arrays). Not nested loops (that's for when you've given up). Two pointers, converging from opposite ends, making one informed decision per step.
Rejection #2: The Palindrome I Overcomplicated
The company: Meta The problem: Given a string, determine if it's a valid palindrome, considering only alphanumeric characters and ignoring case. What I did: Created a new cleaned string, reversed it, compared them. Used O(n) extra space. Then fumbled the follow-up when they asked me to do it in-place.
A palindrome reads the same forwards and backwards. That's literally a definition begging for two pointers - one reading forward, one reading backward, checking if they agree.
The Clean Two-Pointer Approach
Start a left pointer at position 0 and a right pointer at the last position. Move them toward each other:
- If the character at
leftis not alphanumeric, skip it - increment left. - If the character at
rightis not alphanumeric, skip it - decrement right. - If both characters are alphanumeric, compare them (case-insensitive). If they don't match, return false. If they do, move both pointers inward.
- When left meets or crosses right, return true.
Here's a walkthrough with the input "A man, a plan, a canal: Panama":
left -> 'A', right -> 'a' => match (case-insensitive). Move both.
left -> ' ', skip.
left -> 'm', right -> 'm' => match. Move both.
left -> 'a', right -> 'a' => match. Move both.
left -> 'n', right -> 'n' => match. Move both.
left -> ',', skip.
left -> ' ', skip.
left -> 'a', right -> 'a' => match. Move both.
... (continuing the convergence)
left crosses right. Return true.
No extra string created. No reversal. O(n) time, O(1) space. And when the interviewer asks "can you handle Unicode?" or "what about empty strings?", you can reason about edge cases directly on the pointer logic rather than juggling auxiliary data structures.
The Edge Cases I Missed
Two-pointer thinking makes edge cases easier to reason about:
- Empty string: Left > right immediately. Return true.
- Single character: Left and right both at 0. Equal. Return true.
- All non-alphanumeric characters (like
",.;!"): Both pointers skip everything and cross each other. Return true. - String with spaces (like
"race a car"): Pointers match on 'r', 'a', 'c', then left hits 'e' while right skips space to 'a'. Mismatch. Return false.
Each edge case maps to a clear pointer behavior. No special-case branching needed.
The Palindrome Variation They Love to Ask Next
The follow-up Meta loves: "Can you make it a palindrome by removing at most one character?" (LeetCode 680). The two-pointer approach extends naturally. Run the same converging algorithm. At the first mismatch at positions left and right, try two options: skip the character at left (check if left+1 to right is a palindrome) or skip at right (check left to right-1). If either works, return true. Still O(n) time - at most two passes through the string.
Rejection #3: The Container Problem I Timed Out On
The company: Amazon The problem: Given an array of heights representing vertical lines on a graph, find two lines that together with the x-axis form a container holding the most water. What I did: Tried every pair. O(n^2). Ran out of time before finishing my code. The interviewer gently told me "there's a linear approach" and I stared at the whiteboard for three minutes in silence.
This is Container With Most Water (LeetCode 11), and it's the problem that finally broke me enough to actually study two-pointer technique properly.
Why Brute Force Feels Necessary Here
The area between two lines at positions i and j is min(height[i], height[j]) * (j - i). The min() function makes this feel inherently pairwise - you need to know both heights to compute anything useful. That's why most people (including me) default to checking every pair.
But the two-pointer approach here is beautiful, and the reasoning behind it teaches you something deep about the technique.
The Greedy Two-Pointer Strategy
Start with left at position 0 and right at the last position. This gives you the widest possible container. Calculate its area. Now, how do you decide which pointer to move?
Here's the critical insight: always move the pointer pointing to the shorter line.
Why? Consider what happens when height[left] < height[right]:
- The current width is
right - left. - The current area is
height[left] * (right - left). - If you move the right pointer left, the width decreases by 1. The limiting height is still
height[left](or possibly even shorter if the new right line is shorter). The area can only decrease or stay the same. You cannot possibly get a larger area by moving the taller line inward. - If you move the left pointer right, the width decreases by 1, BUT the limiting height might increase (if the new left line is taller). There's at least a chance of finding a larger area.
So moving the shorter line's pointer is the only move that has any hope of improvement. Moving the taller line's pointer is provably non-improving.
Full Walkthrough
Array: [1, 8, 6, 2, 5, 4, 8, 3, 7]
Step 1: left=0 (h=1), right=8 (h=7)
area = min(1,7) * 8 = 8. max_area = 8
height[left] < height[right], move left.
Step 2: left=1 (h=8), right=8 (h=7)
area = min(8,7) * 7 = 49. max_area = 49
height[right] < height[left], move right.
Step 3: left=1 (h=8), right=7 (h=3)
area = min(8,3) * 6 = 18. max_area = 49
height[right] < height[left], move right.
Step 4: left=1 (h=8), right=6 (h=8)
area = min(8,8) * 5 = 40. max_area = 49
Equal heights - move either (let's move left).
Step 5: left=2 (h=6), right=6 (h=8)
area = min(6,8) * 4 = 24. max_area = 49
... (continuing until pointers meet)
Final answer: 49 (between positions 1 and 8)
Nine elements, eight steps. My brute force would have been 36 pair calculations. For an array of 100,000 elements, that's 5 billion pairs versus 100,000 steps.
Why You Can Trust It
Suppose the optimal answer uses positions i and j. If your pointer is at the shorter line of this optimal pair, moving it is correct because any narrower container with that same short line is strictly worse. You'd never move the taller line's pointer past j while the shorter line is still relevant.
This problem taught me something crucial: two-pointer isn't just a mechanical technique. It's a way of pruning the search space by making a greedy argument at each step.
Rejection #4: The Linked List Cycle I Didn't See
The company: Apple The problem: Given a linked list, determine if it has a cycle. If it does, find the node where the cycle begins. What I did: Used a hash set to store visited nodes. Got the right answer but used O(n) extra space. The interviewer asked for O(1) space and I had nothing.
This is where two pointers take on a completely different form. Not left-and-right converging on an array, but slow-and-fast moving through a linked list. Floyd's Cycle Detection Algorithm, also called the Tortoise and Hare.
The Slow/Fast Pointer Concept
Initialize two pointers at the head of the linked list. The slow pointer moves one node at a time. The fast pointer moves two nodes at a time. If there's no cycle, the fast pointer will reach the end (null) and you're done - no cycle. If there is a cycle, the fast pointer will eventually lap the slow pointer, and they'll meet at the same node.
Think of it like two runners on a circular track. The faster runner will always lap the slower one.
Why They're Guaranteed to Meet
This bothered me until I worked through the math. Suppose the cycle has length C. Once both pointers are inside the cycle, each step closes the gap between them by exactly 1 node (fast gains 2, slow gains 1, net difference is 1). Since the maximum gap inside a cycle of length C is C-1, they'll meet within C-1 steps after both enter the cycle. They can't skip over each other because the gap decreases by exactly 1 each step.
Walkthrough: Detecting a Cycle
Consider a linked list: 1 -> 2 -> 3 -> 4 -> 5 -> 3 (node 5 points back to node 3, creating a cycle).
Step 0: slow = node 1, fast = node 1
Step 1: slow = node 2, fast = node 3
Step 2: slow = node 3, fast = node 5
Step 3: slow = node 4, fast = node 4 (fast went 5->3->4)
They meet! Cycle detected.
O(n) time, O(1) space. No hash set. No extra data structure. Just two pointers moving at different speeds.
The Harder Part: Finding Where the Cycle Starts
Detecting a cycle is only half the problem. The follow-up - find the entry point of the cycle - is where Floyd's algorithm gets mathematically elegant.
After the slow and fast pointers meet, reset one pointer to the head of the list. Now move both pointers one step at a time. The node where they meet again is the start of the cycle.
Why does this work? Let me walk through the math:
Let F be the distance from the head to the cycle entry. Let C be the cycle length. When slow and fast first meet, slow has traveled F + a steps (where a is how far into the cycle the meeting point is). Fast has traveled 2(F + a) steps. Since fast also made some complete laps around the cycle, we know:
2(F + a) = F + a + k*C (for some integer k)
F + a = k*C
F = k*C - a
This means F (the distance from head to cycle start) equals k*C - a (some number of full cycle laps minus the offset a). If you start one pointer at the head and one at the meeting point (which is a steps past the cycle start), and move both one step at a time, the first pointer travels F steps to reach the cycle start, and the second pointer travels F = k*C - a steps, which from position a inside the cycle means it wraps around exactly k times and ends up at position a + (k*C - a) = k*C, which is the cycle start.
They meet at the cycle entry. It's almost magical when you see it work for the first time.
Extended Walkthrough: Finding the Cycle Start
Using the same list: 1 -> 2 -> 3 -> 4 -> 5 -> 3
After detection, slow and fast met at node 4. F = 2 (distance from head to cycle start at node 3). a = 1 (distance from cycle start to meeting point).
Reset: pointer1 = node 1 (head), pointer2 = node 4 (meeting point)
Step 1: pointer1 = node 2, pointer2 = node 5
Step 2: pointer1 = node 3, pointer2 = node 3
They meet at node 3 - the cycle start!
This is the technique I didn't know existed during my Apple interview. The hash set approach works, but it shows you don't know Floyd's algorithm - and at Apple, they specifically test for it.
The Framework: When to Reach for Two Pointers
After reconstructing my four failures, I built a decision framework. When you read a problem statement, look for these signals:
Signal 1: Sorted Input
If the input is sorted (or you're told you can sort it without affecting the solution), converging two pointers from opposite ends is almost always relevant. Classic examples: Two Sum II, 3Sum (sort first, then use two pointers for the inner loop), removing duplicates from a sorted array.
Pattern: Left pointer at start, right pointer at end, converge based on comparison to target.
Signal 2: Palindrome or Symmetry
Any problem involving checking symmetry, reversing, or comparing a sequence to itself suggests two pointers from opposite ends moving inward. Valid Palindrome, Valid Palindrome II, checking if a linked list is a palindrome.
Pattern: Left and right pointers moving inward, comparing corresponding elements.
Signal 3: Cycle or Repetition Detection
If the problem involves finding cycles in linked lists, repeated sequences, or the "meeting point" of two processes, think slow/fast pointers. Linked List Cycle, Find the Duplicate Number (treating the array as a linked list), Happy Number.
Pattern: Slow pointer moves 1 step, fast pointer moves 2 steps. If they meet, there's a cycle.
Signal 4: Partitioning or Rearranging In-Place
Problems where you need to rearrange elements of an array in-place - move zeros to the end, Dutch National Flag problem, remove element - use a read pointer and a write pointer. The read pointer scans every element, the write pointer marks where the next valid element should go.
Pattern: Read pointer scans left-to-right, write pointer trails behind, only advancing for valid elements.
Signal 5: Subarray or Window Problems
When you're looking for a contiguous subarray with certain properties (minimum size subarray with sum >= target, longest substring without repeating characters), the sliding window technique is a specialized form of two pointers. Both pointers move in the same direction, maintaining a window between them.
Pattern: Both pointers start at position 0, right expands the window, left shrinks it when a condition is met or violated.
The Decision Tree
When you see a new problem, run through these questions in order:
- Is the input sorted, or would sorting help? Try converging left/right pointers.
- Does the problem involve symmetry or comparing from both ends? Try converging left/right pointers.
- Is there a cycle, repetition, or "meeting point" in a sequence? Try slow/fast pointers.
- Do you need to rearrange elements in-place? Try read/write pointers.
- Are you looking for a subarray or substring with certain properties? Try sliding window (same-direction pointers).
- None of the above? Two pointers probably isn't the primary technique. Consider hash maps, binary search, or other approaches.
This decision tree isn't foolproof, but it catches about 80% of two-pointer problems. The remaining 20% require recognizing that a problem can be transformed into one of these categories.
7 Problems to Drill This Week
I ranked these in learning order. Each one builds on the pattern recognition from the previous ones.
Day 1: Two Sum II - Input Array Is Sorted (LeetCode 167)
Why start here: It's the purest form of converging pointers. No edge cases, no tricks, just the core mechanic. If you can explain why this works in O(n) time, you understand the foundation.
Key insight to verify you understand it: Can you articulate why moving the left pointer right when the sum is too small is always safe? Can you prove no valid pair is skipped?
Day 2: Valid Palindrome (LeetCode 125)
Why this is next: Same converging pattern but with character-skipping logic. Forces you to handle edge cases (non-alphanumeric characters, empty strings) within the two-pointer framework.
Key insight: The pointers don't always move in lockstep. Sometimes one pointer skips while the other waits. This is a pattern you'll see in harder problems.
Day 3: Remove Duplicates from Sorted Array (LeetCode 26)
Why this matters: Introduces the read/write pointer variant. One pointer scans, the other marks where to write. This is a different motion pattern from converging pointers and you need both in your toolkit.
Key insight: The write pointer only advances when the read pointer finds a new unique value. Everything before the write pointer is the "cleaned" portion of the array.
Day 4: Container With Most Water (LeetCode 11)
Why it's harder: The greedy argument for why moving the shorter line is correct is non-obvious. This problem tests whether you can reason about why a two-pointer approach works, not just mechanically apply it.
Key insight: You're not just narrowing the search space - you're making a provable claim that one direction can't help. Practice articulating this proof aloud.
Day 5: 3Sum (LeetCode 15)
Why this is a milestone: Combines sorting with two pointers in a nested structure. Fix one element, then use two pointers on the remaining sorted subarray. Also requires handling duplicate triplets, which is a common source of bugs.
Key insight: After sorting, skip duplicate values for the outer loop element AND within the two-pointer inner loop. Missing either one gives wrong answers on inputs like [-1, -1, 0, 1, 1].
Day 6: Linked List Cycle II (LeetCode 142)
Why it's different: Switches from arrays to linked lists, from converging pointers to slow/fast pointers. Finding the cycle entry point requires understanding the mathematical proof behind Floyd's algorithm.
Key insight: The reset-and-walk-at-same-speed trick for finding the cycle entry. Practice deriving why it works from the equation F = kC - a.
Day 7: Trapping Rain Water (LeetCode 42)
Why it's the capstone: This problem is notoriously hard, but the two-pointer solution is elegant. Maintain left and right pointers with running maximums from each side. The water at any position depends on the minimum of the max-left and max-right heights.
Key insight: You can process the pointer on whichever side has the smaller max-height, because you know the water level there is determined by that smaller max (the other side is guaranteed to be at least as tall). This is the Container With Most Water greedy argument taken to the next level.
The Comeback Interview
Three weeks after my Apple rejection, I had an interview at a mid-size tech company. The first round was a phone screen with a coding problem: given a sorted array of integers, find all unique triplets that sum to zero.
3Sum. I recognized it in about four seconds.
I talked through the approach before writing code: sort the array (already sorted in this case), iterate through with an outer loop, use two pointers for the inner search, skip duplicates at both levels. I explained why the two-pointer inner search was correct - because the subarray is sorted, converging from both ends with the sum comparison guarantees we check all valid pairs. I handled the edge cases: arrays with fewer than three elements, all zeros, all negative numbers.
I coded it in twelve minutes. Clean. No bugs on the first run. The interviewer asked about time complexity: O(n^2), because the outer loop is O(n) and the inner two-pointer search is O(n) for each iteration. Space is O(1) if we don't count the output array, or O(n) for sorting depending on the algorithm.
Then she asked a follow-up: "Can you find all quadruplets that sum to a target?" I smiled. Same pattern - two nested outer loops, two pointers for the inner search. O(n^3) time. I coded it in another ten minutes.
I got the offer the next day.
The difference between my first four interviews and this one wasn't that I'd become a better programmer. I wrote the same quality code. The difference was that I could see the pattern. The problem said "sorted array" and "find combinations that sum to X," and my brain immediately reached for two pointers instead of hash maps or brute force.
That's what pattern recognition gives you. Not the ability to solve problems you've never seen before through pure genius - but the ability to recognize that the problem you've never seen before is structurally identical to one you've solved ten times.
Two pointers isn't a trick. It's a lens. Once you have it, you see it everywhere. And you'll wonder how you ever missed it.
FAQ
When should I use two pointers vs a hash map?
Use two pointers when the input is sorted or when sorting is acceptable. Two pointers gives O(n) time and O(1) space, while a hash map gives O(n) time but O(n) space. If the problem asks for constant space or the input is already sorted, two pointers wins. For unsorted Two Sum, use a hash map. For sorted Two Sum II, use two pointers. Know both and choose based on constraints.
How do I know if a problem is a two-pointer problem?
Look for these signals: sorted input, finding pairs that satisfy a condition, symmetry or palindromes, cycle detection, in-place rearrangement, or subarray/substring searches. Constraint hints help too - if expected time is O(n) on a sorted array, two pointers is almost certainly involved. Keywords like "in-place" or "constant extra space" are strong hints.
What's the difference between slow/fast pointers and left/right pointers?
Left/right pointers (also called converging pointers) start at opposite ends of a data structure and move toward each other. They're used for sorted array searches, palindrome checks, and container problems. Slow/fast pointers (also called the tortoise and hare) start at the same position and move in the same direction at different speeds - typically one step and two steps at a time. They're used for cycle detection in linked lists, finding the middle of a linked list, and problems like Happy Number where you need to detect repetition in a sequence. The two variants solve fundamentally different categories of problems, so recognizing which one to use is part of the pattern matching skill.
Can two pointers work on unsorted arrays?
Yes, but in specific scenarios. Converging pointers generally need sorted input, but read/write pointers work on unsorted arrays (removing values in-place, Dutch National Flag partitioning). Sliding window works on unsorted arrays for subarray-sum or substring problems. Slow/fast pointers work on linked lists regardless of sort order. So "two pointers" as a broad category applies to unsorted data - just not the converging-from-both-ends variant.
Building pattern recognition for coding interviews takes deliberate practice. Explore LeetCode pattern guides on Levelop or learn when to apply BFS vs DFS in graph problems.