I found this question a good long time ago but I've only ever used it once and I'm sorry to say it didn't go well, but that's probably on me.
It makes a good whiteboard data structures problem for candidates that maybe have stronger back-end skills. An LRU Cache is a data structure that has a set limit on it's content length and culls the "Least Recently Used" item when it exceeds it's capacity. It's tempting to reach for a stack or queue for this one because the idea of keeping an order to the items in the cache by last-accessed (ordinal or time-stamp) will pretty much solve the problem, but keeping the array-like structures ordered is expensive; on the order of O-nlog(n) for every set, get, or delete. Even if you co-implement a map to make accessing items fast, you still have to re-order the set because of the LRU rules.
An optimal solution is to use a doubly-linked list combined with a map for internal lookups. Each operation required in a DLL (I know, overloading a TLA) is O-1; much better than the stack or queue solution!
As a bonus follow-up, assuming the white-boarding goes well and we eventually arrive at the DLL solution, it's a fun quick follow-up to ask for a function that can render the queue as an array or object since we are, after all, storing key-value-pairs.
The code below is a working implementation that eschews some type checking and extra safeties for comprehensibility and includes a "toArray" function on the cache that emits an array of key-value-pairs in order of most to least recently used.
You can see why this might make a better whiteboard question. While it's certainly possible for a good candidate to come up with a solution like this in a typical 30-60 minute phone-screen, it's going to leave a lot of dead-air while they physically type it all up. I generally prefer questions with more terse solutions so I get more time to talk about thought process and code-hardening, but I like that this one combines data-structures with real business rules and isn't just "implement a linked list."