Problem: Add Caching Without Touching Core Logic
Problem statement
Your service calls are getting expensive—each function call hits a simulated API and returns identical data for the same parameters. To reduce redundant work, you’ll implement a caching decorator that remembers results for identical inputs.
This is a classic use of the Decorator Pattern: adding performance optimization without modifying the function itself. You’ll apply this decorator to both synchronous and asynchronous functions to demonstrate transparent caching.
Goal
Implement a
withCache(fn)decorator that:Uses an internal
Mapto store results by argument key.Returns cached results immediately on repeated calls.
Logs
[CACHE] Hitor[CACHE] Missfor visibility.Works with both sync and async functions.
Apply it to a simulated slow service that delays responses to mimic latency.
The second call with the same input should return instantly (cached).
Constraints
Do not modify the core service function.
Cache keys must be derived from argument values (use
JSON.stringify(args)).Preserve async behavior—don’t block or break promises.
Return cached results transparently (the caller shouldn’t know it’s cached).
Sample output
The examples below illustrate what the output should look like:
const cachedGetProductDetails = withCache(getProductDetails);(async () => {await cachedGetProductDetails(42); // should fetchawait cachedGetProductDetails(42); // should use cacheawait cachedGetProductDetails(99); // new fetch})();/* Expected output:[SERVICE] Fetching details for 42...[CACHE] Miss - storing result[CACHE] Hit - returning cached result[SERVICE] Fetching details for 99...[CACHE] Miss - storing result */
Good luck trying the problem! If you’re unsure how to proceed, check the “Solution” tab above.
Problem: Add Caching Without Touching Core Logic
Problem statement
Your service calls are getting expensive—each function call hits a simulated API and returns identical data for the same parameters. To reduce redundant work, you’ll implement a caching decorator that remembers results for identical inputs.
This is a classic use of the Decorator Pattern: adding performance optimization without modifying the function itself. You’ll apply this decorator to both synchronous and asynchronous functions to demonstrate transparent caching.
Goal
Implement a
withCache(fn)decorator that:Uses an internal
Mapto store results by argument key.Returns cached results immediately on repeated calls.
Logs
[CACHE] Hitor[CACHE] Missfor visibility.Works with both sync and async functions.
Apply it to a simulated slow service that delays responses to mimic latency.
The second call with the same input should return instantly (cached).
Constraints
Do not modify the core service function.
Cache keys must be derived from argument values (use
JSON.stringify(args)).Preserve async behavior—don’t block or break promises.
Return cached results transparently (the caller shouldn’t know it’s cached).
Sample output
The examples below illustrate what the output should look like:
const cachedGetProductDetails = withCache(getProductDetails);(async () => {await cachedGetProductDetails(42); // should fetchawait cachedGetProductDetails(42); // should use cacheawait cachedGetProductDetails(99); // new fetch})();/* Expected output:[SERVICE] Fetching details for 42...[CACHE] Miss - storing result[CACHE] Hit - returning cached result[SERVICE] Fetching details for 99...[CACHE] Miss - storing result */
Good luck trying the problem! If you’re unsure how to proceed, check the “Solution” tab above.