Database queries take - ms. Cache lookups take - ms. That's -x faster.
Caching stores frequently accessed data in memory so you don't hit the database repeatedly. If % of your data serves % of requests (Pareto principle), caching that % dramatically reduces database load.
Caching appears in almost every system design interview. When your interviewer asks "How would you improve latency?", caching is usually part of the answer.