There are many situations when solutions to problems they are supposed to solve only aid in magnifying the problem. Here is one case I came across.
There was an application that queried data from the database. It was working fine till somebody got the idea that if the data coming from the database is cached in the application, the performance of the application can be improved manifold. And guess what? They implemented the idea and now they are reportedly having unprecedented performance issues in the application.
Caching is supposed to increase the performance of a system when there are a large number of hits in the cache. In our case, caching degraded the performance of the application as time progressed. The reason? The database contains a large number of tables and records and our small cache could not handle such large amounts of data. As a result, the hit rate of the cache is very low.
So when to use caching in a system?
There are many things to consider before you implement caching in your application. The main points are:
- What are the trade-offs? How much memory should be allotted to the cache? How much processing power will the cache take? Will these degrade the performance?
- What will be the cache hit rate? If the hit rate of the cache is less than a threshold, using a cache may in fact reduce the performance of the system.
- Are there any other bottlenecks in the system that can be addressed before the cache is implemented? Instead of focusing your efforts in designing and developing a caching system for your application, it may be logical to search and remove any other bottlenecks that may be degrading the performance.
Hope these points will come handy for at least some of you. If you have other points to add, please do.