Written by: Iqbal Khan, president and technology evangelist, Alachisoft
The old adage 'time is money' is especially true for today's retailers relying on their advanced information systems and server farms. As systems' response times increases, so does a retailer's productivity and revenue. Unfortunately, the vastly increasing system users and transactions play an adverse role in meeting these ends.
For example, consider payment processing and POS systems. With a payment processing system, retailers have a short and limited window of time during the night to process customer payments and transfer of funds. When they have tens of millions of customers, processing all these payments as fast as possible becomes a major issue due to this limited window of time at night.
To alleviate the problem, retailers try to add more payment processing servers, but they are unable to add more database servers proportionately due to architectural constraints in their system. Therefore, retailers reach a scalability bottleneck with the database and adding more payment processing servers making matters worse.
In similar fashion, POS systems are expected to process customer purchases fast. And, as the number of POS systems increases, retailers try to add more back end servers to handle more requests. But, they're not able to add more database servers proportionately due to architectural constraints in your system. And, very soon, they're not able to scale up any more and the entire system grinds to a halt during peak hours.
Ideally, retailers want to be able to scale up a retail system simply by adding more servers. However, in order to do this, retailers need to incorporate distributed cache in their application's architecture.
Payment processing, POS systems, and other retail applications can speed up their data access by fetching information from a distributed cache rather than going to the database all the time. Caching is the process of storing frequently used data close to the application. This data is stored in memory, as objects. Retrieving data from memory is faster and more efficient than from a database. Augmenting a database, this approach is considerably faster than solely going to the database. The net result is payment processing, POS systems, and other retail applications are faster and handle considerably more transactions.
Distributed caching provides a major performance and scalability boost by reducing expensive database trips. Even in an efficient database, a typical database trip is 10-100 times slower than accessing an in-memory cache. Distributed cache usually provides sub-millisecond response times. Hence, by dramatically cutting down on database trips and their costly time, a retailer achieves a substantially quicker response time and can handle more customers.
Payment processing is a good example of 'time is money.' Caching lets retailers to process more payments per hour because they have an assured and fixed window of time when they can process all their payments. If retailers couldn't process their payments (e.g. from 10 pm to 6 am), they'd have to wait until the same time the next day. They would be delaying millions of dollars that have not been transferred into their bank. At the minimum, they would lose interest on that money and likely incur other costs and monetary losses by not completing the necessary and timely transfer of those funds to the bank.