Zulily LLC, USA.
World Journal of Advanced Research and Reviews, 2025, 26(02), 1860-1871
Article DOI: 10.30574/wjarr.2025.26.2.1809
Received on 01 April 2025; revised on 11 May 2025; accepted on 13 May 2025
Distributed caching represents a critical architectural strategy for enhancing transaction speed in e-commerce environments. This article examines how strategically positioning frequently accessed data across multiple networked nodes significantly reduces latency while decreasing database load. The assessment framework developed for evaluating caching technologies incorporates both quantitative performance metrics and practical implementation considerations specific to e-commerce workloads. Results demonstrate that in-memory solutions consistently outperform disk-based alternatives, with hybrid caching architectures showing superior performance when aligned with specific data requirements. Economic analysis reveals compelling justifications for distributed caching across various business scales, though implementation challenges including consistency management, cold-start phenomena, and security implications must be addressed. Future directions point toward machine learning for predictive caching, edge computing integration, serverless compatibility, and advanced invalidation strategies that promise to further optimize distributed caching capabilities for next-generation e-commerce platforms.
Distributed caching; E-commerce performance; Cache coherence; Edge computing; In-memory processing
Preview Article PDF
Amey Pophali. Distributed caching strategies to enhance E-commerce transaction speed. World Journal of Advanced Research and Reviews, 2025, 26(2), 1860-1871. Article DOI: https://doi.org/10.30574/wjarr.2025.26.2.1809