Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Caching Techniquesfor Content Delivery
1.
2. Basic Operations of a Web Cache Client Cache Sever HTTP:GET Stored? Fresh? HTTP:GET HTTP:RESPONSE tSore Copy? HTTP:RESPONSE No Yes HTTP:RESPONSE HTTP:GET File modified Caching works on the principle that the client request can be served from the cache Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
3.
4.
5. Placing a Network in a Cache: Forward Proxy Internet Web Server Web Clients Web Cache Forward Proxy Workgroup LAN Forward Proxy acts on behalf of content consumers The Request is first sent over the LAN to the Forward Proxy Network administrators set up Forward Proxy to help speed up web access for users Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
6.
7. Placing a Network in a Cache: Reverse Proxy Internet Web Servers Web Clients Web Cache Reverse Proxy Sever LAN Also called sever accelerator Reverse Proxy acts on behalf of origin server The Request is sent first to the reverse proxy cache Web Farms set up reverse proxies to improve performance &scalability. Also reverse proxies can be located remotely near customers. Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
8. Placing a Network in a Cache: Interception Proxy Internet Web Server Web Clients Web Cache Interception Proxy ISP Network Interception Proxy acts on behalf of the content traffic (the ISP) The Request is first sent over the ISP Network to the Interception Proxy ISPs set up Interception Proxy to help speed up web access for customers and reduce wide area bandwidth costs Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
9.
10. Delays in Multimedia Streaming Server Client Client Request Playback begins Connection Delay Buffer Delay Total Delay = Connection Delay + Buffer Delay Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
11. Fast Prefix Caching Server Client Cache Client Request To Cache Buffer From Stored Prefix From Cache Buffer ConnectionDelay Buffer Delay Playback Begins Prefix caching reduces the connection delay The client request is served by previously stored multimedia data Locating the cache close to the client and employing a fast connection reduces the connection delay Other advantage: the cache can serve as a splitter for multiple clients with only one connection to server Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/
12. Dynamic Caching Ring Buffer Server Client 1 Client 2 Patch t T+ Data for Client 1 Data for Client 2 Start caching Serve Client 2 from Cache How it works: Client 2 requests the same streaming object as Client 1 When Client 2 receives start of stream Client 1 has already received seconds of stream The temporal distance between the clients is The data streamed to Client 1 is cached in the ring buffer to be served to Client 2 Client 2 is served the initial seconds of stream by a patch from the server or a prefix T+2 Sanjoy Sanyal:http://www.itforintelligentfolks.blogspot.com/