HTTP caching mechanism and principle
Comprehensively understand the mechanism and principle of HTTP caching
- Before introducing HTTP caching, as a foundation of knowledge, let's briefly introduce HTTP messages
- HTTP message is the data block sent and responded when the browser communicates with the server.
The browser requests data from the server and sends a request message; the server returns data to the browser and returns a response message.
Message information is mainly divided into two parts- 1. Contains the header (header) of the attribute -------------------------- Additional information (cookie, cache information, etc.) is related to the cacheThe rule information is included in the header
- 2. Contains the main part of the data (body) ----------------------- the part that the HTTP request really wants to transmit
- HTTP message is the data block sent and responded when the browser communicates with the server.
- Analysis of cache rules
For your convenience, we believe that browsers have a cache database for storing cache information.- When the client requests data for the first time, there is no corresponding cache data in the cache database at this time, so it needs to request the server, and after the server returns, store the data in the cache database.
- When the client requests data for the first time, there is no corresponding cache data in the cache database at this time, so it needs to request the server, and after the server returns, store the data in the cache database.
- There are many rules for HTTP caching, which can be classified according to whether it needs to re-initiate a request to the server. I will divide them into two categories (mandatory caching, contrasting caching)
- Before introducing these two rules in detail, let everyone have a simple understanding of these two rules through the timing diagram.
- When cached data already exists, it is only based on mandatory cache, and the process of requesting data is as follows
- When cached data already exists, only based on the comparison cache, the process of requesting data is as follows
- Students who don't know much about the caching mechanism may ask, based on the process of comparing caches, no matter whether caching is used or not, a request needs to be sent to the server, so why use caching?
- Let's put this question aside for now. When we introduce each caching rule in detail later, we will give you the answer.
- We can see the difference between the two types of caching rules. If the mandatory caching takes effect, there is no need to interact with the server, while the comparison caching needs to interact with the server no matter whether it is effective or not.
- Two types of cache rules can exist at the same time, and the priority of mandatory cache is higher than that of comparison cache. That is to say, when the rule of mandatory cache is executed, if the cache takes effect, the cache will be used directly, and the comparison cache rule will not be executed.
你可能想看: