ASP.NET caching methods and best practices

  • 2020-05-10 17:57:47
  • OfStack

Caching has the greatest potential impact on application performance compared to all the other features of ASP.NET, and with caching and other mechanisms, developers of ASP.NET can accept the additional overhead of building a site with an expensive control (for example, DataGrid) without worrying too much about the impact on performance. To make the most of caching in your application, you should consider ways to implement caching at all program levels.
implementation
To implement page output caching, simply add an OutputCache directive to the page.
< % @OutputCache Duration="60" VaryByParam="*" % >
Like other page directives 1, this directive should appear at the top of the ASPX page, before any output. It supports five properties (or parameters), two of which are required.
Duration
Required properties. The amount of time a page should be cached in seconds. It has to be a positive integer.
Location
Specifies where the output should be cached. To specify this parameter, it must be one of the following options: Any, Client, Downstream, None, Server, or ServerAndClient.
VaryByParam
Required properties. Names of variables in Request that should generate separate cache entries. "none" means no change. *" can be used to create a new cache entry for each different array of variables. Between variables use ";" I'm separating.
VaryByHeader
Changes the cache entry based on changes in the specified header.
VaryByCustom
Allows you to specify custom changes in global.asax (for example, "Browser").
Most cases can be handled with a combination of the required Duration and VaryByParam options. For example, if your product catalog allows users to view the catalog pages based on categoryID and page variables, you can use the parameter value "categoryID; page's VaryByParam caches the product catalog for a period of time (1 hour is acceptable if the product is not changing all the time, so the duration is 3600 seconds). This will create a separate cache entry for each category for each directory page. Each entry will last for an hour from its first request.
VaryByHeader and VaryByCustom are primarily used to customize the appearance or content of a page based on the client that is accessing it. The same URL may need to render output for both browser and mobile phone clients, so different content versions need to be cached for different clients. Alternatively, the page might have been optimized for IE, but it needs to be completely optimized for Netscape or Opera (rather than just breaking the page). The latter example is very common, and we'll provide an example of how to do this:
Example: VaryByCustom is used to support browser customization
In order for each browser to have a separate cache entry, the value of VaryByCustom can be set to "browser". This functionality is already built into the caching module and will be inserted into a separate page cache version for each browser name and major version.
< % @OutputCache Duration="60" VaryByParam="None" VaryByCustom="browser" % >
Fragment cache, user control output cache
Caching the entire page is usually not feasible because some parts of the page are customized to the user. However, the rest of the page is common to the entire application. These sections are best suited for caching using fragment caching and user controls. Menus and other layout elements, especially those dynamically generated from data sources, should also be cached in this way. If needed, the cached controls can be configured to change based on changes to their controls (or other properties) or any other changes supported by the page-level output cache. Hundreds of pages using the same set of controls can also share the cached entries for those controls, rather than keeping a separate cached version for each page.

implementation
Fragment caching USES the same syntax as page-level output caching, but applies to user controls (.ascx files) rather than Web forms (.aspx files). In addition to the Location properties, user controls also support all the properties that OutputCache supports on the Web form. The user control also supports the OutputCache property called VaryByControl, which changes the cache of the control based on the value of the member of the user control (usually a control on the page, for example, DropDownList). If VaryByControl is specified, VaryByParam can be omitted. Finally, by default, each user control on each page is cached separately. However, if a user control does not change with the pages in the application and USES the same name on all pages, the Shared="true" parameter can be applied, which makes the cached version of the user control available to all pages that reference the control.
The sample
< % @OutputCache Duration="60" VaryByParam="*" % >
The example caches the user control for 60 seconds and creates a separate cache entry for each change in the query string, for each page on which the control resides.
< % @OutputCache Duration="60" VaryByParam="none"
VaryByControl = "CategoryDropDownList" % >
The example caches the user control for 60 seconds and creates a separate cache entry for each different value of the CategoryDropDownList control, for each page on which the control resides.
< % @OutputCache Duration="60" VaryByParam="none" VaryByCustom="browser"
Shared true % > ="
Finally, the example caches the user control for 60 seconds and creates one cache entry for each browser name and major version. Each browser's cache entry is then Shared by all the pages that reference the user control (as long as all the pages reference the control with the same ID).
Page-level and user-control-level output caching is certainly a quick and easy way to improve site performance, but in ASP.NET, the real flexibility and power of caching is provided through Cache objects. With the Cache object, you can store any serializable data object, controlling how the cache entry expires based on a combination of one or more dependencies. These dependencies can include time elapsed since the item was cached, time elapsed since the item was last accessed, changes to files and/or folders, changes to other cached items, and, after some minor processing, changes to specific tables in the database.
Data is stored in Cache
The easiest way to store data in Cache is to assign it a value using a key, like HashTable or Dictionary object 1:
Cache [" key "] = "value";
This will store the item in the cache without any dependencies, so it will not expire unless the caching engine deletes it to make room for other cached data. To include specific cache dependencies, use the Add() or Insert() methods. Each of these methods has several overloads. The only difference between Add() and Insert() is that Add() returns a reference to a cached object, while Insert() returns no value (null in C#, Sub in VB).
The sample
Cache. Insert (" key myXMLFileData, new
System. Web. Caching. CacheDependency (Server MapPath (" users. xml ")));
This example inserts xml data from a file into the cache without having to read it from the file on a later request. The purpose of CacheDependency is to ensure that the cache expires immediately after a file change so that the latest data can be extracted from the file and cached again. If the cached data comes from several files, you can also specify an array of 1 file name.
Cache. Insert (" dependentkey myDependentData, new
System. Web. Caching. CacheDependency (new string [] {}, new string []
{" key "}));
This example can insert a second block with a key value of "key" (depending on whether the first block exists). The cache entry for "dependentkey" expires if a key named "key" does not exist in the cache, or if the item associated with the key has expired or been updated.
Cache. Insert (" key, "myTimeSensitiveData null,
DateTime. Now. AddMinutes (1), TimeSpan. Zero);
Absolute expiration: this example will cache time-affected data for 1 minute, after which the cache will expire. Note that absolute expiration and sliding expiration (see below) cannot be used once.
Cache. Insert (" key, "myFrequentlyAccessedData null,
System. Web. Caching. Cache. NoAbsoluteExpiration,
TimeSpan. FromMinutes (1));
Slide expiration: this example will cache some frequently used data. The data will remain in the cache until 1, unless the data is not referenced for 1 minute. Note that sliding expiration and absolute expiration cannot be used once.

More options
In addition to the above mentioned dependency, we can also specify a priority (followed by low, high, NotRemovable, they are in System. Web. Caching. CacheItemPriority enumeration defined) and when the item in the cache expires CacheItemRemovedCallback function call. Most of the time, the default priority is sufficient - the caching engine does the job properly and handles memory management of the cache. The CacheItemRemovedCallback option allows for some interesting possibilities, but it's actually rarely used. However, to illustrate this approach, I'll provide an example of its use:
CacheItemRemovedCallback sample

System. Web. Caching. CacheItemRemovedCallback callback = new System. Web. Caching. CacheItemRemovedCallback (OnRemove);
Cache. Insert (" key, "myFile null,
System. Web. Caching. Cache. NoAbsoluteExpiration,
TimeSpan Zero,
System. Web. Caching. CacheItemPriority. Default, callback);
.
public static void OnRemove(string key,
object cacheItem,
System. Web. Caching. CacheItemRemovedReason reason)
{
AppendLog("The cached value with key '" + key +
"' was removed from the cache Reason: "+
reason. ToString ());
}
This example will use any logic defined in the AppendLog() method (not discussed here, see Writing Entries to Event Logs) to record the reason for the expiration of the data in the cache. By recording items when they are removed from the cache and recording the reason for the deletion, you can determine whether the cache is being used effectively or whether you may need to increase the memory on the server. Note that callback is a static (Shared in VB) method, which is recommended because, if it is not used, the instance of the class holding the callback function will be kept in memory to support the callback (not necessary for static/Shared methods).
One potential use of this feature is to refresh the cached data in the background so that the user never has to wait for the data to be populated, but the data always stays in a relatively new state. In practice, however, this feature does not apply to the current version of the cache, API, because the callback is not triggered or completed until the cached item is removed from the cache. As a result, users will frequently make requests to try to access the cache value, then find that the cache value is empty and have to wait for the cache value to be repopulated. I would like to see an additional callback in a future version of ASP.NET, which can be called CachedItemExpiredButNotRemovedCallback and, if defined, must be executed before the cached item is deleted.
Cache data reference schema
Whenever we try to access data in the cache, we should consider the possibility that the data may no longer be in the cache. Therefore, the following pattern should apply universally to your access to cached data. In this case, we assume that the cached data is a table.
 
  public DataTable GetCustomers(bool BypassCache) 
  { 
  string cacheKey = "CustomersDataTable"; 
  object cacheItem = Cache[cacheKey] as DataTable; 
  if((BypassCache)    (cacheItem == null)) 
  { 
  cacheItem = GetCustomersFromDataSource(); 
  Cache.Insert(cacheKey, cacheItem, null, 
  DateTime.Now.AddSeconds(GetCacheSecondsFromConfig(cacheKey), 
  TimeSpan.Zero); 
  } 
  return (DataTable)cacheItem; 
  }   


There are several points to note about this pattern:
? Some values (for example, cacheKey, cacheItem, and cache duration) are defined once and only once.
? You can skip the cache as needed - for example, when a new customer is registered and redirected to the list of customers, it may be best to skip the cache and repopulate the cache with the latest data, including the newly inserted customer.
? The cache can only be accessed once. This improves performance and ensures that NullReferenceExceptions does not occur because the item was present at the first check but expired before the second check.
? This pattern USES strong type checking. The "as" operator in C# attempts to convert an object to a type, and if it fails or the object is empty, only null (empty) is returned.
? The duration is stored in the configuration file. Ideally, all cache dependencies (whether file-based, time-based, or other types of dependencies) should be stored in a configuration file so that changes can be made and performance can be easily measured. I also recommend that you specify a default cache duration and, if you do not specify a duration for the cacheKey you are using, let the GetCacheSecondsFromConfig() method use that default duration.
The relevant code example is an helper class that handles all of the above, but allows cached data to be accessed through one or two lines of code. Please download CacheDemos.msi.
summary
Caching can improve application performance so much that it should be considered when designing and testing applications. Applications will always benefit more or less from caching, although some applications are better suited to caching than others. A deep understanding of the caching options offered by ASP.NET is an important skill for any ASP.NET developer.

Cache early; Often the cache
You should implement caching at each layer of your application. Add caching support to the data tier, business logic tier, UI, or output tier. Memory is now very cheap -- so you can get a big performance boost by implementing caching across your application in a smart way.
Caching can mask many errors
Caching is a way to get "good enough" performance without a lot of time and analysis. Again, memory is now very cheap, so if you can get the desired performance by caching the output for 30 seconds instead of trying to optimize the code or the database for a full day or even a week, you will definitely choose the caching solution (assuming you can accept 30 seconds of old data). Caching is just one of those features that gets 80% for 20% effort, so to improve performance, think of caching first. However, if the design is bad, it can end up having bad consequences, so you should certainly try to design the application correctly. But if you just need to get high enough performance right away, caching is your best bet, and you can redesign the application as soon as you have time.
Page-level output caching
As the simplest form of caching, the output cache simply holds in memory a copy of HTML sent in response to a request. The cached output is provided on subsequent requests until the cache expires, which can result in a significant performance improvement (depending on how much overhead is required to create the raw page output - sending the cached output is always fast and stable).

Related articles: