asp.net to speed up your site and how to use caching

  • 2020-05-07 19:32:40
  • OfStack

The advantage of output caching and fragment caching is that they are very easy to implement, and in most cases they are sufficient. Caching API, on the other hand, provides additional flexibility (actually quite a bit) that can be used to leverage caching at every level of the application. This paper introduces the application of these three caching techniques in each layer of the system.

Of the many features offered by ASP.NET, cache support is undoubtedly my favorite, and with good reason. Caching has the greatest potential impact on application performance compared to all the other features of ASP.NET, and with caching and other mechanisms, developers of ASP.NET can accept the additional overhead of building sites with expensive controls (for example, DataGrid) without worrying too much about performance. To make the most of caching in your application, you should consider implementing caching at all program levels.
Cache prompt for Steve
Cache early; Often the cache
You should implement caching at every level of your application. Add caching support to the data layer, business logic layer, UI, or output layer. Memory is now very cheap - so you can get a big performance boost by implementing caching in an intelligent way across the entire application.
Caching can prevent many errors
Caching is a way to get "good enough" performance without a lot of time and analysis. Again, memory is now very cheap, so if you can get the performance you need by caching the output for 30 seconds instead of spending a whole day or even a week trying to optimize the code or the database, you will definitely choose the caching solution (assuming you can accept 30 seconds of old data). Caching is one of those features that gets 80% for 20% effort, so to improve performance, caching should come to mind first. However, if the design is bad, it can end up having bad consequences, so of course you should also try to design the application correctly. But if you just need to get high enough performance right away, caching is your best option, and you can redesign the application later when you have time.
Page-level output caching
In its simplest form, the output cache simply keeps in memory a copy of HTML sent in response to a request. The cached output is provided on subsequent requests until the cache expires, so performance can improve significantly (depending on how much overhead is required to create the original page output - sending the cached output is always fast and stable).
implementation
To implement page output caching, simply add one OutputCache directive to the page.
< %@ OutputCache Duration="60" VaryByParam="*" % >
Like other page directives 1, this directive should appear at the top of the ASPX page, before any output. It supports five properties (or parameters), two of which are required.
Each page contains this sentence, which can be configured as follows in web.config:
code
 
<caching> 
<outputCacheSettings> 
<outputCacheProfiles> 
<add name="Cache30Seconds" duration="30" 
varyByParam="none" /> 
</outputCacheProfiles> 
</outputCacheSettings> 
</caching> 


The reference code in the page is as follows:

< %@ OutputCache CacheProfile="Cache30Seconds"% >

Duration required properties. The time in seconds that the page should be cached. It has to be a positive integer.
Location specifies where the output should be cached. To specify this parameter, it must be one of the following options: Any, Client, Downstream, None, Server, or ServerAndClient.
VaryByParam required properties. Names of variables in Request that should produce separate cache entries. "none" means no change. "*" can be used to create a new cache entry for each different array of variables. Use "; "between variables. Separate them.
VaryByHeader changes the cache entry based on changes in the specified header.
VaryByCustom allows custom changes to be specified in global.asax (for example, "Browser").
Most cases can be handled with the required combination of Duration and VaryByParam options. For example, if your product catalog allows users to view the catalog page based on categoryID and page variables, you can use the parameter value "categoryID; VaryByParam of page "caches the product catalog for 1 period of time (1 hour is acceptable if the product is not changing all the time, so the duration is 3600 seconds). This will create a separate cache entry for each category for each directory page. Each entry will last an hour from its first request.
VaryByHeader and VaryByCustom are primarily used to customize the appearance or content of a page based on the client that is accessing it. The same URL may need to render output for both browser and mobile phone clients, so different versions of content need to be cached for different clients. Alternatively, the page may have been optimized for IE and should be disabled for Netscape or Opera. The latter example is very common, and we will provide an example of how to achieve this goal:
Example: VaryByCustom is used to support browser customization
In order for each browser to have a separate cache entry, the value of VaryByCustom can be set to "browser". This functionality is already built into the cache module and will be inserted into a separate page cache version for each browser name and major version.
< %@ OutputCache Duration="60" VaryByParam="None" VaryByCustom="browser"% >
Fragment cache, user control output cache
More options



In addition to the above mentioned dependency, we can also specify a priority (followed by low, high, NotRemovable, they are in System. Web. Caching. CacheItemPriority enumeration defined) and when the object in the cache expiration CacheItemRemovedCallback function call. Most of the time, the default priority is sufficient - the caching engine does the job properly and handles memory management of the cache. The CacheItemRemovedCallback option allows for some interesting possibilities, but it is rarely used. However, to illustrate the method, I will provide an example of its use:

CacheItemRemovedCallback sample
 
System.Web.Caching.CacheItemRemovedCallback callback = new System.Web.Caching.CacheItemRemovedCallback (OnRemove); 
Cache.Insert("key",myFile,null, 
 System.Web.Caching.Cache.NoAbsoluteExpiration, 
 TimeSpan.Zero, 
 System.Web.Caching.CacheItemPriority.Default, callback); 
. . . 
public static void OnRemove(string key, object cacheItem, 
 System.Web.Caching.CacheItemRemovedReason reason) 
{ 
 AppendLog("The cached value with key ''" + key + 
"'' was removed from the cache. Reason: " + 
 reason.ToString()); 
} 

This example USES any logic defined in the AppendLog() method to record the reason for the expiration of the data in the cache. By recording items when they are removed from the cache and recording the reason for the deletion, you can determine whether the cache is being used effectively or whether you may need to increase the memory on the server. Note that callback is a static (Shared in VB) method, which is recommended because if it is not used, an instance of the class holding the callback function will be kept in memory to support the callback (not necessary for static/Shared methods).

One potential use of this feature is to refresh the cached data in the background so that the user never has to wait for the data to be populated, but the data is always in a relatively new state. In practice, however, this feature does not apply to the current version of the cache, API, because the callback is not triggered or completed until the cached item is removed from the cache. As a result, users will frequently make requests to try to access the cache value, then find that the cache value is empty and have to wait for the cache value to be repopulated. I would like to see an additional callback called CachedItemExpiredBut in a future version of ASP.NET

NotRemovedCallback, if the callback is defined, the execution must be completed before the cache entry is deleted.

Cache data reference pattern

Whenever we try to access data in the cache, we should consider one case in which the data may no longer be in the cache. Therefore, the following pattern should generally apply to your access to cached data. In this case, we assume that the cached data is one data table.
 
public DataTable GetCustomers(bool BypassCache) 
{ 
 string cacheKey = "CustomersDataTable"; 
 object cacheItem = Cache[cacheKey] as DataTable; 
 if((BypassCache) || (cacheItem == null)) 
 { 
  cacheItem = GetCustomersFromDataSource(); 
  Cache.Insert(cacheKey, cacheItem, null, 
  DateTime.Now.AddSeconds(GetCacheSecondsFromConfig(cacheKey), TimeSpan.Zero); 
 } 
 return (DataTable)cacheItem; 
} 

There are several points to note about this pattern:

1) some values (for example, cacheKey, cacheItem, and cache duration) are defined once and only once.

2) you can skip the cache as needed - for example, when a new customer is registered and redirected to the list of customers, it may be best to skip the cache and repopulate the cache with the latest data, including the newly inserted customer.

3) the cache can only be accessed once. This improves performance and ensures that NullReferenceExceptions does not occur because the item was present at the first check but expired before the second check.

4) the pattern USES strong type checking. The "as" operator in C# attempts to cast an object to a type, and only returns null (empty) if it fails or if the object is empty.

5) the duration is stored in the configuration file. Ideally, all cache dependencies (whether file-based, time-based, or other types of dependencies) should be stored in configuration files so that changes can be made and performance can be easily measured. I also recommend that you specify a default cache duration, and if you do not specify a duration for the cacheKey used, let the GetCacheSecondsFromConfig() method use that default duration.

The code sample associated with this article (CachedDemo.msi, see the book sample CD) is an helper class that handles all of the above situations and can access the cached data by writing only one or two lines of code.

summary

Caching can greatly improve the performance of an application, so it should be considered when designing and testing the application. Applications will always benefit more or less from caching, although some applications are better suited to using caching than others. A good understanding of the caching options provided by ASP.NET is an important skill for any ASP.NET developer.


Caching the entire page is often not feasible because parts of the page are customized for the user. However, the rest of the page is common to the entire application. These sections are best suited for caching using fragment caching and user controls. In addition, menus and other layout elements, especially those generated dynamically from a data source, can also be cached in this way.

If necessary, you can select the control to be cached according to the following criteria:

(1) the property of a control has been changed

(2) any one of the page or control state changes supported by the page-level output cache

Once certain controls are cached, a few hundred pages that use them can share them, rather than having to keep a separate cached version of the control for each page.

implementation

Fragment caching USES the same syntax as page-level output caching, but applies to the user control (.ascx file) rather than the Web form (.aspx file). In addition to the Location property, the user control also supports all the properties that OutputCache supports on the Web form. The user control also supports an OutputCache property called VaryByControl, which changes the control's cache based on the value of the member of the user control (usually a control on the page, for example, DropDownList). If VaryByControl is specified, VaryByParam can be omitted. Finally, by default, each user control on each page is individually cached. However, if a user control does not change with the pages in the application and USES the same name on all pages, you can set the value of parameter Shared to "true", which will make the cached version of the user control available for all pages that reference the control.

The sample

< %@ OutputCache Duration="60" VaryByParam="*" % >

This example caches the user control for 60 seconds and creates a separate cache entry for each change in the query string, for each page on which the control resides.

< %@ OutputCache Duration="60" VaryByParam="none"
VaryByControl="CategoryDropDownList" % >

This example caches the user control for 60 seconds and will target CategoryDrop

For each different value of the DownList control, create a separate cache entry for each page on which the control resides.

< %@ OutputCache Duration="60" VaryByParam="none" VaryByCustom="browser"
Shared="true" % >

Finally, the example caches the user control for 60 seconds and creates one cache entry for each browser name and major version. Each browser's cache entry is then Shared by all the pages that reference the user control (as long as all the pages reference the control with the same ID).

Cache API, using Cache objects

Page-level and user-control-level output caching is certainly a quick and easy way to improve site performance, but in ASP.NET, the real flexibility and power of caching is provided through Cache objects. With the Cache object, you can store any serializable data object, controlling how cache entries expire based on a combination of one or more dependencies. These dependencies can include the time since an object was cached, the time since an object was last accessed, changes to files or folders, changes to other cached objects, and, after some minor processing, changes to specific tables in the database.



Data is stored in Cache

The easiest way to store data in Cache is to assign it a value using one key, like HashTable or Dictionary object 1:

Cache["key"] = "value";

This will store the item in the cache without any dependencies, so it will not expire unless the cache engine removes it to make room for other cached data. To include specific cache dependencies, use the Add() or Insert() methods. Each of these methods has several overloads. The only difference between Add() and Insert() is that Add() returns a reference to a cached object, while Insert() returns no value (null in C#, Sub in VB).

The sample

Cache.Insert("key", myXMLFileData, new
System.Web.Caching.CacheDependency(Server.MapPath("users.xml")));

This example inserts xml data from a file into the cache without having to read from the file on a later request. The purpose of CacheDependency is to ensure that cached file changes expire immediately so that the latest data can be extracted from the file and re-cached. If the cached data is from several files, you can also specify an array of 1 file name.

Cache.Insert("dependentkey", myDependentData, new
System.Web.Caching.CacheDependency(new string[] {}, new string[]
{"key"}));

This example inserts a second block with a key value of "key" (depending on the existence of the first block). The cache entry for "dependentkey" expires if a key named "key" does not exist in the cache, or if the object associated with the key has expired or has been updated.

Cache.Insert("key", myTimeSensitiveData, null,
DateTime.Now.AddMinutes(1), TimeSpan.Zero);

Absolute expiration: this example will cache time-affected data for 1 minute, after which the cache will expire. Note that absolute expiration and rolling expiration (see below) cannot be used once.

Cache.Insert("key", myFrequentlyAccessedData, null,
System.Web.Caching.Cache.NoAbsoluteExpiration,
TimeSpan.FromMinutes(1));

Dynamic scrolling expiration: this example will cache some frequently used data. Data will remain in the cache until 1, unless the data is not referenced for 1 minute. Note that dynamic scroll expiration and absolute expiration cannot be used once.

Related articles: