Retrieving CPU Usage in .NET

Here’s some simple and effective code to fetch CPU usage / load percentage in either ASP.NET or a .NET forms application. WIth the forms application, we’ll set it up to auto refresh every half second. You can set it to longer or shorter if you want.

Some load balancers can use a simple ASP.NET page like this that displays a server’s processor load to help decide which server to shuttle a request to. This is what I have used the simple ASP.NET implementation for.

For each of these implementations, you will need to add a reference to System.Management to your project.

ASP.NET CPU Load Percentage (C#)

using System;
using System.Management;

public partial class cputime : System.Web.UI.Page
{
    protected void Page_Load(object sender, EventArgs e)
    {
        ObjectQuery qry = new ObjectQuery("select * from Win32_Processor");
        ManagementObjectSearcher searcher = new ManagementObjectSearcher(qry);

        int load = 0;
        int numCpus = 0;
        foreach (ManagementObject mgmt in searcher.Get())
        {
            load += Convert.ToInt32(mgmt["LoadPercentage"]);
            numCpus++;
        }

        Response.Write(String.Format("{0}", load/numCpus));
    }
}

.NET Forms Application CPU Load Percentage (C#)

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Text;
using System.Windows.Forms;
using System.Management;
using System.Threading;

namespace CPULoadPercentage
{
    public partial class Form1 : Form
    {
        delegate void SetTextCallback(string text);

        public Form1()
        {
            InitializeComponent();
            Thread loadThread = new Thread(new ThreadStart(outputLoad));
            loadThread.Start();
        }

        private void outputLoad()
        {
            ObjectQuery qry = new ObjectQuery("select * from Win32_Processor");
            ManagementObjectSearcher searcher = new ManagementObjectSearcher(qry);
            int load;
            int numCpus;
            while (true)
            {
                Thread.Sleep(10); //milliseconds
                load = 0;
                numCpus = 0;
                foreach (ManagementObject mgmt in searcher.Get())
                {
                    load += Convert.ToInt32(mgmt["LoadPercentage"]);
                    numCpus++;
                }
                setText(String.Format("{0}%",load / numCpus));
            }
        }

        private void setText(string text)
        {
            if (lblLoad.InvokeRequired)
            {
                SetTextCallback d = new SetTextCallback(setText);
                this.Invoke(d, new object[] { text });
            }
            else
            {
                lblLoad.Text = text;
            }
        }
    }
}

If you want to change how often the label refreshes with the CPU load, just change the Thread.Sleep(10) to however many milliseconds you want between refreshes. Remember – there are 1000 milliseconds per second.

Caching in ASP.NET

Without a doubt, caching can greatly improve performance on your website, or any other application. But if it isn’t done correctly, it can work against you.

Luckily, ASP.NET has a great built in cache manager in System.Web.Caching. It’s very easy to use, and it seamlessly handles (most of) the pitfalls that can get you into trouble. What’s nice about it is that the ASP.NET worker process will scavenge through your cache if it’s running low on memory.

So what is caching? Caching is taking a chunk of data and temporarily sticking it in any easy to reach place. For caching to work for you (instead of against you), that temporary place (called the “cache”) has to be easier to reach than the place you originally got the data. The cache is always dynamic and temporary.

So why does caching improve performance? Because you’re taking a hard to reach piece of data and making it temporarily easier to reach. It’s like this – if you’re driving through the streets of Saint Paul, MN, you’re going to need a map. If that map is in your backseat, you can temporarily cache it in your front passenger seat so it’s handy when you need to reference it. When you’ve left Saint Paul, you can throw in the backseat for next time. If you have a list of movie genres stored in your database that you need to access often, you can store that list in a memory cache. Because reading data from memory is much faster than reading it from a database, each request will be that much quicker.

So when should you cache? Basically, if you have a database or an I/O (hard drive read access) operation that needs to happen often for a broad range of requests, there’s a good chance that you can greatly improve performance by caching that data in memory the first time you grab it.

So when should you NOT cache? You shouldn’t cache data that is not used often. You shouldn’t cache data that is huge (this is kind of a judgment call). You shouldn’t use cache if you can’t use it judiciously. Remember – when you use caching in ASP.NET, you’re putting the object into memory. If you put too much stuff in memory, it can cause a lot of IIS application pool recycles; it can slow things down not only for your site, but for the entire Web server. Be nice to the cache and it will be nice to you.

Caching in ASP.NET C#

DataTable dtGenres = null;
try
{
    string cachekey = "Genres";
    dtGenres = (DataTable)HttpContext.Current.Cache[cachekey];
    if (dtGenres == null)
    {
        //the Genres weren't cached yet, so fetch the data, and add it to the cache for next time
        dtGenres = Genres.GetFromDB(); //put your own database code in here.
        HttpContext.Current.Cache.Insert(cachekey, dtGenres);
    }
}
catch (Exception ex)
{
    //handle exception
}

That’s about as basic as we can get with a cache example. Remember – it doesn’t have to be just DataTables that go into the cache. You can cache any object.

ASP.NET Custom Configuration Settings in web.config

Your Web site may have a need for configuration settings like the email address your contact form should send to, your Google Maps API key, or whatever. You could store those in a database, and even cache them as a DataTable using System.Web.Caching, but ASP.NET actually provides an easier, more efficient method – store your configuration settings in a config section in your web.config. This method not only saves you from a trip to the database, and from adding another object to the cache (web.config is already cached), it has the added benefit of providing an easy way to silo your development, testing and production environments with different settings.

Add a configuration setionGroup to Web.config

<?xml version="1.0"?>
<configuration>
    <system.web>
        <!-- ... -->
    </system.web>
    <configSections>
        <sectionGroup name="MyCustomConfigSection">
            <section name="MySettings" type="System.Configuration.NameValueSectionHandler"/>
        </sectionGroup>
    </configSections>
    <MyCustomConfigSection>
        <MySettings>
            <add key="SomeSetting" value="SettingValue" />
            <add key="ContactFormEmail" value="some@guy.com" />
        </MySettings>
    </MyCustomConfigSection>
</configuration>

Create a Settings class to read your settings

using System;
using System.Web;
using System.Web.Security;
using System.Configuration;
using System.Collections.Specialized;

public class Settings
{
    static Settings()
    {
        config = new NameValueCollection();
        config = (NameValueCollection)ConfigurationSettings.GetConfig("MyCustomConfigSection/MySettings");
    }

    private static NameValueCollection config;

    public static string Get(string key)
    {
        string rtn = "";
        try
        {
            rtn = config[key];
        }
        catch (Exception ex)
        {
            //handle the exception
        }
        return rtn;
    }
}

Implement It

string emailAddress = Settings.Get("ContactFormEmail");

Restart IIS application pool from ASP.NET page

I’ve been developing this .NET class library as a COM object consumable by Classic ASP. Every time I go to build my project, it would tell me:

Unable to copy file "..\Core\bin\Debug\Core.dll" to "bin\Debug\Core.dll". The process cannot access the file 'bin\Debug\Core.dll' because it is being used by another process.

It turns out that the process that had a lock on my dll file was the IIS application pool process. After about a day of using remote desktop to login to the development server, opening up IIS and stopping the application pool, pressing Alt-Tab to get back to Visual Studio and swearing because Remote Desktop hijacks my Alt-Tab, painstakingly lifting my hand to my mouse, minimizing Remote Desktop, maximizing Visual Studio and building my library, trying the new build in my browser and swearing when I see “Service Unavailable” because the application pool is stopped… you get the picture.

This code enables you to stop and start your application pool from the comfort of your own browser. It also gives you your application pool’s status by monitoring AppPoolState.

IIS Application Pool restart .aspx page

<%@ Page Language="C#" AutoEventWireup="true" CodeBehind="iis.aspx.cs" Inherits="service.iis" %>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" >
<head runat="server">
    <title>IIS App Restart</title>
</head>
<body>
    <form id="form1" runat="server">
    <div>
        Status: <asp:Label ID="lblStatus" runat="server" /><br/>
        <asp:Button ID="btnStop" Text="STOP App Pool" BackColor="IndianRed" ForeColor="White" runat="server" CommandArgument="dev.somesite.com" OnClick="stopAppPool" /><br />
        <asp:Button ID="btnStart" Text="START App Pool" BackColor="Lime" runat="server" CommandArgument="dev.somesite.com" OnClick="startAppPool" /><br />
    </div>
    </form>
</body>
</html>

Remember to replace “dev.somesite.com” in the CommandArgument attribute of the two buttons with the name of your application pool.

Codebehind .aspx.cs file

using System;
using System.Web;
using System.Web.UI;
using System.Management;
using System.DirectoryServices;
using System.Web.UI.WebControls;

public partial class iis : System.Web.UI.Page
{
    protected void Page_Load(object sender, EventArgs e)
    {
        Response.Write(System.Environment.MachineName);
        status();
    }

    protected void status()
    {
        string appPoolName = "dev.somesite.com";
        string appPoolPath = @"IIS://" + System.Environment.MachineName + "/W3SVC/AppPools/" + appPoolName;
        int intStatus = 0;
        try
        {
            DirectoryEntry w3svc = new DirectoryEntry(appPoolPath);
            intStatus = (int)w3svc.InvokeGet("AppPoolState");
            switch (intStatus)
            {
                case 2:
                    lblStatus.Text = "Running";
                    break;
                case 4:
                    lblStatus.Text = "Stopped";
                    break;
                default:
                    lblStatus.Text = "Unknown";
                    break;
            }
        }
        catch (Exception ex)
        {
            Response.Write(ex.ToString());
        }
    }
    protected void stopAppPool(object sender, EventArgs e)
    {
        Button btn = (Button)sender;
        string appPoolName = btn.CommandArgument;
        string appPoolPath = @"IIS://" + System.Environment.MachineName + "/W3SVC/AppPools/" + appPoolName;
        try
        {
            DirectoryEntry w3svc = new DirectoryEntry(appPoolPath);
            w3svc.Invoke("Stop", null);
            status();
        }
        catch (Exception ex)
        {
            Response.Write(ex.ToString());
        }
    }

    protected void startAppPool(object sender, EventArgs e)
    {
        Button btn = (Button)sender;
        string appPoolName = btn.CommandArgument;
        string appPoolPath = @"IIS://" + System.Environment.MachineName + "/W3SVC/AppPools/" + appPoolName;
        try
        {
            DirectoryEntry w3svc = new DirectoryEntry(appPoolPath);
            w3svc.Invoke("Start", null);
            status();
        }
        catch (Exception ex)
        {
            Response.Write(ex.ToString());
        }
    }
}

You should probably stick this little page on a separate site using a different application pool 🙂

ASP.NET HTTP Comression and reducing response size

Compression is important on the Web. Pre-compression, my pages were sometimes 700 KB+! Granted, this was in a development environment, so more than half of the page was dedicated to debug and trace data, but still, under high traffic, a large page can put unnecessary strain on your Web server and bandwidth/throughput, thus slowing your site for all of your visitors.

I did a couple of things to drastically reduce the sizes of my pages. First and foremost, I disabled that ASP.NET event validation crap. ASP.NET purists might tell me that I’ve committed a mortal sin, but really… I don’t need it and chances are you don’t need it either. If you’re really that worried, just hit Google and find a pros/cons lists.

You may even be able to turn off the View State, but that broke my pages.

Turning off ASP.NET Event Validation in web.config

<system.web>
    <pages enableEventValidation="false" />
</system.web>

Okay, so now I’ve saved a few precious KB. K. Scott Allen has an article related to Event Validation that goes a bit more in-depth, specifically how to register individual user controls for event validation, and turning off event validation for just one page.

Now the meet and potatoes – here’s how I reduced my overall response size by 52% from an initial 787 KB to 376 KB.

Implementing HTTP gzip/deflate compression in Global.asax (in C#)

void Application_BeginRequest(object sender, EventArgs e)
{
    HttpCompress((HttpApplication)sender);
}

private void HttpCompress(HttpApplication app)
{
    try
    {
        string accept = app.Request.Headers["Accept-Encoding"];
        if (accept != null && accept.Length > 0)
        {
            if (CompressScript(Request.ServerVariables["SCRIPT_NAME"]))
            {
                Stream stream = app.Response.Filter;
                accept = accept.ToLower();
                if (accept.Contains("gzip"))
                {
                    app.Response.Filter = new GZipStream(stream, CompressionMode.Compress);
                    app.Response.AppendHeader("Content-Encoding", "gzip");
                }
                else if (accept.Contains("deflate"))
                {
                    app.Response.Filter = new DeflateStream(stream, CompressionMode.Compress);
                    app.Response.AppendHeader("Content-Encoding", "deflate");
                }
            }
        }
    }
    catch (Exception ex)
    {
        //handle the exception
    }
}

private bool CompressScript(string scriptName)
{
    if (scriptName.ToLower().Contains(".aspx")) return true;
    if (scriptName.ToLower().Contains(".axd")) return false;
    if (scriptName.ToLower().Contains(".js")) return false;
    return true;
}

What we’ve done here is we’ve wrapped the response in a Response.Filter object, which lets us manipulate the response before sending it back to the browser. Careful – there are is a known issue when using Server.Transfer with HTTP Filters.

The bit where we prevent compression of a couple file extentions (see CompressScript(string scriptName)) is important, because ASP.NET doesn’t like it when you compress WebResource.axd, since that’s the file that contains the JavaScript postback code.

That’s pretty simple code to reduce the size of your response by over 50%. I usually implement this with a config setting in web.config that allows me to easily turn on/off the compression just in case breaks something. You could even take that a step further and turn it off on a per page level by looking at HttpContext.Current.Request.Url.