I was recently tasked to come up with some recommendations on how to improve the speed of a web application for a client. The task was created because of the perceived slowness of the application when users were running the application over a cellular network. While there are usually code based optimizations that can be done I first set out to look for a simple change that could be implemented and would produce a huge impact.
Enterprise application development often entails writing applications that run over the web using a protocol called HTTP. The most prevalent client of this protocol is the web browser. Most devices, like laptops, phones and printers, have at least one web browser installed. Applications are built to send content to a web browser so that the content can be displayed to the person requesting it. The problem with this content is that it resides on a resource that can be very far away from the device with the browser. Depending on the bandwidth (i.e. How much data can travel per second) some content can take a long time to get from the resource it is stored on to the device that wants it.
The content sent by an application can be static or dynamic. The static variety can take advantage of caching where the content is retrieved only once and then it is stored closer to the web browser for a defined period of time. The dynamic variety is much harder to cache, because of the variability in the content and how often it should be refreshed. Caching can make the user experience much better when a user repeatedly visits a site. The problem with that approach is that many people won't bother coming back if the initial experience is poor. This experience becomes a major factor with devices that rely on cellular networks to transmit data.
My client had some pages that required a large amount of data to be downloaded. For a comparative example, let's look at a large page from Wikipedia's long page section. The content from Wikipedia is dynamic as people update it constantly. As an example let's say a page is 2,088,219 bytes long. Below is an approximation of how long it would take to download the page depending on the bandwidth. Note the time it takes for the user to load this page on a network with a poor connection. As you can imagine the users of the client's application were annoyed when a page took almost three minutes to load a page.
|Home (30 Mbps)||Strong Wireless (1 Mbps)||Weak Wireless (100 Kbps)|
|<1 second||16.7 seconds||167 seconds|
When I looked at the request/response exchange using Fiddler, I noticed that the page was taking less than two seconds to process and that the perceived slowness was simply due to the time spent downloading the page. The other thing I noticed brought a smile to my face because I knew I could make a change that would instantly make the user's experience better.
The application I was looking at happened to be running on Windows and using IIS as the web server. Supported versions of IIS have a feature that is disabled by default, called compression. Provided the client web browser supports it, this feature will take the content and shrink it before it sends it to the client. Provided the server CPU isn't frequently above 80%, turning it on provides spectacular results. Looking at the Wikipedia example from earlier, compression set at level "9" will result in the data being almost 90% smaller (267,932 bytes) and will also result in the user's experience being almost 90% faster.
|Home (30 Mbps)||Strong Wireless (1 Mbps)||Weak Wireless (100 Kbps)|
|<1 second||2.14 seconds||21.43 seconds|
I highly recommend looking at your web sites and applications to see if you can apply compression to your site. Your mobile users will thank you.
When developing enterprise applications you need to consider the networking environment they need to access. Applications that only use internal enterprise networks present challenges as you have to play in their sandbox, after all they own the network. Many of those concerns can be overcome by contacting the right people. It would seem that applications which access the broader Internet would be easier to deal with given the perception of openness on the Internet and the standardization of specifications like HTTP/1.1.
There has been no shortage of stories about governments' intrusive role in the Internet around the globe like China's Great Firewall or the Egyption ISP shutown. Even in the US there are stories of the government restricting access to content, implementing cyber-threat countermeasures and trying to maintain Net Neutrality. Although, Net Neutrality really has less to do with government and more to do with corporations trying to restrict and monetize access to the Internet. However, my recent experience shows some Internet Service Providers (ISP) have already started.
While developing an enterprise application I needed to indicate to the user whether or not the computer was on the internal enterprise network. To do this I decided to have the application perform a web request to another resource (i.e.. computer) that I knew was always available only on the client's network. It did exactly what I wanted to in the client's office. However, when I brought my computer home to test the application I was shocked to find that the application showed that it successfully connected to the server.
I consider the Internet to be anything outside of my house, including my Internet Service Provider (ISP). As such I expect the Internet to be open and follow the HTTP/1.1 specification. However my ISP, Verizon FiOS, rolled out a "feature" a couple years ago that breaks the HTTP/1.1 specification. The specification states that if a requested resource cannot be found then a return code of "404 - Not Found" should be sent back to the requestor. When I used Fiddler to trace the return codes I noticed that the response ended up being "200 - OK" meaning the server was found. However, as you can see in Figure 1 the initial resource I was looking for wasn't found, but instead of 404 return code, my ISP returned a 302 (which should be a 307, but that's another issue). The 302 return code tells the requestor that the requested resource has moved to a new location and the requestor should look there instead. The problem with this is that resource hasn't moved, it is simply not available because it doesn't exist on the Internet.
|302||HTTP||my application FQDN||/||0|
|302||HTTP||wwwwz.websearch.verizon.net||/wwwwz.websearch.verizon.net/search?qo=my application FQDN...||0|
|307||HTTP||goto.searchassist.com||/goto.searchassist.com/find?p=paxfire&s=my application FQDN...||0|
|200||HTTP||find.searchassist.com||/find.searchassist.com/landing.jsf?p=cnksver&q=my application FQDN...||48,875|
Apparently Verizon starting doing these redirects to help users when they mistype a URL and then extended it to redirect 404 responses. I can see using this to reduce overall costs by cutting the number of support calls. However, I think this was first and foremost an opportunity to generate ad revenue. Whatever the reason, it breaks the way the open Internet is supposed to work in my opinion, so I decided to turn off Verizon's redirect "feature".
Microsoft released a new version of IIS called IIS Express 7.5. It allows you to run a scaled down version of IIS 7.5 on Windows XP and greater. For example, IIS 5.1 ships with Windows XP Pro, but not Windows XP Home. Yet you can still run IIS 7.5 on Windows XP Home. However, please note their may be some features that do work in earlier version of Windows due the changes in architecture.
IIS Express 7.5 does not run as a service and does not require administrative privileges to run. The short term goal was to make it work with WebMatrix, but the long term plans are to replace the ASP.Net development web server (i.e. Cassini) with this product for use within Visual Studio 2010. If you want to try it out, but you don't want to use WebMatrix, you can try using IIS Express 7.5 in Visual Studio 2010 RTM with some effort or you can download the beta of Service Pack 1 for Visual Studio. The product can also be used without Visual Studio or WebMatrix, but You must have the .Net framework 4.0 installed prior to installing IIS Express 7.5.
If you plan to use Visual Studio 2010 RTM, then I recommend creating a shortcut to "C:\Program Files\IIS Express\iisexpress.exe" or "C:\Program Files (x86)\IIS Express\iisexpress" if you are running a 64-bit edition of Windows. Once you establish your site information (see these instructions) you can make changes to the shortcut to align with your project in Visual Studio 2010.
There has been good news for developers this year on the cloud computing front. There are freely available compute clouds offering a fairly robust environment, although I would caution enterprise software developers to not use corporate assets (e.g. code and data) when trying them out. This is especially true for systems that have to provide compliance to standards like PCI. Those private enterprise systems would be better off leveraging a service like HP's Cloud Start. For developers that just want to get their feet wet, here is a list of free environments:
Microsoft has allowed MSDN Subscribers to leverage Windows Azure platform for free.1,2 Those details are as follows:
- 750 hours/mo of Azure compute
- 10 GB/mo of Azure storage
- 1,000,000 transactions/mo
- 5 service bus connections to AppFabric
- 1,000,000 access control transactions to AppFabric
- 3 1GB instances of SQL Azure database
- 21 GB of transfer (Europe and NA - 7 GB in/14 GB out) (APAC - 2.5 GB in/5 GB out)
For those that don’t have an MSDN subscription, Microsoft announced a free trial offer earlier this year as well that is good until June 30, 2011.1 Those details are as follows:
- 25 hours/mo of Azure compute
- 500 MB/mo of Azure storage
- 10,000 transactions/mo
- 2 service bus connections to AppFabric
- 100,000 access control transactions to AppFabric
- 1 1GB instances of SQL Azure database (for 3 months only)
- 1 GB of transfer (500 MB in/500 MB out)
Starting November 1, 2010, new Amazon customers will be able to take advantage of a free EC2 instance for one year.1 In addition they will be able to leverage free usage tiers of S3, Elastic Block Storage, Elastic Load Balancing and AWS data transfer. So the free details are as follows:
- 750 hours/mo of micro Linux EC2 (i.e. 31days * 24hrs = 744 hours or free use)
- 750 hours/mo of micro ELB
- 10 GB/mo of EBS
- 5 GB/mo of S3 storage
- 30 GB/mo of transfer (15 GB in / 15 GB out)
- 25 hours of SimpleDB
- 100,000 requests/mo on Simple Queue Service
- 100,000 notifications over HTTP and 1,000 over email for Simple Notification Service
1 - credit card must be used to sign up and charges will be incurred if the free limits are exceeded
2 - must have an existing Premium, Ultimate or BizSpark MSDN subscription
Back in November of 2008, the FCC approved the unlicensed use of white space, provided users of this space abided by two rules. First, they were not permitted to transmit a signal if any legacy wireless microphone had been used in the space in the past 60 seconds. In addition, the user needed to check an FCC White Space database to determine which channels were available at a given location.
Earlier this week the FCC introduced some new constraints on the use of white space in the TV station broadcast spectrum. The order still requires the database lookup, but it no longer calls for spectrum sensing for wireless microphones and instead reserves two vacant channels on the UHF band for limited local use. For others that have a need for more channels the FCC recommends registering with the TV band database.
The use of these frequencies tends to be very limited in major metropolitan areas where there are virtually no available white spaces. However in rural America there are opportunities galore. The primary talk around use of the frequencies has been "White-Fi", which is simply a wireless network operating on the 6 Mhz white space channels. Microsoft has been a pioneer in this space, as is evident in the prototype "white-fi" network on their Redmond campus. The IEEE 802.22 working group has also been working on techniques to use the spaces for Wireless Regional Area Networks.
Time will tell if white space broadband makes its way through rural America, or if it will go the way of Broadband over Powerlines. Although the town of Claudville, VA has already been successful in adopting the technology.