Categories
apache

Initial Impressions of Apache 2.4.x, PHP 5.4.x, and FastCGI (mod_fcgid)

Over the past two weeks my main server for FantasySP has undergone a massive software update.  I wanted to make sure things will be rock solid for burstable traffic.  Sunday mornings during football season can be pretty rough unless you are prepared.

My previous configuration was Apache 2.2.x, PHP 5.3.x, and DSO ( mod_php).

The new configuration is  Apache 2.4.x, PHP 5.4.x, and FastCGI (mod_fcgid).

Let’s start off by showing you guys some performance graphs…

NewRelic Data
NewRelic Data

In order for you to see the correct data, I removed everything except php and httpd to get a fair comparison.  MySQL and other services are irrelevant.

You will notice a few trends:

  1. Because I moved away from mod_php to mod_fcgid the memory allocation is completely different.  The drop in httpd memory is now passed to separate php processes.
  2. CPU Usage is also different due to mod_fcgid and is now shown in the php process rather than httpd.
  3. Overall CPU% is lower under the new configuration.
  4. Overall Memory Usage is also lower under the new configuration.

What accounts for the lower CPU% and Memory usage?  Apache, PHP, mod_fcgid, or a combination?  The newest versions of Apache and PHP are supposed to have better memory utilization.  If you want to learn more about mod_fcgid in comparison to mod_php then this is a must read.

There are a few drawbacks to using mod_fcgid, most notably is that APC is no longer a shared cache to all processes.  Instead there are many instances of APC running, which can use more memory and cause higher loads to MySQL due to more misses than before.

I honestly did not anticipate this and had to react accordingly.  Load times to the site were noticeably affected.  I had no choice but to alter some of my caching to be saved to a separate MySQL table, rather than being stored in APC.

Huge pain in the ass, but something I considered doing previously for historical data purposes.

I’ve seen evidence from the PHP site that there will be speed increases just by upgrading to PHP 5.4.x from 5.3.x.  However, the jury is still out with this for me.  It appears that it could be true based on early New Relic data.  However, I’m still making caching changes due to APC and it has skewed the data a bit.

It appears that I’ve successfully offloaded slow load times to the odd hours of the day (12AM – 5AM), and by the time 8AM hits things are speedy. I won’t know for sure until a month goes by and things normalize.

The biggest question is if Apache 2.4.x is faster than 2.2.x.  Early benchmarks are mixed, but it is entirely way too early to make a definitive call.  I know for a fact that memory and cpu usage have decreased due to the new setup based on the NewRelic graphs above.  However, I cannot confirm that performance has improved.

Unfortunately I could not find my previous Apache Bench numbers based on my older configuration.  That would have likely given me all I needed to know.

I will continue to tweak Apache in the coming weeks and see what happens.  By the end of September I should know for sure if my current Apache configuration is noticeably better or not.  Though I do plan to stick with mod_fcgid for now and see where things go.

Stay tuned for a follow-up post down the line.

UPDATE: Check out the follow-up post here.

Categories
facebook

Facebook’s First Premium Feature: Profile Wiki’s

There has been a lot of talk about what Facebook should do next to boost revenue.  Recently Biz Stone mentioned something called “Facebook Premium” and suggested to lose the ads and “maybe some special features” to collect $10 per month.

His idea is just skipping the surface of the water.  Although it did get me thinking…  What type of feature would entice users to pay for it?

I propose Profile Wikis.

I know what you’re thinking, they already have Facebook Timeline. Yes, but that is automatically generated in chronological order.  It’s a great free feature to have, but this builds upon that and could be called something like Facebook Stories.

Of course Facebook may have to think of a better name for this, but essentially it would work very similar to how Wikipedia functions with a Facebook twist.

Let’s say I enable my Wiki page. My friends get a notice that it has been enabled and they are free to apply for editor access if they wish.  These approved friends would be allowed to use any of my posted photos, videos, location check-ins, or status updates.

I would have to elect a moderator to approve new edit changes or additions.  The wiki page can be set to public, private, or friends only.

The entire Wiki is open ended with no set structure.  The best part is that I have no say in what gets written.  It is entirely created and approved by my friends, so it is extremely important who I approve.  I would get notified of any changes or edits and they could be posted on my timeline if I so choose.

If my friends decide to highlight and approve my addiction with Farmville, then so be it. 

I think the Wiki feature itself would be extremely addicting to Facebook users both young and old.  Younger kids will fight to gain editor or moderator status.  Those of us connecting with old friends can easily catch up to see what we missed.  It would also be a great way to remember a friend or loved one who has passed away.

The approval and writing process has to be super clean and slick.  

It could be part GitHub and part Wikipedia.  Merge the “Wedding” wiki with the “master” wiki once it’s ready for primetime. (Of course Facebook wouldn’t use that type of language, but you get the idea)  Create a mockup layout with sections that need to be assigned to specific friends to take on.  Who’s in charge of the section on senior year of college?

See a graph of contributions detailing various parts of the Wiki so users get credit they are due. For example, Sally wrote 70% of the wedding section.

The reading experience has to be cutting edge and interactive, perhaps a take on this Rolling Stone article.  Let the reader get immersed in the Wiki.  Responsive layout for all devices and easily readable. Order a hardcover book if you’d like.

Once this platform is mature, Facebook could even release this software on its own to make collaborative writing an easier process for newspapers and magazines.  The possibilities are endless.

This is just ONE feature that Facebook could create under their “Facebook Premium” program.

 

Categories
apache php wamp

Setting up 64bit WAMP Server under Windows 8 Using Latest Builds

Setting up WAMP Server can be tricky depending on your configuration.  This post is going to walk you through some of the tougher steps to get your local machine up and running so you can get back to development.

Keep in mind this is not a newbies guide to WAMP.  Look elsewhere if you can’t get the basics working.  What we are trying to tackle here are more advanced problems that you may experience.

A couple of snags I hit were WAMP running slowly, APC refusing to install, and getting cURL working.

At the time of writing this, the current bleeding edge WAMP Bundle has the following specs:

Apache 2.4.2 – Mysql 5.5.24 – PHP 5.4.3 XDebug 2.1.2 XDC 1.5 PhpMyadmin 3.4.10.1 SQLBuddy 1.3.3 webGrind 1.0

In my particular case, I opted to run 64bit code but I do not see any benefits other than making things more difficult. You may want to stick with 32bit.

Initial Steps

Installing WAMP at the start is easy and I am going to assume you can handle clicking next on the install screen and get the basics working.  Since you are using Windows 7/8 it is important that you install the latest runtimes for your setup:

http://www.microsoft.com/download/en/details.aspx?id=8328 x86 (32-bit)

http://www.microsoft.com/download/en/details.aspx?id=13523 x64 (64-bit)

It is also extremely important to make sure IIS is not running on your computer.  To do that go to: “Turn Windows Feature On and Off” and disable IIS, otherwise it will take up Port 80 and waste resources.

After a few reboots you should be at the point where you can configure PHP/Apache and add additional extensions.

Apache Configuration

This is where the fun begins.  To get to the Apache configuration file Left Click on WAMP Server systray and browse to Apache -> httpd.conf.  I am not going to go through the entire process, but will add some useful code snippets.

If you have any subdomains then this is the time to set those up and it looks something like this:

##########################
NameVirtualHost *:80

<VirtualHost *:80>
DocumentRoot “G:\wamp\www\FantasySP”
ServerName localhost
</VirtualHost>

<VirtualHost *:80>
DocumentRoot “G:\wamp\www\reddit-ama”
ServerName rlocalhost
</VirtualHost>

<VirtualHost *:80>
DocumentRoot “G:\wamp\www\FantasySP\m”
ServerName m.localhost
ErrorLog logs/subdomain_error.log
</VirtualHost>

As you can see, I have two separate projects FantasySP and Top IAmA, one running on localhost and the other on rlocalhost.  Most of you will have your DocumentRoot as \www\ and can probably skip this bit.  (Also, don’t forget to update your Windows HOSTS file.)

One of the snags you may encounter at this point might be getting Apache to recognize .htaccess files.  I modified my rules to look like this:

<Files “.ht*”>
Order allow,deny
Allow from all
Satisfy All
</Files>

Another snag might be the fact you can’t access the server at all.  By default it may “Order Deny,Allow”, then specify a specific IP that has access.  You can either update your localhost IP or just use “Allow from all”.

By default Apache has a few modules disabled that should be enabled.  Left Click on WAMP Server systray and browse to Apache -> Apache Modules.  You will want to enable deflate_module and rewrite_module.  Without the rewrite module your clean URLs won’t load.  Without deflate enabling gzip compression will cause a configuration error.  If you have expires rules in your htaccess file then include the expires_module.

If Apache is still having errors then chances are you need another missing module or you screwed up your httpd.conf file.  Make sure you back this up before you start fiddling around.

Configuring PHP

Now that Apache is up to snuff, it’s time to add missing PHP extensions and configurations.

Immediately you may want to enable the PHP short open tag.  Left Click LAMP Systray -> PHP -> PHP Settings.  Otherwise things like <? and ?> will cause errors.

(However, as septor in the comments pointed out, the short open tags will affect XML documents.)

Next up is making sure the cURL is working (assuming you want this).  In my case, cURL would not work with the included extension so I had to find an updated version that would work on my PC.

Grab the latest cURL extension that fits your OS.  Overwrite php_curl.dll in your WAMP folder similar to this: C:\wamp\bin\php\php5.4.3\ext

Restart Apache and run <? phpinfo(); ?> to see if cURL shows up this time.  If it doesn’t then try a different DLL file until it does show up.

The biggest pain in the ass will be to get APC working.  After a bit of frustration I finally figured out the easiest way to get it working.

In your php.ini file add the following line at the end to first DISABLE APC.  apc.enabled=0.  It is important APC is disabled first because it will make debugging a lot easier.

Find the correct php_apc.dll file that will work for your setup. You can find that on this handy page full of modules.

You can either try each one and restart apache if you’re lazy.  Or you can look at your phpinfo output to see what version you need.  TS stands for thread safe.  VC9 stands for the runtime library it was compiled with. (Remember when you installed those runtimes a bit earlier?).

Copy the .dll into the ext folder and restart apache.  If you picked the wrong one then Apache will crash.  Otherwise the systray icon will turn Green.  Once it appears in phpinfo output then you’ve found the right one.

Go back to your php.ini configuration and change apc.enabled=1.  There is a good chance Apache will crash once APC is enabled.  I think this has to do with the fact Serialization Support shows as Disabled (or broken).  That value should show up as “php”.

After dealing with this problem for a day or so I read that you had to copy the php_apc.dll file into one of your Windows System folders.

If you are running 64bit WAMP then copy php_apc.dll into C:\Windows\SysWOW64. For 32bit copy into C:\Windows\System32.

Only do that last step if Apache errors when APC is enabled.  I have no idea why these files need to be copied there, so if you know the reason then feel free to post in the comments.

At this point you can specify your own advanced APC configurations if you want.

Fixing Slow Performance

Right after I started testing FantasySP on localhost I quickly realized that it was much slower under this configuration than my last configuration.

Last time I was running PHP 5.3 with Apache 2.2.x.  So either the new Apache was slow or PHP.

As of this writing WAMP comes with Apache 2.4.2.  The latest version right now is Apache 2.4.4.   Apparently Apache 2.4.4 uses VC10 runtimes instead of VC9 to enhance performance under Windows.

I figured it might be best to update Apache to see if that would improve the speeds.  I got the latest Apache Build and created a new directory in wamp/bin/apache/apache2.4.4.

I coped over wampserver.conf from the apache2.4.2 directory and my httpd config file.  I also re-enabled my PHP extensions.  You can now specify which version of Apache you would like by Left Clicking the WAMP systray -> Apache -> Version.

Once WAMP Restarted using the newer version of Apache it was much faster. Though there was still something slowing it down.

I watched task manager in Windows 8 to see which applications were using a lot of CPU.  (A great reason to upgrade to Windows 7+ is the improved task manager).  Anyways, I noticed that Windows Defender was using a lot of CPU when httpd.exe or mysqld.exe would run.

As it turns out, real-time spyware protection was slowing things down!  Open up Windows Defender (or whatever you use for spyware/virus protection) and exclude real-time protection for httpd.exe and mysqld.exe.

It also makes sense to disable Windows Indexer from indexing C:/Wamp/*.

Final Thoughts

Phew.  After all of those configurations and changes, localhost runs as fast as my previous setup under Apache 2.2 and PHP 5.3.  As long as you take the time to set things up properly then things should run smooth under Windows 7 or Windows 8.

Even though you will experience growing pains by updating your WAMP environment I highly suggest taking a weekend to do so to make sure your applications run well on an updated stack.

Hopefully this post has proved useful to easing your pain during the WAMP setup process.  I have a strong feeling that I will be consulting this very blog post 3 years down the road when I have to do it all over again…

Categories
developer

My Summer Side Project

Last year, around the same time, I decided to create a summer side project to give my brain a rest from the same old.  I love working on FantasySP, but even I have to take a break from the 24/7 grind.

So I decided to set aside some time for a fun side project called Top IAmA.  Basically, it collects IAmA’s from Reddit and repurposes them in a more readable format.  You can also browse these IAmA’s by category.

It’s been growing ever since with a nice loyal fanbase. I was thinking of redesigning Top IAmA this summer and base it on Boostrap so the mobile/tablet experience is more enjoyable.  I probably still will at some point.

But for now, I have a new Summer Project idea.

This summer project is more involved and a lot more challenging. The domain has been bought as of 20 minutes ago.  I can’t say specifically what the project will be, but I will say that it involves a very popular Google service.

My hope is that it will go live at some point this Summer or early Fall.  Stay Tuned.

Categories
rant

Your Hate for Google is Misguided

I read the announcement today that Google Checkout has been discontinued.  It was also featured on HackerNews.

The most popular comment right now is by ChrisNorstrom:

“Google is making the same mistakes Microsoft made. Trying to enter into every industry it can thinking it can use it’s monopoly power to take over the world. Reality: Doing 20 things mediocrely is not as profitable of doing 2 things very very well.”

The comments vary, but I tend to see this line of thinking quite often nowadays.

Why exactly can’t Google try new things and then discontinue them if they don’t gain enough traction?  Why is trying new services out a bad thing? Are they supposed to be perfect?

Here are a few Google experiments off the top of my head that turned out alright:

  • gmail
  • chrome
  • google reader
  • google maps
  • google fiber
  • google glass
  • google drive
  • google docs
  • google news
  • google play
  • google music
  • Android
  • Chromebook
  • google trends
  • google+
  • google voice
  • code.google.com
  • Nexus 7

I don’t think people realize how many products Google has at any given time.  Not every product that google makes is going to be successful.  

So just because they won’t be successful every time means that they should not try because it makes them look bad to fail?  

Do you really think that is a healthy thought process? How will anyone ever innovate with that state of mind?  Google is a massive company and they still act like a startup.  It’s absolutely incredible and takes a lot of guts to start products like Google+ so late in the game.

Did they ram Google+ down our throats?  Absolutely.  But you know what?  They are making waves by innovating the space.

Is the consumer going to lose faith in Google because they discontinued one of their free services like Google Reader?  People want Google to spend resources and money on developing tools for them to use for free or for a low cost.  Then be the first ones to complain when it’s discontinued.  It’s ridiculous.  

Google is the ONE company that you should want to invade a new market and try to innovate.  They are the ONE company that has the resources to decide one day to get into X Market and you should be excited at the thought.  They are the one company that can invest millions of dollars in a new endeavor and make billion dollar companies take notice.  (Think Google Fiber).

Would you rather still be using MapQuest than Google Maps for navigation?

What is it, exactly, that makes you want to hate a company like Google, Facebook, or Yahoo so much?

Why is it, that when Facebook bought Instagram it’s the worst thing to ever happen.  Why is it that when Google unified its terms of service it was inherently evil.  I don’t hear anyone complaining about Google Now?

I get it, everyone wants to hate the biggest companies because it’s the cool thing to do. I could make an argument that Google has done more for the web in the past 5 years than any company.

Again, I’m not saying they are perfect.  What I am saying is that they seem to be one of the few that has the balls to innovate and don’t care if they fail.

Do you?

Categories
sports

Bloomberg Front Office Discontinued?

According to various users of FantasySP, it appears that the Bloomberg Front Office product is no longer offered to consumers.

One user emails me:

I don’t think Bloomberg sent out an email about FO (at least I didn’t get one), but I contacted them when it was getting close to the season and nothing showed up on their site.  They wrote back that it had been discontinued but offered no explanation.

If you know more about the decision to discontinue the Front Office product, then please comment below or email me.

It seems as though Bloomberg now focuses on draft related tools instead of in-season management.  Since when are draft tools more important than in-season team management?

In any case, I encourage all Bloomberg Front Office users to switch to the Fantasy Assistant.  A free seven day trial is available for all new users.

The Fantasy Assistant provides a multitude of features including:

  • High Risk and Low Risk waiver wire suggestions.
  • Category based waiver wire suggestions (perfect for roto leagues)
  • Position based team analysis.
  • Daily team ratings and rankings with charts.
  • Player based analysis and stats.
  • Player news from hundreds of sources.
  • And lots more!
Categories
Google rant

Your idea of privacy is dead

I hear a lot about privacy these days.  More often than not, it’s about Facebook or Google and their disregard of privacy.  I think part of the problem is that privacy means different things to different people.

I know for a fact that how I view privacy online is different than other people.  I accept that the internet is changing rapidly and the idea of privacy that existed are long gone.  Online handles are a thing of the past (though there are still exceptions, hi Reddit).

I hear things like…

Ditch Google and use DuckDuckGo because they don’t track users.  Don’t use gmail because they read your email.  Ditch Facebook because Zuckerberg said privacy is dead.

In fact, the most recent knock against them is that Facebook Home can take over your phone and offers all kinds of potential privacy violations.  Reading that story led me to write this blog post.

Let’s face facts here.  The modern web and your personal data are intertwined.  There is no going back.

When Google announced their unified privacy, it was actually a great leap forward.  Those people who complained four years ago that Google was reading their email love Google Now.

Gmail will see that you booked plane tickets or had a packaged shipped and will personalize your Google Now experience.  This was not possible five years ago.  Google Glass was not even possible a year ago.

Will these companies use this data to show advertisements and make money?  Yes, of course.  As it turns out, they are companies that need to make a profit.

A privacy breach will happen time to time with the modern web.  I expect the modern web to be responsible and use SSL and OAuth to securely share my data.  When they are careless and screw things up then they deserve the bad press.  But a story about Facebook Home saying that it “destroys any notion of privacy” is complete and utter bullshit.

I’m not saying that I want to sign up for a service with my email address and phone number and they can turn around and sell this information to a third party.  I don’t want them to publish my phone number without my consent.  Those are violations of my privacy and not something I would agree to.

However, if I sign up for Facebook and they decide to use the fact that I liked ESPN in a personal advertisement, then so be it.  I understand the tradeoffs of the modern web.

I find it funny that we get bombarded with credit card offers and flyers from ten different companies in the mail and no one seems to mind.  We accept those privacy violations offline, even though they offer us nothing in return.  Yet companies online who are innovating and using private data in new ways get so much grief.

No one said you had to join Facebook, use Google, or Twitter.  If you want to pretend its 2002, then that’s fine.  The rest of us are moving forward.  We don’t need you to come along.

This is progress people.  Sit back, relax, and stop your whining.

 

 

 

 

Categories
startup

The Cost of Running a Startup can get out of control

We all love talking about the services out there to make running a startup easier.   Something like monitoring your applications performance or managing newsletters would fit the bill.

For the average well funded startup, paying for additional services is not a big deal.  To the bootstrapped startup, it is an extremely important decision.

Just to name a few:

  • NewRelic – realistic price: $150 per month.
  • SendGrid – realistic price: $9.95 or $79.95.
  • MixPanel – realistic price: $150.

Right off the bat we are talking an additional $310-$369 per month.

There are plenty of other nice services to have like UserVoice ($20) or Basecamp ($30).

You’re going to want to have a CDN too, don’t you think? MaxCDN ($39.95) might be an additional bill.  You’ll also need a place to host your code, so in all likelihood GitHub ($7 or $25) will be an additional bill.

How are small bootstrapped startups supposed to maximize their potential without blowing their budget on these additional services?

Research and use the bare minimum of what you need to get by.  Email each website to see if they have some type of discount for a tiny startup.

NewRelic, or any similar service, is a must to squeeze the most out of your hardware stack. It’s a great company that will actually work with you on the price if you are a tiny start-up.  SendGrid is overpriced in my opinion, so use Mailjet or Amazon SES to get the most bang for your buck.

In fact, a good rule of thumb is to check to see if Amazon offers the service that you need.  They tend to be the least expensive in just about every category, whether it be mail or CDNs or DNS.

MixPanel is also overpriced (sorry, don’t hate me), use something like Clicky instead or just stick with Google Analytics for the time being.

Do NOT spend more money on additional startup services than your hosting bill.  If you are, then you’re doing it wrong.  If your hosting bill is more than $250 a month and you have less than 1,000,000 pageviews per month, then you are doing it wrong.  If you don’t know what your monthly expenses are, then you are doing it wrong.

Big startups don’t have to worry about things like this, but the little guys do.  Make smart, well researched decisions and you should be able to get most of what you want for under $500 per month.

I run a fantasy sports company called FantasySP and my monthly bill for core hosting + github + newrelic + mailjet + CDN + clicky + rackspace cloud is approximately $304.

 

Categories
Google

Why I’m Sticking with Google Chrome

I just read an article entitled “Why I’m Switching (Back) to Firefox” by Cameron Paul.  He explained that Firefox has made huge improvements in the past couple of years.  Meanwhile Chrome has experienced slowness and memory issues as of late.  Google is in it to make money while Firefox stands for “freedom and privacy on the web”

He goes on to say: “Chrome is starting to feel a lot like Firefox did in 2008.” Ouch.

It is true that Firefox has improved by a lot over the past 4 years.  They have neat things for developers like command lines in the dev tool.  At this point Firefox is more than capable to be used as your primary browser again.  I use it on occasion, but mostly for testing purposes.

So is it time to switch? No.

They had my complete loyalty back in 2004-2008.  Every one of us who used their browser said the same thing.  It was slow and crashed all the time.  Within those 4 years it felt like Firefox hardly improved at all.  If anything the experience got worse up until 2010.

Firefox, and the state of web browsers in general, lacked innovation and vision.  Chrome brought about the modern web as we know it.  Without it, it would be impossible to develop javascript heavy applications that exist today.

Why did it take Google Chrome to come out in order for Mozilla to start to work on Firefox’s core issues?  Memory, responsiveness, and page render speed were core problems that existed for years.  

For the past four years Firefox has spent their time playing catch-up to Chrome.  How can a browser that was out for a decade be so far behind Chrome technologically?  

Since day one, Chrome handled extensions and tabs as separate processes to help alleviate memory issues.  As far as I know, Firefox still does not operate in this manner.

Now, because Firefox has nearly caught up with Chrome, it’s time to switch?

Am I supposed to be impressed that Chrome finally gave them a roadmap to use?

Am I supposed to be impressed that they finally fixed their memory issues that lasted for 6 painful years?

If Firefox is about “freedom and privacy on the web”, then Google Chrome is about being as fast as possible and quick to adopt and develop cutting edge features.  

 

Categories
cloudflare Google optimization

Google PageSpeed Service Review

I’ve been on the lookout for a service that would reliably speed up various web projects such as FantasySP and Top IAmA.  The faster the pages load, the happier my users will be.  I first started out using Cloudflare, but their service has been getting worse and worse.  Their pay features aren’t worth the money and using Cloudflare will most likely slow down response times.  A winning combination!

Then I heard Google PageSpeed was in invite only beta. (Not to be confused with mod_pagespeed, which is an Apache module that offers very similar features).  Google PageSpeed Service is very similar to cloudflare and all traffic will be passed through their servers via a DNS change.  I was finally accepted into the program and have spent the past few days tweaking my sites for best performance.

Though before we get to the results, let’s first go over why FantasySP tends to load a little slow to begin with.

What’s really slowing down my site are advertisements and third party javascript. No surprise there, right?  I offer my members a banner free experience, but the average joe has to load a page that is slower than I’d like.  To measure front-end performance I use NewRelic.  Prior to using Google PageSpeed Service, the average web page loads anywhere from 6 to 8 seconds.

Enter Google PageSpeed Service

I started out using Recommended Settings, which are the things you are probably familiar with: remove whitespace, combine CSS, minfiy javascript/css, use Google’s CDN, optimize images etc.  I decided to go all in with Google even though I already used Amazon Cloudfront as my CDN.  Google adds some additional goodies such as converting images to inline DATA, further reducing HTTP requests. It also automatically converts all references to images in stylesheets to run off their CDN, even if it’s not local to my server.  (Goodbye Amazon Cloudfront?).

Google PageSpeed went live on Christmas day, the 25th.

NewRelic: Browser performance
NewRelic: Browser performance

Immediately all pages started to load in under 5 seconds.  Best of all, no functionality on the site was broken, and I did not detect any additional latency by using Google’s service.

On December 26th I decided to enable additional experimental features that are labeled “high risk”.  I also enabled a new low risk feature called prefetch DNS resolve.

The best way to solve slow javascript is to defer javascript, which should prevent javascript from hanging and slowing down page rendering.  Google claims this is “high risk” and may break things, so I made sure to do thorough testing during a day that would have less traffic than normal.

Once defer javascript was enabled, you’ll notice that DOM Processing decreased even further, whereas page rendering actually increased.  Unfortunately I cannot explain why that is. (Experts, feel free to chime in).   So the next question might be, is deferring javascript even worth it according to that graph?  The difference between the 25th and 26th do not seem to matter?

Let’s have a look at what deferring javascript does in a browser not known for speed: Internet Explorer 9.  By enabling Google PageSpeed on December 25th, the pageload times decreased to under 6 seconds for the first time in seven days.  On the 26th, deferring javascript decreased pageload times to around 3 seconds.  Keep in mind that number includes banner advertisements.

Load Times in Internet Explorer 9
Load Times in Internet Explorer 9

 

Clearly deferring javascript helps with IE9, but what about the other browsers?

Windows firefox
Windows firefox
Mac Chrome 23
Mac Chrome 23
Windows Chrome 23
Windows Chrome 23

In every browser, you’ll notice that DOM processing and network times decreased, whereas page rendering times increased.  My theory is that since the javascript is deferred, the dom is processed a lot faster, but that doesn’t mean the page is fully rendered.  Again, feel free to chime in here with your own explanation.  I *think* the end user will feel as though the pages load a lot faster, despite the additional page rendering.

Unfortunately I am unable to test Android and iPhone browser performance because these users are directed to a different sub domain. Plus, I don’t think Google supports these browsers for deferring Javascript.  Older browser performance from IE8/IE7 remain unchanged because many of Google’s optimizations are for modern browsers only.

According to my testing, the previous bottlenecks like advertisements and rendering Google Charts no longer slow down pageloads.

Next up is to see performance from Google Webmaster Tools.  When I used Cloudflare, Googlebot noticed huge performance issues.  Cloudflare caused my 200 ms response times to double and even triple.  Will the same results appear for Google’s Service?

Google Webmaster Tools
Google Webmaster Tools

As you can see, passing traffic through Google PageSpeed servers does cause a penalty of about 100 ms in response times.  Their servers are doing an awful lot of optimizations behind the scenes, so this is not at all surprising.  The trade off is that the end user gets entire seconds shaved off their load times.  I think I’ll take that any day.

More Advanced Optimization Techniques

Of course, I am not satisfied there and wanted to further push the boundaries of what Google PageSpeed Service can do.  Next up is to cache and prioritize visible content.  Cloudflare has something similar, but they call it railgun. Railgun also requires you to run an extra process to send HTTP data back and forth to Cloudflare to show them which HTML content is stale so it can be cached.  I have no idea how well Railgun performs since no one has actually reviewed the service.

Google’s cache and prioritize visible content works a bit different.  You do not need to run any additional service on your server.  Instead they parse the HTML based on a set of rules you specify and rely on using Javascript to speed up rendering times.  Their website says: “Non-cacheable portions are stripped from the HTML, and the rest of the page is cached on PageSpeed servers. Visible content of the page is prioritized on the network and the browser so it doesn’t have to compete with the rest of the page.”  Visible content means, content that the end user actually can see on their browser.  I do not know if it is user specific, or they just base this off the most common resolution.  So anything above the fold will load immediately, whereas content below the fold may load via a Javascript call.

If deferring javascript works on your site then this feature should work once configured correctly.    Here is their full video explaining the feature:

I went ahead and applied FULL caching to Top IAmA with no rules.  This site has no advertisements and very little javascript.  Response times here are a best case scenario   Every page that gets rendered can be cached for 30 minutes.  This means that Google only needs to fetch a page once every 30 minutes, otherwise it serves it straight from their servers.  Googlebot shows the results:

Googlebot crawl speed Top IAmA
Googlebot crawl speed Top IAmA

You’ll notice two things about this graph… 1) The response times to the far left are terrible and 2) The response times under Google PageSpeed with cached content are extremely fast.  Response times went from 200ms to around 50 ms.  The response times to the far left are a result of Cloudflare’s Full Caching features with Rocket Loader enabled. As I mentioned earlier, avoid Cloudflare at all costs.  The response times in the middle of the graph are from my server.

I did attempt to add Google’s cache and prioritize visible content to FantasySP.  Once I tallied up all of the IDs and classes needed for user specific data I previewed the feature site-wide.  After a some tweaking of class names, everything seemed to be working without any issues.  Although I soon ran into occasional Javascript errors under IE10 that would halt rendering the full page.

This shows you how fragile this feature is because a single javascript error will cause rendering to fail.

The error for IE10 shows as:

SCRIPT5007: Unable to get property ‘childNodes’ of undefined or null reference
f2ae86053f8f64f57a4ef28a17bd0669-blink.js, line 30 character 224

Every other browser seemed to have no issues whatsoever.  IE10 seemed to load the fastest, with Chrome, Safari, and Firefox not too far behind.  I applied this technique to a certain subset of pages on FantasySP that I feel confident will have no errors and can use the speed boost.  Have a look at this Russel Wilson or Matt Ryan page.  During pageload, watch the console to see how different sections of the page load via javascript calls from blink.js. If you don’t see it on the first pageload, then refresh to see the cached version appear.

Google Analytics

The final set of data will be looking at “Avg. Document Content Loaded Time (sec)” metric in Google Analytics under Site Speed -> DOM Timings.  Google Analytics explains this metric as: “Average time (in seconds) that the browser takes to parse the document and execute deferred and parser-inserted scripts (DOMContentLoaded), including the network time from the user’s location to your server.”

I broke up the data by browser and they include: Chrome, Internet Explorer, Safari, and Firefox.  Internet Explorer comes in just a hair faster than Chrome.  Although Safari seems to be worse.

DOMContentLoaded
DOMContentLoaded

Google Analytics confirms what the NewRelic graphs show.  Users will perceive pageloads as much faster with Javascript being deferred.

Conclusion

According to NewRelic browser metrics and testing, Google PageSpeed Service offers significant improvements to load times.  It is relatively easy to set up and is better than any other competing service.  Sites both big and small should use at least some of the services provided.

Google PageSpeed Service with advanced optimization techniques is a perfect solution when it comes to simple WordPress blogs that use Disqus commenting.  All you have to do is add an ignore rule for /wp-admin/* and enjoy a much faster blog that should be able to handle an almost unlimited amount of visits and traffic.  I’d love to see results from anyone else out there willing to try.

Overall I’d say using Google PageSpeed is a no brainer.  Test it out and let me know your thoughts.

Update #1: Over at the hackernews thread, Jeff Kaufman, who works with the PageSpeed team gave some very insightful comments that are a must read.

Update #2: This blog you are currently reading now uses Google PageSpeed Service with advanced caching techniques, such as Google’s cache and prioritize visible content.