Mikael's blog

A developers seventh time trying to maintain a blog

Maintenance In Progress

I’ve embarked upon my mission to upgrade my server installation. It’s probably going to take the better part of this week but I hope to be done by this weekend when my parents-in-law are coming to stay with us.

Here’s my todo-list as of now.

  1. Find a temporary server to host the blog on while installing new stuff on IBS.
    1. Grab nearest piece of unused computer tech capable of running Linux.
      (Yay, my RaspberryPi)
    2. Install Raspbian.
      (Since it’s just a temporary server I didn’t bother playing with Arch Linux ARM. Raspbian is what most people are using on their RPis so I figured it would have a decent repository of pre-compiled software.)
    3. Install nginx.
      (This was easy, it was in the repos.)
    4. Install CouchDB.
      (Again, in the repos).
    5. Install Node.js.
      (This is where I’m currently at. It’s a little trickier, requiring compiling from source for version 0.8.x and editing a couple of lines in the V8 configuration to allow ARM v6 compilation. It also takes a LOOOONG time to compile on the RPi)
    6. Setup the blog on the RPi.
  2. Redirect lofjard.se to point to the RaspberryPi.
  3. Install Arch Linux on IBS.
    1. Install nginx.
    2. Install CouchDB.
    3. Install Node.js. (All easy on Arch Linux thanks to fantastic repos.)
    4. Setup the blog on IBS.
  4. Re-redirect lofjard.se back to IBS.
  5. Benchmark
    (The use of nginx for static content instead of node-static should make for a nice performance boost in conjunction with the removal of some now redundant code.)
  6. Keep calm and carry on!
by Mikael Lofjärd

Old But Not Dying

Tuesday the 25th of September 2012 marks my 30th birthday.

Even though I might not be fresh meat anymore, I haven’t forgotten about my blog. I’ve just been busy (raise your hand if you’ve heard that one before).

The Best Laid Plans of Mice and Me…

  • I’ve been planning an article about my success in getting my laptop installed with Arch Linux on UEFI, but I’ve yet to complete it.
  • I’ve also redesigned the blog and it now looks like crap in Firefox.
  • I then fixed the Firefox bugs but have’nt gotten around to uploading them because of my next bullet point.
  • My server operating system/software stack is ooooold. :-)

Planned maintenance

My pretty little server IBS is running Ubuntu Server 10.4 LTS. My original plan was to update it to 12.04 LTS as soon as it was released but, needless to say, that didn’t happen. I’ve since fallen in love with Arch Linux.

As I was fixing my Firefox rendering issues I also happened to update all of my node.js plugins on my workstation. This required some rewriting of my code which in turn is why I haven’t uploaded it yet.

Even though I haven’t updated my blog for the whole of summer, I still hate downtime, and uploading the latest code would require some downtime as it is now depending on newer versions of node.js and a lot of newer plugins.

AND… if I’m going to do maintenance on the server I might as well reinstall the whole machine and run Arch Linux on it.

I’m also planning a major rewrite of my underlying structure to move from a pure node.js server to a nginx/node.js mix.
It’s a move I’ve been contemplating for a while since I’ve been having some problems with my implementation of virtual domains in conjuncture with my speedy in-memory caching.

tl;dr

I’m not dead, just old(er). The blog still lives. Planned maintenance has been keeping me from fixing stuff. Life (vacations/kids/work/barbecuing) has gotten in the way of me writing semi-interesting blog posts this past summer.

by Mikael Lofjärd

Jackrabbit

I’ve finally received my new laptop. It’s not the one I initially wanted (the Lenovo X230) since that on costs a small fortune in the configuration I wanted. Well at least it costs a bit too much here in Sweden with all the taxes and what not. Anyway, I settled for an Asus Zenbook Prime (UX31A) and had money left to take the family on vacation instead.

Crapdev

My old laptop (crapdev) has served me well. I didn’t even pay for it since I just took it home with me when I received it’s replacement at work. Yes, I’ve been using my old work laptop as a private workstation for the last 2-3 years and it’s starting to show.

Hello Jackrabbit

I named my new toy “jackrabbit” as a nod to the restaurant from Pulp Fiction (Jackrabbit Slim’s). With the UX31A measuring only 18 mm at its thickest, and to be fair, 4 mm of those are its rubber feet, I’d say “slim” is the word.

Other specs include an awesome 1920x1080 matte IPS display panel, a blazing fast 120 GB SATA-3 SSD from ADATA, 4 GB of RAM and the Ivy Bridge Core i7 3517U 1.9 GHz CPU with its accompanying Intel HD Graphics 4000 chip.

Needless to say, it runs circles around my old laptop. By the time KMS has switched my tty to native resolution, the laptop is already done booting. The BIOS POST actually takes longer than booting Arch Linux on this machine.

Now getting Arch Linux installed on this EFI-equipped machine however, that’s a story for another day (or week). Stay tuned.

by Mikael Lofjärd

Weekend Getaway

I’m leaving for Gothenburg in a couple of hours for some all-weekend-hacking-codapalooza-marathon-event-thingy (A.W.H.C.M.E.T?). It’s going to be awesome!

Also, I’ll get to enjoy the company of some great friends and some great minds (not at all mutually exclusive).

I have a pretty neat idea that has been brewing in the back of my mind for a few days. I’m going to put this weekend (and as many of those great minds as I can persuade) into it and see if it pans out to something usable. I might even tell you about it someday. =)

by Mikael Lofjärd

The Tiling Truth

Form, function and flexibility. I find that these three concepts are not always easy to rank. Sometimes I feel that form trumps functionality, sometimes I don’t. When it comes to window managers however, function is king!

The Form Junkie

I have to admit, I’ve been totally ignorant about tiling window managers up until now. I’ve always wanted my desktop to look nice and I’ve sometimes gone to great lengths to customize my desktop to be pixel perfect the way I want it. But lately I’ve come to a realization; mouse pointers suck!

Well most mouse pointers do anyway. I’m a trackball user so I think all regular mice sucks anyway, but now I’ve started hating trackpads. I’ve been planning on picking up a new laptop soon (well as soon as the X230 gets released) but I want to be able to use it on the couch without having to rely on a mouse/trackpad/trackball. The Lenovos “nipple-mouse” is about as far as I want to go since it sits smack in the middle of the keyboard.

I need a keyboard centric desktop environment.

Enter xmonad

Tiling window managers to the rescue! More specifically; xmonad to the rescue!

xmonad is an awesome (no pun intended*) tiling window manager written in haskell. It’s fast, easy to use, easy to configure and it runs entirely of the keyboard. There are many great guides to configuring xmonad, and it’s incredibly stable (well duh, it’s written in haskell).

* Not really a “pun”, but “awesome” is the name of another tiling window manager.

by Mikael Lofjärd

Progress at Last

Sometimes you need more than your operating system gives you.

That’s when a text editor comes in handy.

#!/bin/bash

EXPECTED_ARGS=2
E_BADARGS=65
E_BADPATH=66

if [ $# -ne $EXPECTED_ARGS ]
then
  echo "Usage: `basename $0` {source} {dest}"
  exit $E_BADARGS
fi

if [[ ! -f "$1" ]]; then
	echo "Source file does not exist or is not a regular file."
	exit $E_BADPATH
fi

DESTSIZE=`du -b "$1" | awk '{print \$1; }'`

DESTFILENAME=`basename "$1"`

if [[ -d "$2" ]]; then
	DESTPATH="$2/$DESTFILENAME"
else
	DESTDIR=`dirname "$2"`
	if [[ ! -d "$DESTDIR" ]]; then
		echo "Dest dir does not exist."
		exit $E_BADPATH
	fi
	DESTPATH="$2"
fi


cat "$1" | pv -s $DESTSIZE -p -e -r > "$DESTPATH"

exit 0

Copying large files to my NAS becomes so much more fun when I actually KNOW that it’s doing what it should. Progress bars FTW!

UPDATE: Now it’s actually working for more than one case. :)

by Mikael Lofjärd

LESS Is More, More Or Less

A while back I read a blog post somewhere about how the LESS parser/compiler had been remade in Javascript.

“Well awesome”, I thought to myself as I had been wanting some more flexibility in CSS but had been to stubborn/proud to install the SASS compiler since it’s written in Ruby. Needles to say, I wanted to incorporate it in my blog as soon as possible but I’ve not had the time to actually do it until now.

LESS you say?

LESS (like SASS) is a CSS derived language that adds a whole lot of long needed features to CSS to ease maintenance of large style sheets. It compiles into regular CSS markup either in realtime (through their nifty Javascript implementation in the browser) or, as in my case, as a bootstrapping task when I start my blog.

For now, it’s tacked on in a kind of ugly way in my BundleController, but I might redo the actual code some day since I’m not pleased with it. It works though so for now it will have to suffice.

What does if bring to the table?

It brings variables (!!!). Finally you can stop sprinkling your CSS files with color descriptions and font sizes:

@white: #fff;

.myClass {
  background-color: @white;
}

It’s as easy as that.

It also brings something called mixins, which is kind of like multiple inheritance:

.basefont(@size: 12px) {
  font-family: Arimo, sans-serif;
  font-size: @size;
}

body {
  .basefont;
}

h1 {
  .basefont(24px);
}

Mixins can be quite useful in cutting down on repetitive CSS code, and it has support for parameters and default values.

LESS also lets you nest your rules to reduce repeating your selectors:

div.sitewrapper {
  >nav {
    ul {
      margin: 7px 0px 2px 5px;
      li {
        margin: 3px 0 2px 0;
        a {
          width: 70px;
        }
      }
    }
  }
}

When this is compiled it is turned into:

div.sitewrapper > nav ul {
  margin: 7px 0px 2px 5px;
}

div.sitewrapper > nav ul li {
  margin: 3px 0 2px 0;
}

div.sitewrapper > nav ul li a {
  width: 70px;
}

There is also support for a lot more complex nesting rules and a calculation API so that you can calculate colors and distances and make relative offsets and such. I recommend you to read up on http://lesscss.org/ about all the cool features of less.

So does it make your life easier?

It’s kind of hard to say since I just got it running and my style sheet probably needs more work done on it, but so far I’ve been able to cut around 50 lines of CSS out of my ~660 line file, and it has gotten a lot less repetitive and a lot more easier to read I think.

It’s not deployed yet but when I deploy it this weekend I will let you be the judges as the source code is made available as usual.

by Mikael Lofjärd

Cache Me If You Can

Today at work was “do-anything-but-work-day”. It’s a bit like Googles 20%, but instead of 20% it’s more like .0001% .8% or something like that. It’s was our first time and not that many people had a clear idea about what to do at first. I on the other hand had a mission all planned out.

The Performance Degradation

When I put the blog on the new server back in January, I noticed a small decrease in performance. After a few tests I’ve realized that the CPU is the culprit.

The Atom D525, while dual-core, at 1.6 GHz has roughly half the computational power of the Pentium M at 1.5 GHz, which was what my old server had under the hood.

Node.js can make use to multi-core processors by starting more instances of itself, which made concurrent connections on the new server almost the same speed as on the old server. However, concurrent connections isn’t really my problem since I only have around 30 readers on a good day.

What’s Taking You So Long?

Well even in the old version of the blog, I do a lot of caching. My node-static instance takes care of all static file handling and it does a really good job of caching them. I also cache all of my mustache templates when I start Node.js so I can read them from memory every time I render a page.

What was taking so long was actually more than one thing.

First there was database access. CouchDB is really fast and caches a lot, but its only way of communicating is REST over HTTP so there’s still some overhead getting to those cached results.

And then there was presentation logic. The actual rendering of the data on to the template takes a few milliseconds and all pages with source code on them take a few milliseconds more to render all the syntax highlighting server-side. Sometimes there’s a lot of RegExp running to make it all happen.

The Mission

This brings us back to today and my mission; to build an in memory cache for caching web server responses.

My plan was to build a cache that stored the entire HTTP response (headers and content) and that I could clear selectively when needed. This lead me to remove my multi-core code and run Node.js as a single process, since otherwise I would have been in another world of hurt trying to get my processes to sync there cache stores.

When a new post is added I want to clear most of the cache (list pages, archive, atomfeed etc) but not the post pages, and when a comment is added to a post I just want to clear the list pages and the post page for that post. So I added a few different cache stores that I could clear out as I wanted to.

Most of this is handled by the CacheManager.

/*****************************************
 *   Cache Manager
 *****************************************
 *   Author:  mikael.lofjard@gmail.com
 *   Website: http://lofjard.se
 *   License: MIT License
 ****************************************/
 
var CacheManager = (function () {

  var fs = require('fs');

  var env = require('./environmentManager').EnvironmentManager;
  var misc = require('./misc').Misc;

  var cacheStore = {};

  function getStoreType(url)
  {
    var urlParts = url.split('/');
    var result = 'dynamic';

    switch (urlParts[1]) {
      case 'source':
      case 'about':
        result = 'static';
        break;
      case 'archive':
      case 'atomfeed':
        result = 'semiStatic';
        break;
      case 'tags':
      case 'tag':
        result = 'semiDynamic';
        break;
      case 'post':
        result = 'floating';
        break;
    }

    return result;
  }

  return {

    init: function () {
      cacheStore = {};
      cacheStore.static = {};         // static between boots       - /source /about
      cacheStore.semiStatic = {};     // clear on new post          - /archive /atomfeed
      cacheStore.semiDynamic = {};    // clear on edit post         - /tags /tag
      cacheStore.dynamic = {};        // clear on comment (default) - / /page
      cacheStore.floating = {};       // null item on comment       - /post
    },

    clearOnNewPost: function () {
      env.info('CacheManager: Clearing cache on new post');
      cacheStore.semiStatic = {};
      cacheStore.semiDynamic = {};
      cacheStore.dynamic = {};
    },

    clearOnEditPost: function (url) {
      env.info('CacheManager: Clearing cache on edit for ' + url);
      cacheStore.semiDynamic = {};
      cacheStore.dynamic = {};

      delete(cacheStore[getStoreType(url)][url]);
    },

    clearOnNewComment: function (url) {
      env.info('CacheManager: Clearing cache on comment for ' + url);
      cacheStore.dynamic = {};
      delete(cacheStore[getStoreType(url)][url]);
    },

    cache: function (url, headerData, contentData) {
      env.info('CacheManager: Caching content for ' + url);
      cacheStore[getStoreType(url)][url] = { content: contentData, headers: headerData };
    },

    fetch: function (url) {
      var data = cacheStore[getStoreType(url)][url];

      if (typeof(data) != 'undefined') {
        env.info('CacheManager: Found cached entry for ' + url);
        return data;
      }

      return null;
    }
  };
 
}());
 
typeof(exports) != 'undefined' ? exports.CacheManager = CacheManager : null;

Hooking It Up

Previously my main workflow looked something like this; The Router looked at the request, called the assigned Controller which fetched data, formed the data into a model and passed the model to the ViewManager which rendered the result to the response stream.

Hooking up the CacheManager meant that I had to get some parts a little “dirtier” than I wanted, but instead of putting a lot of code into the ViewManager, I created the ResponseManager.

/*****************************************
 *   Response Manager
 *****************************************
 *   Author:  mikael.lofjard@gmail.com
 *   Website: http://lofjard.se
 *   License: MIT License
 ****************************************/
 
var ResponseManager = (function () {

  var env = require('./environmentManager').EnvironmentManager;
  var cm = require('./cacheManager').CacheManager;

  var misc = require('./misc').Misc;

  return {

    writeCachedResponse : function (response, cachedUrl) {
      env.info('ResponseManager: Writing cached view for ' + cachedUrl);

      var data = cm.fetch(cachedUrl);

      response.writeHead(200, data.headers);
      response.write(data.content, 'utf-8');
      response.end();
      return;
    },

    writeResponse : function (request, response, responseData, doNotCache) {
      var pathName = misc.getPathName(request.url);

      if (typeof(doNotCache) == 'undefined') {
        cm.cache(pathName, responseData.headers, responseData.content)
      }

      response.writeHead(200, responseData.headers);
      response.write(responseData.content, 'utf-8');
      response.end();
      return;
    }
  };
     
}());
 
typeof(exports) != 'undefined' ? exports.ResponseManager = ResponseManager : null;

The ResponseManager does most of the talking with the CacheManager and I remade the ViewManager so that the renderView() method now returns the rendered response instead of writing it to the response stream. This lets the Controllers do the job of rendering through the ViewManager and then passing the result to the ResponseManager.

The other part of the equation is the Router. I didn’t really want to put CacheManager calls into the router but it is the first place that has a good path to use as key, so for now the Router checks for the existence of a cached response and, if found, sends it to the ResponseManager before even starting to look up what Controller to call if no cached response is found.

Show Me The Money

So what kind of a performance boost are we talking about?

Well, using the handy ab (Apache Benchmark) I sent a thousand requests to my different implementations:

Uncached

Requests per second: 5.41 (mean)
Time per request: 184.761 ms (mean)
Transfer rate: 177.92 Kbytes/sec recieved

Cached

Requests per second: 62.86 (mean)
Time per request: 15.910 ms (mean)
Transfer rate: 2097.06 Kbytes/sec recieved

That’s quite some increase in performance. So much, in fact, that the transfer rate exceeds my measly 10 Mbit/s outgoing fiber connection. At least now, if my blog was to slow down I know it’s not the servers fault.

Just for kicks I benchmarked it running on my laptop with a Core 2 Duo at 2.0 GHz and the results point to some possible areas of improvement for Intel on the Atom line (mainly memory access speed):

Cached (on my workstation)

Requests per second: 213.98 (mean)
Time per request: 4.673 ms (mean)
Transfer rate: 7030.95 Kbytes/sec recieved

Luckily I don’t have enough traffic to warrant an upgrade to my fiber connection. 100/100 Mbit/s costs almost twice as much as my 100/10 Mbit/s.

by Mikael Lofjärd

Screw You Ubuntu - I'm Going Home

Ignoring a short play date with Red Hat around ‘95, my first Linux love was Slackware.

Slackware was fast and awesome but it somewhat lacked in the package discovery department. I installed most things from source and after learning about all the bad things that can happen when you install new versions of software on top of the old, I setup a package manager, but Slackware still lacked a central package repository.

All Your Source Are Belong To Us

The central source repository led me to switch to Gentoo. Being able to just install things without having to find the source code online first was great, but again I grew tired. The long compile times eventually wore me out and this time the switch was made to Ubuntu.

Out of The Box Experience

Ubuntu was nice in a everything-just-works sort of way, but now, some 6 years later, it doesn’t do what I want and I don’t know how to fix it with all the magic going on. I need to get back to my beloved manual tweaking.

The Arch of Truth

Arch Linux is my new pal. I’ve installed a bare core system that boots to the terminal (retro style), I’ve installed X and am about to configure it to my hearts content. If this doesn’t work out then; LFS, here I come!

by Mikael Lofjärd

The Development Environment Enigma

As you might have noticed, there hasn’t been much work done on the blog these last few months. It kind of boils down to complexity.

The Old Setup

When I started building this blog my main workstation was running Windows 7. Everything ran as well on Node.js on Windows as it did on my Linux server. It was a nice and simple setup; develop locally, test locally, deploy on server.

Enter CouchDB

Then I added a database. Specifically CouchDB that only worked on Linux. This meant a new, more complex development routine; develop locally, deploy on server, test on server, rince and repeat in case of error.

This worked for a while but lately as the code has gotten more complex and it does a lot of pre-caching on startup, I’ve been longing for a locally deployed test version again.

The Switch

So I decided to switch my main workstation over to Linux again. I’ve been away from the Linux desktop for a few years (having only server installations at home), but I felt that it was time for me to get back to my roots.

WTF!?

I have very fond memories of Linux from the late 1990’s, early 2000s. It was fast, lightweight, fast, beautiful and fast. I think you can see where I’m going with this.

What is up with the Linux desktop of today? I’ve been looking at screenshots of the the latest Ubuntu with Unity and it looks like crap, so I installed Linux Mint hoping that Gnome would at least have a more polished look. Boy was I wrong. Gnome 3 also looks like crap. There are these little ugly inconsistencies showing up all over the place; scroll bars, menu placement, font-sizes, you name it.

And it is soooo painfully slow. I mean, when I click on a f-ing menu, I don’t want to wait for half a second before it pops up. If my menus could feel fast 10 years ago, why can’tit do the same now?

Now, I know there’s a lot more going on with all these modern composition managers and what-not, but I just want a clean, fast, two-dimensional development environment.

Let’s Do The Time Warp Again

So I figured I’d install some old school (fast) window manager and spend some hours configuring its quirky text config files like back in the old days. Fluxbox (fka Openbox, fka Blackbox and so on) was probably the fastest one I could remember so that’s what I installed. Awesome!

Damn You Linux Mint

For some weird reason that I can’t figure out I can’t compile the “latest” CouchDB and Couchbase Single Server has been discontinued so I can’t use Linux Mint anyway.

So now I’m going to bite the bullet, install Ubuntu and then kick Unity in the teeth and install Fluxbox again.

TL;DR

  • The new WMs in Linux sucks, looks ugly, are to slow.
  • Couchbase annoys me.
  • Things will never be as fast as when Slackware still used libc (3.6 FTW!).
by Mikael Lofjärd

Sorry, sharing is not available as a feature in your browser.

You can share the link to the page if you want!

URL