UK Government To Demand Data On Every Call And Email

[] UK Government To Demand Data On Every Call And Email

Plans could force ISPs and phone operators to hand over records on all phone calls, emails, Tweets and Facebook messages

[] Phone and email records to be stored in new spy plan

Details of every phone call and text message, email traffic and websites visited online are to be stored in a series of vast databases under new Government anti-terror plans.

This story also made the Slashdot front page.

Stop And Smell The Roses, But Be Wary of The Road Yet Travelled

Here is a wonderful poem that I remember from my childhood.  Although many of the things that we learn in school as children are akin to shrink-wrapped airplane food, there are some juicy bits of wisdom that are worth taking the time to stop and savour.

Stopping by Woods on a Snowy Evening

Whose woods these are I think I know.
His house is in the village though;
He will not see me stopping here
To watch his woods fill up with snow.

My little horse must think it queer
To stop without a farmhouse near
Between the woods and frozen lake
The darkest evening of the year.

He gives his harness bells a shake
To ask if there is some mistake
The only other sound’s the sweep
Of easy wind and downy flake.

The woods are lovely, dark and deep,
But I have promises to keep,
And miles to go before I sleep,
And miles to go before I sleep.

Wikipedia: by Robert Frost, 1874/March/26 – 1963/January/29

“The woods are lovely, dark and deep, But I have promises to keep, And miles to go before I sleep, And miles to go before I sleep.”  These words always seem to dance in the back of my mind after any significantly challenging or rewarding moment in my life.  It is a reminder that despite where we may be right now in our lives, we have miles to go (both in mind and body) before we reach our true destination, which is (when you really think about it) just another point of departure.  Go bravely into the great unknown.

Canadian government is ‘muzzling its scientists’

An article talking about the issue of climate change research, and how governments may be actively preventing the findings from reaching the general public.

“I suspect the federal government would prefer that its scientists don’t discuss research that points out just how serious the climate change challenge is.”

This reminds me of the pseudo documentary “The Age of Stupid” that discusses the many ways that humanity has been warned about the quickly approaching dangers of climate change.



Just finished “The Book” by Alan Watts. I like it.

I have just finished reading The Book.. On the Taboo Against Knowing Who You Are by Alan Watts.  I have thoroughly enjoyed reading this book!  It comes as close as I can imagine to a book that helps its reader truly understand the concept of existence, our world, and our “purpose in it”.

The Book (cover)

I must admit that I have had an affair with such ideas and philosophies for a very long time – and this perhaps makes the content and context of the book easier for me to grok than it would others – but it is worth the effort.  If there is anything worth doing in this world, I would image that understanding who you are, and understanding why you have the experiences and knowledge that you do, in contrast to the experiences and knowledge of others around you, to be of utmost significance and importance.

I have written a few articles under various pseudonyms over the years that explore the very concepts explained in this book, but have never really come across a published work that summarized these thoughts as clearly and succinctly as I would have liked, until now.

If you have any capacity or motivation to understand the world you live in, and you are able to free yourself (your mind) from the conditioning of your environment and your up-bringing, even for a moment, then I suggest you take the time read this book.

If you are not very familiar with Eastern or Western philosophy to begin with, then the ideas in this book may be difficult to grasp.  Nevertheless, once you’ve had a chance to explore the basics of such ideas in other writings, you would do well to circle ’round and come back to this marvelous treasure.

Using CouchDB with Perl?

I’ve been working away on a project where I’ll be using CouchDB and Perl, and was searching the ‘net for information on CouchDB CPAN modules.

There were, of course a lot of CouchDB modules that turned up on MetaCPAN, however I couldn’t figure out which one I should bother messing around with.  I was looking for something simple and straight-forward, similar to the module that was posted in full-source format on Apache CouchDB’s “Getting Started with Perl” guide.

I looked at CouchDB::Client, but found the implementation a little scattered – there are no functions documented that explain how to deal with couchdb documents, only for getting info on and creating databases (the tests in ‘t/’ weren’t very helpful here either).  And the functions don’t say anything about document id’s, which would have been nice.

I also looked at AnyEvent::CouchDB, but again there seemed to be too much going on.. too many methods for doing many things that I won’t need to do.

The “Getting Started with Perl” guide talks about a module called Net::CouchDB, a module that is curiously missing from CPAN as far as MetaCPAN and are concerned – but Jeremy Zawodny wrote up a nice guide called “Hacking with CouchDB” that uses this module, showing it’s clear interfaces with function names like:

  • $cdb->create_db
  • $cdb->put, and
  • $cdb->new

These are of course in contrast to confusing function names like:

  • $cdb->couch(), or
  • $cdb->replicate()

.. that you’d have to deal with in AnyEvent::CouchDB.

Eventually MetaCPAN lead me to “CouchDB::Simple” which sounded a lot more like what I was looking for.  I didn’t get very far with it however, since the install failed.  I e-mailed the author to give him a heads up, since I don’t think it was my environment at fault (perlbrew perl 5.14.2 + cpanm).  I’ll give it another try this week and see if I can get through that hurdle.

 Update (2012-01-25):  Gave AnyEvent::CouchDB a try, things didn’t go as smoothly as I was hoping.  I can attribute some of this struggle to my lack of familiarity with CouchDB.  Also gave DB::CouchDB a try and got a lot further, but ran into a problem with “bad_content_type” error messages when attempting to post a document to the database.  A little reading and I found that this error can easily be triggered by JSON syntax errors.. but isn’t that what the module is supposed to handle?  I’m thinking I may now give this a try with WWW::Curl or some such, since I’m not having too much luck with CouchDB specific modules… but I’m not done yet :)


[Quote] Alan Watts on the difference between where we are, and where we are going..

“… To most of us living today, all these fantasies of the future seem most objectionable: the loss of privacy and freedom, the restriction of travel, and the progressive conversion of flesh and blood, wood and stone, fruit and fish, sight and sound, into plastic, synthetic, and electronic reproductions. Increasingly, the artist and musician puts himself out of business through making ever more faithful and inexpensive reproductions of his original works. Is reproduction in this sense to replace biological reproduction, through cellular fission or sexual union? In short, is the next step in evolution to be the transformation of man into nothing more than electronic patterns?”

” All these eventualities may seem so remote as to be unworthy of concern. Yet in so many ways they are already with us, and, as we have seen, the speed of technical and social change accelerates more than we like to admit. The popularity of science-fiction attests to a very widespread fascination with such questions, and so much science-fiction is in fact a commentary on the present, since one of the best ways of understanding what goes on today is to extend it into tomorrow. What is the difference between what is happening, on the one hand, and the direction of its motion, on the other? If I am flying from London to New York, I am moving westwards even before leaving the British Coast.”

– From: The Book.. On the Taboo Against Knowing Who You Are, by Alan W. Watts,  First Collier Books Edition 1967

An arbitrary thought

I’m sure that not everyone has this ability. It takes a sense of rhythm, the ability to identify and follow a pattern.  The ability, is that which allows you to listen to music and have it somehow tell an epic story in the depths of your imagination.  If you’ve ever done this, then you know precisely what I’m talking about.  I would argue that it is one of the most creative things a human being can do – create a visual story based on sound.  It’s like our ability to draw three dimensional objects on 2 dimensional surfaces, these are very special abilities – very rare gifts indeed.

Just had an idea for a multi-player, multi-controller, single interface, multi-achievement gaming environment.

Imagine a multiplayer game where two or more people are playing the same game simultaneously, and controlling the same character. I don’t mean controlling parts of the character, I mean the whole character (for example, if implemented in a first-person shooter, or RPG like Oblivion).

The game starts, and you are both playing the same character at the same point in time. The way this works is that each player is playing an instance of that character in the same world. Each player is able to make decisions and do things however they see fit with their character instance. The character that has the higest achievement score after a major decision or event becomes the save-point for the next period of play. So whoever makes the better decision, or who ever fights the best and delivers the most damage and kills the bad guy, win that round, and the game continues forward from that point.

This could be applied to games like Oblivion, where multiple people are playing the same character, and the one who kills the vampire, or the one who is able to pick the lock, wins that “encounter” receives a separately counted set of points (tied to the player, not the character), and the game is saved and continues from that point.

I think this would be a great game to play, you can jump in and out any time you want, and the game will continue to move forward because of the other players. Consider this idea GPL’d.

New Acer Iconiatab Acquired

Picked up an android Iconiatab over the weekend. Now to load up the latest draft of Modern Perl. Installed an SSH client, and tweetdeck, both of which seem to work nicely. Still getting used to the touch-based on-screen keyboard, but that will take some time (typing this post using the tablet right now,so forgive the lack of attention to detail).

I came across a very cool keyboard alternative called 8pen, which I’ll be looking into shortly. The primary reason I bought the tablet was for reading books, and was originally planning to get the Kobk Vox, but after playing with it in person, it didn’t impress me that much.

Playing With Prime Numbers

I’ve been toying around with functional programming, and recently came across a perlmonks thread discussing multiple ways to calculate prime numbers.  One of the things I noticed about many of the examples was that almost all of them used loops of some sort (for, when, etc).  So I decided to tackle the problem without using any loops.  Instead, I’ll just use recursive functions.

Firstly, here’s the perlmonks thread: Prime Number Finder

And here’s the solution I came up with:

#!/usr/bin/env perl

use strict;
use warnings;
use 5.010;

$DB::deep = 500;
$DB::deep = $DB::deep; # Avoids silly 'used only once' warning

no warnings "recursion";

# Identify primes between ARG0 and ARG1

my ($x, $y, $re_int, $result);
my ($prime, $is_int);

$x = $ARGV[0];
$y = $ARGV[1];

$is_int = sub {
    my $re_int = qr(^-?\d+\z);
    my ($x) = @_;
    $x =~ $re_int
      ? 1
      : 0;

$prime = sub {
    my ( $x, $y ) = @_;
    if ( $y > 1 ) {
        given ($x) {
            when ( $is_int->( $x / $y ) ) {
                return 0;
            default {
                return $prime->( $x, $y - 1 );
    else { return 1; }

$result = sub {
    my ( $x, $y ) = @_;
    if ( $x <= $y ) {
        if ( $prime->($x, $x-1) ) {
            say $x;
        $result->( ( $x + 1 ), $y );

$result->($x, $y);

When running this code with larger numbers, I would eventually run into “deep recursion” warnings, which is why I’ve had to use no warnings "recursion"; and set $DB::deep to a specific value higher than 100 (which is the default). $DB::deep is a debugging variable used specifically to limit recursion depth, in order to prevent long-running or infinite recursive operations.

The method I’m using here to calculate prime numbers isn’t the most efficient, since I’m not doing anything to reduce the amount of numbers I have to test at each cycle. However, adding some extra intelligence to this, such as the filtering used by the Sieve of Eratosthenes (an “ancient Greek algorithm for finding all prime numbers up to a specified integer.”) should be doable.

I’ll be keeping an eye out for other solutions, since I’m sure there are many (especially in perl), but so far this one seems to be fairly fast and clean. I’m looking forward to what Math::BigInt can offer here as well, if anything.

Playing with Factorials, Haskell, and Perl

I’m currently making may way through a book called “Seven Languages in Seven Weeks” by Bruce A. Tate.  So far it’s been an interesting read, but I’m far from finished.

One of the things in the book that caught my eye was a recursive factorial function in Haskell, which seemed so simple, that I had to see what it would look like in perl.

So I wrote up the following perl snippets to calculate factorials.  There are, of course, multiple ways to do it as I’ll describe below.  There are also (likely) many other ways which I haven’t thought of, so if you have an interesting solution, please share.

One of the things that really caught my attention was how simplistic the syntax was for writing somthing so complex.  Recursion is a fairly simple idea once you’ve seen it in action – a function that executes itself.  However, the implementation of recursion in a given programming language can be somewhat difficult to comprehend, especially for new programmers or those without programming experience.

Although I haven’t dived into Haskell quite yet, it seems to make implementing a factorial function so simple, that I kind of stumbled when trying to understand it, thinking I was missing something.. but it was all there in front of me!

Firstly, let’s clarify what a factorial is (from wikipedia):

In mathematics, the factorial of a non-negative integer n, denoted by n!, is the product of all positive integers less than or equal to n. For example,

5 ! = 5 \times 4 \times 3 \times 2 \times 1 = 120 \


So the factorial of 5 is 120.  Or 5! = 120.   Lets look at the Haskell example from the book.

let fact x = if x == 0 then 1 else fact (x - 1) * x

The above line is saying “if x is 0, then the factorial is 1 – otherwise, call myself with (x – 1), multiplied by x”

Lets look at this in ghci (the Haskell console):

[jbl@watchtower tmp]$ ghci
GHCi, version 7.0.3:  :? for help
Loading package ghc-prim ... linking ... done.
Loading package integer-gmp ... linking ... done.
Loading package base ... linking ... done.
Loading package ffi-1.0 ... linking ... done.
Prelude> let fact x = if x == 0 then 1 else fact (x - 1) * x
Prelude> fact 5

After seeing how easy it was to implement the recursive factorial function in Haskell, here are my attempts in perl.

Firstly, using a loop:

#!/usr/bin/env perl

use strict;
use warnings;
use feature "say";

my $nni = $ARGV[0] ? $ARGV[0] : 5;

for my $i ( 1..($nni - 1) )
    $nni = $nni * $i;
    say $nni;

This first example doesn’t implement a function, and is really just bad (but still working) code. It requires that your base number be global and alterable, in this case $nni.

Now, lets try it with an actual function:

#!/usr/bin/env perl

use strict;
use warnings;
use feature "say";

my $nni = $ARGV[0] ? $ARGV[0] : 5;

sub fact 
    my ($nni) = @_;
    return !$nni ? 1 : fact( $nni - 1 ) * $nni;

say fact($nni);

This second method works similarly to the Haskell implementation. It implements a function that calls itself, without any looping required.

However, it’s still not as concise as the Haskell version, so lets try again:

#!/usr/bin/env perl

use strict;
use warnings;
use feature "say";

my $nni = $ARGV[0] ? $ARGV[0] : 5;
my $fact;
$fact = sub { my ($nni) = @_; !$nni ? 1 : $fact->( $nni - 1 ) * $nni };
say $fact->($nni);

Aha, now we’re getting somewhere. In this third example, the fact() function is anonymous, and we’re assigning it to $fact via reference. This allows us to use $fact like an object with a single method that does the factorial calculation.

Although this is pretty much as concise as I was able to get it while taking readability into account, here’s a final example that goes a step further:

#!/usr/bin/env perl

use strict;
use warnings;
use feature "say";

my ($nni, $fact);
$nni = $ARGV[0] ? $ARGV[0] : 5;
$fact = sub { !$_[0] ? 1 : $fact->( $_[0] - 1 ) * $_[0] };
say $fact->($nni);

This last example uses perl’s pre-defined variable @_ which automatically holds a list of function arguments by default. I usually avoid doing this, since it hurts readability, especially for those who don’t live and breathe perl on a daily basis.

To my surprise, it would seem that Haskell has Perl beat (at least in this example) as far as readability + conciseness is concerned.

I haven’t spent much time playing golf here to reduce the number of lines or characters beyond the last example, but if anyone does come up with a tighter solution, please let me know!

Edit (20111005T22:43:50): Here’s a version I found that uses the Math::BigInt module

#!/usr/bin/env perl

use strict;
use warnings;
use feature "say";
use Math::BigInt lib=>'GMP';

my $b = Math::BigInt->new($ARGV[0]);
say $b->bfac();

This version is likely much faster, since the Math::BigInt package is intended to be used in situations where large integers are being handled.

Here’s the post I found with examples written in other languages as well: Factorial Challenge: Python, Perl, Ruby, and C

Now Listening: Deadmau5 – Faxing Berlin (Grifta Dubstep Remix)

Now that I’ve updated my Arch Linux 64 desktop with some newer packages, I’ve come across a few surprises. For one, that XMBC works! I was so peeved when I couldn’t get it to work before. XMBC is an awesome media player for Linux (and the original Xbox).

The latest releases of XBMC includes an visualization plugin that I’ve been trying to get my hands on for the longest time, called ProjectM. No other media player that I was comfortable installing had a ProjectM plugin, including VLC and Totem.

Anyway, I now finally have XBMC running (and I can even run it in a window, under Xmonad.. which looks freaking awesome.)

Happy Friday!

The Movie Review: Limitless

So I watched Limitless the other day on the advice of good friend.  (spoiler alert) It was awesome.

Limitless, starring Bradley Cooper as Edward “Eddie” Morra, explores a world where a pharmaceutical  company has manufactured pills that, when ingested, allows the imbiber to use the full capabilities of his or her brain.  If we currently only use 20% of our brains, then just one of these pills would allow you to use 100% of your brain power within 30 seconds of swallowing it, and the effects last for almost a day.

What makes this movie exciting for me is the believability of the powers that Eddie is endowed with, not to mention a great performance by the cast.  When Eddie takes his first pill, he essentially gains instant access to every single memory that he has.  Everything that he has learned, overheard, or glanced at briefly, becomes a strength and an intuition that he quickly begins to realize and exploit.  Not only does Eddie have access to all of his memories, including all of the Bruce Lee movies he has ever watched, but he also instantly gains the muscle-memory required to accomplish these feats which he has never trained for.

Eddie’s senses become sharper, and some type of time-dilation occurs which allows him to instantly absorb the smallest details about the environment around him.  He becomes intensely motivated to challenge himself by doing seemingly impossible things, like learning to fluently speak new languages in a matter of days.  He can mentally profile people he’s just met with great accuracy in a matter of seconds, allowing him to masterfully manipulate people and conversations to his advantage.  But the most important “ability” that he gains, in my opinion, is the drive to go out into the world and do something.

You see, before Eddie took the pill, he was an unmotivated, unsuccessful writer with one failed relationship behind him, and another crumbling right in front of him.  He was always behind on his rent.  He’s had perpetual writer’s block, which prevented him from even starting to write the novel he’d been promising his publisher for several weeks; and even though he’s a reasonably healthy guy, with a place to live (for the time being), he always looked homeless whenever he went out in public.   Eddie was a mess before the pill.   However, after taking the pill, and given the ability to master his own mind, Eddie was no longer afraid of anything.  He could quickly visualize possible solutions to any unexpected situation he was facing, and address the situation with ease.  Eddie completely turned his life around and began to accomplish things he would never have dreamed of.

I expect that fear is an element of life that can hold us all back from reaching our full potential, but only if we let it.

Eventually, Eddie figures out how to maintain his newfound mental mastery without the need to take the pills, and realizes that he has the entire world at his finger-tips.  However the only thing Eddie wants to do, more than anything else, is share his abilities with the rest of the world so that everyone can experience what it feels like, to be limitless.

It should be obvious that I think Limitless is an awesome movie, and I believe many of you will enjoy it as well.



Understanding the Concepts of Classes, Objects, and Data-types in Computer Programming


Every once in a while I get into discussions with various people about computer programming concepts such as objects, classes, data-types, methods, and functions.

Sometimes these discussions result in questions about the definition of these terms from those who have some programming experience, and sometimes these questions come from those who have no background in computer science or familiarity with programming terminology whatsoever.

I usually attempt to answer these questions using various metaphors and concepts that I feel the individual or group may be able to relate to.

It is quite possible that even my own understanding of these concepts may be incorrect or incomplete in some way. For the sake of reference and consistency, I am writing this brief article to explore these concepts in the hope that it will provide clarity into their meaning and purpose.


So, what is a data-type?

Some languages, like C, have a strict set of data-types. Other languages, like C++, and Java offer the developer the ability to create their own data-types that have the same privilages as the data-types that are built into the language itself.

A data-type is a strict set of rules that govern how a variable can be used.

A variable with a specific data-type can be thought of in the same way as material things in the real world. Things have attributes that make them what they are. For example, a book has pages made of paper that can be read. A car is generally understood to be an automobile that has four wheels and can be used for transport. You cannot drive a book to the grocery store, in the same sense that you cannot turn the pages of a car.

A data-type is a specific set of rules for how a variable can be used.


Data-types in computer programming may include examples such as:
Object Type
==== ========
Int = number, no decimal places
Float = large number with decimal places
Char = a plain-text character


More familiar, real-world examples may include:
Object Type
==== ========
Bucket = Strong, can hold water, has handle
Balloon = Fragile, can hold a variable amount of air, elastic, portable
Wheel = Round, metal, rubber, rolls


In languages like C++, there are core data-types such as the ones found in C. However, C++ also offers developers the ability to create their own data-types.  Providing developers the ability to create their own data-types makes the language much more flexible. We more commonly refer to a user-defined data-type by the more popular term, class.

In C++, a class is a user-defined data-type [1]. That’s all it is. It provides the developer the ability to create a variable (or object) with specific attributes and restrictions, in the same way that doing “int dollars = 5;” creates an object called “dollars” who’s attribute is to have a value which is strictly an integer. In the real world, a five-dollar bill cannot be eaten (technically), and it cannot be driven like a car to a grocery store (even though that’s where it will likely end up).

An object is a variable that has been defined with a specific data-type. A variable is an object when it is used as an intance of a class, or when it contains more than just data. An object in computer programming is like an object in the real world such as a car, or a book. There are specific rules that govern how an object can be used which are inferred by the very nature of the object itself.

The nature of computer programming means that developers have the ability to redefine objects, for example making the object “book” something that can be driven. In the real world however, we know that you can call a car a book, but it’s still a car. The core understanding of what a car is has been ingrained within us. Although “car” is simply a three letter word (a symbol, or label), there are too many people and things in the world that depend on the word “car” having a specific definition. Therefore objects in the real world cannot be as easily redefined as their counter-parts in computer programming (however, it is still possible [2]).

So what is a method?

In computer programming, we have things called “functions”. A function is an enclosed set of instructions which are executed in order to generate (or “return”) a specific result or set of results. You can think of a function as a mini program. Computer programs are often created by piecing together multiple functions in interesting and creative ways.

Functions have many names, and can also be referred to as subroutines, blocks and methods. A method is a function which is specifically part of a class, or a user-defined data-type, which makes a method an attribute of an object – something that the object is capable of doing.  Just like in the real world, methods can be manipulated and redefined for an object, but not for that object’s base class.  A book can be used to prop-up a coffee table, but that does not mean that books are by definition meant to be used in this way.

Enlightenment achieved!

I’m not really sure where I was going with all of this, but the above should be sufficiently lucid.   I was motivated to write this after recently referencing Bjarne Stroustrup’s “The C++ Programming Language”.  If you’ve ever asked yourself the question “what is an object?” or “what is a class?”, then the above descriptions should serve as a useful reference.

[1] “The C++ Programming Language – Special Edition”, page 224.

[2] For example, the definition of “phone” has been redefined several times in recent history, from the concept of a dial-based phone, to cell phones, to modern smart-phones.


New drug could cure nearly any viral infection

Most bacterial infections can be treated with antibiotics such as penicillin, discovered decades ago. However, such drugs are useless against viral infections, including influenza, the common cold, and deadly hemorrhagic fevers such as Ebola.

Now, in a development that could transform how viral infections are treated, a team of researchers at MIT’s Lincoln Laboratory has designed a drug that can identify cells that have been infected by any type of virus, then kill those cells to terminate the infection.

In a paper published July 27 in the journal PLoS One, the researchers tested their drug against 15 viruses, and found it was effective against all of them — including rhinoviruses that cause the common cold, H1N1 influenza, a stomach virus, a polio virus, dengue fever and several other types of hemorrhagic fever.

The drug works by targeting a type of RNA produced only in cells that have been infected by viruses. “In theory, it should work against all viruses,” says Todd Rider, a senior staff scientist in Lincoln Laboratory’s Chemical, Biological, and Nanoscale Technologies Group who invented the new technology.

Because the technology is so broad-spectrum, it could potentially also be used to combat outbreaks of new viruses, such as the 2003 SARS (severe acute respiratory syndrome) outbreak, Rider says.

Other members of the research team are Lincoln Lab staff members Scott Wick, Christina Zook, Tara Boettcher, Jennifer Pancoast and Benjamin Zusman…[ Full Article ]

Trac and Reverse Proxies Using Apache 2

So today I was working on setting up a trac instance behind a reverse proxy, and found that it could be done quite easily under Apache 2. Apache 2 allows you to set up your reverse proxy in a variety of different ways.

The main thing to note here is that you should never try to ProxyPass/ProxyReverse from port 443 to Port 443. Instead, pass from port 443 to 80 in the back-end.  Allow the proxy to handle the SSL authentication for the browser.

If you require a secure connection from the back-end web server to the front-end proxy server, then utilize a VPN, or an SSH tunnel to pass the data; but don’t waste your time trying to make the web server behind your proxy handle the SSL authentication with the browser.  The amount of time you’ll spend trying to figure it out, you would have saved to go do something else more fun, like set up a Cacti instance to monitor bandwidth consumption within your home network :)

So Whats New?

Looks like the Canada Post strike is over, for now.

I don’t know if it would be wise for Canada Post to strike again.  It’s crossed my mind several times, as I’m sure it has crossed the minds of others, that we have enough technology in place, including wireless technologies and the Internet, that we could essentially do away with Canada Post and do everything digitally. It may seem like Canada Post offers a unique and relatively inexpensive service; but with them out of the picture, new solutions would start springing up in no time.

Document imaging would become a hot topic again (whatever happened to the popularity of personal document scanners?). Encryption would once again become an active topic of discussion.  Companies like Purolator and Fedex simply cost too much, and so more “do-it-yourself” type solutions would begin to flourish quickly.

I’ve been listening to some of my favorite tunes lately that I haven’t heard in a while, such as INXS, Portishead, Red Hot Chili Peppers, and so on.  What I’ve found very curious is that many of these great bands and their awesome songs have been songs I have been listening to ever since I was a clueless child.   Songs by Counting Crows, Seal, REM, and Evanescence were songs that really spoke to me, and validated for the most part, that this world we live in truly is a crazy place.  These songs were just the universe’s way of telling me that it totally agrees with me.

I think some of best lyrics in any song has come from Evanescence.  They were a band that was so ahead of it’s time, it isn’t even funny.   There are only a few other bands that I would place in that category, such as The Red Hot Chili Peppers and  Sneaker Pimps.  Hell, I’d even put No Doubt in that list, their “Tragic Kingdom” album was absolutely freaking awesome!  But there simply aren’t that many bands today that I’d put in that category.. but maybe I just don’t listen to enough new bands to know?

What bands do you like to listen to that you’d rank up there with a title of “One of the Greatest Bands of All Time”?

Coke Zero taste in my mouth, and my legs are rather numb

So I’m sitting here in front of my 28-inch I-INC monitor again, after the longest while. A couple of months ago we decided to mount the I-INC on the wall in our bedroom, and move the 32-inch Samsung LCD TV into the basement to serve as my replacement monitor. This was a big mistake. The reasons were relatively justified. We recently bought a 55-inch Samsung plasma to replace our now puny 32-inch Samsung in the living-room -and- I wanted to start using my original XBOX again because of other personal quest to start playing DDR again with my Red-Octane dance pad.

The Red-Octane dance pad didn’t work with the 360, so I had to use the original XBOX. However, the original XBOX did not support HDMI connectors, so I had to find a way to connect it to my monitor in the basement. Since this didn’t seem feasible, I decided that replacing my 28-inch I-INC LCD monitor with my 32-inch TV would be a good idea. Again, bad idea.

Anyway, after several weeks of utilizing an awkward 1366×768 resolution with 75 DPI fonts.. I decided it was time to admit when I was wrong, and revert back to using the 28-inch monitor (I put the 32-inch in the bedroom, like I should have done all along).

Stop The Meter On Your Internet Use

Bell Canada and other big telecom companies can now freely impose usage-based billing on independent Internet Service Providers (indie ISPs) and YOU.

This means we’re looking at a future where ISPs will charge per byte, the way they do with smart phones. If we allow this to happen Canadians will have no choice but to pay more for less Internet. Big Telecom companies are obviously trying to gouge consumers, control the Internet market, and ensure that consumers continue to subscribe to their television services.

This will crush innovative services, Canada’s digital competitiveness, and your wallet.

We need to stand up for the Internet.

Sign the Stop The Meter petition!


Thoughts on Religion and The World Today

I’m starting to think that more and more people are falling back to religious beliefs and traditions because they begin to believe that once you become secular or atheist, that the world has no meaning, that things are as they are, and there is no mystical purpose to life or the universe, and this makes them feel uncomfortable.

I think it’s comforting to know that the universe, our galaxy, our solar system, our world, and yes, all of us are travelling down a singular path which we cannot break away from; that all events, actions, and reactions which take place in the physical world, that have taken place in history, are simply things that we must experience in order to move forward.

It is inevitable that we will all die one day, and that the generations that come after us will all make the same mistakes (not necessarily in the same way, but in the same vain). The World population is continually growing, and it is already way past Earth’s capacity to support human life without the assistance of war and fear (or so it would seem, the way our world leaders have been acting and reacting lately). These are all things people are starting to realize in this age of free information and global connectivity. The reason why the majority of the world cannot cope with this realization is that our actions are still heavily influenced by emotions, faith, and romanticism.

Although we have so much technology at our disposal, we are still not a species which allows logic and reason dictate our choices. Instead, emotions, faith, greed, and mis-information are still the primary movers of the world.

Likely we will one day become an enlightened species, and finally refer to ourselves as “the Human race” rather than as white or black, or by where we were born or raised.  Maybe one day we will become a single nation which finally takes direction from logic, reason, and innocent curiosity; but that time seems very far away, and it’s quite likely we will destroy ourselves before we ever get there.. but I guess it’s still a possibility that we will overcome all our hatred and prejudices, and finally become a peaceful and free world.   Likely you and I won’t be around to see that day, but you never know.  Have faith.

Someone Hacked My Web Server

So I just found that someone hacked into my web server recently, I’m not sure when they started poking around, but I saw some significant activity around December 17th.

I say “hacked” instead of “cracked” or defaced/damaged because I haven’t seen any actual malicious activity, just a lot of wordpress php scripts which had some eval code appended to the top.

I’ve backed up the hacked php scripts and will try to decipher them later. The scripts are basically a bunch of php evals of statements encoded in base64. I could probably decode them quickly via some perl scripts to change all the evals to print statements, and then use the equivalent of perltidy to make them readable in order to find out exactly what they were trying to do.

In any event, it’s likely they still have some backdoor set up, because it seems they got root access, or at least the ability to write a file with root permissions into the DocumentRoot, so I’ll have to keep an eye out.

I’ve upgraded the system to Lenny (was Debian etch, so yeah I’m at fault there) and upgraded wordpress from 2.3.x to the latest 3.0.4. I blew away the hacked wordpress instance, and just installed wordpress from scratch, along with some other things which hopefully will alert me when something like this happens again.

To the person responsible – I’m not running this web server as some sort of proof of my skill set, it’s simply a personal web server which I am hosting myself because I don’t very much like to be pushed into the idea of cloud computing and hosting my stuff on blogspot, etc. I think it’s good to be able to host your own applications and services, and not be tied down to services provided by Big Corp.

My message to you is this, use your head. It was probably fun to try and break in, but actions like this are what’s causing people to subscribe to cloud computing with open arms, and eventually Big Corp will be hosting everyone’s data, and the freedom that you have to learn how to manipulate PHP will be non-existent because we’ll all be stuck in AOL hell.

If you want to do something cool and interesting, why not trying using your skills to help people.

If anyone’s interested in taking a look encoded PHP, here’s what looks to be one of the primary sources: style.css.php.  Note that the script is basically all on a single, really long line, so most text editors may have trouble reading it.

S5 Presentation Software, XMind, Freemind, and mm2s5

I’m tired and a bit wired, but I figured I’d put a few words together just to purge my messy mind. So today I’d like to talk about presentation software (a la powerpoint); mind-mapping software, and how to get from one to the other in an interesting way.

I’ve been a mind-mapping fanatic for many years, as far back as 2004 if I recall correctly. Back then (and even up to today) I’ve used and loved the free and open-source mind-mapping software called Freemind []. It’s a great little piece of java software which provides a great UI for doing brainstorming and outlining using mind-maps.

These days, I use a mix of Freemind and XMind to do my day-to-day brainstorming and planning. XMind is like Freemind (in fact, I’m sure it borrowed many ideas from that project), but has a nicer UI, and many more options in terms of layout, tagging, markers, etc. I find that I jump between the two often, until my brainstorming takes on a life of it’s own, then I will stick to one or the other for the remainder of the map creation.

I recently had to put together a presentation for the Toronto Perl Mongers group to discuss, well Perl.. and VMware. And of course I whipped out Freemind and XMind to start the brainstorming process. XMind has a nice feature that allows you to export your mind-maps to an MS Power Point or OpenOffice Impress type format, which is great and what I needed. Problem is though that this feature is not free, it comes as part of XMind’s online subscription services for their “professional” version of the product. Even though the price is fairly reasonable, and I’m sure at some point may just bite the bullet and subscribe, I wasn’t ready to do that just yet. So I was on the hunt for some way to convert my mind-map into some kind of presentation.

To their credit, one thing that XMind does do properly is allow you to export your XMind maps to Freemind’s .mm format. This is great, because Freemind itself has multiple freely accessible export formats, including exports to and PDF. However, I wasn’t satisfied, I was looking for something that would do the job more completely.

Eventually I came across a neat little HTML/Javascript based presentation tool called S5, which stood for “Simple Standards-Based Slide Show System”. This tool was exactly what I was looking for! It’s small, clean, no-fluff implementation meant that I could whip up a professional looking presentation without the need to load up any bulky software aside from Firefox. Problem remained though, that my data was still in XMind (and Freemind) formats. I was considering writing a tool that would convert Freemind XML files into S5 HTML documents, which would have been fairly easy since both formats are fairly open and clear, however that would have taken a good deal of time, and time is one that that I never seem to have enough of these days.

So I went hunting on the plains of Google to see if anyone was experiencing the same problem I was, and if they did anything about it. And what do you know! I found a project on Google Code that does exactly that! The project is called (reasonably enough) mm2s5, and does a wonderful job at converting my Freemind mind-maps into S5 Presentation format!

Anyone who’s interested in finding a nice way to brainstorm and turn their ideas into presentations should seriously consider trying these tools out, they’re fantastic, and they’re free!

Work on CPAN-API and Perl Modules Indexing

Since the last TPM meeting in October, some of the TPM members have been working diligently to improve the CPAN search experience by re-architecting CPAN search from the bottom up. I’ve joined the design team in the hopes of providing the Perl community a much more improved CPAN experience.

As most Perl developers are aware, is great for finding useful libraries and modules, but horrible at providing any significant information which relates modules to each-other, or providing useful meta-information or statistics which can be used to make better decisions on which modules to use, let alone deploy in a production environment.

If you are interested in taking part in the CPAN-API community project, please contact me, or visit the CPAN-API project site on GitHub.

Toronto Perl Mongers:

Jolicloud is of the Awesome

So if you haven’t heard of Jolicloud, then you need to download and install it now. It’s an Ubuntu based OS (a self-proclaimed “Cloud OS”) specifically designed for Netbooks, and it rocks. I have Jolicloud installed on my Samsung N110 Netbook, and I use it for everything from e-mail to games (snes9x) to work (Perl/Vim/Screen). Now what makes Jolicloud super-awesome is that it treats web applications no differently from desktop applications. Each application gets it’s own icon on the “Home screen”. It’s also socially aware – it can connect to facebook and allow you to search for applications and/or people who’ve used those applications, so that you can ask them questions and get guidance on the tools you’re trying to use.

The interface is very slick – big icons and a clean method of navigation to the lesser used functions of a standard Gnome/Ubuntu desktop. The most-awesomest part is that once you load up a terminal, you have full access to the command-line and all Ubuntu apt repositories.

Jolicloud isn’t just for netbooks! I’ve also installed it on my Acer Veriton (similar to the Acer Revo), and am using it as a media center OS. Jolicloud also comes in an “express” edition, which allows you to install it under windows, where it will come up as a secondary OS option under the windows boot-loader.

If you have a netbook, nettop, or any light-weight PC, then install Jolicloud. Highly recommended.

Diving in with Arch Linux

The Problem

The time had come for me to “invest” in getting some new equipment. The only workstation that I had up until recently was a company laptop which I had toted back and forth between VMware and my home office. I keep my personal documents on removable storage, but that doesn’t really help when you don’t have a workstation at home, so lugging the laptop around with me was a must.

Don’t get me wrong, I have systems, but their mostly systems running as file servers or VM servers doing various little things automagically, and they’re not sitting in or around my actual desk at home. Also, my printers/scanner at home relied on my laptop to be of any use. It was time to fix all of these unecessary grievences.

The Dilemma

For the past couple of weeks I had been thinking hard about what kind of system I should buy – should it be a powerful / modern desktop system with lots of RAM and screaming CPU/Video? Or would it be a powerful laptop/notebook which would serve as a desktop replacement? Should I go for the i3, i5, or i7 processor? ATI or Nvidia? What kind of budget was I looking at?

All of these questions plagued me for quite some time (okay, not that long.. I admit I’m a bit of an impulse buyer). I’ve spent long enough thinking about this that I realized a lot about myself. For one, I’m not a gamer. I was once one of those people who would have been ecstatic about getting next-gen hardware to play the lastest power-hungry games. Not any more.. and not for quite some time. The last time I seriously played a PC game was about 3 years ago. When I say “seriously”, I mean played it regulary, at least once a week. The last game I played was a game that I was really into; it was X2 of the X-Series space combat simulators.

Since then, I’ve touched a game or two, on and off, but the has fascination is no longer there. i’m more interested in hacking around with open source programs and becoming a better developer.

The Solution

Since I wasn’t going to focus on gaming and media for my new system purchase, this opened the door for a lot of possibilities that I haven’t considered, and some unexpected disappointments. First off, since I wasn’t going to plop $1,000.00 on a single system, I could, theoretically buy two lower-powered systems. And that’s exactly what I did. Instead of going with a full-fledged desktop or power-house laptop, I ended up buying an Acer Aspire Revo net-top unit as my primary workstation, and a Samsung N110 Netbook as my portable. This Revo is awesome! It has 2GB of RAM (upgradable to 4), an Nvidia ION chipset, and an Intel Atom processor (dual-core). I didn’t need much more than this for my purposes, this was perfect. The Samsung N110 was also a nice little beauty. It was a Atom processor with integrated graphics, but was light, pretty, and had a 6-cell battery, which meant that it would last about 8 hours during heavy use. I quickly installed JoliCloud Express on the Netbook, and have been very happy with it ever since.

The Disappointment (In myself)

The disappointment that I experienced was not in the purchase or the hardware, but it was in the fact that I hesitated for a long time to wipe away the Revo’s bundled OS to install Linux. The OS that the Revo came with was Windows 7 Home edition (the Samsung netbook had Windows XP). I haven’t used windows as my primary OS in years, and have always been proud to say so. For the last four years or so, I’ve been using Ubuntu (severely customized), and before that I was using Debian. When I initially started up the Revo, I was impressed by the windows 7 user interface, the nice colors, the clean lines, and the fact that it picked up all my hardware. It was pretty simple, and I have to admit somewhat luring. I’m definately not the little hacker I was 10 years ago. I don’t have time to spend hours hacking away into the wee morning just on my OS configuration. At least that’s what I keep telling myself :) But then it dawned on me – that’s how I got where I am today, by embracing curiousity, and defying conformity. That’s where life becomes interesting and liberating, and that’s where I feel at home. All these thoughts of nostalgia hit me shortly after I hard-reset the Revo, and windows 7 came up saying “system wasn’t shut down correctly – use safe mode” or something to that effect. There was no way for me to tell it to disregard the unclean boot-up, it persisted to ask me to go into safe mode, with no specific explanation. That’s when I wish I had a grub prompt or command line handy.

Diving in with Arch Linux

After coming to my senses, I realized that I definately didn’t want to go back to using Ubuntu for my primary workstation. For a while I’ve been feeling like Ubuntu has lost much of it’s luster, especially for someone like me who loves simplicity and minimalism over fancy GUIs and extra features. I wanted a distribution that tried to stay at the cutting edge with it’s packages, but didn’t screw with the basics of linux so much that you’re forced to use GUIs to configure your OS. Debian didn’t fit the bill here – it’s great for servers – rock solid, but it’s not that great if you want a cutting edge workstation without having to compile things from source.

After a little bit of reading and browsing, I came across Arch Linux (which I’ve known of only in passing before), and decided that this was the OS for me. The Arch Linux community is small enough that I could make some significant contributions without much effort. The distribution itself is awesome, very clean, and very minimal. And most importantly, all of the system configurations are done by editing text files!

The Arch Way

Installing Arch was relatively straight-forward (IMO). It wasn’t as easy as installing, say, Linux Mint, but it also wasn’t as hard as installing Debian 3.0 either. The installation dialogs were ncurses based, but they were descriptive, linear, and logical. When it came time to supply arguments for the initial configuration of the packages I selected, they were all text files (very well documented) which I could edit with vim! I think at that point I knew that was about to embrace a distribution that was very special indeed. This distro was going back to basics, and not flooding it’s users with fancy splash screens and progress meters, it was doing the needful, and it was doing it well.

I still have a lot more to learn about Arch, as I’ve only scratched the surface so far. I’ve been able to set up sound (with alsa) and video using the latest Nvidia drivers. I’ve configured Xmonad as my window manager, and have gotten a handle of how to query and install packages with “pacman”, the Arch package installer. The only real problem that I’ve run into is setting up CUPS for my printers. After some research, it seems that the version of CUPS (1.4.3-2) available in the Arch packages is the latest version available from the CUPS source repository, and that I may have to downgrade (to 1.3.9) it in order to get my printers working.

Overall, I like what I see so far with Arch. I expect to post more on my experiences with it as I learn.

Syncronizing Xymon’s ‘bb-hosts’ Configurations

I’ve been using Xymon (formerly known as “Hobbit”) for a long time.  In most situations, I have Xymon running in a redundant configuration, with two or more instances of Xymon working together to monitor a network.

Even though Xymon works very well, a single change to the primary server’s configuration file (the “bb-hosts” file) means that you have to make the same change to all other ‘bb-hosts’ files in all other Xymon instances.

There are some creative ways to eliminate the drudgery of updating all these files any time a change to the primary file is necessary.  One method, for example would be to have the master file exported via NFS to all the other Xymon server instances, and each of those instances would sym-link to that primary ‘bb-hosts’ file from their local mount of that NFS export.

I don’t like the NFS export idea, because if the primary server has a problem, and the NFS export is no longer available, all instances of Xymon would break – badly.

Instead, I’ve opted for automatically synchronizing the ‘bb-hosts’ file across all Xymon instances via the use of apache, cron, a sym-link, and a simple bash script.

Here’s  how it works:

  • On the primary Xymon instance, sym-link ‘/home/xymon/server/etc/bb-hosts’ to ‘/var/www/bb-hosts’.
  • On the other instances of Xymon, run a bash script which grabs the primary server’s ‘bb-hosts’ via HTTP, which does some simple comparisons, and over-writes the local Xymon ‘bb-hosts’ if changes are detected.
  • Automat this script with cron.

Perhaps the trickiest part of doing this is the actual script used to grab, compare, and over-write the ‘bb-hosts’ file for the other instances of Xymon.  The script I’ve written below grabs the primary ‘bb-hosts’ file, and does a simple MD5 comparison with md5sum, and if it detects a change in the ‘bb-hosts’ file, it will send an e-mail to notify me that this change has occurred, along with details on what has changed.

Here’s the script:




LOCAL_MD5=`md5sum $LOCAL_BB_HOSTS  | cut -d " " -f 1`
REMOTE_MD5=`md5sum  $REMOTE_BB_HOSTS  | cut -d " " -f 1`

#echo "$LOCAL_MD5"
#echo "$REMOTE_MD5"

if [ "$LOCAL_MD5" != "$REMOTE_MD5" ]; then
        echo "Generated by $0" > $BB_HOSTS_DIFFS;
        mail -s "Xymon: monitor-02 bb-hosts updated" < $BB_HOSTS_DIFFS;

If you need a way to keep your Xymon 'bb-hosts' files in sync, something along the lines of the above script just may be what you're looking for. If you're currently accomplishing the same thing in an interesting way, please post a comment and let me know!

Using DZEN with Xmonad to view Currently Active Network Shares

Currently Xmonad is my window manager of choice, because it’s clean, functional, and removes all the unnecessary crap that most modern desktops usually come with by default.

Although Xmonad is very cool, there are still some things that it’s lacking as far as functionality. Much of this is made up for by the use of Xmobar, Trayer, and other Xmonad compatible plugins and applications. I recently came across another one of these applications, and found it to be an exciting find. The tool is called Dzen.

Dzen is a desktop messaging tool which allows you to easily write some useful scripts, and have the output of those scripts become part of your desktop interface. Many examples of how this works are available on the Dzen webite, but some examples are as follows:

  • CPU Monitoring graphs
  • dmesg log monitoring
  • Notification of system events which are commonly found in syslog
  • E-mail or twitter alerts shown on your desktop as they come in
  • Custom calendar alerts
  • and much more..

Now this idea is not new – I remember there being a project called “OSD” (on-screen display) which essentially allows you to do the same thing. However, I think OSD was meant as more of an single message notification system, rather than the way that Dzen works, with master and slave windows, and the ability to implement menus, etc.

In any case, I decided to give Dzen a try, and am happy with the tool that I’ve been able to whip up. For the longest while, I wanted the ability for my xmonad environment to tell me, at a quick glance, what network mounts and removable devices I currently have mounted. I’m sure that this kind of information is easily available on many bloated desktops, including GNOME and KDE, but I was looking for something simple, small and configurable. Didn’t find it, so I ended up writing my own – with the help of Dzen.

Here are a couple of screenshots of how it looks:

Dzen “Active Mounts” widget (mouse out):


Dzen “Active Mounts” widget (mouse over):


I wrote the scripts fairly quickly, so I’m sure they could be written better, but I think they will provide those of you who are interested, a good example of how to implement a regularly updated notification widget with Dzen.

The scripts are written to check for changes in the mount list, and only update Dzen when a change is detected. It is written in two components:

1) A perl script which captures the mount information in the exact format that I want, and
2) a bash script which handles loading Dzen

Here’s the source code (perl script):


# Written by J. Bobby Lopez  - 27 Jan 2010
# Script to -be loaded- by the 'dzen-mounts.bash' script
# This script can also be run by itself, if you want to dump a
# custom plain-text table of your network shares or removable
# devices.
# This script is meant to be utilized the Dzen notification system
# Information on Dzen can be found at

use strict;
use warnings;

use Data::Dumper;
use Text::Table;

my @types = qw( cifs ntfs davfs sshfs smbfs vfat );

sub getmounts
    my @valid_mounts; # to hold mounts we want
    my @all_mounts = split (/\n/, `mount`);
    foreach my $mount (@all_mounts)
        foreach my $type (@types)
            if ( $mount =~ m/$type/ )
                push (@valid_mounts, $mount);
    return @valid_mounts;

sub getsizes
    my @mounts = getmounts();
    my @list;
    foreach my $mount (@mounts)
        my @cols = split (/\ /, $mount);
        my @df_out = split (/\n/, `df -h $cols[2]`);
        $df_out[1] .= $df_out[2] if defined($df_out[2]);
        $df_out[1] =~ s/[[:space:]]+/\ /;
	    my @df_cols = split (/[[:space:]]+/, $df_out[1]);
        push (@list, ([@df_cols]));
    return @list;

my $tb = Text::Table->new(
	"Filesystem", "Size", "Used", "Avail", "Use%", "Mounted on"
print "Active Mounts\n";
print $tb;

And the bash script:


# Script to load Dzen with output from '' script
# Written by J. Bobby Lopez  - 27 Jan 2010
# This script utilizes the Dzen notification system
# Information on Dzen can be found at

function mountlines
        LINES=`perl|wc -l`;
        echo "$LINES"

function freshmounts
        echo "$OUTPUT"

function rundzen
        echo "$OUTPUT" | dzen2 -p -l "$MOUNTLINES" -u -x 500 -y 0 -w 600 -h 12 -tw 120 -ta l &
        PID=`pgrep -f "dzen2 -p -l $MOUNTLINES -u -x 500 -y 0 -w 600 -h 12 -tw 120 -ta l"`;
        echo "$PID"

function killdzen
        if [ ! "$PID" ]; then
            PID=`pgrep -f "dzen2 -p -l $MOUNTLINES -u -x 500 -y 0 -w 600 -h 12 -tw 120 -ta l"`;

        if [ "$PID" ]; then
            #echo "Killing $PID..";  # DEBUG STATEMENT
            kill "$PID";

function checkchanges
    while true; do
        #echo "$NEW - new";  # DEBUG STATEMENT
        if [ "$OLD" != "$NEW" ]; then
            killdzen "$PID";
            #echo "$PID started";  # DEBUG STATEMENT
            #echo "$OLD - old updated"  # DEBUG STATEMENT
        sleep 1;


You can also download the scripts in a tgz archive here. Enjoy!

Nationwide Blackberry Outage – 22/23 December 2009

Well, my Blackberry is officially offline because of a nationwide blackberry outage currently taking place. I use my Blackberry to receive messages from monitoring systems at VMware, so I’m severely pissed. I haven’t received any e-mails for several hours! The only way I knew that something was wrong, is that I have a sanity e-mail sent to myself every 2 hours. When I don’t see that e-mail, something evil is happening.

For a brief moment, I thought I was going to have to call and yell at Rogers, but they’re not at fault this time.

Real time updates on the situation via Twitter is nice, but a working Blackberry would be nicer.

Xmonad: For Hardcore Desktop User Interface Efficiency

Long time linux/unix hackers know of the plethora of window managers and user interfaces that have been and currently are available for Linux and BSD operating systems.  I’ve had great times in the past trying out different window managers such as Elightenment, Sawfish, Black Box, IceWM, xfwm, KDE, Gnome,  and others.  These days the two most popular which are shipped with the more popular distributions (Fedora, Ubuntu) are KDE and Gnome.

However, I remember back in the day when I was using a Enlightenment, or Ratpoison, doing strange and cool things (at the time) like applying transparencies to your windows and modifying the the window borders to be anything but normal and square.

I used to share screenshots of my desktop with others who are also into “desktop eyecandy”, where I’d have floating or docked window maker panels, and monitoring applets anchored to the desktop as if they were part of the background wallpaper.. and this was around 1999.  It was fun times.

One of the more interesting things that I was into at the time was increasing the efficiency and usability of my desktop by trying to reduce the need to reach for my mouse.  I’ve been very accustomed to this already being user of vi and the GNU Screen terminal multiplexor, but the window managers never seemed to try to attain the same level “hacker cool”.  That is, of course until I came across Ratpoision. Ratpoison was exactly what the name implied, a window manager that killed your dependency on the mouse (or rat).  It was awesome, but it wasn’t scalable and didn’t evolve much to keep up with modern technological advancements and requirements such as multi-monitor support.

I recently thought that those days were long lost, until I recently had the urge to streamline my desktop environment.  I now have a 28″ Monitor, and was certain there was a better way to interact with the desktop than the standard Ubuntu/Gnome environment.  So I went looking.  I started looking of course at things I was already familiar with – I looked up Ratpoision to see if there were any major improvements over the years.

I took a look at a Ratpoison again, but it was showing it’s age.  I looked at it’s successor, Stumpwm, but I didn’t feel the love.  Then I tried out Xmonad, created by Spencer Janssen, Don Stewart, and Jason Creighton – and written in Haskell.  I immediately fell in love.

If you haven’t used GNU Screen, Gnome Multi-Terminal, Ratpoision, or any minimalist Window Manager before, then it will be hard to explain why Xmonad is worth your time.  Instead, visit the Xmonad website here:

Here are some suggestions on how get Xmonad working on Ubuntu 8.10:

Install Xmonad:

apt-get install xmonad

We’re going to create another X window session, so that we don’t mess with your existing one. That way, if you don’t like Xmonad, you can go back to using your existing window manager without worrying about breaking your configuration.

Set up your second X window session. Press “ctrl + alt + f2” – this will take you to the command-line terminal where you will start your second X session. Start the session using following command:

xinit -- :1 vt12

This will start up another X session which will sit at virtual terminal 12 – meaning that you have to press ‘ctrl-alt-F12′ to get to it.

Once at your new X session, you should see nothing more than an plain old xterm window. Type “xmonad’, and the terminal window should now be maximized. Xmonad is now running.

Type ‘man xmonad’ to view the help documentation on how to use it.  It’s pretty straight forward, and a joy to use!

Recession, War, Politics, Poverty…. Software Development?

The way things are these days, you’d think that I, like I would imagine many other people in the world, would be thinking about money, the recession, the potential for war between countries who have been flirting with the bomb, my mother and the sale of her house, poverty in Africa, and the general suckage (is that a word?) in the world.

But no, I’m not thinking about those things.  What’s on my most most of the time is software development and programming.  I’m constantly thinking about what I’m good at, what I suck at, and what I need to do to get better.  Is that selfish?  Let me answer that – yes it is very selfish, but I don’t necessarily believe that selfishness is always a bad thing (part of me can relate to Ayn Rand’s philosophy of Rational Self-interest).

The question though is not “is this selfish?” Rather, the question I’m putting out there is “is this normal?” There are enough things going on right now in my life, dealing with situations and people that I find simply unreasonable, that I’m finding it hard to identify what is “reasonable” any more, because what I see as unreasonable seems to be the norm for the majority.

So is it wrong to think about my career and personal development during times of stress? I feel it to be instinctive to focus on your strengths during times of uncertainty, but what do others out there think? Do you feel that in times of stress, you should cut away from what you’re used to and try something new, or go on vacation? Or do you believe that it’s the perfect time to share with others, give back to your community or family and try to increase your karma (if you believe in such things)? These courses of action are not mutually exclusive, but it helps to identify what needs focus if they’re not jumbled together.

If this post seems a little incoherent, it’s 1am, and my eye-lids have been drooping constantly since I started typing.
Have a good night all :)

Converting Freemind Mind-maps Directly to Perl Hash Trees


I use Freemind quite a bit for brainstorming and as an outliner.  One of it’s better uses for me is to hammer out an idea for a perl hash tree very quickly.  The problem is that once I have the hash tree exactly the way I want it in Freemind, I have to manually re-create the hash tree in perl source, with all the required formatting.

This is no longer the case, as I’ve written a quick and dirty “freemind2perl” script (below) which takes a Freemind mind-map file, and converts it into a perl hash tree automagically.  I’m not sure if it will work with all versions of Freemind, but mind-map files (.mm files) are XML based, and the format really hasn’t changed across versions.

Just save the script below as ‘’ and run it with ‘perl’.  It requires the “XML::Simple” perl module to be installed.

Here’s the script (click here to download):

#!/usr/bin/perl -w

use strict;

use XML::Simple;
use Data::Dumper;

my $xml = new XML::Simple;
my $mm_file = shift;
my $data = $xml->XMLin("$mm_file");
my $clean;

sub prep_clean
    my $data = shift;
    my $clean;

    foreach my $key ( keys %{ $data } )
        if ( $key eq "TEXT" )
            $clean->{$data->{$key}} = 1;

        if ( $key eq "node" )
            if ( ref( $data->{$key} ) eq "HASH" )
                $clean->{$data->{'TEXT'}} = prep_clean(\%{$data->{$key}});

            if ( ref( $data->{$key} ) eq "ARRAY" )
                my $sub_hashes = {};
                for ( my $i = 0; $i <= $#{$data->{$key}}; $i++)
                    foreach my $sub_hash ( \%{ $data->{$key}[$i] } )
                        my $subout = prep_clean( $sub_hash );
                        $sub_hashes = { %$sub_hashes, %$subout };
                $clean->{$data->{'TEXT'}} = $sub_hashes;
    return $clean;

$clean = prep_clean( \%{ $data->{'node'} } );

print Dumper(\$clean);


VMware vSphere 4 Announced!

Working at VMware, I (virtually) had a front-row seat to the VMware vSphere simulcast on April 21.  It was an exciting event – everyone was anxious to hear what our industry partners (Cisco, Intel, Dell, etc) had to say about the new product.  The overall excitement and energy shown by these companies was impressive.

I think what I liked most was Steve Herrod’s “Blackberry Demo” which showed how resilient the platform was even to extreme hardware failure.  I don’t think many people truly understand what this technology means for disaster recovery and disaster avoidance – it essentially eliminates the risk.  I know it’s a big claim, but if your company does it’s due diligence, and has an appropriate and active back-up strategy for all critical systems; if you have proper 2×2 redundancy of systems in place to make sure that there are no single points of failure, you can essentially have 99.999% uptime at a fraction of the cost of doing all of this on physical systems. Small businesses can now experience the stability of software and services which were previously enjoyed only by large corporations which could afford it. And these same small businesses now have a new arsenal of tools which can help them compete against their larger, more established counterparts. It is an exciting time in the industry.

If anyone has an interest in virtualization, but doesn’t know where to start, the best thing to do would be to download a copy of VMware Server, or VMware ESXi. Both are free to download and use, and include the latest features and capabilities built into the enterprise (ESX/vSphere) hypervisors.

Double Shot of Tequila

I woke up early this morning with a mission on my mind, to finally organize my server rack the way I’ve always been meaning to, but for some reason (*cough*laziness*cough*) , I never got around to it.  I had recently bought some new hardware to re-build a system which I thought was dead, but which turned out not to be.  I didn’t really feel like returning the hardware, because this was the chance to build an up-to-date server to migrate all my VMs over to, which is something else I’ve been meaning to do for quite some time.

In any case, I finally got around to re-organizing my server rack today, and I’m proud of how it turned out.  With that accomplishment in hand, I decided to install our living room air conditioner (starting to get a tad warm, especially for computer systems). I headed out to Home Depot and purchased some wire mesh, or “screen” as one of their reps called it. Last year we found that we had a lot of mosquitoes and small flies coming in through the air conditioner. Considering it was a fairly inexpensive one, I figured that I got what I paid for. I decided to turn my $100 air conditioner into a $300 air conditioner, but adding on some custom filters in order to block any debris which it may collect through it’s many open vents. The roll of mesh cost around $15, and was easy enough to cut and shape. The end result turned out better than I had expected, and so this year I expect we will have a lot fewer bugs getting in.

And so the air conditioner was installed – this too had been completed.  I was on a roll and feeling good.  I decided then to try my hand at building my new server from scratch.

I had an old rack-mount server case ((solid steel, heavy beast)) which I gutted, and started building the new server in there.  The new components included a new motherboard – the Asus M3N78-VM, an AMD Athlon 1640 CPU, and 4GB of OCZ Dual Channel SLI Ready RAM. The Micro-ATX form factor of the motherboard made it super easy to fit into the monster rack-mount case. With a few simple connections, I was ready to test boot-up, and things should have been smooth from there. It wasn’t.

The system wouldn’t power on – at all. My first mistake was that I plugged the front panel connectors into the wrong pins on the motherboard. No sweat, figured that out, and moved forward. Switched it on again, saw the motherboards “SB Power” LED come on (which was a good sign), fans started spinning, thought I was getting close, but nothing. I couldn’t get it to POST anything, no errors, warnings, or beeps at all. I decided to rip out all the peripherals and go bare-bones in order to isolate the problem. Still nothing!! Removed RAM, nothing.. Removed the CPU, nothing. So at this point, aside from being frustrated, I’ve been able to narrow it down to one of two things, it’s either the motherboard, or the power supply. The power supply should be fine, because it worked with the old hardware that I had in the case originally. However, there is a chance that the power supply isn’t compatible with this motherboard in some way.

If it’s not the power supply, then I’ve received a motherboard that was DOA. I’m hoping this is the case! I’d hate to take this thing back to Tiger Direct tomorrow, have them test it out, and find out that it’s just fine. That would be both embarrassing and frustrating.

Anyway, after all these triumphs and frustrations, I decided to finish off the night with a double shot of Tequila, and damn did it go down smooth :)

If this blog post seems at all incoherent, it probably has to do with the fact that its late, and I’m tired.  Oh, and maybe just a little to do with that double shot of Tequila.

I’m not crazy

At work these days, because I’m the only developer on my “team”, I’ve been in the situation where I’m extending (which includes extensive, and often times ridiculous rounds of debugging) other peoples code.  Many of the projects I’ve inherited weren’t written to be maintained by anyone other than the original developers.  I’ve long ago come to accept that most programmers are not passionate about simplicity and elegance, and therefore write endless reams of code that over-complicate simple problems.

Now at VMware, I do work around some severely intelligent people, but unfortunately they are not developers, so I don’t work with them.  Because of this I often times rant to them about the ridiculousness of a given situation; and they’re smart, so they understand the problem technically, but because they aren’t working with me it would be hard for them to empathize with my frustrations.

I love reading Paul Graham‘s essays every once in a while, because he seems to be able to understand and articulate my frustrations so well.  One in particular that I’ve been re-reading is Great Hackers which always makes me breathe a sigh of relief because he reminds me that I’m not crazy.

If you are a manager and have to manage a group of experienced programmers, I urge you to read that essay.  You just may prevent one of your developers from committing heinous acts of insanity.

WebPIM: A Custom, Web-based, Personal Information Manager

I’ve always wanted a web-based application to help me manage all my stuff. “WebPIM” (as I’ve nick-named it for now), is currently one of my main personal projects that I have been working on.  I started this project back in 2003 as a simple web-based file manager, and have been slowly hacking away at it in my spare time ever since. “WebPIM” can act as a central reference point for all personal or project information. The way I’ve implemented my custom PIM is purely based on the way I work, so it may not be to everyone’s liking. However, I think it could really help individuals who need a way to organize tasks, projects, documents, and general files in a free-form, yet hierarchical and accessible way. Much of the thinking behind the way WebPIM is being developed relates to GTD ((Getting Things Done – David Allen)), and how to get “stuff” off your mind, and into a system.

Here’s the general idea – you have a lot of “stuff” – stuff that’s just sitting around on scraps of paper, on your hard drive, in your e-mail, and every other place you can’t seem to remember. This may be un-important stuff, or it may be severely important stuff – but none of it is organized into any kind of easily reference-able and “trusted system” ((GTD terminology)).

You have several options; the first of which is to do nothing. Unfortunately, ignoring the problem and hoping it will go away won’t solve the problem. Lets assume you want to change your situation, and we’ll use my experiences as a baseline for discussion.

I have tried many personal information managers over the years, and all of them have been incomplete in one way or another. Also, with the new wave of hosted applications like Google’s GMail, Calendar, and Google Docs, I am becoming more and more uncomfortable storing all my stuff on a remote, corporate server over which I have no control ((This has become more and more of a concern for me, having accounts on Google, Facebook, and others. Maybe I’m just paranoid.)).

My solution to this dilemma has been to write my own PIM, and so far, I’ve been happy with the results.

The way WebPIM currently works is by operating as a front-end to a linux based file-system. From WebPIM, I can create directories, create text files, upload files from my local hard drive, and move files around from one directory structure to another. This is the simple stuff that I think any web-based file manager should be capable of. More than this however, WebPIM provides the following features:

  • Move multiple files from one directory to another (batch move)
  • Text-dialog editing of all files (you can edit HTML and XML files in the interface)
  • Full path display when traversing directories, which allows you to go directly to any directory within your current absolute path via a hyper-link
  • Web-download functionality allowing you to download a copy of your favourite web page or web-accessible file into your current directory.
  • Recursive web-download, so that you can download an entire website for later reference (implemented using HTTrack (( in the back-end).
  • Project short-cuts, so that you can create short-cut groups to access multiple directory structures on the same interface. This allows you to access general reference information, along with specific project information all within the same interface, and without disrupting your overall PIM hierarchy.

I think the idea can be better explained with a screenshot of the main interface:

WebPIM Interface
– WebPIM Interface (Click on the image for a larger view) –


Obviously there is still a lot of polish required before this becomes useful to the general public, but I really do believe there is a market for it.  If anyone is interested in trying this out, leave a comment and let me know.  I can probably set up a demo, or provide the source code as-is so that you can give it a shot on your own system.

Insights on Technology, Science, Philosophy, and Society. Exploring Patterns, Logic, Reason, Empathy, and The Golden Rule. “Seek first to understand, then to be understood.” – Stephen R. Covey