A tale of two gamers

Gamer #1 — The youngster. The youngster grew up playing games… The NES isn’t so much nostalgic as archaic. They don’t have much money working on getting through high-school, college, or fresh out. But what they lack in cash flow they make up for in time. These are the purists. If this person spends $60 on a game they want it to be CHALLENGING. They want as much time and value out of a game as they can possibly get. These are the WoW players who think spending 30 hours a week is a good investment of their time.

Gamer #2 — The spouse. The spouse used to be a youngster. But they’ve since acquired this marvelous, and strange, new thing — a life. Now they have a career — not a job — a wife, and possibly a kid or two. When this person forks out $60 for a game he wants a good respite from real life without consuming his real life. It’s challenging for this person to put in 5 hours a week into a game and 30 is completely unrealistic (even if they’d love to be able to.) This gamer wants to be completely engrossed and entertained for a while but also needs to be able to put the game down and pick it up again in 2 weeks without loosing much.

I get reminded of this from time to time on forum discussions and the like. Especially when you have these two factions arguing over things like gold farmers, and cheats, and glitches. I happen to fall into the Gamer #2 category, and I can tell you that when I get 3 hours to sit down and play a game I do NOT want to spend that hard earned time grinding. __I__JUST__DON’T__. The real PITA about that is that I would LOVE to play games like WoW.

I hope that game developers start looking at my demographic (unlike kids (who in their defense can’t) I’m willing to pay for my games…) seriously. give me a WoW server with the exp and gold tweaked so that I can get past the crappy ‘kill 50 fluffy bunnies’ grinding quests, and get to some fun gaming before I turn 60.

The youngsters will, of course, argue things like “taking away from the game,” “why even bother playing,” and things like “it’s not that hard and doesnt take that long, you just suck, n00b.”

To which I say bite me. Until game makers start understanding that some gamers want a time sink, and some gamers cant afford one. I’ll be the low level guy who just payed for 60 hours of some Chinese gold farmers time…. Because one of these days I’d like to get to do something interesting… I’ll be the guy who uses the game glitch to avoid spending 15 hours forging swords or chopping wood.

I would like to not have to resort to these measures… and If the game manufacturers would just throw us a bone, I bet we wouldnt…

A counting bloom filter

This is just me screwing around, really…. This implements a counting bloom filter in native php. I read about bloom filters this morning, and wrote this tonight in between playing Command and Conquer 3 campaign missions. The hashing function is configurable (and easily extensible,) rehashing the key multiple times to reduce the chance of collisions is also configurable (though I’m not entirely sure how needed this is.) Frankly this all might not even be (or work) properly… seems to… we’ll see…

< ?php

class bloom {

	/**
	 *
	 * A counting bloom filter
	 *
	 *	$f = new bloom(10, 'md5');
	 *
	 *	$f->add('foo');
	 *	$f->add('bar');
	 *	$f->add('foo');
	 *
	 *	$f->exists('foo');        // true
	 *	$f->exists('bar');        // true
	 *	$f->exists('baz');        // false
	 *	$f->exists('foo', false); // 2
	 *	$f->exists('bar', false); // 1
	 *	$f->exists('baz', false); // false
	 *
	 **/

	var $number_of_hashing_functions;
	var $map = array();
	var $hashlen = 32;
	var $hashfunc = 'md5';

	function __construct( $functions=1, $hashtype='md5' ) {
		$this->bloom($functions, $hashtype);
	}

	function bloom( $functions=1, $hashtype='md5' ) {
		$this->number_of_hashing_functions = (int)$functions;
		if ( !$this->number_of_hashing_functions || $this->number_of_hashing_functions < 1 )
			$this->number_of_hashing_functions = 1;
		$this->set_hashing_function($hashtype);
		$this->initialize_bitmat($this->number_of_hashing_functions);
	}

	function set_hashing_function($method) {
		switch ( $method ) {
			default:
			case 'md5':
				return false;
				break;
			case 'sha1':
				$this->hashlen = 40;
				$this->hashfunc = 'sha1';
				break;
		}
	}

	function hash($key, $n=0) {
		return call_user_func( $this->hashfunc, $n.$key );
	}

	function add($key) {
		for ( $i=0; $i< $this->number_of_hashing_functions; $i++ ) {
			$k = $this->hash($key, $i);
			for ( $n=0; $n< $this->hashlen; $n++ ) {
				$this->map[$i][$n][$k{$n}]++;
			}
		}
		return true;
	}

	function remove($key) {
		for ( $i=0; $i< $this->number_of_hashing_functions; $i++ ) {
			$k = $this->hash($key, $i);
			for ( $n=0; $n< $this->hashlen; $n++ ) {
				$this->map[$i][$n][$k{$n}]--;
			}
		}
		return true;
	}

	function exists($key, $bool=true) {
		$max = 0;
		for ( $i=0; $i< $this->number_of_hashing_functions; $i++ ) {
			$k = $this->hash($key, $i);
			for ( $n=0; $n< $this->hashlen; $n++ ) {
				if ( !$v = $this->map[$i][$n][$k{$n}] )
					return false;
				else
					$max = max($v, $max);
			}
		}
		if ( $bool )
			return true;
		else
			return $max;
	}

	function initialize_bitmat($n) {
		$empty_bitmap_line = array(
			'0' => 0, '1' => 0, '2' => 0, '3' => 0, '4' => 0, '5' => 0, '6' => 0, '7' => 0,
			'8' => 0, '9' => 0, 'a' => 0, 'b' => 0, 'c' => 0, 'd' => 0, 'e' => 0, 'f' => 0  );
		$this->map=array();
		for ( $i=0; $i< $n; $i++ ) {
			$this->map[$i]=array();
			for ( $l=0; $l< $this->hashlen; $l++ ) {
				$this->map[$i][$l] = $empty_bitmap_line;
			}
		}
	}

}

?>

Using wait, $!, and () for threading in bash

This is a simplistic use of the pattern that I wrote about in my last post to wait on multiple commands in bash. In essence I have a script which runs a command (like uptime or restarting a daemon) on a whole bunch of servers (think pssh). Anyways… this is how I modified the script to run the command on multiple hosts in parallel. This is a bit simplistic as it runs, say, 10 parallel ssh commands and then waits for all 10 to complete. I’m very confident that someone could easily adapt this to run at a constant concurrency level of $threads… but I didn’t need it just then so I didn’t go that far… As a side note, this is possibly the first time I’ve ever *needed* an array in a bash script… hah…

# $1 is the commandto run on the remote hosts
# $2 is used for something not important for this script
# $3 is the (optional) number of concurrent connections to use

if [ ! "$3" == "" ]
then
    threads=$3
else
    threads=1
fi

cthreads=0;
stack=()
for s in $servers
  do
    if [ $cthreads -eq $threads ]; then
        for job in ${stack[@]}; do
              wait $job
        done
        stack=()
        cthreads=0
    fi
    (
        for i in $(ssh root@$s "$1" )
            do
                echo -e "$s:\t$i"
        done
    )& stack[$cthreads]=$!
    let cthreads=$cthreads+1
done
for job in ${stack[@]}; do
    wait $job
done

bash – collecting the return value of backgrounded processes

You know that you can run something in the background in a bash script with ( command )&, but a coworker recently wanted to run multiple commands, wait for all of them to complete, collect and decide what to do based on their return values… this proved much trickier. Luckily there is an answer

#!/bin/bash

(sleep 3; exit 1)& p1=$!
(sleep 2; exit 2)& p2=$!
(sleep 1; exit 3)& p3=$!

wait "$p1"; r1=$?
wait "$p2"; r2=$?
wait "$p3"; r3=$?

echo "$p1:$r1 $p2:$r2 $p3:$r3"

a dumbed down version of wpdb for sqlite

I’ve been working, gradually, on a project using an sqlite3 database (for its convenience) and found myself missing the clean elegance of wpdb… so I implemented it. It was actually really easy to do, and I figured I would throw it up here for anyone else wishing to use it. The functionality that I build this around is obtainable here: http://php-sqlite3.sourceforge.net/pmwiki/pmwiki.php (don’t freak… its in apt…)

With this I can focus on the sql, which is different enough, and not fumble over function names and such… $db = new sqlite_wpdb($dbfile, 3); var_dump($db->get_results(“SELECT * FROM `mytable` LIMIT 5”));

the code is below… and hopefully not too mangled…

< ?php

class sqlite_wpdb {

        var $version = null;
        var $db = null;
        var $result = null;
        var $error = null;

        function sqwpdb($file, $version=3) { 
                return $this->__construct($file, $version); 
        }

        function __construct($file, $version=3) {
                $function = "sqlite{$version}_open";
                if ( !function_exists($function) )
                        return false;
                if ( !file_exists($file) )
                        return false;
                if ( !$this->db = @$function($file) )
                        return false;
                $this->version = $version;
                $this->fquery = "sqlite{$this->version}_query";
                $this->ferror = "sqlite{$this->version}_error";
                $this->farray = "sqlite{$this->version}_fetch_array";
                return $this;
        }

        function escape($string) {
                return str_replace("'", "''", $string);
        }

        function query($query) {
                if ( $this->result = call_user_func($this->fquery, $this->db, $query) )
                        return $this->result;
                $this->error = call_user_func($this->ferror, $this->db);
                return false;
        }

        function array_to_object($array) {
                if ( ! is_array($array) )
                        return $array;

                $object = new stdClass();
                foreach ( $array as $idx => $val ) {
                        $object->$idx = $val;
                }
                return $object;
        }

        function get_results($query) {
                if ( !$this->query($query) )
                        return false;
                $rval = array();
                while ( $row = $this->array_to_object(call_user_func($this->farray, $this->result)) ) {
                        $rval[] = $row;
                }
                return $rval;
        }

        function get_row($query) {
                if ( ! $results = $this->get_results($query) )
                        return false;
                return array_shift($results);
        }

        function get_var($query) {
                return $this->get_val($query);
        }

        function get_val($query) {
                if ( !$row = $this->get_row($query) )
                        return false;
                $row = get_object_vars($row);
                if ( !count($row) )
                        return false;
                return array_shift($row);
        }

        function get_col($query) {
                if ( !$results = $this->get_results($query) )
                        return false;
                $column = array();
                foreach ( $results as $row ) {
                        $row = get_object_vars($row);
                        if ( !count($row) )
                                continue;
                        $column[] = array_shift($row);
                }
                return $column;
        }

}

?>

Postfix, DKIMproxy, Spamc

If you’re running any moderately busy mail server you’re probably using spamassassins spamc/spamd to check for spam because its tons more efficient than piping the mail through the spamassassin cli. Assuming that you do, and that you plan on adding DKIM proxy to the mix to verify  and sign emails you need to put things in the right order, to save you some headache here’s what I did:

  1. smtp|smtps => -o smtpd_proxy_filter=127.0.0.1:10035 # outgoing dkim verify port
  2. 127.0.0.1:10036 => -o content_filter=spamassassin
  3. spamassassin =>  pipe user=nobody argv=/usr/bin/spamc -f -e /usr/sbin/sendmail -oi -f ${sender} ${recipient} # this delivers to the “pickup” service
  4. pickup => -o content_filter=dksign:127.0.0.1:10037 # outgoing dkim signing port
  5. 127.0.0.1:10038 => -o content_filter= # the buck stops here

If you arent careful with these (which I wasnt) you’ll end up causing an infinite loop between your filters (which I did).  Thus concludes our public service announcement.

Piracy and Copy Protection

There’s been a lot of hooting and hollering in the media about copy protection and piracy.  DRM seems to be a rallying cry for the younger generation.  Which makes sense.  I figured I’d throw my own $.02 out there to the wind (for all the good its worth) But I’d first like to say that (as is true with all generalizations) this doesnt apply to all people all the time.

You really have 3 classes of people who might buy a game.  You have the people with enough disposable income not to care, they just buy their game, and use it as directed.  (please not I’m not discussing the as directed part, simply the who’s yelling part. I fully agree that as directed should come outside the bos — or be returnable — and should also be reduced to lay speak since most of us dont speak asshole lawyer.)  Those people really arent who’se yelling about DRM.

You have the gaming middle class who, more or less, wants to go straight.  They buy what they can when they can, but at $60 a pop they cant afford *everything* they want.  To varying degrees they have a couple of options.  They can pick what they *think* they’ll like best, and abstain from everything they cant afford, or they can “test-run” the game and see whether they like it first.  As someone who has bought a car before I know which one of those options makes the most sense to me… This stage is definitely not a plateau, though, it slopes down from the previous upper class to the next… lower class.

Finally there’s the poor people.  They have a computer.. maybe handed down.. maybe cannibalized…  Maybe they’re preteen to mid/late twenties.  Maybe they’ve got a job, maybe they dont. Maybe they’re in school, maybe not.  But the common denominator here is that you arent going to see a red penny from these people no matter what you do.  One does not squeeze blood from a stone.  These people have WAY more time than they have money, and so spending 2 days downloading, and 2 days searching for an install key, then a crack, is NOT a problem for them… It’s cheaper than buying the thing.  What, I think, game companies are missing is that a good number of these people WANT to be in the middle and upper gamer class, and when they get there they *will* spend money, because they’ll have more of that then time.

What I think *IS* interesting is the time to money ratio.  You develop a game that might take 10-30 hours to beat (in the case of non mmorpgs) or last forever (in the case of mmorpgs.)  Essentially you’re making something that appeals TO the people who have more time than money… Much like payday loans your target audience is pretty dangerous in general, fiscally.  And as people progress to upper gamer class you have way more money than time.  And you wonder why people are willing to fork over $30 to get a bunch of WoW gold so they can kill something besides “soft fluffy bunnies, which pose no threat to humanity” without having to neglect their wife and kids for 400 straight hours.

Crucify the young for being young, crucify the old for being old.  It makes you wonder… in the minds eye of the video gaming industry… does their ideal customer base actually exist?  I bet it lies dangerously, and precariously, in the same kind of balance that the tobacco industry lies: hook em while they’re young and bleed em dry as they age.  Except the tobacco company does it better, they dont actively vilify their young “patrons” they simply wait.

Writing your own shell in php

I’ve always wanted to write my own simple shell in php.  Call me a glutin for punishment, but it seems like something that a lot of people could use to be able to do… If your web app had a command line interface for various things… like looking up stats, or users, or suspending naughty accounts, or whatever…. wouldnt that be cool and useful?  Talk about geek porn.  Anyways this this morning I got around to tinkering with the idea, and here is what i came up with… It’s rough, and empty, but its REALLY easy to extend and plug into any php application.

apokalyptik:~/phpshell$ ./shell.php

/home/apokalyptik/phpshell > hello

hi there

/home/apokalyptik/phpshell > hello world

hi there world

/home/apokalyptik/phpshell > cd ..

/home/apokalyptik/ > cd phpshell

/home/apokalyptik/phpshell > ls

shell.php

/home/apokalyptik//phpshell > exit

apokalyptik:~/phpshell$ ./shell.php

See the source here: shell.phps

Internally Caching Longer Than Externally Caching

We use varnish for a lot of our file caching needs, and recently we figured out how to do something rather important through a combination of technologies. Imagine you have backend servers generating dynamic content based on user input. So your users do something that fits the following categories:

  • is expensive to generate dynamically, and should be served from cache
  • many requests come in for the same objects, bandwidth should be conserved
  • doesnt change very often
  • once changed needs to take effect quickly

Now wish varnish we’ve been using the Expires header for a long time with great success, but for this we were having no luck. If we set the expires header to 3 weeks, then clients also cache the content for 3 weeks (violating requirement #3.) We can kill the Expires header in varnish at vcl_deliver, but then clients don’t cache at all (#2.) We can add Content-Control, overwrite the Age (otherwise reported Age: will be greater than max-age), and kill the Expires headers in the same place, but this isn’t pretty, and seems like a cheap hack. Ideally we could rewrite the Expires header in varnish, but that doesn’t seem doable.

So what we ended up doing, was header rewriting at the load balancer (nginx.) inside our location tag we added the following:

proxy_hide_header Age;
proxy_hide_header Expires;
proxy_hide_header Cache-Control;
add_header Source-Age $upstream_http_Age;
expires  300s;

Now nginx setsa proper Cache-Control: and Expires: headers for us, disregarding what varnish serves out. Web clients dont check back for 5 minutes (reusing the old object) and varnish can cache until judgment dat because we get wild card invalidation

Isn’t technology fun?!