All things being equal, the simplest solution tends to be the best one.

Occam’s razor strikes again!

Tonight we ran into an interesting problem. A web service – with a very simple time-elapsed check – started reporting negatives… Racking our brain, pouring over the code, produced nothing. It was as if the clock were jumping around randomly! No! On a whim Barry checked it and the clock was, indeed, jumping around…

# while [ 1 ]; do date; sleep 1; done
Wed May 30 04:37:52 UTC 2007
Wed May 30 04:37:53 UTC 2007
Wed May 30 04:37:54 UTC 2007
Wed May 30 04:37:55 UTC 2007
Wed May 30 04:37:56 UTC 2007
Wed May 30 04:37:57 UTC 2007
Wed May 30 04:37:58 UTC 2007
Wed May 30 04:37:59 UTC 2007
Wed May 30 04:38:00 UTC 2007
Wed May 30 04:38:01 UTC 2007
Wed May 30 04:38:02 UTC 2007
Wed May 30 04:38:19 UTC 2007
Wed May 30 04:38:21 UTC 2007
Wed May 30 04:38:22 UTC 2007
Wed May 30 04:38:23 UTC 2007
Wed May 30 04:38:24 UTC 2007
Wed May 30 04:38:08 UTC 2007
Wed May 30 04:38:09 UTC 2007
Wed May 30 04:38:10 UTC 2007
Wed May 30 04:38:28 UTC 2007
Wed May 30 04:38:12 UTC 2007
Wed May 30 04:38:30 UTC 2007
Wed May 30 04:38:31 UTC 2007
Wed May 30 04:38:32 UTC 2007
Wed May 30 04:38:33 UTC 2007
Wed May 30 04:38:34 UTC 2007
Wed May 30 04:38:35 UTC 2007
Wed May 30 04:38:19 UTC 2007
Wed May 30 04:38:20 UTC 2007
Wed May 30 04:38:21 UTC 2007
Wed May 30 04:38:22 UTC 2007
Wed May 30 04:38:40 UTC 2007
Wed May 30 04:38:41 UTC 2007
Wed May 30 04:38:42 UTC 2007
Wed May 30 04:38:43 UTC 2007
Wed May 30 04:38:44 UTC 2007

wordpress.com squeezed out another baby last night

Congrats to http://claudiacanals.wordpress.com/ which happens to be our one millionth hosted blog.  This happened around 11:38pm PDT, and weighed 7lbs 6oz

MySQL Tip: HAVING is your friend

Often times you end up with the need to pull something from the database conditionally depending upon some other criteria. For example, given the following table, how would list the first names shared by one or more people?

table: users col: firstname col: lastname

Commonly people will either put in a counter somewhere or it will be done with some loops in the application itself (which is a tremendously bad idea.)

But MySQL gives you a handy dandy way of doing exactly this task using GROUP BY and HAVING

SELECT firstname FROM users GROUP BY firstname HAVING count(firstname) > 1;

Thats pretty easy :) Now on a large table you will want to have the column(s) that you do this on indexed.

PHP CLI Status Indicator

Most times when people write command line scripts they just let the output flow down the screen as a status indicator, or just figure “it’s done when it’s done” But sometimes it would be nice to have a simple clean status indicator, allowing you to monitor progress and gauge time-to-completion. This is actually very easy to accomplish. Simply use
instead of
in your output. Obviously the example below is very simplified, and this can be applied in a much more sophisticated fashion. But it works.

$row_count = get_total_rows_for_processing();
$limit=10000;
echo "\r\n[  0%]";
for ( $i=0; $i < = $row_count; $i = $i + $limit ) {
  $query="SELECT * FROM table LIMIT {$limit} OFFSET {$i}";
  // do whatever
  $pct = round((($i+$offset)/$row_count)*100);
  if ( $pct < 10 ) {
    echo "\r[  $pct%]";
  } else {
    if ( $pct < 100 ) {
      echo "\r[ $pct%]";
    } else {
      echo "\r[$pct%]";
    }
  }
}
echo "\r[100%]\r\n";

Backgrounding Chained Commands in Bash

Sometimes it’s desirable to have a chain of commands backgrounded so that a multi-step process can be run in parallel. And often times its not desirable to have yet another script made to do a simple task that doesn’t warrant the added complexity. An example of this would be running backups in parallel. The script sniplet below would allow up to 4 simultaneous tar backups to run at once — recording the start and stop times of each individually — and then wait for all the tar processes to finish before exiting

max_tar_count=4
for i in 1 3 5 7 2 4 6 8
 do
 cur_tar_count=$(ps wauxxx | grep -v grep | grep tar | wc -l)
 if [ $cur_tar_count -ge $max_tar_count ]
  then 
  while [ $cur_tar_count -ge $max_tar_count ]
   do
   sleep 60
   cur_tar_count=$(ps wauxxx | grep -v grep | grep tar | wc -l)
  done
 fi
 ( echo date > /backups/$i.start &&
     tar /backups/$i.tar  /data/$i &&
     echo date > /backups/$i.stop )&
done
cur_tar_count=$(ps wauxxx | grep -v grep | grep tar | wc -l)
while [ $cur_tar_count -gt 0 ]
 do
 sleep 60
 cur_tar_count=$(ps wauxxx | grep -v grep | grep tar | wc -l)
done

The real magick above is highlighted in red. You DO want that last loop in there to make the script wait until all the backups are really done before exiting.

You’re only ever done debugging for now.

I’m the kinda guy who owns up to my mistakes. I also strive to be the kinda guy who learns from them.  So I figured I would pass this on as some good advice from a guy who’s “screwed that pooch”

There was a project on which I was working, and that project sent me e-mail messages with possible problem alerts.  All was going well, and at some point I turned off those alerts.  I don’t remember when.  And I don’t remember why.  Which means I was probably “Cleaning up” the code.  It was, after all, running well (I guess.)  But along comes a bug introduced with new functionality (ironically a from somewhere WAAAAAAY up the process chain from my project).  And WHAM, errors up the wazzoo.  But no e-mails. Oops. Needless to say the cleanup process was long and tedious… especially for something that was avoidable.

I’ve since put the alerting code back into the application, and have my happy little helpers in place fixing the last of the resulting issues.

The lesson to be taken from this is that you’re only ever done debugging for now. Because tomorrow that code, thats working perfectly now, wont be working perfectly anymore.  And that the sources for entropy are, indeed, endless.

Any… good… php devs out there looking for some side work?

I know a group of guys looking to do some cool stuff who could use a few good contractors. Drop me an e-mail with maybe a sample or something cool you did in php and I’ll pass it on.

apokalyptik apokalyptik com — Subject: “PHP Consulting” (I’ll likely completely overlook your mail if you use some subject not starting with that string)

Cheers

DK

It’s not sharing files that sucks, its file owning!

Lets face it… sharing files (especially photos and videos) is easy and free. Flickr, Youtube, Photobucket, Divshare all make it spectacularly freaking easy to share a file. And I’ so fed up with “sharing a file” that I’m ready to be sick. Stop putting speed holes in a technology that already works pretty damn well and give us something new that we need. Let me tell you what I’m talking about.

I have somewhere between 6,000 and 10,000 digital photos. Which is a lot. And right now they’re tucked into iPhoto. Which is great… until… I want to share my entire collection. My wife wants to find a few photos of the puppies. I have to go over to my computer, launch iPhoto, then she has to open photo, connect to my computer, browse these 10,000 images over wifi pulling an ungodly amount of traffic across the air, and s l o w l y rendering thumbnails for her on her side. Then she has to import the photos . Or she kicks me off my computer so she can do it.

Whats wrong with both of these pictures (pardon the pun) is that it’s way harder for a family to own a large group of digital photos than it should be.  Nobody makes it easy to own 10,000 photos, and whats worse is nobody even makes it easy to add 10,000 photos.

So theres your problem. The next big thing. We’ve made syndication for the masses, now lets make mass media for the masses.

Thats my $.02

SWFUpload-PostVarMod

This is a small hack I did to the excellent SWFUpload script (v 1.0.2) which allows you to specify what variable you would like the file to be uploaded as. You are, therefor, no longer chained to using “Filedata” you could use “Photo,” “Document,” “Movie,” or whatever. Granted in the grand scheme of things this isn’t *that* important, but it *does* make it easier to drop the swfupload script on top of an existing application without having to try and zip-tie it into place.

You can download it, its modified javascript, and its source here