Hpricot <text>sometext</text> workaround

As noted by the open trouble ticket here, The most awesome Hpricot seems to have come down with a bug, in that it’s not able to access “sometext” inside this: “<text>sometext</text>” It parses it ok (puts.doc.inspect definately shows the proper {elem}) you just cant get to it. So heres my ugly little hack/workaround for this issue until it’s resolved. (I’m posting it here, since I cant seem to signup to make a comment on the bug report on the Hpricot home page… and someone might find this useful) This hack is specifically for web documents, however would also work for strings or files with only minor tweaks.

## Begin hack
doc = “”
open(url) do |f|
doc=doc + f.read
end
doc = doc.gsub(/<text>/, “<mtext>”)
doc = doc.gsub(/<\/text>/, “</mtext>”)
doc = Hpricot(doc)
## Should be one line
## doc = Hpricot(open(url))
## End hack

ruby-Mapquest

Changelog (Click Version To View/Download)

  • Version 0.005
    • Addition: thumb_height, thumb_width, thumb_style, thumb_type, thumb_url to route rval when available (per turn)
  • Version 0.004
    • Addition: apistatus to geocode rval
    • Addition: geocode status to geocode rval
    • Addition: geocode quality to geocode rval
    • Documentation: Added commas to geocoding example… oops
  • Version 0.003
    • bugfix: added apikey to overviewmap
    • bugfix: distance is now a float value
    • change: overview metrics renamed from maneuver_foo to maneuvers_foo
    • Addition: [:debugurl] for manual inspection of values returned from mapquest’s api
  • Version 0.002
    • Addition: Routing Support
  • Version 0.001
    • Initial Release

Warranty: None, at all, whatsoever, use at your own risk, may burn down your house and knock over your garbage cans and return the car with the gas level on “E” and refuse to return your lawnmower even though it’s now 7 months later and you’re growing a small rainforest out back… might fall asleep while watching your children, forget to pay the electric bill, and run up massive credit card debt. In other words. You’re on your own. Dont come crying to me!

To use this client you have to first apply for a mapquest openapi key here Then you must add “*” as a referrer under “my account” for your openapi ke

Example usage:

mq = Mapquest.new(“foobazbazbooblah”)

#Geocoding
myLocation = {
:address => “555 17th Street, Suite 1600”,
:city => “Denver”,
:state => “Colorado”,
:zip => 80202,
}
puts mq.geocode(myLocation).inspect

#Routing
route_request = {
:addressOrigin => {
:name => “Yahoo!”,
:address => “701 First Avenue”,
:city => “Sunnyvale”,
:stateProvince => “ca”,
},
:addressDestination => {
:name => “Google”,
:address => “1600 Amphitheatre Parkway”,
:city => ” Mountain View”,
:stateProvince => “ca”,
},
}
puts mq.route(route_request).inspect

As simple as it gets, no? Cheers!

ruby-Delicious v0.001

Since I’ve worked out the kinks mentioned in my last blog entry (was a problem with re-escaping already escaped data, by the way (never debug while sick and sleep deprived!)) I’ve scraped things together into a class which is a client for the api itself.  It’s relatively sparse right now, but good enough for using in an application.  Which is what the client is geared towards, by the way.  Specifically (and privately) tagging arbitrary data.  It *can* publicly tag URLs, but thats more or less a side effect if what delicious… is… and not a direct intention while writing the API.  You can visit the quickly thrown together ruby-Delicious page here (link also added up top)

Having a strange problem with the del.icio.us api

(note: example urls broken because of line width, they’re each on one line in my code :D)

I’m using code referenced here: http://www.bigbold.com/snippets/posts/show/2431 to access the del.* tagging api with only limited success. I dont think it’s the code, though, because the problem is replicatable in the browser, and everything *seems* to line up with the docs.. The URL I use to create the item is:

https://api.del.icio.us/v1/posts

/add?&url=la+la+la&description=foo&tags=foo

This works. I get a nice “foo” with the proper url “la la la” and I get a pretty <result code=”done”/> Then I try to delete the item with iether of these urls:

https://api.del.icio.us/v1/posts
/delete?&url=la+la+la

https://api.del.icio.us/v1/posts
/delete?&url=foo

Niether of these work, I still get a pretty <result code=”done”/>, but the item is never deleted…

I saw this problem referenced on the Tucows Farm blog but the only suggestion in the comments was: “Google for “delicious-import.pl”, it deletes bookmarks upto 100 at a time. A quick little override in the code will make it delete all bookmarks. Handy when you screw up an import. Not so handy in other situations, which is why you cant do it by default. This script will read a netscape/firefox/mozilla type bookmark file. I am re-working it to do Opera for me.” Which I did, the URL build inside of the script there is http://user:[email protected]/api/posts/delete?&url=la+la+la but that’s out of date. I tried it anyhow, andit redirected me to the /v1/ query above (https://api.del.icio.us/v1/posts/delete?&url=la+la+la) Which still didnt work. I can’t imagine that I’m the only person who’s run into this problem

Tag anything, anywhere?

I’ve not been able to find anything really high profile (good google page rank) but is there an API which allows you to tag *anything* anywhere? (not just URLS, but… any piece of data?) Being able to take one arbitrary identifier, optionally a type, and add arbitrary tags to it sounds like the stuff of web 2.0, yea? but seems people are just home-brewing their own. Now if I were able to go somewhere and /tags/people/demitrious or /tags/blogs/demitrious or /tags/*/demitrious or /tags/urls/apokalyptik.com /tags/foo/bar then we’d be getting somewhere

The quest for clean data

When you’re on the leading edge of things you always have a problem: dirty data. And the quest for clean data is always at the forefront of your mind. Last night while searching for a geocoding service which didn’t suck outside the US and major EU countries I cane upon this article which put into words the stormy mood that had been brewing whilst I struggled in my quest: Geocoding, Data Quality and ETL. I know geocoding is outside my normal sphere of writings, but the way technology is going some of you are going to eventually have to work with geocoding at some point.

And the bottom line is this: While we now have the tools and techniques necessary for getting the job done right, It’s going to be a long time until we actually get it right. It’s just one of those things that takes a lot of time, money, and manpower to accomplish.

That being said… I wonder how difficult it would be to mashup, say, Google Earth and current cartographic services to specifically draw attention to problem areas, and to setup as an automatic alert for new expansion (or demolition for that matter?!) Not being my area of expertise I’d be hard pressed to get that right, at least without some insider help anyhow. But I’d be willing to bet that it would prove a valuable tool for the geospacial community. And make no mistake about it: the better they are at doing what they do the easier it is for you to find the newest starbucks.