Advertisements

Archive

Archive for the ‘nerd’ Category

Transit of Venus

June 5, 2012 Leave a comment

It turned out to be a mostly clear evening to watch the Transit of Venus. Jen and I went to Holland State Park so we could watch from the beach and I brought my trusty binoculars and cardboard so I could magnify the sun without burning my retinas. It worked pretty well, though during the next transit, 105 years from now, I think I’ll invest in a tripod to mount the binoculars because holding it by hand is a bit too shaky.

Who needs high tech when you have a steady hand?

There were a few pesky clouds for a while but things were mostly cleared up after 7:30 until sunset. The binocular and cardboard trick worked like a charm. The pictures aren’t the best because I only had a crappy camera phone, but there are plenty of better pictures online. These are mine, so deal with it.

That’s a whole freaking planet nearly the size of the Earth in that pinprick of shadow

I just love the fact that there’s a whole planet there in that little piece of shadow, nearly the size of our own Earth. It puts things into a bit of perspective. Carl Sagan put it much better in Pale Blue Dot, and the same thing can apply here. All our existence and hopes and dreams, wars and loves, they all fit into something about that big; something that, while only a relative stone’s throw away, casts a shadow the size of a grain of sand, and only if you look really hard. We’re pretty insignificant here. Let’s make the most of it. But enough soliloquizing.

There was a small crowd of people that gathered around to see my high tech solution, and I got to play science teacher for a little bit. A few were amazed that such a thing could be done with binoculars, and several were expecting something as large as a lunar eclipse and seemed a little disappointed at how small the shadow was. A little girl asked whether the shadow of Venus would appear on top if I flipped the binoculars, so, in the interest of science, we did a little experiment by flipping over the binoculars and found that no, the image stays just as is.

One of the guys I talked to had a welder’s mask that he let me borrow and I was surprised at how well it worked, that you could see the image just fine. Another fellow had a huge telescope with a solar filter hooked up to a laptop. He drew a bigger crowd than I did. Nerd jealousy.

When the clouds were mostly gone I found I could get a much larger, but much dimmer image from a little farther away when I had a bench to steady my hand. I thought this one was cool.

All in all, it was a whole lot of nerdy fun, and there was ice cream to be had, and a pretty sunset. Just before the sun went down, I hazarded some staring at the sun with only my sunglasses to protect me, and was pleased to see Venus’ backside for the last time in my life. The next transit of Venus is in 2117, and if our species hasn’t managed to eradicate ourselves by then, hopefully we’ll have another generation who will appreciate its beauty. The baton has been passed.

Advertisements
Categories: nerd Tags: ,

Cthulhu on my Kindle

December 27, 2011 Leave a comment

My lovely wife bought me a Kindle as a gift and I’ve been playing with it the first few days. So far, I love this thing.

I’ve been spending a few days hoarding free books that I can find all over the internet. Amazon has a bunch of free books on their site but also recommends other repositories. Project Gutenberg is pretty damn slick. Plus, today I found a totally free collection of HP Lovecraft’s works over at CthulhuChick.com. Well done, Ruth. You rock!

Amazon also has this nifty way of getting books to your Kindle. By registering your device, you have a specific @kindle.com email address assigned (managed on amazon.com) and you can send books as attachments in an email and they’ll show up the next time your e-reader connects to the web. It accepts zip files as well as .mobi files and a few other formats. Plus, they keep a hold of the books you send over email so that if you accidentally delete something, like I already have, you can just pull it up under the Personal Documents section of the Kindle management page and resend it to your device.

I was a bit of a naysayer when these things first came out, but I can definitely see the benefit of having one. Now I just wish I could squeeze my huge-ass hardcover copy of the Autobiography of Mark Twain into digital format without buying it again, because that thing is freaking heavy.

Categories: books, nerd

I’m Sold on Dapper

December 8, 2011 Leave a comment

Don’t get me wrong, I love my LINQ. I just have mixed feelings about LINQ2SQL, or anything that promises to make my life easier by allowing me to write fewer SQL statements.

I’ve been burned a few times too many by seemingly innocuous LINQ2SQL queries that ended up ballooning into resource hogs once deployed to the real world. Often, it’s a sleeper; some query that’s been running just fine for years and then, BAM! You get jolted by a spiked CPU like a shovel to the face because an email campaign hit some remote part of the site that hadn’t been pored over. A little digging finds that LINQ2SQL has drunkenly taken over the kegger at your parents’ house, smashing lamps and vases and shoving your friends into the pool, wreaking all sorts of havoc and running your CPU off the charts.

It’s that friend you learn to limit. He may be great in certain situations, like running the basic CRUD (Create, Read, Update, Delete) routines on all those cumbersome admin screens, but once you take him into the real world, once you expose him to all your other friends on your high traffic ecommerce site, once you give him a broader audience, you run the risk that he’ll show his true colors, and you may not like what they see.

Such has become my relationship with any ORM that promises to lift the burden of having to write straight SQL. It’s fine in the right circumstances and saves loads of time writing basic operations. But once you cook up a slightly more complex query and roll it into a public website with tens of thousands of hits an hour, it’s just not enough. Trusting the black box of ORM SQL generation often turns out to be a risky endeavor.

I’d rather be in direct control of what SQL gets executed when writing finely tuned database access. Thus, I’ve come to love what Dapper has to offer. Dapper, by the folks over at stackoverflow.com, is an extremely lightweight data access layer optimized for pulling raw data into your own POCOs (Plain Old C# Objects). It’s that perfect fit between the nauseatingly redundant world of SqlCommands and DataReaders, and the overzealous and overbearing friend you find in LINQ2SQL. No longer do I have to guess at what kind of query an ORM is going to generate. No longer do I have to worry that LINQ2SQL is going to fly off the handle and take up all my CPU trying to compile the same dastardly query over and over again. I can instead write the SQL myself and get it into my POCO of choice with less effort than it takes to bash my head on the keyboard.

For example, let’s say I’ve got this domain object:

public class OmgWtf
{
public string Acronym { get; set; }
public string Sentence { get; set; }
}

All I have to do to yank the data from the database is this:

using (var conn = new SqlConnection(ConnString))
{
conn.Open();

string sql = @"
SELECT TOP 1 omg.Acronym, wtf.Sentence
FROM OnoMatopoeicGiddiness omg
JOIN WordsToFollow wtf ON wtf.OmgID = omg.ID
WHERE wtf.ID = @WtfID";

var omgwtf = conn.Query<OmgWtf>(sql, new { WtfID = 3 }).First();

Console.Write("{0}: {1}", omgwtf.Acronym, omgwtf.Sentence);
}

The result is, of course:

SQL: I Squeal for SQL!

No longer do I have to suffer the fate of black box SQL generation when all I really want is a clean, easy, and fast way to get my SQL or stored procedure results directly into my domain objects. I’m sold on Dapper for many of my high-performing pages. As we maintain our sites and find the occasional bloated LINQ2SQL resource hog, we’re swapping out the queries to straight SQL, stored procedures, and Dapper, and it has really sped things up.

Go ahead, give it a shot yourself. It’s available on the NuGet Gallery, and only imports a single C# file; no extra assemblies required. They’ve got plenty of examples at the project site. I’m wondering how I ever lived without it.

Categories: nerd

Coming to Terms With Baseless Merging in TFS

October 20, 2011 Leave a comment

Team Foundation Server has never been friendly when it comes to the complicated love triangles that inevitably rise from wanting to merge between three different branches. The only option we have is, at some point, to create a common merge point using a baseless merge. These tend to be pretty finicky and if you don’t plan on it up front, are nearly impossible to deal with.

Our typical use for three way merging arises when we branch from the trunk into an internal developer branch, and at the same time create a partial branch (just views and content) for a third party design group who we want isolated from the rest of the team. We need to be able to merge our internal code changes to and from their branch. These scenarios are usually project-based, and we get by with the fact that we can create both the design branch and the internal branch at the same time. When you have that synchronization, a baseless merge can be done between the two new branches to get a merge history which sets you up for a project lifetime filled with happy merges.

However, we ran into another scenario the other day and I didn’t know whether we’d be able to handle it. Our client has a large codebase and our general strategy is to keep the Trunk in synch with what’s live or approved to go live (though we’re contemplating another “Production” branch which may make things easier). During the October to December timeframe, the busy season hits the websites as customers buy their product and development on our side slows down, at least on the day to day small projects. We can’t leave large projects in the trunk during this time and risk accidental deployment.

We will have several large projects going on which won’t be released for several months. One of these projects is to finally upgrade all our disparate systems to .Net 4 and MVC 3. At the same time, we’ll have at least one more large project separate from the upgrade, but looking to use a lot of the fun new MVC 3 functionality. We need a three way merge. We may have other projects coming down the pipe during these months as well, so I wanted to find a way to use baseless merging to assure that all new projects could be merged with the Upgrade branch without polluting the trunk.

Future projects won’t have the same exact starting point as the designer branch scenario, but I found a way to mimic starting from the same point. It goes something like this.

Assume you have Trunk and Upgrade branches which were branched months ago and each has a lot of changes since then. You need to branch from Trunk into the new project branch, Foo, but then to also get the updates from the Upgrade branch.

  1. View history on the Upgrade branch and find the changeset at which it was branched. We’ll call that X.
  2. Branch from the Trunk to create Foo, but branch from changeset X.
  3. From the command line, do a baseless merge from Upgrade to Foo, specifying changeset X and including the /discard option. This causes it to only create the merge history, which is fine because the code is identical when you specify changeset X.
  4. Now you’ve got your merge history. You can merge up from the trunk to get the latest of its code, and you can merge from the Upgrade code to get the latest of its code, and everyone’s happy.

And there you have it. Of course, now that I think about it, I’m wondering whether I’m over-complicating things. I bet I could get the same end result by merging the latest from Trunk to Upgrade, branching from Trunk to Foo at latest, then baseless merging from Upgrade to Foo and accepting all edits. For some reason, the bullet list above seems cleaner to me but they probably boil down to the same thing.

I guess the moral of the story is this: Baseless merging is going to be a nightmare if you don’t plan for it up front. The whole reaching into history for a common merge point probably has other uses as well, and I wonder whether something like this could also be useful to bring together two separate branches which weren’t baseless merged up front. I’ll investigate that another time. In the meantime, I’ll dream wishful dreams of distributed repositories like git and Mercurial, which live in a land where this type of thing is supposedly mundane.

Categories: nerd Tags: ,

Lazy Loading By Using C# Yield Statements

October 19, 2011 Leave a comment

I keep forgetting about that fun little yield statement we’ve got in C#. It’s that one that lets you magically create an enumerator out of a normal method.

The other day I was feeling lazy. I had a bunch of views that all shared the same model which allowed for a bunch of different editors for different types of data. Some of them needed a list of features pulled from the database. The easy thing to do would be to just slap that feature query result onto the model, but it only needed to be used by a single view and it was an expensive query with some other nonsense happening to the list in the controller, so it was slowing down all other views unnecessarily.

I then remembered the yield statement and wondered whether it would lazy-load, hoping that my slow loading method would never be called unless the view actually used it. It worked!

static void Main(string[] args)
{
    Console.WriteLine("Test 1 (we're not enumerating this time)");
    var foo1 = GetFeatures();

    Console.WriteLine("Test 2 (just peeking in the enumeration)");
    var foo2 = GetFeatures();
    Console.WriteLine("First entry in list: " + foo2.First());
}
        
// this is the expensive method
static IEnumerable<string> GetFeatures()
{
    Console.WriteLine("GetFeatures was called");

    // just pretend this is an expensive query
    var expensiveList = new string[] { "a", "b", "c", "d" };

    //and pretend we're doing something more complex
    foreach (var item in expensiveList)
        yield return item;
}

And the output:

Test 1 (we’re not enumerating this time)
Test 2 (just peeking in the enumeration)
GetFeatures was called
First entry in list: a
Press any key to continue . . .

Of course, all this lazy loading nonsense could be skipped if we just built the query logic inside the model’s collection get method, but that wouldn’t be very MVCy, now, would it? I’d rather set the list in the controller.

Categories: nerd Tags: ,

They Might Be Giants and Jonathan Coulton

September 20, 2011 Leave a comment

They Might Be Giants and Jonathan Coulton came to town a few days ago. I fell in love with Jonathan Coulton a few years ago when I first heard his rendition of Baby Got Back. The rest of his repertoire is one giant nerdgasm after another. It was through his website where I found out he was coming to town, and the Giants were icing on the cake.

It was a great concert, but I sure wish that Coulton’s set was longer. He couldn’t have played more than a half dozen songs before rushing off the stage to allow for another hour of setup. It would have been much better to just give him an acoustic guitar during that interim so we could get our money’s worth.

I’ve always teetered between being a big fan of TMBG and getting too annoyed with their music. I love their variety and quirkiness and the constant struggle of trying to figure out what the hell they’re singing about without going to their wiki. On the other hand, I hate to sound petty, but sometimes I just can’t get over the grating voices of the lead singers. Usually I’m fine with it, but there are some songs where the earsplitting nasality of their vocals is just too much and I have to take a break. I’ve never been a fan of Rush or the Smashing Pumpkins for the same reason, but I can dig the Giants. They’re worth the extra effort.

The crowd was definitely a new one for me. It felt like I was in a nerdy internet forum, with all the current memes being represented. At one point I realized that I’m probably just as nerdy as the rest of the bunch, because I understood and enjoyed a lot of the obscure humor. I guess I just try to hide my inner nerd. These people were flaunting it. Good for them.

Categories: music, nerd

Unit Testing is for the Weak

July 2, 2011 Leave a comment

I know it’s bad for me and the project, but every time I start programming something new, I jump into it without creating any unit tests. Even when I know that the code I’m updating could really benefit from being tested and in the long run, it would save me a lot of time and brainpower, I still start plowing through with my bulldozer of coding prowess.

It’s always the same order of events. I spend a half day hammering away at code before I start realizing the anti-pattern. I keep building, hitting F5 in my browser, do a congratulatory fist-pump after my awesome new code works, then lower my fist as I realize I may have broken something else. I figure, oh, it’s just this once, that I can adjust some query string variables to test for the other few variations, then go onto the next minor detail. The whole scene is repeated, each new thing being repeatedly tested again and again by changing the url until I realize that dammit, I knew I should be unit testing this crap.

It happened again today. I was extending some custom paging code so we could inject some ads into a list of products. The original custom paging code wasn’t wrapped in tests, so I figured, why waste my time with unit tests when I have this handy F5 button? It took all day before my dammit moment, at which point I shelved everything and wrote a few basic tests around the existing functionality as a starting point.

And of course, it worked like a charm; way easier than I assumed up front. I can stop monkeying with the url for every freaking variation of page sizes and indexes, and instead focus on one individual problem at a time. It boils down to that single unit of focus I’m capable of. With these unit tests, I don’t have to keep thinking about all the previous variable possibilities because it’s already written down. If I break it, I’ll know immediately. I don’t have to be surprised by the twelfth F5 when I realize that at some point, I screwed something up and that fixing it is going to screw up some other miniscule detail.

I don’t know why, after all these years, I can’t realize this simple fact up front and act on it. I guess I’m just weak or cocky or just plain dumb; too reliant on my cowboy coding skills and thinking that of course, I can handle every little project that comes my way without wasting my time writing unit tests. But every time, I realize that the time spent hitting F5 for every previous url variation is reduced dramatically once I actually write a few unit tests and no longer have to bother testing the same thing by hand for every change I make. If I’m really trying to save time, I’ll start with the unit tests up front.

Categories: nerd