free geoip Jayson's Blog -
A conduit to the voices inside my head.

Jayson's Blog

  • Three Things About Software I Didn't Learn In College

    Scott Hanselman recently put out an interesting request for folks to post about three things they've learned about software either in college, out of college, or both. Seeing as I only took one programming class in college (C++ 101), this post will cater to his request of "I'm especially interesting in those who didn't go to college at all, to add yours" (I do have some college, but only 3 semesters worth).

    Reading Karthik's response is what inspired me to hammer out my own version.

    1. Learn how to say "no" early on in your software development career. I don't mean just all out "no we can't do that" but something more along the lines of knowing how to avoid scope/feature creep. Give clients an inch, and they'll take a mile if you start succumbing to a "we can do anything" attitude. Potentially this can snowball into ever slipping ship dates, and the client will blame you.
    2. Release early and release often when it comes to milestones/alphas/betas. The sooner you can get bits into real world usability scenarios, the more polished the end product will be. Doing this also instills a greater sense of ownership from the client's perspective which will appease even the most 'hands on' type of clients.
    3. Your job does not end at 5 o'clock each day. Many days it will not end until well into the wee hours of the morning. I don't mean that you'll always be doing billable work during these hours...this is a career that requires more self training than most others, and as such if you're not willing to devote a substantial amount of your own time and money making yourself more marketable, then this isn't the right career path for you.
    4. *bonus* Try not to get caught up in what I like to refer to as the "everything looks like a nail" syndrome. It's very easy to want to start applying all of our newly discovered knowledge to projects we are currently working on. My rule is "if the project has an invoice attached to it, use practices that have worked for me/my team in the past, and that I am familiar the bleeding edge stuff for personal projects, or get guidance from someone in the know (teammates, leads, etc)." Redoing architecture after the fact is expensive, and will usually cost you a client.

    Software development is still very much an art more than a science, so what works for one team very well may not work for another. In the end, it comes down to learning from past mistakes (of which there will be plenty early on in a career), and doing more of what has worked in the past. Well, for me at least.

  • Adding CoComment Support to Community Server

    [Update] I've updated my code to use Thomas Freudenberg's solution, which can be found in the trackbacks of this post.

    I posted earlier about CoComment, which is an online tool that allows you to track comments you make on other blogs (if you go plug in your email address on their site, you’ll get a beta key within a day or so if you want to try it out, which I highly recommend doing).  At the time of that posting, CoComment only supported the big name blogging engines, and from what I gathered somewhat poorly at that.  So I emailed the development team asking if there was anything I could do from my end to expedite the process of getting more blogging engines in the mix, namely Community Server of course. 

    Lo and behold I get an email back from them a couple of days later stating they had gutted the approach they were taking (screen scraping) and were moving to a tag based system, and about a day after that I received another message pointing me towards this post on their site which outlines the implementation they came up with.  Within half an hour I had this hammered out on my own site and am pleased to announce that it works quite nicely, so kudos to the folks over at CoComment.  CS purists probably would have taken a CSModule based approach, but seeing as A) this is still very beta software and thus might change quite often, and B) that it’s literally 10 lines of code, I simply plugged it into the Skin-CommentForm.ascx control.  Here’s the code:

    <%@ Import Namespace="CommunityServer.Components" %>
    <%@ Import Namespace="CommunityServer.Blogs.Components" %>

    <% WeblogPost currentPost = WeblogPosts.GetWeblogEntry(CSContext.Current.BlogGroupID, CSContext.Current.PostID); %>
    <% bool isAuthor = CSContext.Current.IsAuthenticated && CSContext.Current.User.UserID == currentPost.AuthorID; %>
    <script type="text/javascript">
            var blogTool                = "<%=SiteStatistics.CommunityServerVersionVersionInfo %>";
            var blogURL                 = "<%=Globals.FullPath(currentPost.Weblog.HomePage) %>";
            var blogTitle               = "<%=currentPost.Weblog.Name %>";
            var postURL                 = "<%=Globals.FullPath(BlogUrls.Instance().Post(currentPost)) %>";
            var postTitle               = "<%=currentPost.Subject %>";
            var commentAuthorLoggedIn   = "<%=isAuthor %>";
    <% if (isAuthor) { %>
            var commentAuthor           = "<%=CSContext.Current.User.DisplayName %>";
    <% } else{ %>
            var commentAuthorFieldName  = "<%=tbName.UniqueID %>";
    <% } %>
            var commentFormName         = "__aspnetForm";
            var commentTextFieldName    = "<%=tbComment.UniqueID %>";
            var commentButtonID         = "<%=btnSubmit.UniqueID %>";

    Voila, your blog is now CoComment enabled.  To actually use CoComment, you’ll need to get an account, and there is a bookmarklet available from their site that sits in your links toolbar of your browser of choice (it looks like the big 4 are supported).  My next feature request will be some sort of visual cue that lets the commenter know that the site is CoComment-enabled…other than that, it works beautifully.

  • Community Server -- Fix for Posting Comments from Feed Readers

    I have finally managed to track down and fix the “unable to post a comment from feed readers” issue with Community Server.  Unlike most of the other fixes, this one isn’t so trivial (and I knew as such going in…that’s why it’s taken so long).  Moving onwards….

    Out of the box, CS has virtually no implementation for CommentRSS (which is strange as .Text supports it 100%…curious as to why Telligent didn’t include it with CS).  I followed the same pattern that .Text uses by implementing a custom httpHandler to intercept requests for a specific Url that comments are sent to from feed readers.  The first thing you’ll need to do is add a new class in the CommunityServerBlogs project (in the Components/Syndication folder); name it RssCommentHandler.cs, and make sure to change the namespace to CommunityServer.Blogs.Components (as C# projects will include the full directory structure in the namespace by default…interestingly enough VB.Net doesn’t do this, defaults to the root namespace).  Here’s the code you’ll need to add to the new class:

    using System;
     using System.Web;
     using System.Web.Caching;
     using System.IO;
     using System.Xml;
     // CS
     using CommunityServer.Components;
     using CommunityServer.Blogs.Components;
     namespace CommunityServer.Blogs.Components
         /// <summary>
         /// Summary description for CommentHandler.
         /// </summary>
         // jayson.knight -- fix for posting comments from feed readers
         public class RssCommentHandler : IHttpHandler
             private RssCommentHandler() { }
             #region IHttpHandler Members
             public void ProcessRequest(HttpContext context)
                 HttpRequest request = context.Request;
                 if(request.RequestType == "POST" && request.ContentType == "text/xml")
                     XmlDocument doc = new XmlDocument();
                     User user = Users.GetUser();
                     int postID = getPostIDFromUrl(request.RawUrl);
                     WeblogPost commentedEntry = WeblogPosts.GetPost(postID, false, true, false);
                     Weblog blog = commentedEntry.Section as Weblog;
                     // if comments aren't enabled, throw an http forbidden exception
                     if (!blog.EnableComments)
                         throw new HttpException(403, "Comments are not enabled");
                     Permissions.AccessCheck(blog, Permission.View, user);
                     string name = doc.SelectSingleNode("//item/author").InnerText;
                     if(name.IndexOf("<") != -1)
                         name = name.Substring(0,name.IndexOf("<"));
                     WeblogPost post = new WeblogPost();
                     post.SubmittedUserName = name.Trim();
                     post.BlogPostType = BlogPostType.Comment;
                     post.SectionID = blog.SectionID;
                     post.ParentID = postID;
                     post.Body = doc.SelectSingleNode("//item/description").InnerText;
                     post.Subject = doc.SelectSingleNode("//item/title").InnerText;
                     post.TitleUrl = checkForUrl(doc.SelectSingleNode("//item/link").InnerText);
                     post.IsApproved = true;
                     post.PostDate = DateTime.Now;
                     post.BloggerTime = DateTime.Now;
                     WeblogPosts.Add(post, user);
             private string checkForUrl(string text)
                 if(text == null || text.Trim().Length == 0 || text.Trim().ToLower().StartsWith("http://"))
                     return text;
                 return "http://" + text;
             private int getPostIDFromUrl(string uri)
                     return Int32.Parse(getReqeustedFileName(uri));
                 catch (FormatException)
                     throw new ArgumentException("Invalid Post ID.");
             private string getReqeustedFileName(string uri)
                 return Path.GetFileNameWithoutExtension(uri);
             public bool IsReusable
                     return true;

    The next change is in CommunityServer.Blogs.Components.WeblogRssWriter.PostComments method.  Add the following code:

    // jayson.knight -- fix for posting comments from feed readers
    // only write the wfw:comment tag if comments are enabled for this weblog
    if (CurrentWeblog.EnableComments)
        this.WriteElementString("wfw:comment", FormatUrl(BlogUrls.Instance().RssComments(CurrentWeblog.ApplicationKey,p.PostID)));

    As we’re looking for a new RssComments Url, we need to modify the CommunityServer.Blogs.Components.BlogUrls class with a new method to find the rewritten Url; add this method to this class:

    // jayson.knight -- fix for posting comments from feed readers
    public virtual string RssComments(string applicationKey, int PostID)
        return FormatUrl("weblogRssComments", applicationKey, PostID);

    We then need to tell CS how to rewrite this Url so that it formats correctly; thankfully the infrastructure for this is already in place via the SiteUrls.config file; add the following element in the HomePages section:

    <!-- jayson.knight fix for posting comments from feed readers -->
    <url name = "weblogRssComments" location = "weblogs" path="rsscomments/{1}.aspx" pattern="rsscomments/(\d+)\.aspx" /> 

    And finally, we need to map an httpHandler to our newly created RssCommentHandler class to intercept requests for rsscomments/*.aspx; add the following element to the httpHandlers section of the root web.config file:

    <!-- jayson.knight fix for posting comments from feed readers -->
    <add verb="POST" path = "rsscomments/*.aspx" type="CommunityServer.Blogs.Components.RssCommentHandler, CommunityServer.Blogs" /> 


    I will say this about CS; it’s Url rewriting infrastructure is powerful stuff; adding a new Url is a snap with it…this would have taken much longer without this in place.  Oh, and my new website is now officially done!

    Sidenote:  Huge thanks to Phil (aka Haacked) for helping me track down where to put the wfw:comment tag, he’s the Rss man!  I really need to get better at reading RFC specs, in this case the spec located here.  Thanks again Phil for your help.

  • Dynamically Inject Validation Controls From A Base ASP.NET Page

    I've been out of work this week sick (which really sucks...I don't do very well with idle time, plus I haven't been sick enough to miss work in many years), so what better time to catch up on some blogging.

    We had an interesting scenario at work recently whereby we needed to dynamically attach a number of validation controls to various TextBox controls on all of our existing pages. Specifically we had written a custom validation control which would test a control to make sure it didn't contain any HTML tags (a common enough scenario). This post assumes knowledge of authoring custom validation controls, i.e. inherit from BaseValidator and override EvaluateIsValid. For anyone needing a refresher, check out this MSDN post on custom validation controls.

    There are multiple ways to go about attaching validation controls to the controls they need to validate, but in the end I settled on going with a base page class that our existing base page could inherit just plug in the base page and everything is automagically wired up during the page request cycle. Given that validation controls will almost always need to validate TextBox controls (more specifically, any type of control that accepts user input), all we need to do is find controls that implement the IEditableTextControl interface. Existence of this interface in the controls inheritance hierarchy means that the control supports user editing of text.

    So in short, here is the plan of attack:

    • Author our custom validation control, or utilize one of the built in validation controls.
    • Create a base page that loops through all of the contained controls, and when a control is found that implements the IEditableTextControl interface, instantiate and attach our validation control. Recursion seems to be the best way to go about this.
    • [optional] Inject a ValidationSummary on the parent page to notify the user of any failed validation attempts.

    Here is the code for a base page that dynamically attaches a custom validation control called HtmlInputValidator:

    public class HtmlValidationBasePage : Page
        protected HtmlValidationBasePage()
        protected override void OnLoad(EventArgs e)
            List<Control> controls = new List<Control>();
            FindControls<Control>(this.Page.Controls, controls);
            if (controls.Count > 0)
                controls.ForEach(delegate(Control control)
                    IEditableTextControl textControl = control as IEditableTextControl;
                    if (textControl != null)
                        HtmlInputValidator handler = new HtmlInputValidator();
                        handler.ControlToValidate = control.UniqueID;
                        handler.Display = ValidatorDisplay.Dynamic;
                        handler.Text = "Failed Validation for control " + control.ID;
                        handler.ErrorMessage = "Failed Validation for control " + control.ID;
                        handler.SetFocusOnError = true;
                        handler.EnableClientScript = false;
                        handler.ID = control.ID + "Validator";
            ValidationSummary summary = new ValidationSummary();
            summary.ShowSummary = true;
            summary.DisplayMode = ValidationSummaryDisplayMode.List;
        // Recurse through all of the controls on the page
        protected T FindControls<T>(ControlCollection controls, List<T> foundControls) where T : Control
            T found = default(T);
            if (controls != null && controls.Count > 0)
                for (int i = 0; i < controls.Count; i++)
                    found = controls[i] as T;
                    if (found != null)
                    found = FindControls<T>(controls[i].Controls, foundControls);
            return found;

    Of course this base page could (and should) be further extended to support turning validation on/off, or only validating certain groups of controls...the above sample is simply for sake of brevity and should serve as a starting point only.

    I've always been a huge proponent of authoring a custom base page class from which the rest of your pages will is without a doubt the easiest (and cheapest) way for you to get reusable functionality distributed to all of your pages with minimal coding effort. Their power really starts to shine in scenarios such as the one outlined in this post.

    See attached file below

  • Syntax For Generic Type Declarations In Type Elements

    I was recently working on implementing a provider based design for a project I'm working on which also happens to make heavy use of generics throughout the provider architecture. The signature of the type to be used in the <providers> section of the config file is MyType<T, V>, however I kept getting the dreaded "Unable to load type 'MyType' from assembly 'MyAssembly'" error when attempting to run the application. After about 30 minutes of wringing my hands wondering what the heck was going on, I remembered that generic types have a different signature when declared in text form. The fix was simple, instead of declaring it as

    type="MyAssembly.MyType, MyAssembly"

    This needs to be changed to

    type="MyAssembly.MyType`2, MyAssembly"

    Where the number after the '`' is the number of generic type parameters in the type's signature. It would have been nice if the runtime could have offered a hint like it does in other situations. This may be generics 101 for some, but I'm posting this in hopes it'll save someone else some time if Google picks up the keywords in this post.

  • List of Bugs I've Experienced (So Far) with Community Server 1.0

    Here is a list of the bugs I experienced while migrating from .Text .95 to Community Server 1.0 (as it pertains to the blog piece of CS only)…it’s obvious that I still have some work to do to get everything fixed: 

    • CommentRSS/API was generating Http 404 errors for RSS subscribers (fix is here).
    • Individual Post Category RSS feeds were generating Http 404 errors for RSS subscribers (fix is here).
    • Comments made to blog posts on the website itself were not generating Email responses; generated the following exception (fix is here, writing a simple trigger hacks it):

      System.Runtime.InteropServices.COMException (0x8004020E): The server rejected the sender address. The server response was: 553 5.5.4 ... Domain name required for sender address Anonymous

      User Agent / IP Address

      as HTTP

    • RSS Comments show comment author as Anonymous; the feeds section of the website displays the name the author used in the comment (no fix yet, working on it); feed readers should show the same information.

    • Trackbacks show up as Anonymous; should display the name of the site that generated the trackback (no fix yet, working on it).

    • Comments/Feedback on blog posts in the admin section shows the author as being Anonymous; should display the value the author left in the comment (no fix yet, working on it).

    • By default, Community Server only allows a specific subset of HTML elements to be used in posts; this is configurable in the communityserver.config file (edit your communityserver.config file to allow the HTML elements/attributes you need; not extremely intuitive IMO).

    • Referrals migrated over from .Text don’t page correctly in the Admin/Referrals section; only the last 20 are displayed with no paging mechanism to see the rest of them (no fix yet, not a show stopper so I probably won’t fix it).

    • The timezone for RSS feeds was incorrectly set to GMT with no option in the Admin section to change this (fix is here, though you’ll need the CS source and a recompile).

    • Feedback in the Admin section for blogs should link back to the actual comment; you have to manually search for the post to view the feedback (no fix yet, working on it).

    • Trackbacks (sometimes) format incorrectly by using URL encoding for special characters (no fix yet, and this issue is probably last on my list).

    • Clicking the "Comments" link from the main page takes you to an anchor at the top of the comments, requiring scrolling up to see the "Add a comment" link; I prefer the old school "Show all the comments, and have the 'add comment' bit at the end" -- the present situation means a lot of scrolling up and down to read comments and post a response, and there's no way of seeing the comment you may be responding to without opening a second browser window (no fix yet, I have a feeling this will be addressed in 1.1 though as a lot of people are unhappy with this).

    • Unable to comment posts via CommentRSS/API (no fix yet, working on it).

    • There has been mention of a ton of issues with the version of FreeTextBox that ships with Community Server; I use BlogJet for posting so thankfully I haven’t seen these issues firsthand.  I do know that even though the version that ships with CS (v3.0) is supposedly compatible with Firefox, there’s some functionality that still doesn’t work.  I personally hate free text box and would recommend that folks use a 3rd party posting tool.

    • Link Categories/Links don’t sort properly when migrated over from .Text (manually updated the SortOrder column of the cs_Links/cs_LinkCategories tables).

    • Through some trial and error, figured out that the “Email me replies to this post” option under “Advanced Settings” in the post editor doesn’t do anything.  I confirmed this with Scott over at Telligent; it’ll be removed in 1.1.  The reason I tracked this down is that it’s set to false when posting from BlogJet; attempting to set it to true didn’t work in trying to resolve my email issues…given that it does nothing, makes sense.  The email fix is mentioned above.

    • Tons of build issues when compiling from source.  It compiles in Debug mode, but switching to Release broke everything on my machine (assembly references), and then Debug wouldn’t work either.  If you’re going to compile from source, just leave it in debug mode and manually copy over udated assemblies to your target machine.

    • This one is strictly my opinion:  The CS database (as it relates to the blog piece only) is a trainwreck; violates 2nd normal form quite a bit.  For example, there are no less than 3 tables that deal directly with blog posts themselves, and there is a lot of redundant/duplicate data in each table.

    • Again, strictly my opinion:  CS blogs really abuse the hell out of the Anonymous user account.  Blogs are supposed to be anonymous in nature, I don’t expect my audience to all have an account set up in CS (though if you do set one up you get some added features)…so everything gets lumped into a generic anonymous account which makes hard to keep track of who’s doing what on my site.  This may work well for forum type scenarios, but it doesn’t lend itself well to the blogging model.

    So, obviously not a trivial list of bugs/issues/complaints.  That all being said, there a few things that make Community Server a compelling product:

    • Search.  This was loooong overdue; makes it very easy to find pertinent content.
    • For those that can get FreeTextBox to work (and don’t mind using IE), it’s a huge improvement over the version that shipped with .Text.
    • In the admin section, there’s a reports section that will list all of the exceptions generated by CS, making it incredibly easy to track down bugs and either fix them or report them.
    • User accounts.  I know that most folks are reticent to sign up for accounts on sites, but it’s nice to have this option.
    • Initial deployment is much easier than it was with .Text (bugs aside).  Out of the box, it works pretty well if you’re willing to put some time into fixing the minor annoyances.
    • For those hosting (n) number of blogs, the notion of blog groups is quite nice, and the aggregate page looks great.
    • And finally, though a little overwhelming, the admin piece is a huge improvement over .Text.

    If I had it to do over with, I would have waited for 1.1…it’s been an interesting experience so far though.  It’s all about learning right?

    I have a feeling this list isn’t complete; I’ll post new bugs (and hopefully fixes) as they present themselves.  If anyone notices anything, please ping me and let me know.  And of course if anyone has fixes to the issues listed above, let us all know.  Cheers.


  • Community Server Migration Script -- Round 3

    Exciting news on the .Text to Community Server migration script; Robert McLaws came across my previous post and tapped me on the shoulder to beef up the engine for multi-blog migration scenarios so he could wrap a slick WinForms GUI wizard around it, so I’ve done just that.  From the prototypes he’s sent me, it looks very promising…who doesn’t like a sexy GUI over a command line interface, right?  The architecture of the migration engine itself is quite simple; pass it an array of .Text BlogID’s (from the blog_Config table), and it dynamically builds a corresponding DTS package for each of the BlogID’s.  The packages can either be saved to the server, or executed directly from the engine itself. 

    There is only one thing that can be said about DTS…it’s f’ing fast.  I’m also equally impressed with how easy it is to dynamically generate DTS packages from code (though figuring out all the table/column mappings was the bulk of the work, and another story altogether).  I’ve posted my bits for the migration engine here (core engine is in DotTextCS.Engine.dll), though the frontend is a console app, connection properties need to be set via a standard .config file, and each user + blog needs to be manually configured via the CS admin pages before executing the utility.  See my comments here for further instructions.  Robert’s wizard will alleviate all of the manual-ness, so unless you’re feeling industrious (or have a trivial number of blogs to migrate), wait for the 1.0 release, hopefully to be posted within a couple of days, so stay tuned.

  • Ashlee Simpson Lip-Sync Blunder -- Video

    I must have been the last person in the free world to hear about the Ashlee Simpson mix-up on SNL this weekend (I haven't watched SNL since I was a teenager, back when it was somewhat funny).  At any rate, I'm a huge fan of pseudo-wannabe pop stars making asses out of themselves, so I set out to find the video; snag it here.  That's quality for ya...and wtf was she doing at the end, a hoe-down?  Then she blames the band (not just a session band, her band) at the end of the show?  Classy.

    More commentary here, and a link to her messageboards where more views of the clip can be downloaded.  Absolutely hilarious.

  • Fedora Core 3 -- Virtual PC

    Hello from Fedora Core 3 (running on VPC)! I should mention that I am a complete linux noob before I continue (I can spell l-i-n-u-x, and that's about it). This isn't my first foray into linux land (I had a round with SuSE 7.0 previously...perhaps 3 years ago?), but it is indeed my first successful one, I never could find drivers for my Voodoo 4 video card for SuSE, so it was basically unusable. Onwards to now...

    I won't comment on how I feel about Open Source Software (OSS, and linux in particular) other than to say while I like the idea in concept, in reality it still has a long way to go. The install via VPC wouldn't have been possible without this guide, and even after following those steps, I was limited to 800x600 resolution in VPC. I was quickly having flashbacks of my previous attempt at a linux install, but a quick google found this post which seemed promising. It took me another half hour or so of poking around to figure out that the XF86Config file doesn't exist on Fedora...on Fedora it's named Xorg.conf and lives in /etc/x11. The configuration mentioned in the link works fine though, and I'm now happily running at 1600x1200. So why did I choose to start playing around with linux? No, I'm not jumping ship from MS products...I make my living as a .Net developer, so that would be similar to suicide. First reason, I like new (free) tech toys, but the second (and main) reason is to start toying around with Mono, which could actually further my .Net career...a no brainer.

    A little background first... I grew up on Macintosh. My mom is a graphic designer, and in an effort to keep up with the graphic design world she purchased a Mac when I was 12 (circa 1990). Needless to say, I was hooked immediately. Mmmm, I still remember the little critter (Macintosh LC for any afficionados out there) like it was yesterday...12 inch monitor (sporting all of 256 colors...I know this to be true because I counted them all), 8 megabyte hard drive, and a whopping 2 megabytes of RAM...all running on Mac OS 7. I would literally spend hours each day learning everything I could about it, and also played quite a bit of King's Quest (my first ever PC video game...Mom wouldn't buy me Leisure Suit Larry, so I had to settle for that). Little did I know that this would be the basis for my future career. Thanks Mom!

    Why the sentimental paragraph above? Hmmm...this is where I'm supposed to segway into my point I guess. When I switched to Windows from Mac a few years later, I felt like a fish out of water. What?!? No menu bar across the top of the desktop? And what's with this mouse with more than one button? A context is that? What does the registry do? Ad nauseum. I'm reliving that now trying to find my way around linux. While it's not as dificult as I thought it would be, it's definitely quite quirky. For example, Fedora doesn't seem to support MP3's out of the box (!!!). The built in (preferred) media player is Helix (Helix is the OSS fork of RealPlayer), and it complains if you feed it an MP3 file...but thankfully it tells you where to go to get a player that will happily accept and play one (RealPlayer for linux). Installing it is not a trivial task, but there is a bit of documentation that walked me through it (involving a terminal window and a few commands). The average user will find this simply unnacceptable IMO, but perhaps I overlooked something. That being said, Firefox works beautifully, and almost all of the Firefox extensions available on Windows are also available on linux, so my browsing experience is virtually identical. AMSN gives me an MSN compliant messenger (again, a non-trivial install). I don't want this to be a "linux isn't ready for the desktop" post, but I can say with some certainty that the average user would have quickly grown frustrated and would have probably given up on the OS having experienced what I have so far.

    I'm a technical guy, so I intend on using linux for technical reasons. Mono seems to be making some inroads, so it seems only natural that I should put it to the test. The first step? Installing it. If simply getting MP3's to play on linux was tough, I can't wait to see what challenges Mono presents. Cheers, and more to come on my Mono trials and tribulations.

  • Interviewing With Microsoft, And Landing The Job: Part 1 -- Preparation Is Key

    As promised, I've decided to type up a short series on my interviewing process with Microsoft. I've talked before about it, however that was from a perspective of just screening and not actually landing the position. I am by no means an expert on interviews, or getting the dream job and as such this will just be my own experience with how it goes, and what ultimately worked to my advantage to help me succeed in the end. Most of this could probably be applied to interviewing with any company, but I'll put a Microsoft slant on it seeing as (maybe?) quite a few readers of this blog would jump at a chance to spend some time inside the walls of Microsoft. It's worth mentioning that I am *not* a traditional MS 'developer'...I will be working within Enterprise Support which is a completely different entity, and thus has different processes.

    This post is part 1 of this series.

    First and foremost, spend some time over on the MS careers site. On it you'll find a wealth of information about the different groups within Microsoft and what career tracks are offered in them. There is also a search page with a list of almost every job offered at Microsoft, so spend some time going through them and reading over the various requirements; you may find positions you didn't realize were even there. Of course Microsoft is first and foremost a software company, so the bulk of the jobs are related to the business of writing and selling commercial shrink-wrapped software (Windows, Office, server tools, etc). The vast majority of those traditional 'developer' jobs are located in Seattle, so you must be willing to relocate (though MS does have product groups scattered around the world, so do some research...there may be one closer than you realize).

    Interestingly enough this was a track that wasn't of much interest to me...probably because I come from an IT (Information Technology) background, which anyone in the business will tell you is a completely different beast than that of the ISV (Independent Software Vendor) realm. There are other ways to be a developer for Microsoft, but not work within a product group: Microsoft's own internal IT/Tools department (Microsoft like any other business has an IT department), or Microsoft Consulting which as a general rule is not location specific other than "must live near a major airport" as these guys travel a lot (most of my friends who work for Microsoft are employed in this capacity, and love their jobs as they get to interface with customers on a very regular basis), or as an Application Developer Consultant which are similar to architects. I'm sure there are other code centric roles within Microsoft, but these are the ones I'm familiar with offhand. The point is not to limit yourself to being a developer on a specific product, though I'm sure that's a track that will interest many.

    Once you've done your research, it's time to polish up your resume and start applying. Over the years I've kept a custom resume for each of the major groups I've applied to. Find out the technical specifics of the job and hone in on those areas in your resume. I cannot stress that enough. I can't even begin to imagine how many resumes MS Recruiting sees on a daily basis, so anything you can do to make yours stand out gives you that much of an edge over other candidates. The competition starts from the second you hit the submit button over on the MS Careers site. It's also worth mentioning that the flood approach (while easy to do) will not get you any further than hand picking a few choice positions and pursuing those more actively. I've made it a habit to check the Career page on at least a monthly jobs are popping up all the time. Perhaps most importantly: Read the requirements. If they say you need 10 years in the financial industry, that economics 101 class you took back in college won't suffice. In a nutshell, if the position seems like a longshot, it probably is.

    So you've sent out 10 resumes, now what? It's been my experience that MS Recruiting is pretty quick to contact candidates that they are interested in. I can't give a hard number, but 2-3 weeks seems to be the average for me in the past 5 years. All in all I was contacted for about a dozen positions, with about half of those resulting in some actual face to face time with the group who was interested. The recruiter who contacts you can be located almost anywhere; Microsoft has relationships with many firms throughout the world when it comes to recruiting talent. The initial call will usually be normal HR stuff, so there is no need to put on your technical hat quite yet (that'll come later)...the usual "tell me about yourself, here's a description of the position, etc." By this point the group has shown interest so unless you royally mess up, you'll get some airtime with folks in the group. It's worth mentioning that from this point forward, every conversation, every email, every type of contact period is to be considered part of the interview process. Get rid of that email stationary, colorful fonts, etc. Treat all of them as if they were clients you are already billing work to, or trying to court as a potential client.

    The recruiter will want to block off a couple of hours for you to speak with members of the team (about half will be technical, and the other managerial (or the so called "MS" part of the interview). They will usually let you pick a range of dates and times, so make sure you pick blocks where you'll be at your best. I'm not a morning person, so I always tried to do mine in mid-afternoon when I felt sharpest. I wouldn't push dates farther then 10 business days away, and heck, if you're feeling up to it then by all means schedule it the next day! I would recommend a week out, get all of the names of folks you will be screening with and write them down, and then it's time to hit the books.

    In this day and age of blogs (and especially MS employees keeping blogs) I always search for the names of the people who will be screening me. A lot of times you'll get some hits, and they might even have their own website. You should of course read their entire site/blog to get insight into the group, plus you can mention that you read their site in preparation for the screen. If you know the specific group name, you can search for it to see what you come up with. Stuff you can find on the web is as close to actually sitting beside that person as you'll get before they extend an offer. In my case, I read about a dozen blogs start to end from folks who worked in either my group, or groups similar to mine. The information I gathered from those sites was by far the most useful as they outlined situations this group dealt with on a daily basis (as well as the resolutions for many of them).

    Picking the actual books you'll need to study is a different beast altogether. This is not a steadfast rule, but it's been my experience that the majority of the stuff I think I need to read doesn't come up in the actual screen...the stuff I've found during my web researching has been more relevant. That being said, I pick 2 books (one of them from MS Press) and read them cover to cover. Treat them like college textbooks, and your final exam is the day of the screen. Use a highlighter, work through all of the exercises, etc. If you haven't finished them by the day of the screen don't worry, but do make sure to go back over whatever you felt was important the night before the screen. Don't kill yourself memorizing acronyms, layout of different configuration screens (in other words, the extreme details); concept are most important (for example, in my screens I missed the actual names of several tools, but I knew they existed and was able to describe how to use them which was good enough). Perhaps most important is to not overdo it. You aren't going to get any more out of an 8 hour cram session than a couple of 2 hour sessions spread over a couple of days. It may feel like you're cramming a bunch in, but that's just what it is: Cramming (not retaining). My rule was about 4 hours a day, and I made myself stick to it. I also spent another hour or so just working through real world examples either from my past work, or books, or contrived. Actually going through the paces is the best process in my honest opinion.

    My final word of advice for preparation is not to do any last minute cramming the day of the screen. If you do, you'll end up focusing too much on the areas you're "brushing up" on when more than likely those areas aren't the answer to the questions at hand. If you're working at an office, I highly recommend trying to take the day off. If you can't get an entire day off, take at least half the day. If you can't do that, then schedule the screens for a day on which you can. Trust me, you're not going to want office hubbub on your mind if you want a decent shot at delivering "wow." If you've done your research on the group, applied to positions you're adequately qualified for, and gone through your reading material you should be more than prepared for the actual technical content of the interview. What you will not be prepared for is the process itself which I will go over in part 2 of this series, as well as going more in depth to my actual experience this go round.

< Previous 1 2 3 4 5 Next > ... Last »

Copyright © ::
External Content © :: Respective Authors

Terms of Service/Privacy Policy