Technical aptitude testing & recruiting

Having spent a reasonable amount of time on both sides of the interview table, plus having co-founded NerdAbility, it may come of little surprise that I am pretty opinionated on the topic.

It's a pretty divisive topic, and a lot of people feel quite strongly about a lot of this stuff, but here's my opinion anyway..


Tech tests

First up is the question of whether or not potential candidates should have to take some kind of tech test. Personally I like them. But with a few caveats:

  1. They should be easy. 
    This might sound counter intuitive, but I prefer a relatively simple tech test.  In reality, I'm not really convinced that you gain that much from the more complex tests, and at worst, probably just get false negatives and end up mistakenly ruling out great candidates.  I have seen tests that take days (unit testing/mocking/designing/coding/reviewing/re-factoring etc.) and if anything they put candidates off, and as mentioned probably don't provide much info.

    Something nice and simple, let's say a modest estimate of an hour all in is probably about right.  As ranted about by many folk, famously Jeff Atwood, a basic FizzBuzz programming test will rule out a lot of people.

    Further more, when I review test submissions I don't really care about if they work - what I am really looking at is the coding style and overall approach. Coding style, class structuring, use of nice libraries/core functionality/data structures, unit tests.  If you have a simple test that should take no more than an hour to code then candidates really don't have any excuse not to make their best efforts with how the code is structured, unit tested etc - so you can set the bar pretty high.
  2. They shouldn't be timed
    Timing the tests just blur the lines - if you are timing the tests then you have to lower the bar. With a simple, non-timed test, you can say you will rule out people who haven't submitted unit-tests for example, but if you set a time limit then you have to excuse people - and will inevitably find yourself saying things like

    "Well, sure its 200 lines of code all in the main() method, and variable names like 'tmpString', and it probably doesn't work, and it's not unit-tested.. but maybe they were rushed for time.. maybe we should bring them in..? "

    It happens, the bar slips lower and lower, and eventually the test is serving no purpose other than ruling out those people who just can't be bothered.
  3. They should be core technology concepts
    There is no point having tests that test specific domain knowledge or expect experience beyond core language competencies. Even if your business is in a very specific niche, you are going to do much better hiring great tech folk if you test core competencies rather than specific libraries/tools/technologies.
  4. They should be done before hand
    On-site testing adds a different dimension to the test - whether it be whiteboard or on a machine, there are other variables that can end up being a distraction. On a whiteboard candidates can end up worrying over exact method signatures and missing semi colons (but whiteboard is much preferable to actual coding - In my opinion, you should never expect a candidate to bring a laptop, and asking a candidate to use another machine/OS/IDE is also fraught with potential distractions).



What's the point?

As mentioned, I am not a big believer in using the tests to really measure if someone is a great programmer. For me, they serve two simple purposes (well three actually, but I will mention the third point later)
  1. They can be bothered. They are actually interested enough in the role and the company to invest their own time and energy. This will rule a few people out who are just machine gunning resumes out to lots of companies blindly, or those who are just after some interview practice.
  2. They have demonstrated an understanding of core technology approaches/patterns - Simple things like Single-responsibility, unit testing (use of good assertions, sensible testing, messaging etc), class organisation shows that they have actually spent a reasonable amount of time programming and keep up technology.  A nice little example of this, that I like to see, is use of Java's Collections/Arrays convenience methods (assuming testing Java!)  Arrays.asList( "1", "2", "3" ) makes declaration of explicit lists easier (nice for testing etc) and shows a knowledge of core Java stuff.


So all we have so far is know they can be bothered and that they have a good understanding/interest in program design/architecture - not much more that can indicate whether or not they are an awesome developer.



The interview

This is where the test really comes into its own.  I think using the test to drive the tech interview is really a great way to go.

You can walk through the code, ask about design decisions, and with a little extra thought you can easily push into variations of the test, how would they handle other constraints and you can continue to push through varying degrees of complexity, and if you are consistent with your tests you get a consistent sliding scale to compare candidates - you know exactly at what point did each candidate get to on the scale of questions. See this article as a great example of this technique more generally (also fun/good practice to try working through the problems the author is asking before reading the answers to see how far you get).

This approach of starting very easy and working up a sliding scale of difficulty is a common practice used by big co's like Google et al.



An example

Here's an approach I quite like

Tech test: the problem

Given a webpage address, find the most common word on the page.

This tests core technology concepts, stuff like good usage of HashMap (or similar) data structure for counting words, extensive enough to need some proper unit testing, but quick enough to complete relatively quickly.


Interview: followup

There are a few follow up questions that can be used to further probe the candidates understanding:
  1. What would you need to change if I wanted the most common word on a whole website (e.g. wikipedia) - this can go into how to crawl webpages and potential pitfalls if that is a relevant area, but otherwise can go into challenges regarding the amount of information needed to be stored, e.g. if you have limited memory, how can you store the info etc
  2. If I wanted the top 5 most common words how would you change it?  This is interesting as there are a variety of solutions, and unless they go straight for the optimal solution you can keep asking if they can think of any better solution. 

    For example, they might just keep track of the top 5 words during the counting, which is pretty efficient, but less flexible when top 5 becomes Top X words;

    Alternatively they may just count all the words, then implement a comparator for the Map entries and sort them all and just take the top X - which is flexible but is always going to be  O(nlogn) performance (sorting is always at best nlogn)

    Another approach is to use a Heap (PriorityQueue in Java), and heapify the counted set (heapify can be completed in O(n) time) then just take the top X elements from the queue (X being a lower order constant not dependent on the size of the dataset, and log n being lower order than the linear time to heapify the data upfront,  so performance is O(n))

    You can also follow up this question with further questioning about performance and Big-O - if thats something that you think is interesting/relevant for the position - which it might not be..



Whatever test you choose, as long as you have some sensible and interesting questions to follow up with, I think it makes for a pretty productive process and in many ways is optimized for the candidate making the best impression they can.

0 comments: