• C++ Programming for Financial Engineering
    Highly recommended by thousands of MFE students. Covers essential C++ topics with applications to financial engineering. Learn more Join!
    Python for Finance with Intro to Data Science
    Gain practical understanding of Python to read, understand, and write professional Python code for your first day on the job. Learn more Join!
    An Intuition-Based Options Primer for FE
    Ideal for entry level positions interviews and graduate studies, specializing in options trading arbitrage and options valuation models. Learn more Join!

What is the real value of 3hr basic coding tests for experienced hires with MS/PhD - please explain?

There are lots of books on the syntax of any language -- C, C++, Python, Ruby, etc. But few book on interesting and engaging coding problems.

That's because it takes 10 times as long to come up with good examples + exercises + project as copy and paste of a piece of code from a blog.

Tbere are also no _real_ design books around, so there are not. The industry is still in the ad-hoc folklore era.
 
Before reading this thread I felt the same way as OP. I am a fierce anti iq exam (I could go on a mildly incoherent rant about it) and I sometimes feel like these coding challenges are just that: iq exams (I get triggered :mad: ). So after reading everything said in here I now understand somehow the motivation to send those out. Still, it seems to me everybody who graduates from a quantitative program can learn good programming habits: I don't believe it's an innate skill, you can learn, and businesses should invest in their employees (if that's not too much to ask).

Those tests are just a way of saying, we only want people who are already great programmers. The issue, as was explained in the thread, is that universities do not produce "great" programmers. They produce raw talent, and talent has to be polished: Especially if you come straight out of academia (like me, and unlike OP who has work experience already) and rare were the people who read your code (I know how painful it is to read code with 0 comments o_O ). You could be lucky to get a research supervisor who has industry experience and instills great programming habits but that's a bit much to hope for.

One last thing: Make those challenges interesting, okay?
I just accepted an internship offer for which I had to complete a coding challenge: either recreate a pong clone, or solve some NLP problem (language detection). Then write a report, produce performance analysis, heavily commented code etc...
I mean that was fun to do because you are actually solving a machine learning problem.

Those Matata Capital challenges are more like "find out how many palyndromes are in the given century" : dumb, boring as hell, and on top of it, timed = pseudo iq exam.
They don't want to take the time to recreate semi original problems which require you to think, to try things, and to be creative. It's not for everyone, certainly not for me. I would probably not want to work for a company which thinks iq exams are a legit barrier of entry to their workplace.
 
Those Matata Capital challenges are more like "find out how many palyndromes are in the given century" : dumb, boring as hell, and on top of it, timed = pseudo iq exam.
They don't want to take the time to recreate semi original problems which require you to think, to try things, and to be creative. It's not for everyone, certainly not for me. I would probably not want to work for a company which thinks iq exams are a legit barrier of entry to their workplace.

Exactly. Literally just had an interview at an undisclosed quant trading firm in Europe and again they literally asked me a series of questions that were so dumb that I didn't even know how to answer. Basic IQ test style questions. Example of two questions - what is a primary key in SQL? How do you access elements in a matrix in MATLAB.

I literally have had 4-6 firms ask me similar dumb questions (Quant dev roles), and basic probability questions that make no sense. You literally can't make this stuff up. I've gone through most of the books and they are literally just picking questions in a random and adhoc fashion. Mental math ability under pressure does not determine whether someone will code 10-14 hours a day consistently all year around, nor does it show you their team work skills or ability to perform on the job.

There are lots of books on the syntax of any language -- C, C++, Python, Ruby, etc. But few book on interesting and engaging coding problems.

That's because it takes 10 times as long to come up with good examples + exercises + project as copy and paste of a piece of code from a blog.

Tbere are also no _real_ design books around, so there are not. The industry is still in the ad-hoc folklore era.

Yes and there lies the problem in finance quant trading/finance technology recruitment as well. It is actually hard to come up with a problem for a large number of candidates to solve that would be relevant to the job. It would cost too much time + dollars.

Smart recruitment = Solve an actual take home coding problem relevant to what you'd do on the job over 3-6 days. Harder for the firm to do this but it becomes much easier to evaluate candidates.

But again this is why I fundamentally believe with the exception of a few quant hedge funds - financial firms are not attracting the best talent - not even close. The brain drain will continue.

Why would you apply to firms where brain-teasers are seen as the requirement to be accepted when you can work at a tech company and get paid 60-120K+ USD off the bat + equity in some cases, with no dumb brainteasers or millions of barriers to entry.
 
Last edited:
Smart recruitment = Solve an actual take home coding problem relevant to what you'd do on the job over 3-6 days. Harder for the firm to do this but it becomes much easier to evaluate candidates.

One anecdote. The bank tells a candidate to study Hull and come back in a week's time to answer questions. Good approach IMO.
 
It is actually hard to come up with a problem for a large number of candidates to solve that would be relevant to the job. It would cost too much time + dollars.

It could be done. But busy quants have probably not been trained to extract the essence of daily work into a set of interview questions.
 
Smart recruitment = Solve an actual take home coding problem relevant to what you'd do on the job over 3-6 days. Harder for the firm to do this but it becomes much easier to evaluate candidates

There is no silver bullet to what makes good hiring and what you suggest is in fact silly. This is asking the candidate to spend a week working unpaid. You may be able to abuse fresh grads without a job this way, but not experienced hires. That said, we do send this sort of homework out to junior applicants, but as an interviewer I have not considered these very important in writing my evaluation of the candidate.

Our hiring practice of junior talent to me seems similar to what you're suggesting: The successful candidate will typically be given a 6 month internship, after which they will more often than not be given a full time offer. This again can only be done for junior people of course, and in my opinion is not a very efficient way of doing things. And it can be very frustrating for the intern not knowing for sure about their future, often forcing them to go out and interview at other places as well while interning as a backup, even if they enjoy the job.

Again I stress that hiring juniors and experienced candidates is different. The latter you can have a conversation with about something they've been working on in the past and try to figure out if they were doing what they were told, or tried to understand the going ons. Junior applicants are so many you want to have a relatively clean benchmark, so you'd ask the basics of everything. The best candidates answer these very quickly (and you may say it's only because they spent so much time studying the books many of the questions come from; I say that speaks volumes of their dedication and interest in the job) and even though things may seem ad hoc, people are usually in agreement when it comes to making decisions as to who to hire.
 
This is asking the candidate to spend a week working unpaid. You may be able to abuse fresh grads without a job this way ...

I don't see it as abuse. Serious candidates will pro-actively invest in their own skills and future and not wait on someone to tell them what to do.
 
This is asking the candidate to spend a week working unpaid. You may be able to abuse fresh grads without a job this way ...

I don't see it as abuse. Serious candidates will pro-actively invest in their own skills and future and not wait on someone to tell them what to do.

When you know if you suggested the same test to an experienced hire and they'd flat out refuse to do it, yes you are taking an advantage of the circumstances many fresh graduates find themselves in. The candidates are applying to many places at once and the cream of the crop get multiple offers, and if your application process is known to be heavy, I wouldn't be surprised if the top talent had an aversion to even apply. They shouldn't be wasting a week of their time on a single assignment in a single application during the hiring season, anyway.

That said, my main objection to multiple day take home tests remains my subjective experience as an interviewer of not finding the results particularly useful in guiding me in hiring. At best it's been a prefilter.
 
When you know if you suggested the same test to an experienced hire and they'd flat out refuse to do it, yes you are taking an advantage of the circumstances many fresh graduates find themselves in. The candidates are applying to many places at once and the cream of the crop get multiple offers, and if your application process is known to be heavy, I wouldn't be surprised if the top talent had an aversion to even apply. They shouldn't be wasting a week of their time on a single assignment in a single application during the hiring season, anyway.

That said, my main objection to multiple day take home tests remains my subjective experience as an interviewer of not finding the results particularly useful in guiding me in hiring. At best it's been a prefilter.
Experienced people have a track record so reading Hull for a week is not useful I suppose.

But I suppose it takes < 1 hour to determine if they know their stuff, e.g. maths, C++, MC, PDE?(?)
 
Back
Top