While all Topcoder tracks – design, development and algorithms have a fierce competitor pool and cut-throat competition, the number of development ‘code’ challenges launched on a weekly basis outweighs the others by sheer volume.
I’ve seen several new faces who’ve not only participated in code challenges in the last year, they’ve done so with a lot of consistency winning several challenges. What does it take to win a Topcoder development challenge? How do you beat other submissions and come out on top? How much time should you spend understanding requirements vs. doing actual development? How much should you test your submission?
While, there’s no magic wand to it but there’s little doubt that the ‘Review’ phase is the often the critical factor which differentiates the winning submissions from the rest. How do reviewers typically check and evaluate submissions? Let’s try and decode a reviewer’s mindset and uncover some must do’s and don’ts which would be helpful for you as a submitter.
Read the spec again (and again)!
While we’re gradually standardizing templates for challenge specifications, no requirement or spec is perfect and there may be areas which are unclear/ subject to individual interpretation. Make sure you understand the scope and requirements correctly. That’s the first step towards a successful submission.
When in doubt, ask!
The forums are your biggest lifeline during any challenge and the copilot & PM are your best friends. Whenever you have a doubt, just ask in the forums. Remember, there are no stupid questions, only stupid answers. Never shy away from clearing your doubts in the forums.
Eat Your Own Dog Food!
Always make sure to test your submission before the review phase begins. Over the years, I’ve seen all kinds of mistakes made by submitters which often leads to their hard work and efforts on challenges getting wasted. These include
- Submitting to the wrong challenge (yes, that happens!)
- Submitting code that won’t compile/ run (Works on my machine syndrome!)
- Submitting a patch which doesn’t work (wrong commit version, faulty patch)
In short, put yourself in a reviewer’s shoes and check the submission as they would. It’s always a good idea to download your submission from Online Review (OR) and double check that everything is as you intended it to be.
As a submitter, always focus on providing clear deployment, configuration and verification steps. This way, you are actually helping yourself and helping the reviewers. Of course, some challenges have straight forward requirements therefore it’s not mandatory to do so in such cases.
In general, though, the more detailed and unambiguous you are, the better it is for reviewers and your own sake.
A Picture is worth a thousand words, a Video is worth a million!
It’s always a good practice to include screenshots as part of your documentation as it helps eliminate doubt. This is especially true if the environment setup for any challenge is complex. Similarly, some challenges often require submitters to provide a video. In some cases, copilots add this requirement so they can share it with the clients. But there may also be scenarios where it’s required so it can help reviewers configure and verify submissions.
I hope these guidelines help you understand the reviewer’s approach and produce better quality submissions for future challenges. Good Luck!