Successful people usually say that we can learn a lot more from failures than we can learn from success – seeing failures as those little puzzle pieces that need to happen in order to fit a perfect desired solution. Ok mahestro, all this profound and dramatical introduction for what? As different as Deadpool 2 writers, I ain’t no lazy writer, so I plan to explain on how to succeed at uploading a risk-free design submission for screening in Topcoder through this article. I will break down the submission process and study why we fail screening. So bear with me, I will try to guide you to set up a plan to never receive that scary “SCREENING FAILED” email.
For a start, I assume you know how to submit a design output into the Topcoder system, if you are not there yet then this how-to guide might be helpful to quickly introduce you to the submission process.
WHAT IS SCREENING REVIEW?
Screening is a thorough review process where submissions are checked to verify they are legitimate before showing them to the client. Your submission goes through an evaluation based on the famous screening scorecard, which contain parameters that your submission will be judged against. In other words, someone makes sure your submission doesn’t break any rules, doesn’t violate any copyright issues, and your asset declarations are in place.
Failing the screening review can come with some severe consequences if we’re not careful. First is frustration, knowing that after you invested a considerable amount of time into something and it is no longer eligible for competition just because of a simple declaration accident – that hurts. In addition to that is the wasted amount of priceless time, one of the most valuable assets of life. We can’t afford wasting time, can we? Among other consequences there is also a temporary suspension if you repeat the same issues several times, especially if the case is cheating. Another consequence, ironically a good one, is embracing the opportunity to learn so it never happens to us again. After all, successful people learn from failure, right?
WHY DO WE FAIL SCREENING?
Glad you asked. There are several reasons why we could fail the screening process. I explore them on this article, so allow me to shamelessly advertise myself by redirecting you to this good piece of information, where you’ll learn what to do if you fail screening. The goal of this article is to avoid the failure from happening, it’s good to have the option though, so go read it.
Does this graphic above look familiar to you? That is what your scorecard would probably look like if your submission fails screening. Among the most common cases we have:
- Used photos/icons from allowed sites but didn’t declare them.
- Used photos/icons from unapproved sites.
- Used my own photos/fonts but didn’t declare them.
- Missing source files.
- Missing screens files.
- Missing photos/icons/fonts declaration.
- Copyright violations (tracing icons, cheating, among others).
Those stated above is according to the system. They are the rules, and that’s why you would fail screening. However, I would like to dig deeper into this problem. Performing a decent analysis in order to understand why this problem has become so repetitive, to explain why in my design challenges that I run as a copilot, at least 20% of the submissions are failing the screening process. It seems to be an outrageous number but it is true, 2 out of 10 submissions fail screening nowadays. We could easily scream out the first and easiest option: there are a lot of newcomers. Too easy, it could be part of the reason, why not? But if it’s so common do you think it’s the problem of the people who first interact with something to fail so badly? There should be something else.
This is the part where we become humble and proficient at solving problems by acknowledging our weaknesses which leads us to the point of becoming better. The Topcoder platform is always getting updated and fixing bugs in the system. With that in mind, it’s not always perfect. There are some specific issues that once being fixed would affect the assets declaration positively. I also believe we do have our own amount of blame, as human beings we tend to err, an accident can happen to anyone, to forget to add one font out of three, normal. So I’m going to split this out to explore the reasons of screening failures. I wasn’t joking with the lazy writing thing at the beginning, you’re going to read a lot of design analysis today.
Upload Page Improvements
The knowledge in the submission page can be improved to help the knowledge in our heads to fulfil the assets declaration successfully. Fortunately, there is a fix coming soon to straighten this out but for education purposes alongside with the goal of this article, I’m going to mention the targeted issues to be fixed:
- List of approved external sites (stock artwork/photo). It will help to have an upfront view of the sites we can extract stock artwork and photos from. Right now we have it linked there, we just need to add recent entries.
- Fonts declaration constraints. In order to declare fonts properly users might benefit of having an autocomplete textfield which suggests the fonts while typing, those allowed by the system for being considered common (Helvetica, Arial, Cambria, among others).
We have to know that the type of human error when we accidentally forget to declare stocks or to upload requirements is a slip, not a mistake. We set the goal of uploading the content, which is right, however we don’t plan or execute the task as expected by the environment, that’s the slip favorite playground, your subconscious. The reason why even the most skilled designers end up repeating this behaviour so many times, is that we are so used to uploading submissions that we tend to do it in an automated way, and when this happens we are more likely to produce slips. It’s different when you do an activity for the first time, since you haven’t mastered it you are more careful, paying more attention to details, having a conscious mind.
WHAT CAN WE DO? THE ULTIMATE GUIDE IN A NUTSHELL
All this long chity-chata comes down to this piece of information? Well, I’m not sorry for making you read a lot and patiently waiting, but analysis is important to come with useful solutions, especially if they are going to be used by humans.
I wouldn’t be that evil to just lay out the reasons why we fail, just blaming the upload page weaknesses and blaming our slips without offering some suggestions or methods to avoid this from happening. It would be too easy to criticize and point out failures from all angles. Let me share with you what I consider the set of best practices in order to minimize the risk of failing screening, wisely using the information we gathered from the analysis to create defensive mechanisms for us.
1. Keep separate folders for source/screen files
It might sound obvious but I have seen this. We could easily mix up our sources with screens if don’t properly organize the files in separate folders.
2. Notify stock/typography that are provided by you (you created them)
Another common case, maybe you have good stock photography of your own that you would like to use in the challenge to give it more personality or for whatever reason, it is allowed, but it must be declared, using the notes.
3. Keep track of fonts and photos
You should have a way to log this information. If you’re designing you should write down the fonts you’re incorporating in your design, you should store the URLs of the photos you’re using. A good practice, what I do, is to store this information in text files. At the moment of uploading the submission I just copy/paste the information immediately.
4. Ask in the forums for external sites confirmation
Do you want to use icons from fontawesome or somewhere? This reference is not in the list of approved sites but it is allowed lately (fontawesome). Whatever the case is, ask in the forums to your copilot if using a specific external source is allowed.
5. Declare assets provided by the challenge itself
Sometimes copilots can provide some photos from the client, or a common scenario where photos or icons can be extracted from a provided website owned by the client. This also needs to be declared in the form of a note.
6. Verify your submission – The Post Mortem Master Checklist
I know a checklist might seem redundant since the upload page is already requesting for the information through the UI fields but as we saw in our short analysis, the hints and information on the page are not optimal to ensure you won’t err. Checklists are ridiculously underrated tools, they are so powerful that they can seamlessly improve the accuracy of a task execution, therefore reducing error, especially slips. Familiar?
This is what I consider an effective verification list against a visual design submission:
☑️ My source zip file contains the intended PSD/Sketch/etc?
☑️ My screens/submission file contains the intended JPGs/PNGs screens?
☑️ My presentation is a JPG/PNG screen?
☑️ Did I declare all the fonts I used in the design?
☑️ Did I declare all the stock photos, icons and illustrations I used in the design?
☑️ Did I add the marvelapp shareable link?
Here’s an online version if you’re planning to use this list. I’d recommend to use this list as a post mortem asset, going through the checklist over the already uploaded submission. I mean, upload the submission, go to your submissions page, download the submission. Now use the checklist to make sure the information is in place.
7. Bonus: declare the marvelapp presentation
Most of the design challenges require you to provide a marvelapp presentation, provided by the challenge copilot. You must provide the shareable link of this presentation. You won’t fail screening if you don’t provide it, however, it will tremendously help with the challenge health, so it’s a good practice to follow
That’s all folks. I hope that the power of the root-cause analysis helps you with solving any weaknesses through the proposed good practices above, combined with the checklist as a tool to help you minimizing the human error, therefore having a perfectly beautiful one hundred score in the screening process.