ico-arrow-big-left

TCO - Hercules Fog Recorder - Unit Test Expansion and Cleanup

Key Information

Register
Submit
The challenge is finished.
Show Deadlines

Challenge Overview

The Fog recorder application is a C++ application that allows a user to schedule recordings for TV shows and record live TV. We use this project to test certain functionality of client applications, which are external to just the recorder.

The recorder is a Mac application that can be opened and run in XCode, and it includes Makefiles that can be used on Linux, and there are also ways to build for Windows.

The Fog application exposes a set of REST API calls that can be found here:

http://docs.fogrecorder.apiary.io

Code

The existing code is here:

https://gitlab.com/hercules-fog/Fog-CLI/

A link will be provided in the forum where you will be able to get access.

Submission

Your submission to OR should be a Git patch file that can be applied to see the fixes. Here is documentation on how to generate a Git patch file:

https://ariejan.net/2009/10/26/how-to-create-and-apply-a-patch-with-git/

Requirements

Currently, we have some cursory tests working right now (see the unit_tests folder in the repo).  This challenge will:

* Clean up the tests to work with the very latest Fog code
* Ensure that the latest NPM plugins are used
* Add a new test, detailed below:

New test

We are going to start to add expanded tests that are much more involved.  The code is relatively stable and is moving to a wide production release, so we want to add tests that ensure we aren't regressing on anything.

For this challenge, please add a new test to:

1. Start a recording of a configurable URL (default to http://odol-atsec-min-02.linear-chi-pil.xcr.comcast.net/ESPND_HD_CHI_11299_1_8150563131777250163.m3u8)
2. When a recording is running, let it run for a configurable interval, like 5 minutes.
3. While the recording is running, check a set of values at a configurable interval to ensure the various functions are proper:
3. While the recording is happening, ensure that:
* The manifest file returned by the recording URL in fog is correct (for the local fog server).  The manifest should be updated regularly with new fragments and details
* Ensure that the fragment files are downloaded properly and that the fragments downloaded match the expected bitrates as configured in the config.json
* Ensure that the SAP is downloaded, if the config.json is configured for SAP.
* Ensure that the memory usage doesn't grow rapidly and that CPU usage is minimal for the fog application (under a configurable threshold, like 10%)
* Ensure that the HTTP request for the main manifest, sub-manifests, and individual fragments all work quickly and return the expected data

The expectation is that this test must be robust.  We want a lot of work put into implementing this single test to ensure that the recording is working properly, the local manifest is being updated with the fragments, the fragments are downloading quickly, and that the correct bitrates are being downloaded.

 

 

Final Submission Guidelines

Please see above

Reliability Rating and Bonus

For challenges that have a reliability bonus, the bonus depends on the reliability rating at the moment of registration for that project. A participant with no previous projects is considered to have no reliability rating, and therefore gets no bonus. Reliability bonus does not apply to Digital Run winnings. Since reliability rating is based on the past 15 projects, it can only have 15 discrete values.
Read more.

ELIGIBLE EVENTS:

2016 TopCoder(R) Open

REVIEW STYLE:

Final Review:

Community Review Board
?

Approval:

User Sign-Off
?

CHALLENGE LINKS:

Review Scorecard

?