Soundscapes Marathon Match

Register
Submit a solution
The challenge is finished.

Challenge Overview

Prize Distribution

Prize                     USD

Main Prizes (Non-Speech)

1st                  $27,000

2nd                  $19,000

3rd                  $13,500

4th                   $8,000

5th                   $6,000

Academic bonus (2)    $10,000 each

Secondary Prizes (With Speech)

1st                $8,000

2nd                $5,750

3rd                $3,250

Total Prizes        $100,000

 

Challenge Overview

The National Geospatial-Intelligence Agency (NGA) is the nation's primary source of geospatial intelligence (GEOINT). NGA provides GEOINT in support of U.S. national security and defense, as well as disaster relief. GEOINT is the exploitation and analysis of imagery and geospatial information that describes, assesses and visually depicts physical features and geographically referenced activities on the Earth.

Currently, geo-locating the source of video and audio recording on Earth is difficult. NGA seeks to determine if locations can be classified using machine learning methods based on sound.

NGA seeks a novel approach for using non-speech ambient sound as a means of geo-locating video and audio recordings. NGA is in search of innovative methods of identifying, analyzing, and modelling these sound and acoustic scene indicators to uniquely classify audio recordings as originating in one of nine cities. Solvers’ responses will include a white paper describing their technical approach; for each test file, the developer must indicate the city which the video originated from, and for each file, solvers will provide the confidence level generated by their method for each of the nine cities. Top-tier-scoring solvers will be invited to present a paper containing a description of their methodology and next steps at a workshop to be held in 2020.

 

In this challenge, solvers must guess the location of recording of short non-speech sound files. Each file was recorded in one of 9 cities over the world. The name of the cities is not disclosed. Your task is to create an algorithm that for each sound file in the test set outputs probabilities of the sound originating in each of these 9 cities. The quality of your algorithm will be scored using the logarithmic loss metric, see the Scoring section for details.

Input Files

This challenge features three types of files: sound files, speech marker metadata files, and ground truth files.
 
  • Sound files are available in FLAC format. There are two kinds of sound files: those that contain speech and those that do not. The latter are created by running a proprietary speech detection tool and setting the volume to 0 at the sections classified as speech.

  • Speech marker files are regular text files with the following format:
    <start><TAB><end><TAB>SPEECH
    where <start> and <end> specify the beginning and ending time (in seconds) of a section within the sound file classified as speech. The file may contain more than one lines if there are multiple such sections.

  • The ground truth file gives the known location for each sound file in the training set, one file per line. It is a regular text file with the following format:
    <id><TAB><city>
    where <id> is the unique identifier (a 10-digit number) of the sound file (without the .flac extension), <city> is the identifier of the location of recording (a one-letter code in [A,...,I]).

 
  • For training you'll have access to the original version of the sound files that DO contain speech. In addition, the speech marker files are available, so it is up to you whether you use or ignore those sections of the files that have speech. So as not to duplicate the size of training data we don't provide the speech-removed version of the files, however we publish a tool that we used internally to create the speech-removed files.

  • For testing (during the online phase of the contest) your algorithm must process sound files that DO NOT contain speech. 

  • Final ranking will be established during the final testing phase (also known as validation phase), again, your algorithm must process sound files that DO NOT contain speech. However, for a special bonus, the algorithms will also compete using the original version of the files containing speech. The scores achieved in this additional step won't influence the final ranking and the distribution of the main prizes.

 

Downloads

Input files are available for download from Google Drive. 

 
  • train.zip is the training set. It contains sound files, speech marker files and the ground truth file. Most of the sound files doesn't contain speech at all, the corresponding speech marker files are empty. The size of the training set is 14.5 GB.

  • train_sample.zip contains a small (75 MB) subset of the training data. Use this if you want to get familiar with the data without having to download any of the large files.

  • test.zip is the testing set that you should use to submit results during the provisional testing phase of the challenge. It contains only sound files. The size of the test set is 6.5 GB.

Note that the training set is unbalanced in the sense that for certain cities there are significantly more training files available than for other cities. The provisional and final test sets are more - but not completely - balanced, the exact ratios of test files per cities are not disclosed.

 

Output file

Your output must be a CSV file having the following format:

  • id,A,B,C,D,E,F,G,H,I

Your output file may or may not include the above header line. The rest of the lines should specify the probabilities your algorithm assigns to each city, for each sound file in the test set, one per line.

Each probability value must be a real number between 0 and 1 (inclusive). It's not required that the 9 probability values add up to 1, during score calculation the probability vector will be normalized. A sample line:

1991899913,0.1,0.5,0.2,0.3,0.2,0.1,0,0,0.3

Your output must be a single file with the name solution.csv

 

Submission format and code requirements

This match uses a combination of the "submit data" and "submit code" submission styles. The required format of the submission package is specified in a submission template document. This current document gives only requirements that are either additional or override the requirements listed in the template.

  • You must not submit more often than 3 times a day. The submission platform does not enforce this limitation, it is your responsibility to be compliant to this limitation. Not observing this rule may lead to disqualification.

  • An exception to the above rule: if your submission scores 0, then you may make a new submission after a delay of 1 hour. 

  • The /solution folder of the submission package must contain the solution.csv file, which should be formatted as specified above in the Output file section and must list probability values for all sound files in the test set. 

Scoring

During scoring, your solution.csv file (as contained in your submission file during provisional testing, or generated by your docker container during final testing) will be matched against  expected ground truth data using the following method.

If your solution is invalid (e.g. if the tester tool can't successfully parse its content), you will receive a score of -1.

If your submission is valid, your score will be calculated using an ROC AUC based metric. The area under the ROC curve is calculated for each class using a One Versus All scheme, then the score will be the unweighted average of these areas, scaled to the [0..100] range. For the exact algorithm of scoring see the source code of the scorer tool.

Final testing

This section details the final testing workflow, and the requirements against the /code folder of your submission are also specified in the submission template document. This current document gives only requirements or pieces of information that are either additional or override those given in the template. You may ignore this section until you decide you start to prepare your system for final testing. 

  • The signature of the train script is as given in the template:
    train.sh <data_folder>
    The supplied <data_folder> parameter points to a folder having the training data in the same structure as is available for you during the coding phase, zip files already extracted. The supplied <data_folder> will contain all sound and speech marker files. The ground truth CSV file will also be available in the same folder.

  • The allowed time limit for the train.sh script is 8 GPU-days (2 days on a p3.8xlarge with 4 GPUs). Scripts exceeding this time limit will be truncated.

  • A sample call to your training script (single line) follows. Note that folder names are for example only, you should not assume that the exact same folders will be used in testing.
    ./train.sh /data/soundscapes/train/
    In this sample case the training data looks like this:
      data/

    soundscapes/
      train/
        0000322837.flac
        0000322837.txt
        ... etc., other .flac and .txt files

        train_ground_truth.csv
 

  • The signature of the test script:
    test.sh <data_folder> <output_file>
    The testing data folder contains similar sound files as are available for you during the coding phase.

  • The allowed time limit for the test.sh script is 12 GPU-hours (3 hours on a p3.8xlarge with 4 GPUs) when executed on the full provisional test set (the same one you used for submissions during the contest). Scripts exceeding this time limit will be truncated.

  • A sample call to your testing script (single line) follows. Again, folder and file names are for example only, you should not assume that the exact same names will be used in testing.
    ./test.sh /data/soundscapes/test/ /wdata/my_output.csv
    In this sample case the testing data looks like this:
      data/

    soundscapes/
    test/
      0000054571.flac
      0001416626.flac
      ... etc. other sound files

      

  • To speed up the final testing process the contest admins may decide not to build and run the dockerized version of each contestant's submission. It is guaranteed however that at least the top 10 ranked submissions (based on the provisional leader board at the end of the submission phase) will be final tested.

  • Hardware specification. Your docker image will be built, test.sh and train.sh scripts will be run on a p3.8xlarge Linux AWS instance. Please see here for the details of this instance type.

 

Supporting Literature and Additional Resources

 
  1. Friedland, G., Vinyals, O. and Darrell, T., 2010, October. Multimodal location estimation. In Proceedings of the 18th ACM international conference on Multimedia (pp. 1245-1252). ACM.

  2. Lei, H., Choi, J. and Friedland, G., 2012, March. Multimodal city-verification on flickr videos using acoustic and textual features. In 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (pp. 2273-2276). IEEE.

  3. Lei, H., Choi, J. and Friedland, G., 2011. City-identification on Flickr videos using acoustic features. International Computer Science Institute, Technical Report TR-11-001.

  4. Barchiesi, D., Giannoulis, D., Stowell, D. and Plumbley, M.D., 2015. Acoustic scene classification: Classifying environments from the sounds they produce. IEEE Signal Processing Magazine, 32(3), pp.16-34.

  5. A speech-removal tool that you can optionally use to create the speech-removed version of sound files from a .flac file and the corresponding speech marker metadata file.

 

General Notes

 
  • This match is NOT rated.

  • Teaming is allowed. Topcoder members are permitted to form teams for this competition. After forming a team, Topcoder members of the same team are permitted to collaborate with other members of their team. To form a team, a Topcoder member may recruit other Topcoder members, and register the team by completing this Topcoder Teaming Form. Each team must declare a Captain. All participants in a team must be registered Topcoder members in good standing. All participants in a team must individually register for this Competition and accept its Terms and Conditions prior to joining the team. Team Captains must apportion prize distribution percentages for each teammate on the Teaming Form. The sum of all prize portions must equal 100%. The minimum permitted size of a team is 1 member, the maximum permitted team size is 5 members. Only team Captains may submit a solution to the Competition. Topcoder members participating in a team will not receive a rating for this Competition. Notwithstanding Topcoder rules and conditions to the contrary, solutions submitted by any Topcoder member who is a member of a team on this challenge but is not the Captain of the team are not permitted, are ineligible for award, may be deleted, and may be grounds for dismissal of the entire team from the challenge. The deadline for forming teams is 11:59pm ET on the 21th day following the date that Registration & Submission opens as shown on the Challenge Details page. Topcoder will prepare a Teaming Agreement for each team that has completed the Topcoder Teaming Form, and distribute it to each member of the team. Teaming Agreements must be electronically signed by each team member to be considered valid. All Teaming Agreements are void, unless electronically signed by all team members by 11:59pm ET of the 28th day following the date that Registration & Submission opens as shown on the Challenge Details page. Any Teaming Agreement received after this period is void. Teaming Agreements may not be changed in any way after signature.
    The registered teams will be listed in the contest forum thread titled “Registered Teams”.

  • Organizations such as companies may compete as one competitor if they are registered as a team and follow all Topcoder rules.

  • Relinquish - Topcoder is allowing registered competitors or teams to "relinquish". Relinquishing means the member will compete, and we will score their solution, but they will not be eligible for a prize. Once a person or team relinquishes, we post their name to a forum thread labeled "Relinquished Competitors". Relinquishers must submit their implementation code and methods to maintain leaderboard status.

  • In this match you may use open source languages and libraries, and publicly available data sets, with the restrictions listed in the next sections below. If your solution requires licenses, you must have these licenses and be able to legally install them in a testing VM (see “Requirements to Win a Prize” section). Submissions will be deleted/destroyed after they are confirmed. Topcoder will not purchase licenses to run your code. Prior to submission, please make absolutely sure your submission can be run by Topcoder free of cost, and with all necessary licenses pre-installed in your solution. Topcoder is not required to contact submitters for additional instructions if the code does not run. If we are unable to run your solution due to license problems, including any requirement to download a license, your submission might be rejected. Be sure to contact us right away if you have concerns about this requirement.

  • The contest's stakeholders allow the usage of programming languages, libraries and data sets having the following open source license types:

  • You may be allowed to use other license types but that requires explicit approval from the client. If you intend to use code or data having a license type not listed above, post a question in the forum thread titled “Requested License Types”, not later than 14 days before the end of the online submission phase. The organizers will make a reasonable effort to verify your request quickly, however, using an alternative solution having a license type already approved is a preferred and faster way.

  • If your solution includes licensed software (e.g. commercial software, open source software, etc.), you must include the full license agreements with your submission. Include your licenses in a folder labeled “Licenses”. Within the same folder, include a text file labeled “README” that explains the purpose of each licensed software package as it is used in your solution.

  • External data sets and pre-trained networks are allowed for use in the competition provided the following are satisfied:

    • The external data and pre-trained network dataset are unencumbered with legal restrictions that conflict with its use in the competition.

    • The data source or data used to train the pre-trained network is defined in the submission description.

    • The external data source must be declared in the competition forum not later than 14 days before the end of the online submission phase to be eligible in a final solution. References and instructions on how to obtain are valid declarations (for instance in the case of license restrictions). If you want to use a certain external data source, post a question in the forum thread titled “Requested Data Sources”. Contest stakeholders will verify the request and if the use of the data source is approved then it will be listed in the forum thread titled “Approved Data Sources”.

  • Use the match forum to ask general questions or report problems, but please do not post comments and questions that reveal information about possible solution techniques.

 

Award details and requirements to Win a Prize 

 

Final prizes

In order to receive a final prize, you must do all the following:

Achieve a score in the top five according to final system test results. See the "Final scoring" section above.

Comply with all applicable Topcoder terms and conditions.

Once the final scores are posted and winners are announced, the prize winner candidates  have 7 days to submit a report outlining their final algorithm explaining the logic behind and steps to its approach. You will receive a template that helps when creating your final report.

If you place in a prize winning rank but fail to do any of the above, then you will not receive a prize, and it will be awarded to the contestant with the next best performance who did all of the above.

 

Eligibility

This Challenge is authorized under Title 10 of United States Code § 2374a, which authorizes the Secretary of Defense to award prizes in recognition of outstanding achievements in basic, advanced, and applied research, technology development, and prototype development that have the potential for application to the performance of military missions of the Department of Defense.

To be eligible to participate in this Challenge, an individual or entity must comply with the following requirements:

  1. Register as a participant.

  2. Participants must be legal entities and individuals (non-felons) over the age of 18. Eligibility is subject to verification before monetary prizes are awarded.

  3. Participants are not eligible to receive any monetary or non-monetary prize in the challenge if they are a person or entity designated or sanctioned by the United States Treasury’s Office of Foreign Assets Control (see http://www.treasury.gov/resource- center/sanctions/SDN-List/Pages/default.aspx for additional information).

  4. An individual or entity that is determined to be on the GSA Excluded Parties List (www.sam.gov) is ineligible to receive a monetary or non-monetary prize award.

  5. NGA employees directly supporting the development or execution of this challenge and support contractors directly supporting the development or execution of this challenge, Capital Consulting Corp. and its contractors, and judges are ineligible to compete in this challenge. Likewise, members of their immediate family (spouses, children, step-children, siblings, step-siblings, parents, step-parents), and persons living in the same household, whether or not related, are not eligible to participate in any portion of this challenge. Note: The members of an individual’s household include any other person who shares the same residence as such individual for at least three months out of the year.

  6. Individuals and entities, otherwise eligible to win a monetary or non-monetary prize, may form and submit a team entry; however, each team member must be clearly identified on the team’s submission form for the team to be eligible. Failure to follow this procedure as outlined on the challenge website will disqualify the contest submission. Team winnings, as determined by the challenge sponsor, will be distributed to the designated team lead for further distribution to team members. In the event a dispute regarding the identity of the participant who actually submitted the entry cannot be resolved to NGA’s satisfaction, the affected entry will be deemed ineligible.

  7. Individuals and individual team leads selected as monetary prize winners must submit all required taxpayer identification and bank account information required to complete an electronic payment of the monetary prize. Failure to provide Capital Consulting Corp. or NGA required documents for electronic payment within 30 days of notification by Capital Consulting Corp. or NGA will result in a disqualification of the winning entry or entries.

  8. Participants may not be a Federal entity.

  9. Federal employees may not pursue an application while acting within the scope of their employment, while in the Federal workplace, or while on duty. Note: Federal ethical conduct rules may restrict or prohibit federal employees from engaging in certain outside activities; any federal employee not excluded under this paragraph seeking to participate in this challenge outside the scope of employment should consult his/her agency's ethics official prior to developing a submission.

  10. Federal grantees may not use Federal funds to develop challenge applications unless consistent with the purpose of their grant award.

  11. Federal contractors may not use Federal funds from a contract to develop challenge applications or to fund efforts in support of a challenge submission.

  12. Participants are prohibited from using NGA Federal facilities or relying upon significant consultation with NGA Federal employees to develop a submission, unless the facilities and employees were made available to all participants in this challenge on an equal basis.

 

By participating in this Challenge:

  1. Participants agree to be bound by the rules of the challenge, agree that the winner selection decisions for the challenge are final and binding

  2. Participants acknowledge that their submission may be the subject of a Freedom of Information Act (FOIA) request and that they are responsible for identifying and marking all business confidential and proprietary information in their submission.

  3. Participants agree to indemnify the Federal Government against third party claims for damages arising from or related to Challenge activities, including the use, publication, or distribution of the participant’s submission.

  4. Participants agree to assume any and all risks and waive claims against the Federal Government and its related entities, except in the case of willful misconduct, for any injury, death, damage, or loss of property, revenue, or profits, whether direct, indirect, or consequential, arising from participation in this prize contest, whether the injury, death, damage, or loss arises through negligence or otherwise.

  5. Winners agree to use a portion of their winnings to travel to the Acoustics Summit to present their winning papers.

  6. Winners agree and consent, as a condition for receiving a monetary or non-monetary prize, to the use of their name, affiliation, city and state, likeness or image, comments, and a short synopsis of their winning solution as a part of NGA’s promotion of this challenge.

  7. Winners agree to grant the United States government the license described in the Intellectual Property (IP) section below.

Additional participation rules:

Participation in this Challenge is open to individuals and entities.  Entries may only be submitted by a registered participant.

      The rules apply to all participants in this NGA Challenge and may be changed without prior notice. Participants should monitor the Challenge website for the latest information.

      Registration information collected by CCC/TopCoder will be used solely for the purpose of administering the event. Registration information will not be distributed to any parties outside of TopCoder, CCC, and NGA nor released for any other purpose except as noted in this document.

      Individual participants’ display name may be listed on the Challenge website to enable the event to be tracked by interested members of the public. The name and photographs of the winner may be posted on the NGA website and released to the media.

      NGA may contact registered participants to discuss the means and methods used in solving the Challenge.

      NGA may compute and release to the public aggregate data and statistics from the submitted solutions. Names and select information about competition winners may be publicly displayed by NGA for announcement, promotional, and informational purposes.

      Nothing in these rules, to include information on the Challenge website and communications by NGA officials, may be interpreted as authorizing the incurrence of any costs or modifying the statement of work or authorizing work outside the terms and conditions of any existing agreements or contracts with NGA.

      A submission may be disqualified if, in NGA’s sole judgment:

      Fails to function as described,

      The detailed description is significantly inaccurate or incomplete,

      Malware or other security threats are present.

      NGA Reserves the right to disqualify a participant whose actions are deemed to violate the spirit of the competition for any reason, including but not limited to: abusive, threatening, or violent behavior; attempts to reverse engineer or otherwise misappropriate the submission of another participant; or violation of laws or regulations in the course of participating in the challenge.  NGA does not authorize or consent to a participant infringing on any US patent or copyright while participating in the Challenge.

      NGA Reserves the right, in its sole discretion to (a) cancel, suspend or modify the Challenge without notice, and/or (b) modify the number and dollar amount of prizes, based on the number and quality of submissions, including not awarding any prize if no entries are deemed worthy.

      The agency’s award decision is final.

      Each individual (whether competing singly or in a group) or entity agrees to follow applicable local, State, and Federal laws and regulations.

Intellectual Property (IP) Rights

Each participant retains title and full ownership in and to their submission. Participants expressly reserve all IP rights not expressly granted.

Each winner grants to the U.S. Government a fully paid-up, non-exclusive, royalty-free, irrevocable, worldwide license and right to:

(i)             use, modify, reproduce, release, perform, display, or disclose the submission within the Government without restriction; and

(ii)            release or disclose the submission outside the Government and authorize persons to whom release or disclosure has been made to use, modify, reproduce, release, perform, display, or disclose the software or documentation for United States government purposes.

Government purpose means any activity in which the United States Government is a party, including cooperative agreements with international or multi-national defense organizations or sales or transfers by the United States Government to foreign governments or international organizations. Government purposes include competitive procurement, but do not include the rights to use, modify, reproduce, release, perform, display, or disclose computer software or computer software documentation for commercial purposes or authorize others to do so.

Use of Marks: Except as expressly set forth in the challenge Rules, Terms & Conditions, entrants shall not use the names, trademarks, service marks, logos, insignias, trade dress, or any other designation of source or origin subject to legal protection, copyrighted material or similar IP (“Marks”) of the organizers or other challenge partners, sponsors, or collaborators in any way without such party’s prior written permission in each instance, which such party may grant or withhold at its sole and absolute discretion.

Representation, Warranties and Indemnification

By entering the Challenge, each participant represents, warrants and covenants as follows:

  1. Participant is the sole author, creator, and owner of the Submission;

  2. The Submission is not the subject of any actual or threatened litigation or claim;

  3. The Submission does not and will not violate or infringe upon the IP rights, privacy rights, publicity rights, or other legal rights of any third party; and

  4. The Submission, and participants’ use of the Submission, does not and will not violate any applicable laws or regulations, including, without limitation, applicable export control laws and regulations of the U.S. and other jurisdictions.

If the Submission includes any third-party works (such as third-party content), participant must be able to provide, upon request, documentation of all appropriate licenses and releases for such third-party works. If participant cannot provide documentation of all required licenses and releases, NGA reserves the right, at its sole discretion, to disqualify the applicable Submission.

Participants must indemnify, defend, and hold harmless the Federal Government from and against all third-party claims, actions, or proceedings of any kind and from any and all damages, liabilities, costs, and expenses relating to or arising from participant’s Submission or any breach or alleged breach of any of the representations, warranties, and covenants of participant hereunder.

NGA reserves the right to disqualify any Submission that, in their discretion, violates these Official Rules, Terms & Conditions.

NGA reserves the right to cancel, suspend, and/or modify the Challenge, or any part of it, for any reason, at NGA’s sole discretion.

Approved for public release, 20-734. Approved for public release, 20-734

  Approved for public release, 20-734