Key Information

Register
Submit
The challenge is finished.

Challenge Overview

AWS Account is required, as you need to use several services in AWS.

Previously, we have a tool for executing queries in a directory and generate export data in another output directory.

For this challenge, we'd like to improve this tool, by making a shell script that can do like

1. Backup the export data and clean up the export directory
2. Execute the export command to generate the csv files in the export directory
3. If the execution of the above is scuessful, we can upload the generated csv files into S3.
4. then use the COPY command to load the data into redshift.
5. if the execution of the export is fail, we will print proper log for later analysis.

Documentation
You are expected to write a clean setup guide for Linux environment like Ubuntu, so we can execute the shell script above properly

About Informix Database
You can use docker to run the informix database locally, like 

docker run -it -p <informix_port_1>:2021 appiriodevops/informix:1.2

About COPY command 
We have a load script (attached in forum), which you can reference.

About Upload to S3 command
We can use AWS CLI to upload files to s3, please check http://docs.aws.amazon.com/cli/latest/reference/s3/

About Testing
For export against the informix database may return empty files, you need to do some manually test, like put some dummy data in export directory, and making sure S3 upload the redshift load is working properly.

 

 

Final Submission Guidelines

1. Setup Guide for Linux Environment that can successfully execute the tool.
2. Updated Shell Script to do the steps as above.
3. Verfication Steps.

ELIGIBLE EVENTS:

2016 TopCoder(R) Open

REVIEW STYLE:

Final Review:

Community Review Board

Approval:

User Sign-Off

SHARE:

ID: 30054411