We have a server on AWS that is used for encoding multiple videos at one time (we encode 6 videos at once). On startup the EBS volume is currently mounted to the encoding instance with all the source videos to be encoded stored on the EBS. We currently use a simple bash script to encode a single video ("encode [login to view URL]" ) and the encode script ([login to view URL]) creates a final mp4 file.
The goal here is to further automate the process. The freelancer will create a script that will allow us to upload a csv file with the source filenames, parse the csv file and cause the encoding script to encode all the videos on the csv file (but only send 6 at one time).
What we already have
1. a script and all applications/software that encodes a single video "encode [login to view URL]"
2. A csv list that contains the source filenames for encoding (this csv can be in any format the developer needs)
We need to do the following:
2. copy all source videos on the csv list (6 files) from the EBS volume to the instance storage. The source filename contains the directory where the source file is stored. so the directory structure is:
/cat
/dog
/elephant
and the filenames are [login to view URL], [login to view URL], [login to view URL] so in the first example cat-123 is in the cat directory
3. create a script or process that sends all 6 files at one time for encoding in parallel.
'encode [login to view URL]'
'encode [login to view URL]'
encode [login to view URL]'
if their are more than 6 videos on the csv file send the remaining as the first 6 finish.
4. copy the final mp4 files to S3
This project needs to be done asap.
There are much better ways to accomplish this using SQS queue and other web services. this is a "baby step" for us.