Looking for professional in Hadoop using core Java and ETL
$250-750 USD
Paid on delivery
I need you to develop some software for me. I would like this software to be developed for Mac using Java. - Development experience on Hadoop - Hive , Oozie , MapReduce , Sqoop.
- Development experience in Teradata
- Experience with design and development of ETL processes
- Should be proficient in writing Advanced SQLs and expertise in performance tuning of SQLs / Hive queries.
- Programming/Scripting (Unix , Java or Python ).
- Experience with Version Control such as git.
- Understand the business process, the relationship between various data elements, aspects of ETL and logic behind a business solution.
- Benchmark application performance periodically and fix performance issues , understanding of Hadoop platform capabilities and limitations.
- Experience in developing large scale data (ETL) platforms, pipelines, warehousing, mining or analytic systems is preferred.
- Experience in developing automated test scripts to help with regression testing Sharp troubleshooting skills to identify and fix issues quickly.
- Strong core Java concepts like Multi-threading, JDBC, Data Serialization, Developing ETL using core Java
Project ID: #16648257
About the project
14 freelancers are bidding on average $643 for this job
Hello Sir/ Ma’am We are a group of Software Engineers having 10+ years of experience. Expert in java, C, C++ , C# , Android. Please check our profile for reference. Thank you
Dear Sir, I have gone through your project details and understand that you need to develop some software using hadoop and Java. I have 5 years of experience in hadoop, Java, software architecture Kindly ping m More
Hello, I have Macbook Air and I work in Bigdata/Hadoop technologies using Spark, Kafka, Cassandra, ELK stack etc. Can we talk further on this? Thanks!
Hi, Nice to meet you I am good at Java and my major is ETL. I am now working for one of biggest big data company in VN. I can provide you my cv for review. regards,