Iic - Dallas, TX
posted 4 months ago
The ETL/Hadoop Developer position is a contract role based in Dallas, TX, or Charlotte, NC, with a duration of 6+ months. The ideal candidate will be an experienced ETL Developer with proficiency in Hadoop ecosystem tools such as PySpark, Hive, and Sqoop, and strong programming skills in Python. This role requires a solid understanding of data warehousing concepts and hands-on experience in designing, developing, and maintaining ETL processes for large-scale data sets. The candidate will be responsible for implementing ETL processes that extract, transform, and load data from various sources into the data warehouse, ensuring data quality and compliance with governance policies. The successful candidate will collaborate with data architects, data engineers, and other stakeholders to understand data requirements and implement effective solutions. They will optimize ETL workflows for performance and scalability, write complex SQL queries for data transformation, and develop automation scripts for scheduling and monitoring ETL jobs. Troubleshooting and resolving issues related to data consistency and performance tuning will also be key responsibilities. Documentation of ETL processes, data mappings, and data dictionaries will be essential for reference and reporting purposes.