Avenues - New York, NY

posted 3 months ago

Full-time
New York, NY
Professional, Scientific, and Technical Services

About the position

We are seeking an experienced ETL / Advanced SQL / Dimension Modeling Developer to join our team for a 6+ month engagement in New York City. This position requires a strong background in ETL tools, particularly with Informatica, and advanced SQL capabilities. The ideal candidate will have over 11 years of experience in ETL processes and a deep understanding of data warehousing methodologies, including dimensional data modeling. The role involves working closely with business users to gather requirements and manage project scope effectively. The successful candidate will also have experience in programming within a Linux/UNIX environment, including shell scripting, and proficiency in Python. Familiarity with big data technologies such as Hadoop, Spark, and HIVE is essential, as the role will involve working with extremely large data volumes, potentially exceeding 20 TBs. Knowledge of database design techniques and VLDB performance aspects, such as table partitioning and optimization techniques, will be advantageous. Additionally, experience with reporting tools like QlikSense, Tableau, or Cognos will be considered a plus. This position is based in NYC, requiring the candidate to be onsite three days a week, allowing for collaboration with team members and stakeholders. The role is critical in ensuring the successful implementation and management of data solutions that meet the needs of the business.

Responsibilities

  • Develop and implement ETL processes using Informatica in an enterprise environment.
  • Utilize advanced SQL for data manipulation and querying.
  • Design and model dimensional data structures for data warehousing.
  • Collaborate with business users to gather requirements and manage project scope.
  • Work with large data volumes and ensure efficient data processing.
  • Program in a Linux/UNIX environment, including shell scripting.
  • Develop applications using Python for data processing and analysis.
  • Engage with big data technologies such as Hadoop, Spark, and HIVE.

Requirements

  • 11+ years of experience in ETL tools, with specific expertise in Informatica.
  • Advanced SQL capabilities are required.
  • Strong understanding of data warehousing methodologies and ETL processing.
  • Experience with dimensional data modeling.
  • Demonstrated ability to work with business users to gather requirements.
  • Experience programming in a Linux/UNIX environment, including shell scripting.
  • Proficiency in Python programming.

Nice-to-haves

  • Experience with large database and data warehouse implementation (20+ TBs).
  • Understanding of VLDB performance aspects, such as table partitioning and optimization techniques.
  • Knowledge of reporting tools such as QlikSense, Tableau, or Cognos.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service