Deloitte - Atlanta, GA

posted 2 months ago

Full-time - Mid Level
Atlanta, GA
Professional, Scientific, and Technical Services

About the position

The Cloud Data Engineer position at Deloitte is designed for experienced professionals who are passionate about technology and eager to work in a collaborative environment. This role focuses on building and managing data systems and pipelines in cloud environments, specifically targeting data analytics and reporting. The position emphasizes a hands-on approach to problem-solving and innovation, with minimal travel requirements, making it ideal for those looking to contribute to impactful projects without extensive travel demands.

Responsibilities

  • Evaluate business needs and priorities, liaise with key business partners, and address team needs related to data systems and management.
  • Translate business requirements into technical specifications, establishing and defining details, definitions, and requirements of applications, components, and enhancements.
  • Participate in project planning, identifying milestones, deliverables, and resource requirements; track activities and task execution.
  • Generate design, development, test plans, detailed functional specifications documents, user interface design, and process flow charts for programming execution.
  • Develop data pipelines/APIs using Python, SQL, and potentially Spark and AWS, Azure, or GCP methods.
  • Build large-scale batch and real-time data pipelines with data processing frameworks in AWS, Azure, or GCP cloud platforms.
  • Move data from on-prem to cloud and manage cloud data conversions.

Requirements

  • 3+ years of experience in data engineering with an emphasis on data analytics and reporting.
  • 3+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Amazon Web Services (AWS), Google Cloud Platform (GCP).
  • 3+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across multiple Database Platforms (Cassandra, MySQL, Snowflake, PostgreSQL, Redshift, Azure SQL Data Warehouse, Databricks, etc.).
  • 3+ years of experience in designing and building data extraction, transformation, and loading processes by writing custom data pipelines.
  • 3+ years of experience with one or more scripting languages: Python, SQL, Kafka, or others.
  • 3+ years of experience designing and building solutions utilizing various Cloud services such as EC2, S3, EMR, Kinesis, RDS, Redshift/Spectrum, Lambda, Glue, Athena, API gateway, etc.
  • Bachelor's degree in Computer Science, Information Technology, Computer Engineering, or related IT discipline, or equivalent experience.
  • Must be legally authorized to work in the United States without the need for employer sponsorship.

Nice-to-haves

  • AWS, Azure, and/or Google Cloud Platform Certification.
  • Master's degree or higher.
  • Expertise in one or more programming languages, preferably Scala, PySpark, and/or Python.
  • Experience working with either a Map Reduce or an MPP system on any size/scale.
  • Experience working with agile development methodologies such as Sprint and Scrum.

Benefits

  • Broad range of employee benefits including health insurance, retirement plans, and professional development opportunities.
  • Diversity, equity, and inclusion initiatives that empower employees to contribute their unique perspectives.
  • Opportunities for mentorship and leadership development.
  • Commitment to sustainability and community impact.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service