Bytedance - San Jose, CA

posted about 1 month ago

Full-time - Mid Level
San Jose, CA
Professional, Scientific, and Technical Services

About the position

The Analytics Engineer for Global Internal Audit at ByteDance plays a crucial role in enhancing the audit team's capabilities by developing and maintaining data products that support continuous auditing and risk identification. This position involves leveraging data engineering and analytics skills to create efficient data solutions, ensuring data quality, and fostering collaborative relationships with stakeholders across various business verticals.

Responsibilities

  • Develop and maintain data warehouses to support audit engagements and implement data quality checks.
  • Master the data tools and systems inventory, serving as an expert and trainer on the team.
  • Partner with data analysts and auditors to build and maintain data models for risk indicators dashboards.
  • Develop AI-enabled tools that automate the evaluation of controls and improve audit efficiency.
  • Organize the relationship between business processes, risks, and data to empower the audit team.
  • Develop and maintain collaborative relationships with stakeholders across different business verticals.
  • Provide data engineering support for audit engagements, including developing queries and deploying data quality checks.
  • Continue professional development in data engineering practices, machine learning, AI, and ByteDance products.

Requirements

  • Strong proficiency in SQL and at least one programming language (Python or R).
  • Experience with data integration, ETL processes, and large-scale data processing systems.
  • Working knowledge of cloud-based infrastructure (AWS, GCP, Azure, or Snowflake).
  • Experience in implementing data quality checks or data observability platforms.
  • Experience in building and maintaining data products for continuous audit programs.
  • 5+ years of practical experience in data engineering or analytics engineering.

Nice-to-haves

  • Bachelor's degree or above in a quantitative discipline (Mathematics, Statistics, Computer Science, etc.).
  • Working knowledge of large-scale data processing techniques (Hadoop, Flink, MapReduce).
  • Good understanding of data warehouse and data modeling principles.
  • Experience in a decentralized data environment.
  • Strong business acumen and stakeholder management skills.
  • Good presentation and storytelling skills.

Benefits

  • Hybrid work model with three days in the office per week.
  • Continuous education and professional development opportunities.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service