USAA - Plano, TX

posted 2 months ago

Full-time - Entry Level
Plano, TX
Credit Intermediation and Related Activities

About the position

At USAA, we are dedicated to facilitating the financial security of millions of U.S. military members and their families. As a Data Engineer at the intermediate level, you will be part of the Channel Data Hub team, which is responsible for delivering data from various sources to enable reporting and analytics across the organization. This role involves working with a range of technologies, including Data Stage, Snowflake, AWS, Python, and Cloudera. You will engage in all phases of the data management lifecycle, from gathering and analyzing requirements to collecting, processing, storing, securing, and archiving data. Your contributions will help develop and maintain technical systems for data reporting and solutions that align with business objectives. In this position, you will participate in the full life cycle of data engineering, which includes analysis, solution design, data pipeline engineering, testing, deployment, scheduling, and production support, all under the guidance of senior team members. You will assist in implementing technical solutions for data reporting and analytics systems, design and write test scripts to verify data integrity, and review existing test scripts for understanding. Familiarity with IT Change and Release Management best practices is essential, as you will deploy data pipeline code and participate in design and code review sessions. You will also engage in Agile ceremonies, such as daily standups and iteration planning, to ensure effective collaboration within the team. As you grow in this role, you will develop an understanding of data management best practices through training and documentation review, while also learning about new and emerging technologies in the data engineering space. Your ability to identify, measure, monitor, and control risks associated with business activities will be crucial in maintaining compliance with policies and procedures.

Responsibilities

  • Participates in the full life cycle of data engineering including analysis, solution design, data pipeline engineering, testing, deployment, scheduling, and production support with guidance from senior team members.
  • Assists in the implementation of technical solutions for data reporting and analytic systems.
  • Assists with designing and writing test scripts to verify data integrity and application functionality.
  • Reviews functionality of existing test scripts for understanding.
  • Demonstrates familiarity with IT Change and Release Management standard methodologies.
  • Deploys data pipeline code with assistance from senior team members.
  • Participates in design and code review sessions.
  • Actively participates in Agile ceremonies such as daily standup, iteration planning, backlog grooming, and retrospective sessions.
  • Develops intermediate familiarity of data management standard methodologies by participating in trainings, reviewing documentation, and reading code from existing solutions.
  • Demonstrates knowledge and understanding of business products and processes.
  • Assists senior team members in breaking down business features into technical stories and approaches.
  • Actively learns about new and emerging technologies in the data engineering space and seeks to apply takeaways in current and future projects.
  • Ensures risks associated with business activities are effectively identified, measured, monitored, and controlled in accordance with risk and compliance policies and procedures.

Requirements

  • Bachelor's degree; OR 4 years of related experience may be substituted in lieu of degree; OR Approved certification from CodeUp, Galvanize, VetFIT or eFIT.
  • 2 years of data engineering, data analysis or software development experience implementing data solutions.
  • Working experience in SQL and Relational Databases.
  • Basic understanding of Agile methodology practices.
  • Strong analytical and problem-solving skills.
  • Working experience in Cloud technologies and tools.

Nice-to-haves

  • 2 years' experience in Data Stage, ETL/ELT, Data Warehousing, SQL
  • 1-2 years' experience in Kafka, Python, Snowflake.

Benefits

  • Comprehensive medical, dental and vision plans
  • 401(k)
  • Pension
  • Life insurance
  • Parental benefits
  • Adoption assistance
  • Paid time off program with paid holidays plus 16 paid volunteer hours
  • Various wellness programs
  • Career path planning and continuing education assistance.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service