Verizon Communications - Irving, TX

posted about 2 months ago

Full-time - Principal
Remote - Irving, TX
Telecommunications

About the position

The Principal Data Engineer at Verizon is a pivotal role responsible for expanding and optimizing the company's data assets, data pipeline architecture, data flow, and data curation. This position plays a crucial part in enabling various network programs, including Network Performance Experience, Operational Excellence, and Workforce Optimization. The role combines IT expertise with architecture and design, requiring a comprehensive understanding of network service assurance and network operations data. This data will be utilized for Business Analytics, Operational Analytics, Text Analytics, Data Services, and the development of Big Data Solutions for different Verizon Business units. In this role, you will define and drive end-to-end data pipeline deployments, including creating a roadmap, designing the architecture, and ensuring execution excellence with metrics focused on business outcomes. You will also be responsible for driving data harmonization components that generate business value for network services and operations, enabling proactive actions in a timely manner. Additionally, you will support the building of prototypes and proof of concepts to validate integrated technologies and products, and then lead these POCs through to implementation. Oversight and collaboration on pipeline implementations will be essential to ensure governance, quality, and compliance. You will also lead junior team members, enhancing their technical and functional skills. Utilizing your in-depth understanding of data warehousing technologies, along with a broad knowledge of both on-premises and cloud deployments, you will help shape the future of the enterprise-wide network big data ecosystem. Meeting business objectives will involve delivering high-quality, on-time, and on-budget solutions by leveraging a global talent pool, including employees, T&Ms, and SOW labor. You will work closely with onshore teams to oversee the execution of tasks such as the development and population of data/ETL pipelines, testing results with end users, and providing operational support.

Responsibilities

  • Expand and optimize data assets, data pipeline architecture, data flow, and data curation.
  • Define and drive end-to-end data pipeline deployments, including roadmap and design.
  • Ensure execution excellence with business outcome-oriented metrics.
  • Drive data harmonization components for business value generation.
  • Support the building of prototypes and proof of concepts for integrated technologies.
  • Provide oversight and collaborate on pipeline implementations to ensure governance and quality.
  • Lead junior team members in technical and functional skills development.
  • Utilize knowledge of data warehousing technologies for enterprise-wide network big data ecosystem.
  • Deliver high-quality, on-time, and on-budget solutions utilizing a global talent pool.
  • Work with onshore teams to oversee execution of data/ETL pipeline development and operational support.

Requirements

  • Bachelor's degree or four or more years of work experience.
  • Six or more years of relevant work experience.
  • Strong communication, presentation, and influencing skills.
  • Experience collaborating with data engineers, architects, data scientists, and enterprise platform teams.

Nice-to-haves

  • Bachelor's degree in Computer Science or related Engineering field and 10+ years of work experience.
  • BS/MS in Computer Science, Information Science, Engineering, or related fields.
  • Programming experience in Back End Systems Development with database technologies (SQL, NOSQL).
  • Hands-on experience in designing and building data pipelines using Python, Spark, Flink, or Java.
  • Fluent understanding of best practices for building Data Lake and analytical architectures on Cloud Big Data toolsets, preferably Google Cloud Platform and Hadoop.
  • Strong ability to apply a business mindset to data issues and initiatives.
  • Knowledge of data governance practices and emerging trends.
  • Familiarity with Agile Development methodologies and enabling tools (Jira, CI/CD, DevOps).
  • Experience identifying areas for performance improvement and collaborating with Engineering/Planning/Operations.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service