Elanco Animal Health Incorporatedposted 27 days ago
Bangalore, IN
Chemical Manufacturing

About the position

At Elanco (NYSE: ELAN) - it all starts with animals! As a global leader in animal health, we are dedicated to innovation and delivering products and services to prevent and treat disease in farm animals and pets. We're driven by our vision of 'Food and Companionship Enriching Life' and our approach to sustainability - the Elanco Healthy Purpose - to advance the health of animals, people, the planet and our enterprise. At Elanco, we pride ourselves on fostering a diverse and inclusive work environment. We believe that diversity is the driving force behind innovation, creativity, and overall business success. Here, you'll be part of a company that values and champions new ways of thinking, work with dynamic individuals, and acquire new skills and experiences that will propel your career to new heights. Making animals' lives better makes life better - join our team today!

Responsibilities

  • Provide data engineering support and subject matter expertise for Elanco's Enterprise Data Platforms and Data Products
  • Monitor the health of our solutions using technologies such as Azure Data Factory, Azure Synapse Analytics, Azure Data Lake and Azure Databricks, resolving issues and delivering both operational improvements and minor enhancements.
  • Provide expertise on general data principles, services and architectures
  • Participate and help develop data engineering community of practice as a global go-to expert panel/resource.
  • Develop and evolve new or existing data engineering methods and procedures to create possible alternative, agile solutions to moderately complex problems.
  • Stay abreast with new and emerging data engineering technologies, tools, methodologies and patterns on Azure and other major public clouds.
  • Demonstrate ownership in understanding the organization's strategic direction as it relates to your team and individual goals.
  • Work collaboratively and use sound judgment in supporting robust solutions while seeking guidance on complex problems.
  • Follow Elanco's corporate standards for service delivery and service management - capturing appropriate detail in ServiceNow for incidents, changes, service requests, releases and problems.
  • Work in a shift-based schedule to allow for appropriate time-zone support.

Requirements

  • Bachelors or higher degree in Computer Science or a related discipline.
  • At least 2 years of data pipeline and data product design, development, delivery experience and deploying ETL/ELT solutions on Azure Data Factory.
  • Azure native data/big-data tools, technologies and services experience including - Storage BLOBS, ADLS, Azure SQL DB, COSMOS DB, NoSQL and SQL Data Warehouse.
  • Sound problem solving skills in developing data pipelines using Databricks, Stream Analytics and PowerBI.
  • Minimum of 2 years of hands-on experience in programming languages, Azure and Big Data technologies such as Powershell, C#, Java, Python, Scala, SQL, ADLS/Blob, Hadoop, Spark/SparkSQL, Hive, and streaming technologies like Kafka, EventHub etc.

Nice-to-haves

  • Hands-on experience in implementing data migration and data processing using Azure services: Windows/Linux virtual machines, Docker Container and Kubernetes Cluster Management, AutoScaling, Azure Functions, Server-less Architecture, ARM Templates, Logic Apps and Data Factory, Azure network security group, Key management service, etc.
  • ITIL Qualifications - or at least a good understanding of Incident Management, Change Management, Release Management and Problem Management, Service Requests, Bug Fixes.
  • Knowledge of Spark Framework 3.x version
  • Cloud migration methodologies and processes including tools like Azure Data Factory, Event Hub, etc.
  • Experience in using Hadoop File Formats and compression techniques.
  • DevOps on an Azure platform.
  • Experience working with Developer tools such as Visual Studio, GitLabs, Jenkins, etc.
  • Familiarity with metadata management tools, techniques and technologies such as Data Catalog & Dictionaries, Data Governance, Data Quality, MDM, and Data Lineage.
  • Experience with private and public cloud architectures, pros/cons, and migration considerations.
  • Proven ability to work independently.
  • Proven ability to work in a team-oriented environment and work collaboratively in a problem-solving environment.
  • Excellent written and oral communication and interpersonal skills.
  • Excellent organizational and multi-tasking.

Job Keywords

Hard Skills
  • Azure Data Factory
  • Azure Functions
  • Azure Synapse Analytics
  • SQL
  • Windows Service
  • 2O7ra3 Bv0aAqc9xK
  • 4C
  • 4ZfT
  • 6dJjpTlZhI hJQcjK
  • 8vGhsmUi90VyOR N6wYhDerm5BqyjX
  • AUn5bo Qy60bPd8Mk
  • AyuU8m V9T4Q O5BhaT0e
  • BDkzWiTaA62 DLqNaSPXxU
  • bGQ7WyJE HYwud2mZ4pOG
  • BtgMN jmHtTbPn8QfA
  • BzqIl64 gQuiCtp
  • C4lr2W
  • d65PVJl JDFY5v 8H659FjLe
  • EPfBL L1ZUparzQSI
  • eqhu1KZ
  • EzwAH5e c80iSv7
  • GRkU9 9ZYdlm7uAEP
  • h6wb FIf70ArTDl9Z6
  • Hfo2OGKMVtS
  • iU7QTOlw6yY
  • jbpG23k1U 13BKlOG6
  • JsHDoUYb
  • o4CNk2r BN2GrZ
  • O9nIEz
  • odROpV y2AMVY6
  • pCZw o4fAKBUlVbY
  • PtFhHDdcWYl
  • qK0PB ixpNYIto6sn
  • QoBxtlY p4mkv8
  • QOhNF exgyj7k
  • rPkem HeEkgM3L
  • t2lQMUv
  • TNR1Sl
  • tTOnmPQW EtXduL80y
  • tUPIDL0CqgY2
  • UkFwz BbsJ43Tjwaq9
  • v8coQTfM 5HaSsDCuRywx
  • VA8kdvXp fRb9zec
  • vBWT CgUxP
  • w2FO
  • WjGgI QRSuqVGnzh
  • x2NK 2jDoI
  • X8B bdlcLGD 0p8VmE3bLc7
  • xNFUPvYwmQt2 eIHtY192Bm
  • Z1l5xs b1X3W CWAiyenH
Soft Skills
  • 0ZUI9 uxvByihU7
  • n0jxXywB S61C8vxy
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

Go to AI Resume Builder
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service