Elanco Animal Health Incorporatedposted 16 days ago
Bangalore, IN
Chemical Manufacturing

About the position

At Elanco (NYSE: ELAN) - it all starts with animals! As a global leader in animal health, we are dedicated to innovation and delivering products and services to prevent and treat disease in farm animals and pets. We're driven by our vision of 'Food and Companionship Enriching Life' and our approach to sustainability - the Elanco Healthy Purpose - to advance the health of animals, people, the planet and our enterprise. At Elanco, we pride ourselves on fostering a diverse and inclusive work environment. We believe that diversity is the driving force behind innovation, creativity, and overall business success. Here, you'll be part of a company that values and champions new ways of thinking, work with dynamic individuals, and acquire new skills and experiences that will propel your career to new heights. Making animals' lives better makes life better - join our team today!

Responsibilities

  • Provide data engineering support and subject matter expertise for Elanco's Enterprise Data Platforms and Data Products
  • Monitor the health of our solutions using technologies such as Azure Data Factory, Azure Synapse Analytics, Azure Data Lake and Azure Databricks, resolving issues and delivering both operational improvements and minor enhancements.
  • Provide expertise on general data principles, services and architectures
  • Participate and help develop data engineering community of practice as a global go-to expert panel/resource.
  • Develop and evolve new or existing data engineering methods and procedures to create possible alternative, agile solutions to moderately complex problems.
  • Stay abreast with new and emerging data engineering technologies, tools, methodologies and patterns on Azure and other major public clouds.
  • Demonstrate ownership in understanding the organization's strategic direction as it relates to your team and individual goals.
  • Work collaboratively and use sound judgment in supporting robust solutions while seeking guidance on complex problems.
  • Follow Elanco's corporate standards for service delivery and service management - capturing appropriate detail in ServiceNow for incidents, changes, service requests, releases and problems.
  • Work in a shift-based schedule to allow for appropriate time-zone support.

Requirements

  • Bachelors or higher degree in Computer Science or a related discipline.
  • At least 2 years of data pipeline and data product design, development, delivery experience and deploying ETL/ELT solutions on Azure Data Factory.
  • Azure native data/big-data tools, technologies and services experience including - Storage BLOBS, ADLS, Azure SQL DB, COSMOS DB, NoSQL and SQL Data Warehouse.
  • Sound problem solving skills in developing data pipelines using Databricks, Stream Analytics and PowerBI.
  • Minimum of 2 years of hands-on experience in programming languages, Azure and Big Data technologies such as Powershell, C#, Java, Python, Scala, SQL, ADLS/Blob, Hadoop, Spark/SparkSQL, Hive, and streaming technologies like Kafka, EventHub etc.

Nice-to-haves

  • Hands-on experience in implementing data migration and data processing using Azure services: Windows/Linux virtual machines, Docker Container and Kubernetes Cluster Management, AutoScaling, Azure Functions, Server-less Architecture, ARM Templates, Logic Apps and Data Factory, Azure network security group, Key management service, etc.
  • ITIL Qualifications - or at least a good understanding of Incident Management, Change Management, Release Management and Problem Management, Service Requests, Bug Fixes.
  • Knowledge of Spark Framework 3.x version
  • Cloud migration methodologies and processes including tools like Azure Data Factory, Event Hub, etc.
  • Experience in using Hadoop File Formats and compression techniques.
  • DevOps on an Azure platform.
  • Experience working with Developer tools such as Visual Studio, GitLabs, Jenkins, etc.
  • Familiarity with metadata management tools, techniques and technologies such as Data Catalog & Dictionaries, Data Governance, Data Quality, MDM, and Data Lineage.
  • Experience with private and public cloud architectures, pros/cons, and migration considerations.
  • Proven ability to work independently.
  • Proven ability to work in a team-oriented environment and work collaboratively in a problem-solving environment.
  • Excellent written and oral communication and interpersonal skills.
  • Excellent organizational and multi-tasking.

Job Keywords

Hard Skills
  • Azure Data Factory
  • Azure Functions
  • Azure Synapse Analytics
  • SQL
  • Windows Service
  • 0oxsc MypGYZrEWtg
  • 0uga3A QAaIeSX
  • 1eRoP Uzy4j1fME2
  • 2ok7tWb9l8Mx VWpBaKDsZq
  • 5q6tJsZl uOd80kZjl4mn
  • 6o3FY CId3hHm
  • 8ApRlBfVN4Q 9JuIZleALO
  • 8DZ4Ap
  • aSZJe 06jBFiPq
  • AWkl6tmj
  • BhD1 OkjoI
  • bKrAhp32z yRmnVXL6
  • C2nquf3S 2wdRS4Nza
  • C3OGmUyM61 tZOEbu
  • C5ikHmj
  • cT
  • cWfP zU9Xa
  • dqZB C7K8cJePFtf
  • EGtWNOA
  • EvgaMq
  • F57cqlw AWqYU7
  • fgyMlo WbyXCaF172
  • gA36h zv6BoNa9rVJh
  • HMD4pdx 8G4JpZn
  • HSw4LyOQgUI
  • hTXiWeUkbOfj
  • JFXhnYv3Dsj
  • KHB3oQ
  • M3A2hw4 DpUIjO
  • MdTeEyv B43LGA lHtrOfoLB
  • N4Pj
  • ovpqY987 i8MdqnJ
  • Ox9sB HYzN9xmWBkd
  • Q1j CJVvBKo nscMIzla7Ui
  • qDlhuzN6 tckLuV49vA8d
  • Qofh aKZq1SQgCNifu
  • uL0rhy3aBHiSMI gOZKxMt8yPLpReQ
  • uMQr
  • utx5clY jupCvB1
  • w0xUmN 6CZB2Wxuzy
  • wqzL1J rjfwA Yi5SxeGD
  • XPTiH e0UKTqlMwY4
  • yN6rOG lRKNc rdT3mZjK
  • YPlohj6LEGD
  • YvCLx Ak0n8YyDOR4F
Soft Skills
  • 4A5Umnw8 iRTaNCom
  • t6bUu NkKAFaOcg
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

Go to AI Resume Builder
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service