ADP - Parsippany-Troy Hills, NJ

posted about 2 months ago

Full-time - Senior
Parsippany-Troy Hills, NJ
Professional, Scientific, and Technical Services

About the position

ADP is seeking a Principal Data and AI Platform Engineer to join our dynamic team in Parsippany, New Jersey. In this role, you will be an integral part of a scrum team, responsible for transforming innovative designs and ideas into functional web applications that cater to both internal and external clients. Your work will have a significant impact, supporting organizations ranging from small businesses to large enterprises with thousands of employees. You will collaborate closely with clients, product managers, architects, and software engineers to plan, design, develop, test, and implement solutions that enhance operational efficiency and user experience. As a Lead Developer, you will take on a hands-on role, guiding your team through the complexities of software development. You will be responsible for designing, developing, debugging, and deploying software solutions, leveraging your expertise to navigate code complexities. Additionally, you will mentor junior developers, helping them grow their skills and navigate their projects. Your leadership will be crucial in prioritizing user stories and aligning them with the technical interests of your team members. This position offers exciting challenges and opportunities for career growth, all within a supportive and collaborative environment. Your daily responsibilities will include researching new AI technologies, collaborating with various teams to create production-grade AI and ML deployment architectures, and developing GenAI applications using large language models. You will write Python code for data processing tasks, build CI/CD pipelines for ML models, and provide hands-on engineering support. Staying current with AI and ML trends will be essential, as you will identify opportunities for implementing new technologies and models. You will also follow an agile development methodology, ensuring that all projects adhere to the Software Development Lifecycle (SDLC) framework.

Responsibilities

  • Research new AI technologies, including cloud managed AI/ML services and generative AI services.
  • Collaborate with Architecture, Infrastructure, and Data Science teams to create production-grade AI and ML deployment architectures.
  • Develop GenAI applications using open-source or proprietary large language models (LLMs).
  • Write Python code in Databricks Notebook for data preprocessing and application orchestration.
  • Build CI/CD pipelines for ML models and data engineering pipelines with a focus on automation and monitoring.
  • Provide hands-on engineering support, including coding, testing, debugging, and deployment of solutions.
  • Evaluate and select third-party vendors, tools, and services for AI solution development.
  • Communicate architectures and technical details effectively with partners.
  • Stay updated with the latest trends in AI and ML and identify opportunities for implementation.
  • Follow an agile development methodology including the Software Development Lifecycle (SDLC) framework.

Requirements

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field preferred.
  • Minimum of 5 years of experience in development, focusing on cloud computing and AI technologies.
  • Strong knowledge of cloud platforms (e.g., AWS, Azure, Google Cloud) and services.
  • Hands-on experience with cloud services like EC2, RDS, IAM, Security Group, VPC.
  • Proven experience in designing and implementing efficient data processing pipelines.
  • Experience in selecting cloud services based on performance characteristics and operational burden.
  • Ability to design scalable streaming architectures to optimize for latency and cost.
  • Experience in developing configurable frameworks for big data engineers and data scientists.
  • Experience in AI/ML operations for data classification and labeling.
  • Understanding of AI/ML features provided by Databricks and AWS Sagemaker.
  • Good understanding of search and vector search techniques, with hands-on experience in AWS OpenSearch and Kendra.
  • Experience with graph or ontology databases like AWS Neptune.
  • Proficiency in building automations and integrations using CI/CD pipelines with tools like Jenkins and Terraform.
  • Experience with programming languages such as Python and familiarity with software development methodologies.

Nice-to-haves

  • Familiarity with data privacy, security, and compliance requirements in cloud and AI solutions.
  • Continuous learning mindset and a passion for staying updated with cloud and AI technologies.

Benefits

  • Competitive salary and performance bonuses.
  • Comprehensive health insurance coverage.
  • 401(k) retirement savings plan with company matching.
  • Flexible work hours and remote work options.
  • Professional development opportunities and tuition reimbursement.
  • Paid time off and holidays.
  • Diversity and inclusion programs.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service