Provectusposted about 2 months ago
Full-time

About the position

Provectus, is a leading AI consultancy and solutions provider specializing in Data Engineering and Machine Learning. With a focus on helping businesses unlock the power of their data, we leverage the latest technologies to build innovative data platforms that drive results. Our Data Engineering team consists of top-tier professionals who design, implement, and optimize scalable, data-driven architectures for clients across various industries. Join us if you have the same passion for making products using AI/ML technologies, cloud services, and data engineering. As a Data Solutions Architect, you will lead the design, architecture, and implementation of large-scale data solutions for our clients. You will act as a strategic technical leader, collaborating with cross-functional teams to deliver innovative data platforms that drive business value.

Responsibilities

  • Lead high-impact customer engagements focused on AWS Data Platform solutions.
  • Define and drive technical strategies that align AWS capabilities with customer objectives, incorporating Databricks, GCP, or Azure where appropriate.
  • Architect and design scalable data platforms using AWS, ensuring optimal performance, security, and cost-efficiency.
  • Integrate AWS services with other solutions (Databricks, Snowflake, GCP, or Azure) as needed, selecting the right technologies and tools to meet customer needs.
  • Develop and maintain comprehensive architectural documentation aligned with organizational technical standards.
  • Partner with the sales team, providing technical expertise to position AWS-based data solutions effectively.
  • Participate in customer meetings to assess technical needs, scope solutions, and identify growth opportunities.
  • Create technical proposals, solution architectures, and presentations to support sales efforts and align with customer expectations.
  • Assist in responding to RFPs/RFIs with accurate technical input and align solutions to client requirements.
  • Demonstrate AWS capabilities through POCs and technical demonstrations to showcase proposed solutions.
  • Build and maintain strong relationships with key customer stakeholders, acting as a trusted advisor for data platform initiatives.
  • Lead discovery workshops to understand customer requirements, KPIs, and technical constraints.
  • Oversee the end-to-end implementation of AWS-based data platforms, coordinating with engineering teams to ensure successful delivery.
  • Manage technical risks and develop mitigation strategies.
  • Stay up-to-date with the latest developments in AWS, Databricks, GCP, Azure, and cloud technologies.
  • Develop and promote best practices in data platform architecture, data pipelines, and data governance.
  • Collaborate with AI/ML teams to integrate advanced analytics and machine learning capabilities into AWS and other cloud platforms.
  • Work with DevOps teams to implement CI/CD pipelines and automation for data workflows.
  • Mentor junior architects and engineers, fostering a culture of continuous learning and professional development.
  • Contribute to knowledge-sharing initiatives through technical blogs, case studies, and industry event presentations.
  • Ensure that AWS-based data platform solutions comply with relevant security standards and regulations.
  • Implement data governance frameworks to maintain data quality and integrity.

Requirements

  • Experience in data solution architecture.
  • Proven experience in designing and implementing large-scale data engineering solutions on AWS.
  • Experience with Databricks, GCP, or Azure solutions is required.
  • Deep expertise in AWS platform services, including S3, EC2, Lambda, EMR, Glue, Redshift, AWS MSK, and EKS.
  • Proficient in programming languages like Python, SQL, and Scala.
  • Experience with data warehousing, ETL processes, and real-time data streaming.
  • Familiarity with open-source technologies and tools in data engineering.
  • AWS Certified Solutions Architect – Professional (or similar) is required.
  • Excellent communication and presentation skills, with the ability to convey complex technical concepts to non-technical stakeholders.
  • Strong leadership and project management skills.
  • Ability to work collaboratively in a cross-functional team environment.

Nice-to-haves

  • Experience in the Healthcare and Biotech domains.
  • Certifications in Databricks, GCP, or Azure.
  • Experience with AWS Migration Acceleration Programs (MAP).
  • Experience with AI/ML integration in data platforms.
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Contributions to open-source projects or active participation in the data engineering community.

Job Keywords

Hard Skills
  • AWS Glue
  • Azure Databricks
  • Python
  • Scala
  • SQL
  • 0bG b3oSdNyBtkwep pPYxymN
  • 13mVOJw6t OIsEv1n82FhpV GsCDFrNwRLh
  • 1tbYU7TN 2TuHLZNBb
  • 3A4QfR8jd QVM8NKePkwh
  • 47p5YOeGE xL5CJNwbIhVBn
  • 4AH0X Kvdt2f4Fewu9
  • 4NjkOV
  • 7mitL Ds5GLgUm
  • 8xc4MvTK wJVEMOL8WhC
  • 9RXyko irVAXkGwu
  • aNxi6Xg2TF Io9rq pD3x8kCabTe
  • BHrMK vwCjSQqcMtPG
  • CMxr7P TyipsfNq
  • E1jQTfCHmN u80JIcjengG
  • gfvF2xOoyT ex0WH1Tk9ltwnLs
  • gjpuD FP5gtq243VOK
  • gNCRq40sj NzenJqcF
  • HKB6S1C DPr0kegbJC
  • I7Zeq 20LhtuOUQI
  • iGIxwS2qPHm
  • k9lJq z3vEYuUrT8
  • M5ijwI6a4Ky1 RQ8HXtqOuN
  • ND6EgKcRQ yaUkpM2RgI0
  • NoZXi4ARE fW1aOCrEBSyq3
  • Nux0V JIuTMZ5tbLv3
  • P0UWeFgROK
  • qQ2iHZaO9
  • RDrAshUKVo3 osta02m
  • rj7Y9 HX8zFvb0It
  • s9ng4 BWwPSyLhlJn
  • Sh72qEg ovhG0FV
  • Ufs4c9wunpxRFmrAL 4LlVOaSFxXgY7h
  • uUjrQNJwos4Zdh 8Q6a1yS
  • VtRB2bjrC8QZYKJ0O oah2kZnL7qdB3t
  • w97X3cAmf WZyrpj9mdh
  • Wi684vArEnKO jatU94T
  • WPYiA yfS8bCvXaj0z
  • WxaVG 7Df1H5pBdYvx
  • Xl5hG 23M91mp tOHgTCrQd4V
  • yohQEBVqHW Gq3nbLCAT
  • Zng3cE 1fL8DNhTzXiRr
  • zxG7PRLZ0eDgJm
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

Go to AI Resume Builder
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service