AbbVie - Mettawa, IL

posted 5 months ago

Full-time - Mid Level
Mettawa, IL
Chemical Manufacturing

About the position

As a Data Architect at AbbVie, you will play a crucial role in ensuring compliance with applicable Corporate and Divisional Policies and procedures. You will manage multiple projects, writing Business Requirements Documents (BRDs) and Technical Design Documents (TRDs) to guide the development process. Your expertise in data collection, correlation, and analysis will be essential as you utilize appropriate tools to gather and interpret data effectively. You will also be responsible for recording and maintaining technical data, which will be instrumental in developing operating and instruction manuals. In this position, you will develop simple to complex ETL mappings using Informatica, ensuring that all business rules applied in the ETL logic align with Functional/Technical specification documents or any other requirements documentation. You will leverage AWS services to implement end-to-end data pipelines that derive valuable insights. Additionally, you will utilize Informatica MDM hub (Siperian) on versions 9.x and 10.x to make necessary design and architecture changes, including configuring and fine-tuning fuzzy logic Informatica MDM to meet evolving business needs. Your responsibilities will also include conducting data warehouse, BI, analytics, and ETL applications development and testing using ETL tools like Informatica Powercenter. You will create various technical documentation, including technical specification documents, technical design documents, data flow diagrams, process diagrams, and process illustrations. Implementing batch and continuous data ingestion pipelines using AWS SQS and Python connectors will be part of your daily tasks. Collaboration is key in this role, as you will work closely with various departments, architects, project managers, and technical managers to provide estimates, develop overall implementation solution plans, and lead the implementation of solutions. You will implement advanced concepts such as Streams, Tasks, Clustering, Data purge, and handle semistructured (XML, JSON) and unstructured data, as well as streaming data loads. Your analytical skills will be put to use as you identify business benefits of alternative strategies and ensure compliance between business strategies and technology directions. You may also prepare testing plans to confirm that requirements and system designs are accurate and complete, and conduct user training sessions. Identifying process disconnects and translating them into improvement opportunities will be essential for achieving cost savings, productivity improvements, or revenue-generating business benefits. Building strong business relationships and integrating activities with other IT areas will be crucial for the successful implementation and support of project efforts. You will write SQL queries to analyze data thoroughly and present your findings to larger groups, perform complex SQL, PL/SQL, Unix Shell Scripting, and engage in performance tuning and troubleshooting. Finally, you will analyze departmental processes and needs, making recommendations for the most effective means to satisfy those needs, and develop data ingestion, processing, and raw data pipelines for various data sources to AWS.

Responsibilities

  • Ensure compliance with Corporate and Divisional Policies and procedures.
  • Manage multiple projects and write Business Requirements Documents (BRDs) and Technical Design Documents (TRDs).
  • Collect, correlate, and analyze data using appropriate tools.
  • Record and maintain technical data for developing operating and instruction manuals.
  • Develop ETL mappings in Informatica and document business rules applied in ETL logic.
  • Utilize AWS services to implement end-to-end data pipelines.
  • Utilize Informatica MDM hub (Siperian) for design and architecture changes.
  • Conduct data warehouse/BI/Analytics/ETL applications development and testing using ETL tools.
  • Create technical documentation such as technical specification documents and data flow diagrams.
  • Implement batch and continuous data ingestion pipelines using AWS SQS and Python connectors.
  • Collaborate with various departments to provide estimates and develop implementation plans.
  • Implement advanced concepts such as Streams, Tasks, and Data purge.
  • Assist in the development of standards and procedures.
  • Apply standard information systems theories, concepts, and techniques.
  • Identify business benefits of alternative strategies and ensure compliance between business strategies and technology directions.
  • Prepare testing plans and conduct user training sessions.
  • Identify process disconnects and translate them into improvement opportunities.
  • Develop business relationships and integrate activities with other IT areas.
  • Write SQL queries to analyze data and present results to larger groups.
  • Perform complex SQL, PL/SQL, Unix Shell Scripting, and troubleshooting.
  • Analyze departmental processes and needs and make recommendations for effective solutions.
  • Develop data ingestion, processing, and raw data pipelines for different data sources to AWS.

Requirements

  • Bachelor's degree in Computer Science, Applied Computer Science, Computer Engineering, Information Technology, or a related field with 5 years of related experience.
  • Alternatively, a Master's degree in the aforementioned fields plus 2 years of related experience.
  • Experience in data warehouse/BI/Analytics/ETL applications development and testing using ETL tools like Informatica Powercenter.
  • Experience implementing batch and continuous data ingestion pipelines using AWS SQS and Python connectors.
  • Experience with Streams, Tasks, Clustering, Data purge, semistructured (XML, JSON) and unstructured data handling.
  • Experience in Analysis, Design, Development, Testing, Data Analysis, Data Governance, Reporting, Impact Analysis, Applications Maintenance, and cloud technologies.
  • Proficiency in complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, and troubleshooting.
  • Experience developing data ingestion, data processing, and raw data pipelines for different data sources to AWS.

Benefits

  • Paid time off (vacation, holidays, sick)
  • Medical, dental, and vision insurance
  • 401(k) plan
  • Short-term and long-term incentive programs
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service