University Of Michigan - Ann Arbor, MI

posted 5 months ago

Part-time - Entry Level
Ann Arbor, MI
Educational Services

About the position

The University of Michigan is seeking a Research Project Assistant for a temporary position that will last for one year, with the possibility of renewal for an additional one to two years. This part-time role requires the candidate to work 25-30 hours per week on-site, with a pay rate ranging from $20 to $25 per hour, depending on experience. The successful candidate will play a crucial role in a large National Institutes of Health-sponsored multi-institutional study, focusing on the design, development, and maintenance of efficient data pipelines to facilitate seamless data flow across various research sites. The Research Project Assistant will be responsible for integrating data from multiple sources, including manual entry forms, Redcap, emails (both raw text and PDF attachments), and third-party services like Zapier. The role demands strong technical skills, particularly in Python for scripting and automation, as well as experience with data integration tools and data storage solutions. The candidate will also be tasked with ensuring data quality, consistency, and security throughout the data pipeline, while collaborating with researchers and stakeholders to understand their data requirements and deliver effective solutions. In addition to technical competencies, the position requires excellent communication skills, both written and verbal, as the assistant will need to work collaboratively within a team environment. The ideal candidate should be self-motivated and capable of working independently, demonstrating strong problem-solving abilities and attention to detail. This role is an excellent opportunity for individuals looking to gain experience in data engineering and project management within a research context.

Responsibilities

  • Design and develop robust data pipelines for data ingestion, processing, and storage.
  • Integrate data from various sources, including manual entry forms, Redcap, email (raw text and PDF attachments), and third-party services (e.g., Zapier).
  • Utilize Python scripts for data extraction, transformation, and loading (ETL).
  • Manage and maintain data in structured formats.
  • Develop automated reporting solutions and interactive dashboards.
  • Ensure data quality, consistency, and security throughout the pipeline.
  • Collaborate with researchers and other stakeholders to understand data requirements and deliver solutions.

Requirements

  • Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field.
  • 3+ years of experience in data engineering, data pipeline development, or a related role.
  • Proven track record of building and maintaining data pipeline projects.
  • Proficiency in Python for scripting and automation.
  • Experience with data integration tools (e.g., Zapier).
  • Knowledge of data formats (XLSX, CSV, JSON) and data transformation techniques.
  • Familiarity with data storage solutions and database management.
  • Ability to design and implement ETL processes.
  • Experience with Redcap or similar data collection tools.
  • Proficiency in using form-based data entry systems.
  • Competence in parsing unstructured text.
  • Knowledge of automated reporting and dashboarding tools (e.g., Tableau, Power BI).
  • Strong problem-solving abilities and attention to detail.
  • Excellent communication skills, both written and verbal.
  • Ability to work collaboratively in a team environment.
  • Self-motivated and capable of working independently.

Nice-to-haves

  • Experience with API integration and data access.
  • Familiarity with data governance and compliance best practices.
  • Understanding of data visualization techniques and tools.
  • Experience with project management methodologies and tools.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service