22nd Century Technologies - Richmond, VA

posted 3 months ago

Full-time - Mid Level
Richmond, VA
5,001-10,000 employees
Professional, Scientific, and Technical Services

About the position

The Data Analyst/Cloud ETL Developer will play a crucial role in supporting teams working in an Agile environment to analyze datasets that will be made available in a cloud-based data management platform. This platform is essential for the agency to produce master data with a focus on data governance. The analyst will be responsible for analyzing source systems that contain a spatial component for candidate datasets, documenting business processes and the data lifecycle, and developing data requirements, user stories, and acceptance criteria. Additionally, the role involves testing strategies and developing ETL processes to extract business and spatial data, loading it into a data warehousing environment, and designing and testing the performance of the system. The position requires collaboration with various teams to understand the company's data storage needs and to develop data warehousing options. The analyst will need to have a deep knowledge of coding languages such as Python, Java, XML, and SQL, as well as a strong understanding of warehousing architecture techniques including MOLAP, ROLAP, ODS, DM, and EDW. Responsibilities also include profiling source data, conducting entity resolution, managing metadata, and creating clear and concise user stories that are easy for technical staff to implement. The analyst will assist the Product Owner in maintaining the product backlog and will be involved in user acceptance testing to ensure that the project meets business requirements.

Responsibilities

  • Work with the Project team members and business stakeholders to understand business processes and pain points
  • Develop expertise in source system datasets and data lifecycle
  • Profile source data which may contain a spatial component; review source data and compare content and structure to dataset requirements; identify conflicts and determine recommendations for resolution
  • Conduct entity resolution to identify matching and merging and semantic conflicts
  • Elicit, record, and manage metadata
  • Diagram current processes and proposed modifications using process flows, context diagrams and data flow diagrams
  • Decompose requirements into Epics and Features and create clear and concise user stories that are easy to understand and implement by technical staff
  • Utilize progressive elaboration; map stories to data models and architectures to be used by internal staff to facilitate master data management
  • Identify and group related user stories into themes, document dependencies and associated business processes
  • Client and document requirements and user stories with a focus on improving both business and technical processing
  • Assist Product Owner in maintaining the product backlog
  • Create conceptual prototypes and mock-ups
  • Collaborate with staff, vendors, consultants, and contractors as they are engaged on tasks to formulate, detail and test potential and implemented solutions
  • Perform Quality Analyst functions such as defining test objectives, test plans and test cases, and executing test cases
  • Coordinate and Facilitate User Acceptance Testing with Business and ensure Project Managers/Scrum Masters are informed of the progress
  • Designs and develops systems for the maintenance of the Data Asset Program(Data Hub), ETL processes, ETL processes for spatial data, and business intelligence
  • Develop a new data engineering process that leverage a new cloud architecture and will extend or migrate our existing data pipelines to this architecture as needed
  • Design and supports the DW database and table schemas for new and existent data sources for the data hub and warehouse. Design and development of Data Marts
  • Work closely with data analysts, data scientists, and other data consumers within the business in an attempt together and populate data hub and data warehouse table structure, which is optimized for reporting
  • Partner with Data modeler and Data architect to refine the business's data requirements, which must be met for building and maintaining Data Assets.

Requirements

  • Advanced understanding of data integrations
  • Strong knowledge of database architectures
  • Strong understanding of ingesting spatial data
  • Ability to negotiate and resolve conflicts
  • Ability to effectively prioritize and handle multiple tasks and projects
  • Excellent computer skills and proficiency in MS Word, PowerPoint, MS Excel, MS Project, MS Visio, and MS Team Foundation Server
  • Experience with key data warehousing architectures including Kimball and Inmon
  • Broad experience designing solutions using a broad set of data technologies
  • Expertise in Data Factory v2, Data Lake Store, Data Lake Analytics, Azure Analysis Services, Azure Synapse
  • Experience with IBM Datastage, Erwin, SQL Server (SSIS, SSRS, SSAS), ORACLE, T-SQL, Azure SQL Database, Azure SQL Datawarehouse
  • Scripting experience with Windows and/or Python, Linux Shell scripting
  • Experience in AZURE Cloud engineering.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service