As a Data Engineer at Radiant Digital, you will play a crucial role in supporting the dataset archival needs of the organization. You will work closely with the OCCS Data Management Leadership and key stakeholders to develop, document, and implement metadata standards and data dictionaries, which include taxonomies and data formats. Your responsibilities will also involve assisting in data collection from both existing and new registries, with a focus on improving the value and sustainability of these registries by leveraging electronic data. In this position, you will be tasked with importing new code system data to support the curation of the Centers for Medicare & Medicaid Services (CMS) C-CDA (Consolidated-Clinical Document Architecture) value sets in the Value Set Authority Center (VSAC). Collaboration will be key, as you will work with both internal and external researchers on a wide variety of research data. You will establish data standards and conventions in close collaboration with the Data Management Leadership team, developing and updating data standards user guides and manuals. As an in-house subject matter expert, you will map research data to metadata standards, ontologies, and formats, ensuring a solid understanding of computer science principles such as data structures and algorithms to continually improve your skills and the current implementation. This role requires a minimum of 5 years of experience in data collection, data flow management, data quality, and data standards from a variety of datasets, including those from the Value Set Authority Center (VSAC) and Fast Healthcare Interoperability Resources (FHIR). You will need to demonstrate experience in retrieving and assembling data from multiple sources and formats, creating derived datasets, and implementing data specifications to create OCCS datasets for research purposes. Your expertise in metadata implementation using controlled vocabularies and domain standards will be essential to optimize the user experience of data discovery systems. Additionally, you will establish best practices for interoperability, data authority control, and metadata standards, contributing to metadata quality control, cleanup, editing, enhancement, migration, and mapping, including the use of automated routines for metadata maintenance. A basic understanding of programming languages and tools such as Python, SQL, MySQL, Oracle, PostgreSQL, MongoDB, XML, and JSON is required, along with experience using version control systems, project/issue tracking systems, and continuous integration services. You will also be expected to track and evaluate new standards, technologies, and trends in data design, analysis, and strategy. Effective written and oral communication skills are essential, as is the ability to collaborate with scientists, engineers, and users.