Duneolas is hiringData Engineer
Description
About Us
Duneolas is the next transformative leap forward in the evolution of the practice management(EHR) systems used in healthcare. Our unique team of experienced IT and healthcare professionals aim to deliver a software and infrastructure platform that not only outperforms existing legacy systems on established practice workflows but also provides the foundation for the next paradigm shift in primary care technology. Duneolas will not be a typical EHR but rather something we like to call an IEHR or intelligent electronic health record. The intelligence comes both from our frontend which promotes improved data quality capture and our innovative backend which enables next level analysis of the data. We believe that the coming decades will bring enormous improvements in healthcare's ability to leverage data to raise the levels of care delivered to patients and the efficiency of all healthcare team members. Duneolas will be an accelerator of this positive change.
Duneolas Ltd has recently closed our seed round at close to a €4 million valuation. Our founding team have significant knowledge and credibility in the target market having already seen success with Medvault Health in the same market - this is not our first rodeo. Duneolas is looking to build out our core engineering team to ensure we can execute on our ambitious plan for Duneolas. This is a fully remote role and will require an individual who is self-disciplined, determined, and driven to succeed and grow as an engineer. The data engineer role is a tremendous and unique opportunity for a high-quality engineer to get in on the ground floor with an excellent team working on a truly groundbreaking project.
Requirements
Key Responsibilities:
- Lead end-to-end data migration projects, ensuring the seamless transfer of data from one system to another.
- Develop a robust data mapping strategy to transform and migrate data from MSSQL to PostgreSQL, accounting for differences in data types, constraints, and relationships.
- Build ETL processes to efficiently migrate data between systems.
- Ensure that migrated data maintains integrity, completeness, and accuracy. Implement data validation checks and troubleshoot any discrepancies.
- Oversee the infrastructure supporting data processes, ensuring optimal performance, reliability, and scalability.
- Work with and leverage the AWS ecosystem, showcasing proficiency in utilizing AWS services for efficient data processing and storage.
- Leverage open-source technologies to build robust and cost-effective data solutions.
- Maintain thorough documentation of the migration process, including architecture, schemas, transformation rules, and testing procedures.
Qualifications:
- 4+ years of experience in database management and data migration, particularly in MSSQL and PostgreSQL environments.
- Solid understanding of computer science fundamentals and software engineering principles.
- Strong proficiency in SQL and at least one programming language (e.g., Python, Java).
- In-depth understanding of data modeling techniques and the ability to translate schemas between different database systems.
- Proven track record of successfully planning and executing data migration projects.
- Hands-on experience with tools and methodologies for data migration, ensuring a smooth transition while maintaining data integrity.
- Familiarity with data orchestration tools, such as Apache Airflow, and the ability to design and manage complex data workflows.
- Experience in designing, implementing, and optimizing database schemas, queries, and indexing strategies.
- Familiarity with agile methodologies, sprint planning, and retrospectives.
- Proficiency with version control systems, especially Git.
- Ability to work in a fast-paced startup environment and adapt to changing requirements.
Bonus Skills:
- Experience with data migration tools(pg_dump, pgloader, Airbyte, AWS DMS, AWS Glue)
- Knowledge of version control systems (e.g., Git) and CI/CD pipelines for database changes(Liquibase).
Something went wrong.