Data Modeler

  • Tasks

    • Collaborate with our clients to design and build modern data platforms using a variety of technologies
    • Help lead the design and implementation of complex, cloud-based data ingestion and transformation pipelines
    • Implement scalable and secure Data Lakes / Warehouses
    • Mentor and upskill other engineers, both client and internal
    • Help drive effective development patterns and delivery practices
    • Help maintain and improve our internal tools and design patterns
    • Continually improve with our internal development program, including mentoring and paid training/certifications

    Requirements

    • Highly proficient with Python
    • Extensive experience developing on AWS, including a broad understanding of the service offerings for that provider
    • Excellent knowledge of software development best practices
    • Experience building CICD pipelines using Jenkins
    • Experience using Infrastructure-as-code tools (e.g. Terraform, Cloudformation)
    • Advanced SQL skills, including query optimisation
    • Experience with Data Vault is a must
    • Experience with Azure is a must
    • Extensive experience with structured, unstructured, and semi-structured data
    • Experience working within an Agile environment (SAFe preferred), as well as a strong understanding of Agile principles and delivery practices
    • Working knowledge of data regulations (e.g. GDPR)
    • Very strong communication skills, including stakeholder management up to CXO level
    • Preferably, experience working with the following:
    • Containers (Docker preferred)
    • Real-time/event-based data
    • Big data products
    • Data quality frameworks
    • Orchestration tools (e.g. Airflow)

    This is contract position that is only available to candidates authorised to work in the EU.

Add Comment

Comment
Name
E-mail
Website