Associate Data Migrations Engineer
Indicium AIEnterprise AI, Delivered.
Description
<!--block-->As an Associate Data Migrations Engineer at Indicium AI, you will join a team of data practitioners building scalable, reliable data solutions for leading organisations across Europe. Working within client engagements, you will contribute to the design and development of data pipelines, support data integration projects, and grow your expertise under the guidance of experienced engineers. This is an opportunity to establish strong data engineering fundamentals while making a tangible impact on clients' data platforms from day one.<!--block-->Data Ingestion & Integration
- <!--block-->Contribute to data ingestion workflows, pulling from relational and NoSQL databases, REST APIs, and flat files with attention to data quality and consistency.
- <!--block-->Support the design and development of ELT pipelines, following team standards and data engineering best practices.
- <!--block-->Assist with monitoring pipeline executions across cloud and on-premises environments, escalating issues as appropriate.
- <!--block-->Assist in implementing and maintaining data storage solutions including Data Warehouses and Data Lakes.
- <!--block-->Write and optimise SQL queries to support downstream analytics and reporting use cases.
- <!--block-->Work closely with cross-functional teams — analysts, engineers, and client stakeholders — to understand requirements and deliver fit-for-purpose solutions.
- <!--block-->Maintain clear documentation for pipelines, data models, and processes, contributing to a culture of engineering rigour.
- <!--block-->Actively engage with DevOps and DataOps practices, absorbing team patterns around CI/CD, version control, and code review.
Requirements
- <!--block-->1–2 years of hands-on experience in Python (primary) or another language such as Java, Scala, or Ruby.
- <!--block-->Solid intermediate SQL — comfortable with joins, aggregations, and window functions.
- <!--block-->Working knowledge of Git via GitHub, GitLab, or Bitbucket.
- <!--block-->Basic familiarity with dbt — models, sources, and the test/documentation workflow.
- <!--block-->Foundational understanding of at least one public cloud platform (AWS, GCP, or Azure).
- <!--block-->Basic understanding of data structures and algorithms.
- <!--block-->Intellectual curiosity and eagerness to learn in a fast-paced consulting environment.
- <!--block-->Strong written and spoken English.
- <!--block-->Comfortable asking questions, contributing in team discussions, and receiving technical feedback.
Nice to have
- <!--block-->Exposure to orchestration tools such as Apache Airflow or Dagster.
- <!--block-->Awareness of distributed processing frameworks (Spark, Dask, or similar).
- <!--block-->Prior experience in a consulting or client-facing environment.
Perks
- <!--block-->Holiday Entitlement: 25 days holiday plus bank holidays.
- <!--block-->Learning & Development: €1,500 training budget + 5 training days.
- <!--block-->Company Bonus: Discretionary company and personal bonus paid quarterly.
- <!--block-->Pension Scheme.
- <!--block-->Choose Your Kit: Select from a range of laptops and accessories.
- <!--block-->Social Events: meetups, squad events, summer/Christmas events, etc.
- <!--block-->And many others.




