Data Engineer

WeTransfer
Tools to move ideas.

Job details

Apply now

Sign up to apply

Or sign up to refer and earn a reward of €300

Intro

Data engineering at WeTransfer

WeTransfer is used by millions every day - from our moms to your favourite artists. The quest, to design simple tools to a wide variety of people, comes with a lot of decisions. We trust our gut feeling but are always looking to back it up with data. This data is collected in a variety of ways and is all brought together in our data pipeline.

Our team of data engineers is responsible for the complete data pipeline, so all the way from input (Snowplow, AdZerk, LaunchDarkly) to output (our Redshift data warehouse). Making sure the data pipeline is fast, reliable and flexible is our number one priority. 

At the moment the data in our Redshift data warehouse is used by humans. We have a team of business analysts using it, finance uses it to do reporting and our engineers use it to debug and/or do research. We are really happy serving those human customers but in 2020 we want to add a new group of customers; machines. Yes we are talking machine learning and automated decision making. And that is where we need you; an experienced data engineer who is not afraid to experiment and try out new things.

What you’ll be doing

At the moment the data team is formed by three data engineers and one team lead. There is a nice mix of team projects and individual projects engineers workon. As data engineer number four you can expect to work on one of the big team projects but also take responsibility for one (or more) of you own projects.

There is also plenty of opportunity to work on project with people from different teams and departments across the organisation. At the moment data engineers are involved in project with adsales, engineering, infrastructure and business analysts. No need to be scared that the only thing you will do is write AWS EMR jobs.

Main requirements

We are looking for someone with:

  • Experience in dealing with large (>billion rows) amounts of data.
  • Special interest in makings things robust and fault tolerant.
  • A passion to automate things.
  • Deep understanding of database design and SQL.
  • Strong programming skills (Python preferred).
  • Hands-on experience with AWS or similar (EC2, S3, Lambda, EMR, Spark, etc).
  • Hands-on experience with analytical databases (Redshift or similar).
  • Who can learn us something.

Nice to have

Even better if you have:

  • Experience with Airflow, Google Analytics, event data processing, Metabase (or other BI tools).

Perks

WeOffer

  • Diverse workplace with some pretty decent co-workers (we are humble).
  • Opportunity to join us on our mission.
  • Competitive salary and benefits
  • Personal development (€1000,-), health (€750,-) and conference budget
  • Flexible work hours
  • Catered lunch
  • Fancy work equipment (MacBook, Bose headphones, big screen, fancy desk and even some skippy balls to sit on)

Apply now

Sign up to apply

Or sign up to refer and earn a reward of €300