Senior SQL Developer / DWH Architect
Engineering Remote

Senior SQL Developer / DWH Architect

Founded in 2018, SegmentStream platform is already trusted by many leading brands across the globe such as United Colors of Benetton, New Balance, Nespresso, MyToys, Eventim.

Our mission is to make BI and Artificial Intelligence technologies more accessible for marketing teams to achieve their goals faster without developers and data scientists.

We are fortunate to be VC-backed by one of the world’s leading startup accelerators - Techstars, as well as some biggest names in the B2B SaaS world, including the founders of PipedriveDynamic Yield and other great companies.

Image
SQL
Data Warehouses
Machine Learning
Data Mining
Data Transformations
Google BigQuery
Data Pipelines
ETL

To evolve our product and scale our business, we are looking for an experienced Senior SQL Developer / DWH Architect who will be integrated into our Engineering team, being responsible for helping maintain and improve the BI and Data Transformation architecture, Machine Learning algorithms, Data Mining processes and delivering scalable and beautiful data solutions within our core product.

What we do

  • We develop high-load data pipelines that allow automating data collection across hundreds of different data sources.
  • We automate data transformation flow using our advanced Workflow Management System based on Kubernetes and Argo (the same stack is used by Tesla, Nvidia, Google, GitHub, and other great companies).
  • We love Clouds. We use Google BigQuery as our primary data warehouse, and our whole infrastructure is based on the Google Cloud Platform.
  • We develop our break-through solutions using Machine Learning to solve complex business problems and invent elegant applications for this technology.
  • We believe in using the right tool for the right task.
  • We are proud of the code we write. We prefer clean, structured, customisable and scalable solutions which fit all clients, versus ad-hoc, fast and dirty solution which is delivered just for one client.
  • We love when our products drive revenue to our customers, being complex under the hood, but very simple for the final user.

You fit us if

  • You can turn complex business requirements into a working product that our customers will love.
  • You are proud of the SQL code that you write, but at the same time remain pragmatic and self-critical.
  • You know when to refactor and when to release.
  • You are inspired by the search for elegant solutions for complex technical problems.
  • You are passionate about Data, AI and ML.
  • You love elegant and scalable solutions and hate dirty ad-hoc work.
  • You are focused, motivated, independent and able to complete the job, no matter how difficult the task.
  • You’re empathetic, patient and happy to help your teammates grow

Examples of future challenges

  • Prepare a design for all metadata according to various ETL processes.
  • Perform troubleshoot on all ETL processes and resolve issues effectively.
  • Analyze all data warehouse architecture goals and identify skill requirement for same.
  • Improve the data schema of our Data Warehouse to make it more scalable and cheaper to process the data.
  • Prepare designs for database systems and recommend improvements for performance.
  • Prepare a data transformation that allows you to migrate from a legacy schema to a new one.
  • Develop various ETL processes and prepare OLAP cubes.
  • Provide support to all data warehouse initiatives.
  • Prepare re-useable and customisable data transformations for visualising user-level metrics like LTV, CAC, Retention Rate, etc.

Requirements

  • You have a degree in Computer Science, Math, Statistics or similar.
  • Expert SQL knowledge.
  • 5+ years in a Data Warehouse environment with varied forms of data infrastructure, including relational databases, Hadoop, and Column Stores.
  • 5+ years of experience in Python, R, Scala, Java, or similar data processing programming language.
  • Experience with BI reporting tools (Tableau, QlikView, PowerBI, Looker, Google Data Studio…).
  • Experience in Digital Marketing Analytics and Attribution is a huge bonus.
  • Experience in data pipelining technologies and ETL frameworks.
  • Experience with data analysis, processing, and validation.
  • Experience with Machine Learning is a huge bonus.
  • Proven ability to write code that solves real problems.
  • Experience implementing data solutions in public cloud technologies including configuration and deployment.
  • You value teamwork and agree with the statement that “a team is a group of people who are responsible for each other’s decisions”.
  • You write fluently in English without mistakes.

Interested in this position?

Leave your contact details and we'll get in touch in 8 business hours.

Apply now

Get started with SegmentStream

Request a personalized demo with our team or try SegmentStream for free today!