Join to apply for the Data Engineer role at Quartile
Overview
Quartile, the world's largest retail media optimization platform, is a trusted partner for multichannel e-commerce success.
Through unmatched expertise and patented AI technology, we fuel growth for 5,300+ brands and sellers worldwide and manage an annual ad spend exceeding $2 billion.
The award-winning platform covers major marketplaces and ad channels for optimal reach.
The result is unprecedented granularity, smarter budgeting, and bespoke solutions for retailers.
Quartile is proud to be an equal opportunity employer with employees stemming from a wide range of backgrounds and experiences.
As a business, we value the enrichment that diversity brings to our organization and are committed to a culture that creates a sense of inclusion and belonging.
We welcome new perspectives and affirm that all employment decisions are made without regard to race, color, ancestry, religion, national origin, age, familial or marital status, sex, sexual orientation, pregnancy, gender identity or expression, disability, genetic information, veteran status, or any other classification protected by federal, state, or local law.
What You'll Do
Develop Data Pipelines in Python using APIs from world leading marketing companies
Create monitoring systems for pipelines
Design Architecture for new ETLs in Databricks or Azure Functions
Consume data from Azure Data Lake to generate requests to APIs
Develop integrated tests for QA
Requirements
Experience with Pyspark, Spark or Scala
Relational databases (MySQL, PostgreSQL, AWS RDS)
Profiling backend services and databases and improving their performance
Experience with Tests (unit, manual, integration, E2E...) and good documentations
Solid Experience with Python (APIs, Data Modeling, Classes)
Knowledge of Cloud Architecture (Azure or AWS)
Experience working with Data Lake / Delta Lake Housing
Proficiency in English is required - Advanced to Fluent Level
Innovative behavior
Agile methodologies
Preferred Qualifications
Experience with Machine Learning for engineering
Experience with Databricks
Experience with Terraform
Experience with OAuth Authentication Workflow
Experience with Topics and Queues Architecture (Kafka, Event Hub, Service Bus)
Experience with Object Oriented languages like Scala or Java
Understanding of E-commerce and Marketing area
Good teamwork and positive mindset
What You'll Need to Know
This is a PJ contract based out in Brazil.
Benefits
Fully remote work
The opportunity to join an international team
Career growth opportunities
Generous Paid Time Off
#J-
Data Engineer • Londrina, Paraná, Brasil