Senior DataOps Engineer

dLocalWorldwide2 months ago
devopsFull-TimeSenior
Apply on Company Website

Job Overview

Job Title

Senior DataOps Engineer

Company

dLocal

Location

Worldwide

Job Type

Full-Time

Experience

Senior

About This Role

Headquarters: Barcelona / Madrid
URL: http://dlocal.com

Why should you join dLocal?

dLocal enables the biggest companies in the world to collect payments in 40 countries in emerging markets. Global brands rely on us to increase conversion rates and simplify payment expansion effortlessly. As both a payments processor and a merchant of record where we operate, we make it possible for our merchants to make inroads into the world’s fastest-growing, emerging markets. 

By joining us you will be a part of an amazing global team that makes it all happen, in a flexible, remote-first dynamic culture with travel, health and learning benefits, among others. Being a part of dLocal means working with 1000+ teammates from 30+ different nationalities and developing an international career that impacts millions of people’s daily lives. We are builders, we never run from a challenge, we are customer-centric, and if this sounds like you, we know you will thrive in our team.




What’s the opportunity? 
As a Senior DataOps Engineer, you'll be a strategic professional shaping the foundation of our data platform. You’ll design and evolve scalable infrastructure on Kubernetes, operate Databricks as our primary data platform, enable data governance and reliability at scale, and ensure our data assets are clean, observable, and accessible.

What will I be doing?

    • Architect and evolve scalable infrastructure to ingest, process, and serve large volumes of data efficiently, using Kubernetes and Databricks as core building blocks.
    • Design, build, and maintain Kubernetes-based infrastructure, owning deployment, scaling, and reliability of data workloads running on our clusters.
    • Operate Databricks as our primary data platform, including workspace and cluster configuration, job orchestration, and integration with the broader data ecosystem.
    • Work in improvements to existing frameworks and pipelines to ensure performance, reliability, and cost-efficiency across batch and streaming workloads.
    • Build and maintain CI/CD pipelines for data applications (DAGs, jobs, libraries, containers), automating testing, deployment, and rollback.
    • Implement release strategies (e.g., blue/green, canary, feature flags) where relevant for data services and platform changes.
    • Establish and maintain robust data governance practices (e.g., contracts, catalogs, access controls, quality checks) that empower cross-functional teams to access and trust data.
    • Build a framework to move raw datasets into clean, reliable, and well-modeled assets for analytics, modeling, and reporting, in partnership with Data Engineering and BI.
    • Define and track SLIs/SLOs for critical data services (freshness, latency, availability, data quality signals).
    • Implement and own monitoring, logging, tracing, and alerting for data workloads and platform components, improving observability over time.
    • Lead and participate in on-call rotation for data platforms, manage incidents, and run structured postmortems to drive continuous improvement.
    • Investigate and resolve complex data and platform issues, ensuring data accuracy, system resilience, and clear root-cause analysis.
    • Maintain high standards for code quality, testing, and documentation, with a strong focus on reproducibility and observability.
    • Work closely with the Data Enablement team, BI, and ML stakeholders to continuously evolve the data platform based on their needs and feedback.
    • Stay current with industry trends and emerging technologies in DataOps, DevOps, and data platforms to continuously raise the bar on our engineering practices.

What skills do I need?

    • Bachelor’s degree in Computer Engineering, Data Engineering, Computer Science, or a related technical field (or equivalent practical experience).
    • Proven experience in data engineering, platform engineering, or backend software development, ideally in cloud-native environments.
    • Deep expertise in Python or/and SQL, with strong skills building data or platform tooling.
    • St

Why This Job Might Be a Good Fit

  • Fully remote full-time position
  • Senior devops role at dLocal
  • Open to candidates in Worldwide

Similar Remote Jobs

Devopsfull-timemid
Worldwide
1 month ago
Apply →
Devopsfull-timesenior
Worldwide
1 month ago
Apply →
ST
DevOps Engineer

Stormlight Capital

Devopsfull-timemid
Worldwide
1 month ago
Apply →

Get Daily Remote Job Alerts Before Others Do

Join 12,000+ remote professionals

No spam, unsubscribe anytime. We respect your privacy.

Frequently Asked Questions

Is this position fully remote?

Yes, this role is listed as a remote position. You can work from anywhere within the specified location requirements.

How do I apply for this job?

Click the "Apply on Company Website" button to be redirected to the official application page.

Are international applicants welcome?

Check the location requirements listed above. Some positions are restricted to specific regions.

When was this job posted?

The posting date is shown in the Quick Facts sidebar. We update our listings daily to ensure accuracy.

About dLocal

dLocal

0 open positions

Ready to Apply?

This opportunity could be your next big move.

Apply on Company Website

Quick Facts

Job TypeFull-Time
ExperienceSenior
LocationWorldwide
Categorydevops
Posted2 months ago
Apply Now
Browse all Devops jobs →All jobs at dLocal