Author
This is some text inside of a div block.
Last Updated
This is some text inside of a div block.
Editorial Transparency
This is some text inside of a div block.

Sr. Data Engineer

As a Senior Data Engineer at Oscilar, you will be responsible for designing, building, and maintaining the data infrastructure that powers our AI-driven decisioning and risk management platform. You will collaborate closely with cross-functional teams, ensuring the delivery of highly reliable, low-latency, and scalable data pipelines and storage solutions that support real-time analytics and mission-critical ML/AI models.

Responsibilities

  • Architect and implement scalable ETL and data pipelines spanning ClickHouse, Postgres, Athena, and diverse cloud-native sources to support real-time risk management and advanced analytics for AI-driven decisioning.
  • Design, develop, and optimize distributed data storage solutions to ensure both high performance (low latency, high throughput) and reliability at scale—serving mission-critical models for fraud detection and compliance.
  • Drive schema evolution, data modeling, and advanced optimizations for analytical and operational databases, including sharding, partitioning, and pipeline orchestration (batch, streaming, CDC frameworks).
  • Own the end-to-end data flow: integrate multiple internal and external data sources, enforce data validation and lineage, automate and monitor workflow reliability (CI/CD for data, anomaly detection, etc.).
  • Collaborate cross-functionally with engineers, product managers, and data scientists to deliver secure, scalable solutions that enable fast experimentation and robust operationalization of new ML/AI models.
  • Champion radical ownership—identify opportunities, propose improvements, and implement innovative technical and process solutions within a fast-moving, remote-first culture.
  • Mentor and upskill team members, cultivate a learning environment, and contribute to a collaborative, mission-oriented culture.

Qualifications

  • 5+ years in data engineering (or equivalent), including architecting and operating production ETL/ELT pipelines for real-time, high-volume analytic platforms.
  • Deep proficiency with ClickHouse, Postgres, Athena, and distributed data systems (Kafka, cloud-native stores); proven experience with both batch and streaming pipeline design.
  • Advanced programming in Python and SQL, with bonus points for Java; expertise in workflow orchestration (Airflow, Step Functions), CI/CD, and automated testing for data.
  • Experience in high-scale, low-latency environments; understanding of security, privacy, and compliance requirements for financial-grade platforms.
  • Strong communication, business alignment, and documentation abilities—capable of translating complex tech into actionable value for customers and stakeholders.
  • Alignment with Oscilar’s values: customer obsession, radical ownership, bold vision, efficient growth, and unified teamwork with a culture of trust and excellence.

Nice-to-have

  • Experience integrating Kafka with analytics solutions like ClickHouse.
  • Knowledge of event-driven architecture and streaming patterns like CQRS and event sourcing.
  • Hands-on experience with monitoring tools (e.g., Prometheus, Grafana, Kafka Manager).
  • Experience automating infrastructure with tools like Terraform or CloudFormation.
  • Proficiency with Postgres, Redis, ClickHouse, and DynamoDB. Experience with data modeling, query optimization, and high-transaction databases.
  • Familiarity with encryption, role-based access control, and secure API development.

About Oscilar

A company specializing in advanced data analytics solutions that enhance decision-making processes for businesses.
Apply Now