Pagos Logo

Pagos

Data Engineer

Posted 5 Days Ago
Be an Early Applicant
Remote
28 Locations
Senior level
Remote
28 Locations
Senior level
As a Data Engineer at Pagos, you will build and maintain scalable data pipelines, ensure data quality through testing, and collaborate with teams to integrate data providers. You will also drive engineering projects with high ownership and craft high-quality code for performance and availability.
The summary above was generated by AI

About Us

At Pagos, we’re passionate about empowering businesses to take control of their payments stack and solve the puzzles standing between them and optimized growth. Our global platform provides developers, product teams, and payments leaders with both a deeper understanding of their payments data and access to new payments technology through user-friendly tools that are easy to implement. To succeed in this, we need creative thinkers who are willing to roll up their sleeves and start building alongside us.

About the Role

As a Data Engineer, you’ll play a key part in building and maintaining the platform that powers our products. By collaborating with backend engineers, data analysts, and other engineers, you’ll build and own new features, modules, and extensions of our systems. We’re seeking an action-oriented and collaborative problem solver who thrives in ambiguity and can take on new challenges with optimism in a fast-paced environment. We value team members who are not only skilled in their area of expertise but are also perpetual learners who are committed to growth and contributing to our collective success.


In this role, you will:

  • Craft high-quality code for scale, availability, and performance

  • Design, develop, and maintain scalable data pipelines and processes to extract, process, and transform large volumes of data, both real-time and batched (ELT/ETL)

  • Build and maintain integrations with data providers using various data transfer protocols

  • Drive engineering projects from start to finish with a high level of ownership and autonomy

  • Ensure the quality of our products and data through both manual and automated testing, as well as code reviews

What We’re Looking For

We’re looking for someone with:

  • 8+ years of software engineering experience with an emphasis on Data Engineering

  • Bachelor’s degree or higher in Computer Science or related technical discipline (or equivalent experience)

  • Advanced experience with complex SQL queries and database/lakehouse technologies such as Redshift, Apache Iceberg and Postgres

  • Deep experience with big data technologies and frameworks such as Apache Spark, DBT, as well as data quality tools, like DBT (test)

  • Familiarity with cloud platforms like AWS, GCP, or Azure, and common data-related services (e.g. S3, Redshift, EMR, Glue, Kinesis, Athena)

  • A bias for action, where no task is too small, and an eagerness to learn and grow with our industry

Nice to have: 

  • Experience with real-time streaming frameworks like Apache Kafka

  • Experience with Great Expectations and/or Soda

  • Comfort and/or past experience working and managing big data and ELT pipelines

  • Comfort and/or past experience working with Temporal, Apache Airflow or similar orchestration tools

  • Experience working in high-growth, venture-backed startup(s)

Pagos does not accept unsolicited resumes from third-party recruiting agencies. All interested candidates are encouraged to apply directly. 

Top Skills

Apache Airflow
Apache Iceberg
Apache Kafka
Apache Redshift
Spark
Athena
AWS
Azure
Dbt
Emr
GCP
Glue
Kinesis
Postgres
S3
SQL
Temporal

Similar Jobs

4 Days Ago
Remote
Hybrid
10 Locations
Mid level
Mid level
Artificial Intelligence • Healthtech • Machine Learning • Natural Language Processing • Biotech • Pharmaceutical
Lead AI and Data Engineering projects, develop AI models, oversee data solutions and collaborate with teams to improve patient outcomes.
Top Skills: Amazon NeptuneAmazon RedshiftApache AirflowApache NifiApache SolrAWSAzureDockerElasticsearchGoogle BigqueryGoogle Cloud PlatformHadoopInformaticaJavaKafkaKubernetesNeo4JPrefectPythonScalaSnowflakeSparkSQLTalend
6 Days Ago
Remote
29 Locations
Senior level
Senior level
Software
As a Senior Geospatial Data Engineer, you will work on ingesting satellite imagery and geospatial data, optimizing data processes, building data pipelines, and mentoring junior engineers. You'll play a lead role in initiatives to enhance data processing capabilities while working collaboratively across teams.
6 Days Ago
Remote
28 Locations
Mid level
Mid level
Fintech • HR Tech • Payments • Financial Services
As a Data Engineer, you will design and manage scalable data pipelines, collaborate cross-functionally, enforce data governance, optimize SQL queries, and resolve data-related issues, contributing to Deel's data infrastructure development.
Top Skills: AirflowDbtFivetranGithub ActionsKubernetesPythonSnowflakeSQLTerraform

What you need to know about the Melbourne Tech Scene

Home to 650 biotech companies, 10 major research institutes and nine universities, Melbourne is among one of the top cities for biotech. In fact, some of the greatest medical advancements were conceptualized and developed here, including Symex Lab's "lab-on-a-chip" solution that monitors hormones to predict ovulation for conception, and Denteric's vaccine for periodontal gum disease. Yet, the thousands of people working in the city's healthtech sector are just getting started, to say nothing of the tech advancements across all other sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account