Hi! We’re Mercuryo, and we’re on a mission to redefine finance by blending the best of traditional banking with the innovations of decentralized finance (DeFi). We believe that everyone should have easy access to Web3 and traditional financial services — and we’re making that happen by building a robust platform that simplifies dealing with crypto and seamlessly integrates it into the broader financial ecosystem.
Since we launched in 2018, we’ve teamed up with Web3 top projects such as MetaMask, Trust Wallet, Ledger, Jupiter, 1inch, and PancakeSwap and 200+ others to power over 200 dynamic products. Our work also brings us into direct collaboration with major ecosystems such as Solana Labs, Consensys, and BNB Chain. We’re just getting started, and we want you to help us shape the future of money!
Why Mercuryo?
Industry Impact
Join us in helping world-class Web3 projects onboard millions of new users into the next generation of finance.
Innovative Environment
Collaborate with more than 300 talented professionals from diverse backgrounds — including banking, SaaS, and Web3 — all united in delivering outstanding user experiences.
Growth and Learning
Our expanding network of 200+ B2B partnerships and a user base of over 7 million means there’s always room to grow your skills, tackle new challenges, and push boundaries.
Flexible Culture
We’re remote-first, celebrating diversity across 30 countries. At Mercuryo, you’ll be empowered to take ownership of your work, spark creativity, and shape how we move forward together.
About the Role:
We are looking for an experienced Data Engineer to join our Analytics Infrastructure team. Our team is responsible for collecting, processing, and storing all company data that drives business decisions and product analytics. You will participate in developing a high-load analytical platform, work with Big Data, and build scalable solutions for data storage and processing. Your work will directly impact key business decisions and product development.
Your Role:
- Data collection and integration from third-party services (APIs, SFTP, databases, etc.)
- Design and development of ETL/ELT processes for analytical data warehouse.
- Assistance to analytics team with queries refactoring and optimization.
- Code refactoring and performance optimization of existing data pipelines.
- Ensuring data quality and consistency across the warehouse.
- Monitoring and maintaining data infrastructure.
Requirements:
- Experience in Python, middle+ or senior level.
- Experience with Apache Airflow (2.8+) for data orchestration.
- Deep knowledge of SQL and experience with database and queries optimization.
- Experience with columnar databases (Starrocks, ClickHouse, Vertica, etc.)
- Experience in ELT processes design and implementation.
- Experience working with large datasets and performance optimization.
Nice to Have:
- Experience with StarRocks and BigQuery.
- Infrastructure-as-Code experience with Terraform.
- Experience with DBT.
- Experience with Kubernetes (as a user).
What We Offer:
- Competitive market rate salary and performance-based incentives.
- 22 days annual leave with an additional 6 company days, plus bank holidays.
- Comprehensive health insurance plans.
- Extensive benefits program.
- Flexible work schedule and remote work options.
- Modern offices and co-working spaces across 6 countries.
- Working equipment.
- Professional development and training opportunities.
- Opportunity to shape the initiatives you’re working on.
- Diverse and friendly team.
- We are open-minded to new ideas.
Join Us!
If you're driven to be a part of the Web3 forefront and are keen to leave your mark on this rapidly evolving field, Mercuryo is an excellent choice. Discover our open positions and see how you can contribute to shaping the future!