Easygo Logo

Easygo

Data Engineer

Posted 2 Days Ago
Be an Early Applicant
Melbourne, Victoria
Mid level
Melbourne, Victoria
Mid level
The Data Engineer will design and maintain scalable ETL pipelines using AWS Glue, implement secure data systems, collaborate with cross-functional teams to enable analytics, and optimize data pipelines for performance. Responsibilities include documenting processes, ensuring compliance with data governance, and participating in code reviews.
The summary above was generated by AI

Are you a passionate and ambitious data engineer ready to dive into an environment that fosters innovation, continuous learning, and professional growth? We're seeking talented individuals who are eager to tackle complex big data problems, build scalable solutions, and collaborate with some of the finest engineers in the entertainment industry.

  • Complex Projects, Creative Solutions: Dive into intricate projects that challenge and push boundaries. Solve complex technical puzzles and craft scalable solutions.
  • Accelerate Your Growth: Access mentorship, training, and hands-on experiences to level up your skills. Learn from industry experts and gain expertise in scaling software.
  • Collaborate with Industry Leaders: Work alongside exceptional engineers, exchanging ideas and driving innovation forward through collaboration.
  • Caring Culture, Career Development: We deeply care about your career. Our culture prioritizes your growth with tailored learning programs and mentorship.
  • Embrace Challenges, Celebrate Success: Take on challenges, learn from failures, and celebrate achievements together.
  • Shape the Future: Your contributions will shape the future of entertainment.

About the team

You’ll be joining the Data Engineering team on an exciting mission to build a top-tier data platform that powers everything we do. We manage data from our in-house products like Stake and Kick, as well as third-party tools and services, centralising it and ensuring it’s reliable, scalable, and ready to drive smarter decisions across the business.

We’re focused on advanced data tools and services that truly make a difference. Our goal is to help teams unlock the full potential of data, empowering them to create impactful outcomes for our customers and the business. If you’re someone who loves tackling big challenges and shaping the future of data, you’ll fit right in!

Key Responsibilities:

  • Design, develop, and maintain scalable ETL pipelines using AWS Glue and orchestrate workflows with Airflow to extract, transform, and load data from various sources (e.g., databases, APIs, flat files, streaming services) into the data lake, following medallion architecture principles.
  • Build and implement secure and efficient data systems using AWS services and Terraform, ensuring performance and compliance.
  • Collaborate with cross-functional teams to transform data from the gold layer in the data lake to Redshift using dbt, enabling high-quality analytics and machine learning insights.
  • Monitor and optimise data pipelines for performance, scalability, and cost-efficiency, ensuring observability through monitoring and alerting systems.
  • Document end-to-end processes, including ingestion, transformation, storage and governance, to support knowledge sharing and scalability.
  • Implement data governance practices such as data lineage, classification, access control, and compliance with GDPR and other regulatory requirements.
  • Build and optimise real-time data pipelines using PySpark, Glue Spark, and Kinesis, focusing on Change Data Capture (CDC) for seamless operations and reliability.
  • Ensure pipelines are thoroughly tested and optimised, with comprehensive monitoring and alerting systems for reliability and performance.
  • Participate in peer code reviews to ensure adherence to best practices, coding standards, and high-quality development. 

Minimum Qualifications:

  • A Bachelor’s degree in Computer Science, Software Engineering, Information Systems, or a related field or equivalent practical experience.
  • 3 - 6 years of experience in data engineering, with a focus on ETL development, data modelling, database management, and real-time data pipelines.
  • Proficiency in SQL, Python, or PySpark, with hands-on experience using cloud services such as Glue, Redshift, Kinesis, Lambda, S3 and DMS.
  • Experience with orchestration tools (e.g., Apache Airflow), version control systems (e.g., GitHub), and big data technologies such as Spark or Hadoop.
  • Experience designing and implementing modern cloud-based data platforms, preferably on AWS, using Infrastructure as Code (IaC) tools like Terraform.
  • Knowledge of data governance and compliance standards.
  • Strong problem-solving, analytical, and communication skills for engaging with cross-functional teams.

Preferred Qualifications 

  • Experience with DataOps principles, CI/CD pipelines, and agile development methodologies.
  • Knowledge of machine learning concepts and their application in data engineering.
  • AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Data Analytics) or similar cloud certifications.

Some of the perks of joining us:

  • Championing Engineering Excellence to influence data driven impact across global scale software products.
  • Work alongside the top 5% of engineering talent in Australia using a vast AWS cloud native and big data technology stack.
  • Exposure to building global, large-scale volume data pipelines, data warehouses, and datalakes which are consuming requests at thousands per second frequency.
  • EAP access for you and your family
  • Access to over 9,000 courses across our Learning and Development Platform 
  • Lucrative Annual Bonuses
  • Paid volunteer day
  • Two full-time barista’s who will make your daily coffee, tea or fresh juice!
  • Daily catered breakfast
  • On-site masseuse on Wednesdays
  • Team lunches and happy hour in the office from 4pm on Fridays
  • Fun office environment with pool tables, table tennis and all your favourite gaming consoles
  • Help yourself drinks fridges and snack shelves

We believe that the unique contributions of everyone at Easygo are the driver of our success. To make sure that our products and culture continue to incorporate everyone's perspectives and experience we never discriminate on the basis of race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. We are passionate about providing a workplace that encourages great participation and an equal playing field, where merit and accomplishment are the only criteria for success.

Top Skills

Pyspark
Python
SQL
HQ

Easygo Melbourne, Victoria, AUS Office

Melbourne, Victoria, Australia

Similar Jobs

2 Days Ago
Melbourne, Victoria, AUS
Mid level
Mid level
Software
As a Data Engineer on the Better Data team, you will design and build data pipelines for analytics, optimize performance, and manage data responsibly while promoting best practices. Your role involves collaborating with cross-functional teams to improve customer experience through effective data use.
Top Skills: PythonSQL
Yesterday
Melbourne, Victoria, AUS
Junior
Junior
Cloud • Mobile • Software • Analytics
As a Software Developer/Data Engineer, you will develop software packages, automate IT processes, perform requirement analysis, handle troubleshooting, and ensure high-quality code documentation while working with Australia's top organizations on data solutions.
Top Skills: JavaPythonR
2 Days Ago
Melbourne, Victoria, AUS
Entry level
Entry level
Information Technology
As a Data Engineer at InterWorks, you will tackle a variety of data projects for clients, creating ETL processes, solving data management issues, and collaborating closely with users to meet their unique needs. Strong SQL and adaptability are essential, with a focus on continuous learning and delivering exceptional solutions.
Top Skills: SQL

What you need to know about the Melbourne Tech Scene

Home to 650 biotech companies, 10 major research institutes and nine universities, Melbourne is among one of the top cities for biotech. In fact, some of the greatest medical advancements were conceptualized and developed here, including Symex Lab's "lab-on-a-chip" solution that monitors hormones to predict ovulation for conception, and Denteric's vaccine for periodontal gum disease. Yet, the thousands of people working in the city's healthtech sector are just getting started, to say nothing of the tech advancements across all other sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account