Koantek Logo

Koantek

Data Engineer

Posted Yesterday
Be an Early Applicant
Remote
Hiring Remotely in Maharashtra
Mid level
Remote
Hiring Remotely in Maharashtra
Mid level
The Data Engineer designs and builds data pipelines, integrates data, manages data infrastructure, and collaborates with data scientists to support data-driven strategies.
The summary above was generated by AI
Why Koantek?

Koantek stands at the forefront of Data and GenAI solutions, specializing in healthcare, life sciences, manufacturing, and financial services. As a global provider of technology services and solutions with a focus on Artificial Intelligence and Machine Learning, we deliver tailored solutions that enable businesses to leverage data for growth and innovation. Our team of experts utilizes deep industry knowledge combined with cutting-edge technologies, tools, and methodologies to drive impactful results. By partnering with clients across a diverse range of industries—from emerging startups to established enterprises—we help them uncover new opportunities and achieve a competitive advantage in the digital age.



Data Engineer

Description: 

As a Data Engineer at Koantek, you will leverage advanced data engineering techniques and analytics to support business decisions for our clients. Your role will involve designing and building robust data pipelines, integrating structured and unstructured data from various sources, and developing tools for data processing and analysis. You will play a pivotal role in managing data infrastructure, optimizing data workflows, and guiding data-driven strategies while working closely with data scientists and other stakeholders.

The impact you will have:

  • Guide Big Data Transformations: Implementation of comprehensive big data projects, including the development and deployment of innovative big data and AI applications.

  • Ensure Best Practices: Guarantee that Databricks best practices are applied throughout all projects to maintain high-quality service and successful implementation.

  • Support Project Management: Assist the Professional Services leader and project managers with estimating efforts and managing risks within customer proposals and statements of work.

  • Architect Complex Solutions: Design, develop, deploy, and document complex customer engagements, either independently or as part of a technical team, serving as the technical lead and authority.

  • Enable Knowledge Transfer: Facilitate the transfer of knowledge and provide training to team members, customers, and partners, including the creation of reusable project documentation.

  • Contribute to Consulting Excellence: Share expertise with the consulting team and offer best practices for client engagement, enhancing the effectiveness and efficiency of other teams.




Requirements

Minimum qualifications:

  • Educational Background: Bachelor’s degree in Computer Science, Information Technology, or a related field (or equivalent experience).

  • Experience:

    • 3+ years of experience as a Data Engineer, with proficiency in at least two major cloud platforms (AWS, Azure, GCP).

    • Proven experience in designing, developing, and implementing comprehensive data engineering solutions using Databricks, specifically for large-scale data processing and integration projects

    • Develop scalable streaming and batch solutions using cloud-native components.

    • Perform data transformation tasks, including cleansing, aggregation, enrichment, and normalisation, utilising Databricks and related technologies.

    • Experience in applying DataOps principles and implementing CI/CD and DevOps practices within data environments to optimize development and deployment workflows.

  • Technical Skills:

    • Expert-level proficiency in Spark Scala, Python, and PySpark.

    • In-depth knowledge of data architecture, including Spark Streaming, Spark Core, Spark SQL, and data modeling.

    • Hands-on experience with various data management technologies and tools, such as Kafka, StreamSets, and MapReduce.

    • Proficient in using advanced analytics and machine learning frameworks, including Apache Spark MLlib, TensorFlow, and PyTorch, to drive data insights and solutions.

  • Databricks Specific Skills:

    • Extensive experience in data migration from on-premises to cloud environments and in implementing data solutions on Databricks across cloud platforms (AWS, Azure, GCP).

    • Skilled in designing and executing end-to-end data engineering solutions using Databricks, focusing on large-scale data processing and integration.

    • Proven hands-on experience with Databricks administration and operations, including notebooks, clusters, jobs, and data pipelines.

    • Experience integrating Databricks with other data tools and platforms to enhance overall data management and analytics capabilities.

  • Good to have Certifications: 

    • Certification in Databricks Engineering (Professional)

    • Microsoft Certified: Azure Data Engineer Associate

    • GCP Certified: Professional Google Cloud Certified.

    • AWS Certified Solutions Architect Professional



Similar Jobs

2 Days Ago
Remote
Junior
Junior
Other
As a Data Engineer focusing on web scraping, you will manage scraping configurations, monitor errors, oversee data retrieval, and build data pipelines.
Top Skills: AirflowAzureDbtGitNoSQLPythonSQLTerraform
3 Days Ago
Remote or Hybrid
Mid level
Mid level
Artificial Intelligence • Healthtech • Software • Analytics
As a Data Engineer, you will enhance data pipelines, support ML workflows, and improve cloud infrastructure on Azure and AWS, collaborating with Scientists and Engineers to turn data into actionable insights.
Top Skills: AWSAzureDatabricksPythonRSQL
Junior
Artificial Intelligence • Big Data • Information Technology • Consulting
We are looking for a Data Engineer with skills in Python, SQL, and Azure technologies to develop Big Data applications. Responsibilities include working with Spark and Docker to handle large scale data processing and analytics.
Top Skills: Azure Data FactoryAzure SynapseDatabricksDockerPythonSparkSQL

What you need to know about the Melbourne Tech Scene

Home to 650 biotech companies, 10 major research institutes and nine universities, Melbourne is among one of the top cities for biotech. In fact, some of the greatest medical advancements were conceptualized and developed here, including Symex Lab's "lab-on-a-chip" solution that monitors hormones to predict ovulation for conception, and Denteric's vaccine for periodontal gum disease. Yet, the thousands of people working in the city's healthtech sector are just getting started, to say nothing of the tech advancements across all other sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account