Synechron Logo

Synechron

Enterprise Data Engineer | Cloud (AWS, Azure) | Big Data (Spark, Hadoop) | ETL & Data Pipelines | SQL & NoSQL

Posted 9 Days Ago
Remote
Hiring Remotely in Hinjawadi, Pune, Mahārāshtra
Senior level
Remote
Hiring Remotely in Hinjawadi, Pune, Mahārāshtra
Senior level
The Data Engineer will design, develop, and maintain data pipelines and solutions, ensuring data quality and collaborating with cross-functional teams to meet analytics needs.
The summary above was generated by AI

Job Summary

Synechron is seeking a proficient Data Engineer to support the design, development, and maintenance of scalable, efficient data pipelines and enterprise data solutions. The role involves collaborating with cross-functional teams to gather requirements, implement data management strategies, and ensure data quality, security, and availability. The Data Engineer will leverage experience in cloud platforms, big data tools, and modern development practices to enable data-driven decision-making and operational excellence across the organization.

Software Requirements

Required:

  • Strong understanding of data management concepts, cloud platforms (preferably AWS or Azure), and scalable architectures.

  • Hands-on experience with programming languages such as Python, Java, or Node.js.

  • Practical experience with big data tools like Apache Spark, Hadoop, Flink, or similar frameworks.

  • Working knowledge of databases such as SQL (MySQL, SQL Server, PostgreSQL) and NoSQL databases (e.g., MongoDB, DynamoDB).

  • Experience with data orchestration and pipeline tools such as Apache Airflow, Luigi, or comparable frameworks.

  • Familiarity with version control systems such as Git and collaboration tools like JIRA and Confluence.

Preferred:

  • Knowledge of containerization (Docker, Kubernetes) and infrastructure as code (Terraform, CloudFormation).

  • Experience in deploying and managing data pipelines on cloud platforms like AWS Glue, Azure Data Factory, or GCP Dataflow.

  • Familiarity with stream processing tools like Kafka or Kinesis.

  • Exposure to data security protocols and compliance standards (GDPR, HIPAA, etc.).

Overall Responsibilities

  • Design, develop, and maintain large-scale data pipelines, ETL workflows, and data integrations to support analytics, reporting, and operational needs.

  • Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver reliable solutions.

  • Optimize and monitor data pipelines for performance, scalability, and data quality.

  • Implement data governance, validation, and cataloging processes to ensure data integrity and security.

  • Automate deployment, testing, and data infrastructure changes using CI/CD practices.

  • Participate in architecture discussions, technical reviews, and documentation to support data ecosystem growth.

  • Stay informed of emerging data technologies, industry standards, and best practices, and incorporate relevant innovations.

Expected outcomes:
Reliable, scalable, secure, and high-performing data pipelines that support organizational analytics and business intelligence initiatives.

Technical Skills (By Category)

Programming Languages:

  • Essential: Python, Java, or Node.js

  • Preferred: Spark (PySpark, Spark Scala), SQL for data manipulation

Databases/Data Management:

  • Essential: SQL database management (MySQL, PostgreSQL, SQL Server)

  • Preferred: NoSQL databases (MongoDB, DynamoDB)

Cloud Technologies:

  • Preferred: AWS (Glue, S3, EMR), Azure Data Factory, GCP Dataflow

Frameworks & Libraries:

  • Essential: Apache Spark, Kafka, Hadoop ecosystem components

  • Preferred: Dask, Flink

Development Tools & Methodologies:

  • Essential: Git, Jenkins, CI/CD pipelines, Agile/Scrum practices

  • Preferred: Terraform, Docker, Kubernetes, DataOps tools

Security & Compliance:

  • Awareness of data encryption, access controls, and compliance frameworks such as GDPR, HIPAA, and data masking best practices.

Experience Requirements

  • Minimum of 5+ years developing and maintaining enterprise data pipelines and big data solutions.

  • Proven experience in designing scalable ETL workflows, integrating cloud data services, and optimizing data processes.

  • Demonstrable success in deploying data solutions that support reporting, analytics, and machine learning initiatives.

  • Industry experience in finance, healthcare, retail, or enterprise sectors highly desirable; relevant open-source or academic projects also acceptable.

Day-to-Day Activities

  • Develop, test, and deploy scalable data pipelines and ETL workflows.

  • Collaborate with business and data science teams to gather requirements and deliver data solutions.

  • Monitor data pipelines and optimize for performance, reliability, and security.

  • Troubleshoot technical issues, perform root cause analysis, and apply fixes.

  • Automate deployment and infrastructure provisioning procedures.

  • Maintain detailed documentation of data architecture, workflows, and operational guidelines.

  • Proactively research emerging data tools and platforms to recommend innovation.

Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related disciplines.

  • 5+ years of experience supporting enterprise data ecosystems, especially on cloud platforms.

  • Experience with big data frameworks, cloud data services, and automation tools.

  • Certifications in cloud platforms (AWS Data Analytics, Azure Data Engineer, GCP Data Engineer) are advantageous.

  • Strong problem-solving, analytical thinking, and communication skills.

Professional Competencies

  • Critical thinking to design innovative and scalable data architectures.

  • Leadership skills to mentor junior staff and guide data projects.

  • Effective stakeholder management for cross-team collaboration.

  • Adaptability to rapidly evolving data technologies and organizational needs.

  • Ownership of data quality, security, and compliance standards.

  • Time management skills to effectively prioritize tasks and meet project deadlines.

S​YNECHRON’S DIVERSITY & INCLUSION STATEMENT
 

Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.

All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.

Candidate Application Notice

Similar Jobs

2 Hours Ago
Remote or Hybrid
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
The role involves managing cross-program dependencies, optimizing processes, conducting quality assurance, and enhancing resource allocation strategies for financial transformation projects.
Top Skills: Azure DevopsClarityMS OfficeTableau
4 Hours Ago
Easy Apply
Remote
Easy Apply
Senior level
Senior level
Artificial Intelligence • Consumer Web • Digital Media • Information Technology • Social Impact • Software
As a Senior Quality Platform Engineer, you will develop and maintain quality infrastructure, improve developer experience, and implement quality engineering practices to ensure scalable, efficient testing workflows.
Top Skills: AWSAzureCircleCICypressDockerGCPGithub ActionsGitlabJavaJavaScriptJestJunitKubernetesPlaywrightPythonRubyTypescript
4 Hours Ago
Remote or Hybrid
Expert/Leader
Expert/Leader
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Seeking an experienced Full Stack Developer with 8+ years expertise in backend and React UI, overseeing high-quality development, collaborating on modern web solutions, and driving performance optimization.
Top Skills: Context ApiDockerGitGraphQLHTML5JavaJavaScriptJestKubernetesMochaNode.jsPythonReactReduxRestful ApisShellTypescript

What you need to know about the Melbourne Tech Scene

Home to 650 biotech companies, 10 major research institutes and nine universities, Melbourne is among one of the top cities for biotech. In fact, some of the greatest medical advancements were conceptualized and developed here, including Symex Lab's "lab-on-a-chip" solution that monitors hormones to predict ovulation for conception, and Denteric's vaccine for periodontal gum disease. Yet, the thousands of people working in the city's healthtech sector are just getting started, to say nothing of the tech advancements across all other sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account