Zensar Technologies Logo

Zensar Technologies

Data Engineer

Posted 2 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in India
Senior level
Remote
Hiring Remotely in India
Senior level
Design and build ETL pipelines using Apache Spark/PySpark, implement Iceberg table operations, and validate analytical schema. Requires strong data engineering experience and collaboration with stakeholders.
The summary above was generated by AI

Key responsibilities:**


- Design and build the Apache Spark/PySpark ETL pipeline (Bronze → Silver → Gold medallion architecture)


- Implement Apache Iceberg table operations (MERGE, UPSERT, SCD Type 2 logic, incremental loads)


- Design and validate the analytical star schema (fact/dimension tables, conformed dimensions)


- Define and execute three-tier data quality rules, dead-letter handling, and validation logic


- Build business logic connectors, transformation helpers, and custom derivations


- Collaborate with stakeholders to clarify KPIs, query patterns, and analytical use cases


- Write comprehensive unit, integration, and end-to-end tests



**Required skills:**


- 5+ years data engineering, backend development, or full-stack analytics platform work


- Expert-level PySpark, SQL, and cloud data warehousing (Snowflake/Databricks/similar)


- Apache Iceberg or strong willingness to learn advanced table formats


- Dimensional modelling fundamentals (Kimball star schema, slowly changing dimensions)


- Python for data transformation and microservices


- API design, error handling, and testing discipline

Responsibilities

Required skills:**


- 5+ years data engineering, backend development, or full-stack analytics platform work


- Expert-level PySpark, SQL, and cloud data warehousing (Snowflake/Databricks/similar)


- Apache Iceberg or strong willingness to learn advanced table formats


- Dimensional modelling fundamentals (Kimball star schema, slowly changing dimensions)


- Python for data transformation and microservices


- API design, error handling, and testing discipline

Qualifications

Required skills:**


- 5+ years data engineering, backend development, or full-stack analytics platform work


- Expert-level PySpark, SQL, and cloud data warehousing (Snowflake/Databricks/similar)


- Apache Iceberg or strong willingness to learn advanced table formats


- Dimensional modelling fundamentals (Kimball star schema, slowly changing dimensions)


- Python for data transformation and microservices


- API design, error handling, and testing discipline

About UsAt Zensar, we’re “experience-led everything”. We are committed to conceptualizing, designing, engineering, marketing, and managing digital solutions and experiences for over 130 leading enterprises. We are a company driven by a bold purpose: Together, we shape experiences for better futures. Whether for our clients, our people, or the world around us, this belief powers everything we do. At the heart of our culture is ONE with Client - a set of four core values that reflect who we are and how we work: One Zensar, Nurturing, Empowering, and Client Focus.
Part of the $4.8 billion RPG Group, we’re a community of 10,000+ innovators across 30+ global locations, including Milpitas, Seattle, Princeton, Cape Town, London, Zurich, Singapore, and Mexico City. Explore Life at Zensar and join us to Grow. Own. Achieve. Learn. to be the best version of yourself.
We believe the best work happens when individuality is celebrated, growth is encouraged, and well-being is prioritized. We are an equal employment opportunity (EEO) and affirmative action employer, committed to creating an inclusive workplace. All qualified applicants will be considered without regard to race, creed, color, ancestry, religion, sex, national origin, citizenship, age, sexual orientation, gender identity, disability, marital status, family medical leave status, or protected veteran status.

Similar Jobs

Yesterday
Remote
Mid level
Mid level
Software • Consulting
The Looker Data Engineer will develop Looker data models, optimize reporting solutions, use Looker APIs, and work with SQL and cloud platforms.
Top Skills: DbtLookerLookmlPythonSnowflakeSQL
10 Days Ago
Remote or Hybrid
Mid level
Mid level
Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
Design and optimize SQL scripts, develop Java applications and APIs, lead collaboration sessions, and ensure solutions adhere to RDBMS principles.
Top Skills: Aws RdsCore JavaGitOciOracle DbPl/SqlRest ApisSQL
10 Days Ago
Remote or Hybrid
Mid level
Mid level
Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
The Business Data Analyst II will design and optimize SQL and PL/SQL scripts, develop Java applications and RESTful APIs, gather requirements, and ensure solutions maintain data integrity and security.
Top Skills: Aws RdsCore JavaGitOciOracle DbPl/SqlRest ApisSQLSql Developer

What you need to know about the Melbourne Tech Scene

Home to 650 biotech companies, 10 major research institutes and nine universities, Melbourne is among one of the top cities for biotech. In fact, some of the greatest medical advancements were conceptualized and developed here, including Symex Lab's "lab-on-a-chip" solution that monitors hormones to predict ovulation for conception, and Denteric's vaccine for periodontal gum disease. Yet, the thousands of people working in the city's healthtech sector are just getting started, to say nothing of the tech advancements across all other sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account