MetLife logo, now hiring for IT positions
MetLife Logo

MetLife

Associate Big data engineer

Posted Yesterday
Remote
Junior
Remote
Junior
As an Associate Big Data Engineer at MetLife, you will support the implementation of data ingestion processes using various Big Data tools, maintain ETL code, monitor performance, and ensure data security while collaborating with cross-functional teams on data enablement solutions.
The summary above was generated by AI

Description and Requirements
Continuing with the tradition of innovation, MetLife, as part of its Data and Analytics function, established a dedicated center for advanced analytics & research in India. DnA Hyderabad a.k.a Global Advanced analytics and Research Center (GARC) is part of larger Data and analytics organization (DnA) of MetLife focused on scaling data governance, data management, data engineering, data science/machine learning/ artificial intelligence, visualization, techno-project management capabilities, enabling a more cost-effective analytics operating model, and increasing data and analytics maturity across the MetLife global community.
Driven by passion and purpose, we are looking for you, high performing data, and analytics professional, to drive and support development and deployment of actionable, high impact, data, and analytics solutions in support of MetLife's enterprise functions and lines of businesses across markets. The portfolio of work delivers data driven solutions across key business functions such as customer acquisition and targeting, engagement, retention, and distribution, underwriting, claims service & operations, risk management, investments, audit, and tackling hard, open-ended problems.
The portfolio of work will support deployment of models to various clusters and environments with support/ guidance of Big Data Engineers by following a set of standards and will ensure operational readiness by incorporating configuration management, exception handling and logging for end-to-end batch and real-time model operationalization. The position requires understanding of data engineering, azure devops and atlassian stack, container as a service.
You will work and collaborate with a nimble, autonomous, cross-functional team of makers, breakers, doers, and disruptors who love to solve real problems and meet real customer needs. You will be using cutting-edge technologies and frameworks to process data, create data pipelines and collaborate with the data science team to operationalize end to end machine learning & AI solutions.
Responsibilities

  • Contribute towards supporting the build/implementation of data ingestion and curation processes developed using Big data tools such as Spark (Scala/python), Hive, HDFS, Kafka, Pig, Spark, HDFS, Oozie, Sqoop, Flume, Zookeeper, Kerberos, Sentry, Impala, CDP 7.x etc. under the guidance of Big Data Engineers.
  • Support the ingestion of huge data volumes from various platforms for Analytics needs and prepare high-performance, reliable, and maintainable ETL code with support/review guidance of senior team members.
  • Provide relevant support in monitoring performance and advising any necessary infrastructure changes to senior team members for their review inputs.
  • Understand the defined data security principals and policies developed using Ranger and Kerberos.
  • Gain broader understanding on how to support application developers and progressively work on efficient big data application development using cutting edge technologies.
  • Collaborate with Business systems analyst, technical leads, Project managers and business/operations teams in building data enablement solutions across different LOBs and use cases basis
  • Understand & support the creation of reusable frameworks which will optimize the development effort involved.
  • Contribute towards supporting the build/implementation of data ingestion and curation processes developed using Big data tools such as Spark (Scala/python), Hive, HDFS, Kafka, Pig, Spark, HDFS, Oozie, Sqoop, Flume, Zookeeper, Kerberos, Sentry, Impala, CDP 7.x etc. under the guidance of Big Data Engineers.
  • Support the ingestion of huge data volumes from various platforms for Analytics needs and prepare high-performance, reliable, and maintainable ETL code with support/review guidance of senior team members.
  • Provide relevant support in monitoring performance and advising any necessary infrastructure changes to senior team members for their review inputs.
  • Understand the defined data security principals and policies developed using Ranger and Kerberos.
  • Gain broader understanding on how to support application developers and progressively work on efficient big data application development using cutting edge technologies.
  • Collaborate with Business systems analyst, technical leads, Project managers and business/operations teams in building data enablement solutions across different LOBs and use cases basis
  • Understand & support the creation of reusable frameworks which will optimize the development effort involved.


About MetLife
Recognized on Fortune magazine's list of the 2024 "World's Most Admired Companies" and Fortune World's 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world's leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East.
Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we're inspired to transform the next century in financial services.
At MetLife, it's #AllTogetherPossible . Join us!
#BI-Hybrid

Top Skills

Python
Scala

Similar Jobs at MetLife

Yesterday
India
Remote
43,000 Employees
Entry level
43,000 Employees
Entry level
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Associate Data Scientist role at MetLife involves working with data to deliver insights and support business decisions. The position likely includes data analysis, model development, and collaboration with cross-functional teams.

What you need to know about the Melbourne Tech Scene

Home to 650 biotech companies, 10 major research institutes and nine universities, Melbourne is among one of the top cities for biotech. In fact, some of the greatest medical advancements were conceptualized and developed here, including Symex Lab's "lab-on-a-chip" solution that monitors hormones to predict ovulation for conception, and Denteric's vaccine for periodontal gum disease. Yet, the thousands of people working in the city's healthtech sector are just getting started, to say nothing of the tech advancements across all other sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account