Capgemini logo

Data Engineer-ETL

Capgemini
Full-time
Remote
Mexico
Programming
Description

Capgemini is seeking a highly motivated and detail-oriented Associate Data Engineer for a top 10 US Insurance Carrier.

Our client is one of the United States’ largest insurers, providing a wide range of insurance and financial services products with gross written premiums well over US$25 Billion (P&C). They proudly serve more than 10 million U.S. households with more than 19 million individual policies across all 50 states through the efforts of over 48,000 exclusive and independent agents and nearly 18,500 employees. Finally, our client is part of one the largest Insurance Groups in the world.

Job Summary:

We are looking for a skilled Data Engineer to design, build, and maintain data pipelines that support analytics and business intelligence initiatives. This role involves both enhancing existing pipelines and developing new ones to integrate data from diverse internal and external sources. The ideal candidate will have advanced SQL and Informatica skills, experience in ETL development, and a foundational understanding of dimensional data modeling. Experience with DBT is a plus.



Requirements

Key Responsibilities:

  • Design, develop, and maintain data pipelines and ETL workflows to ensure reliable data integration across platforms.
  • Enhance and optimize existing data pipelines by adding new attributes, improving performance, or increasing maintainability.
  • Build new data ingestion pipelines from a variety of structured and semi-structured sources.
  • Use Informatica to develop and manage ETL processes in alignment with business requirements.
  • Write and optimize complex SQL queries for data transformation, validation, and extraction.
  • Apply basic knowledge of dimensional data modeling to support reporting and data warehousing needs.
  • Collaborate with data analysts, data scientists, and business teams to understand data needs and deliver clean, structured datasets.
  • Participate in code reviews, documentation, and testing to ensure quality and accuracy in data delivery.
  • Work in agile or project-based environments to deliver on sprint goals and project timelines.

Required Skills & Qualifications:

  • Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience).
  • 2–4 years of hands-on experience in data engineering or ETL development.
  • SQL: Advanced-level proficiency in writing, optimizing, and troubleshooting queries.
  • ETL Tools: Intermediate experience building and managing pipelines using ETL platforms.
  • Informatica: Advanced experience with PowerCenter or Informatica Cloud for data integration tasks.
  • Dimensional Data Modeling: Basic understanding of star and snowflake schema designs.
  • Excellent problem-solving and communication skills with an ability to collaborate across teams.

“Nice to Have” Skills:

  • Experience with DBT (Data Build Tool) for modular and scalable transformation logic.
  • Exposure to cloud data platforms (AWS, GCP, Azure).



Benefits

This position comes with competitive compensation and benefits package:

  1. Competitive salary and performance-based bonuses
  2. Comprehensive benefits package
  3. Career development and training opportunities
  4. Flexible work arrangements (remote and/or office-based)
  5. Dynamic and inclusive work culture within a globally reknowned group
  6. Private Health Insurance
  7. Pension Plan
  8. Paid Time Off
  9. Training & Development