ETL Developer


  • Job ID:

    4032
  • Pay rate range:

    $65 - $75
  • City:

    Chicago
  • State:

    Illinois
  • Duration:

    10/09/2022 - 04/09/2023
  • Job Type:

    Contract
  • Job Description

    ETL Developer informatica 

    $65 / HR $75 / HR (1.5x overtime)

    Remote

    Must have:

                       Python / Lambda

                       AWS

                       Kafka and Kafka streaming 

     

    Responsibilities: Works closely with Analytics and Marketing users to understand informational needs and business challenges, document those requirements, and translate into solutions. Partners with the work stream leads to ensure overall cost, delivery, and quality targets are met. Reviews functional and business specifications from the team and defines detailed ETL technical specifications. 

    Utilizes the ETL application to analyze data, necessary to resolve issues uncovered and related to source data, ensuring technical specifications are thorough. Defines, develops, documents and maintains Informatica ETL mappings and scripts. 

    Creates and deploys Informatica work flows in Linux environments. Defines, develops, documents and maintains procedural and SQL programs. Assists in the development, execution, and documentation of system and integration test plans. 

    Performs ETL / Informatica tuning and SQL tuning. Optimizes Informatica code performance and processing methods. Performs design validation, reconciliation and error handling in load and extract processes (initial and incremental). Support Production batch environment. 

     

    Qualifications: Knowledge & Experience: 3+ years of Informatica Power Center software experience is Strong SQL Knowledge

    Strong understanding of using ETL tools to integrate systems. Knowledge of System Development Life Cycle (SDLC). 3+ years working with DB2 or equivalent RDBMS is preferred. Informatica Power Center 9.6 experience preferred. Java development experience preferred. Shell scripting experience preferred. Autosys experience preferred. Personal Attributes: Strong analytical and problem solving skills. Strong quality assurance and accuracy focus.

     

    Project Details: 2 to 5+ years of experience within the field of data engineering or related technical work including business intelligence, analytics

    Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business

    Very good understanding of Data warehousing concepts and approaches

    Hands-on experience building complex business logics and ETL workflows using Informatica IICS and PowerCenter.

    Good Proficient in SQL, PL/SQL and preferably experience in Snowflake

    Good Experience in one of the scripting languages: Python or Unix Scripting

    Experience in data cleansing, data validation and data wrangling

    Hands-on experience in AWS cloud and AWS native technologies such as Glue, Lambda, Kinesis, Lake Formation, S3, Redshift

    Experience using Spark EMR, RDS, EC2, Athena, API capabilities, CloudWatch, CloudTrail is a plus

    Experience with Business Intelligence tools like Tableau, Cognos, ThoughtSpot.

    #LI-remote #PCIT

     

     

Add Reference

CONTRACTORS

Find and apply to jobs on the go

Take our free app with you anywhere. Enter your phone number and we’ll send you the download link.

  • +1

`

Client Registration Request