ETL Developer informatica
$65 / HR $75 / HR (1.5x overtime)
Python / Lambda
Kafka and Kafka streaming
Responsibilities: Works closely with Analytics and Marketing users to understand informational needs and business challenges, document those requirements, and translate into solutions. Partners with the work stream leads to ensure overall cost, delivery, and quality targets are met. Reviews functional and business specifications from the team and defines detailed ETL technical specifications.
Utilizes the ETL application to analyze data, necessary to resolve issues uncovered and related to source data, ensuring technical specifications are thorough. Defines, develops, documents and maintains Informatica ETL mappings and scripts.
Creates and deploys Informatica work flows in Linux environments. Defines, develops, documents and maintains procedural and SQL programs. Assists in the development, execution, and documentation of system and integration test plans.
Performs ETL / Informatica tuning and SQL tuning. Optimizes Informatica code performance and processing methods. Performs design validation, reconciliation and error handling in load and extract processes (initial and incremental). Support Production batch environment.
Qualifications: Knowledge & Experience: 3+ years of Informatica Power Center software experience is Strong SQL Knowledge
Strong understanding of using ETL tools to integrate systems. Knowledge of System Development Life Cycle (SDLC). 3+ years working with DB2 or equivalent RDBMS is preferred. Informatica Power Center 9.6 experience preferred. Java development experience preferred. Shell scripting experience preferred. Autosys experience preferred. Personal Attributes: Strong analytical and problem solving skills. Strong quality assurance and accuracy focus.
Project Details: 2 to 5+ years of experience within the field of data engineering or related technical work including business intelligence, analytics
Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business
Very good understanding of Data warehousing concepts and approaches
Hands-on experience building complex business logics and ETL workflows using Informatica IICS and PowerCenter.
Good Proficient in SQL, PL/SQL and preferably experience in Snowflake
Good Experience in one of the scripting languages: Python or Unix Scripting
Experience in data cleansing, data validation and data wrangling
Hands-on experience in AWS cloud and AWS native technologies such as Glue, Lambda, Kinesis, Lake Formation, S3, Redshift
Experience using Spark EMR, RDS, EC2, Athena, API capabilities, CloudWatch, CloudTrail is a plus
Experience with Business Intelligence tools like Tableau, Cognos, ThoughtSpot.
Download our app to build and sustain a viable contracting career. Share your phone number to receive the download link.
© 2023 PeopleCaddie. All rights reserved.
Are you a Contractor/Temporary looking for a job
or a Hiring Manager? Please select one
Already have an account? Sign In
Create a password to view invite-only contractor profiles and send interview requests
Already have a PeopleCaddie account? Sign in