Data Engineer


  • Job ID:

    4329
  • Pay rate range:

    $75/hr - $85/hr
  • City:

    Chicago
  • State:

    Illinois
  • Duration:

    05/28/2023 - 11/28/2023
  • Job Type:

    Contract
  • Job Description:

    Senior Data Engineer (Cloud)

    Pay: $75-$85 w2

    Contract Length:  6 months contract (potential extension)

    Location: Chicago, IL 

     

    The Senior Cloud/Data Engineer will design, code, test, and analyze software programs and applications. This includes researching, designing, documenting, and modifying software specifications throughout the production lifecycle. This role will also analyze and amend software errors in a timely and accurate fashion and provide status reports where required.

     

    Responsibilities:

    • Work with Product team to determine requirements and propose approaches to address users' needs

    • Analyze requirements to determine approach/proposed solution

    • Design and Build Solutions using relevant programming languages

    • Thoroughly test solutions using relevant approaches and tools

    • Conduct research into software-related issues and products

    • Design and develops enterprise data solutions based on architecture and standards leveraging leading architecture practices and advanced data technologies.

    • Implement solutions for data migration, data delivery and ML pipelines.

    • Implement Identity Access Management Roles and policies.

    • Build resilient, reliable, performant and secure Data Platform and Applications.

    • Automate deployments of AWS services and BI Applications.

     

    Experience and Skills:

    • 6+ years of experience within the field of application/platform engineering or related technical work including business intelligence, analytics.

    • 4+ years of experience with AWS Senior Cloud Data Engineering, management, maintenance, or architecting, implementing best practices and industry standards.

    • Experience with data warehousing platforms such as Snowflake, Redshift or similar.

    • Strong knowledge and established experience with AWS services including but not limited to: S3, EC2, RDS, Lambda, Cloud Formation, Kinesis, Data Pipelines, EMR, Step Functions, VPC, IAM, and Security Groups.

    • Experience with DB technologies (e.g., SQL, Python, PostgreSQL, AWS Aurora, AWS RDS, MongoDB, Redis).

    • Experience with CI/CD tools, pipelines, and scripting for automation. (GitHub Actions, Jenkins, AWS Code Pipeline tools, Cloud formation and Terraform).

    • High degree of knowledge in IAM Roles and Policies

    • Strong knowledge configuring AWS cloud monitoring and alerts for cloud resource availability.

    • Strong scripting experience using PowerShell and/or Python.

    • High degree of knowledge in PaaS and SaaS application performance.

    • Understand enterprise level application architecture diagrams and IT security requirements.

     

    Must have skills:

    • Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business

    • Effective problem solving and analytical skills. Ability to manage multiple projects and report simultaneously across different stakeholders

    • Rigorous attention to detail and accuracy.

    • Demonstrate d ability to troubleshoot technical problems and issues.

    • Passionate about programming and learning new technologies.

    • Experience planning and executing on-premises to AWS migrations

    • BA or BS degree Computer Engineering, Computer Science, or related fields.

    • Hands on experience in Java (Spring, Spring boot framework)

    • Hands on experience with AWS cloud architecture and development using AWS resources like S3, Lambda, API Gateway, DynamoDB, RDS, etc

    • Hands on experience in data pipeline development using modern ETL tools specifically on Informatica PowerCenter and/or Informatica IICS

    • Proficiency in SQL and Scripting (Unix sell scripts and/or Python)

    • Hands on experience in trouble shooting complex Application, ETL and SQL problems.

     

    Good to have Skills:

     

    • Experience with large scale enterprise streaming services such as Kafka.

    • Experience with Kubernetes and Docker containers or AWS Fargate.

    • Experience implementing applications with both Windows and Linux server OS

    • Experience with networking, security groups, or policy management in relation to Cloud resources across multiple operating systems, including UNIX, Linux, or Windows

    • Experience with Informatica MDM

    • Proficiency in databases (Snowflake, DB2, Redshift, etc), database concepts and dimensional modeling

    • Experience in data cleansing, data validation and data wrangling

     

    #LI-ONSITE #PCIT

     

     

Add Performance Rater (for Prior Role)

CONTRACTORS

Find Exclusive Contracting Opportunities

Download our app to build and sustain a viable contracting career. Share your phone number to receive the download link.

  • +1

Client Registration Request