Hadoop Admin


  • Job ID:

    1781
  • Pay rate range:

    $55 - $60
  • City:

    Dallas
  • State:

    Texas
  • Duration:

    04/15/2019 - 04/15/2020
  • Job Type:

    Contract
  • Job Description

    Pay: $55-$60 per hour

    Duration: 1 year

    Urgent need for a Hadoop Admin/Developer for our client located in Dallas, Texas for a long-term 1-year contract with possible extension.  

    Job description/requirements for the Hadoop Admin/Developer needed:

    Must have the following: 

    • Bachelor’s degree in computer science, information systems, or related technology field or equivalent experience.
    • At least 10+ years of experience 
    • 6+ years of Hadoop Administration experience in a big data environment & 2+ years of Splunk Administration experience
    • 4+ years of software development experience in a big data environment
    • Should have a solid understanding of core application architecture design, development, best practices, and application/ data security. 
    • Experience in Hadoop Administration activities such as installation, configuration, and management of clusters in Cloudera(CDH) Hortonworks (HDP) Distributions using Cloudera Manager&Ambari.
    • Hands on experience in installing, configuring, and using Hadoop ecosystem components like HDFS, MapReduce, Hive, Impala, Sqoop, Pig, Oozie, Zookeeper, Spark, Solr, Hue, Flume, Accumulo, Storm, Kafka&Yarn distributions.
    • Administration, Configuration and proactive monitoring of Splunk
    • Need a good understanding of architecture definition and design of integrated IT solutions.
    • Monitoring, troubleshooting, and incident resolution
    • Alerts management, capacity management
    • Backup/archival management & restoration
    • Ability to manage data retention policies and perform index administration, maintenance, and optimization, configuration backups
    • Good working knowledge of Cloudera big data platform (Cloudera Hadoop implementation on Oracle BDA
    • Good working knowledge of UNIX, Linux OS
    • Good working knowledge of Cloudera big data platform Cloudera Hadoop implementation on Oracle BDA
    • Need a good understanding of architecture definition and design of integrated IT solutions.
    • Need to exercise application security best practices and will be responsible to develop & deploy mitigations for any application or security vulnerabilities.
    • Coordinates with offshore team to ensure that both platform support and data engineering services are delivered optimally
    • This role will also be responsible for architecting end to end Data Engineering solutions
    • Will coordinate with business users for architecting Data Engineering solutions
    • Designs solution approach and finalizes strategizing. 

    Can accept candidates out of the area and interview will be via WebEx. 

    #PCIT, #IT 

     

Find and apply to
jobs on the go


Take our free app with you anywhere. Enter your phone number and we’ll send you the download link.

flag +1
mobile