- This is a contract to hire position with the client. Only GC holders or citizens may apply.
As a member of our Big Data Services group, you will responsible for large scale Hadoop environments build and support including design, capacity planning, cluster set up, performance tuning and monitoring and development assistance. In addition you will:
- Work with the Daman Big Data engineers to define the infrastructure requirements
- Build utilities required for development and maintenance of solutions on the Hadoop cluster.
- Interact with users to define system requirements and/or necessary modifications
- Define and develop client specific best practices around data management within a Hadoop environment
- Perform cluster maintenance tasks such as creation and removal of nodes, cluster monitoring and troubleshooting
- Experience in setup, configuration and management of Hadoop clusters
- Certification in one of the Hadoop distributions is a plus (MapR, Cloudera, Hortonworks, or BigInsights)
- Experience with Hive, MapReduce, YARN, Spark, Sentry, Oozie, Sqoop, Flume, HBase, Impala, etc.
- Experience in installing, administering, and supporting Linux operating systems and hardware in an enterprise environment.
- Expertise in typical system administration and programming skills such as storage capacity management, performance tuning
- Proficiency with UNIX utilities and scripting (Perl, Python, shell, Expect, etc)
- Experience with High-availability clusters (HACMP)
- Experience with automating production processes and procedures
- Relocation Assistance: Yes
Would you consider a candidate at $75.00 per hour?
can we submit a third party candidate too. mean another company is holding the visa of candidate.
I would like to know if relocation assistance is offered for this job.