Default Image
Retour à tous les emplois

Linux Administrator (Big Data - Hadoop)

Lieu de travail Zürich, Switzerland
Secteurs Big Data
Type d'emploi contrat
Un salaire Negotiable
Référence BBBH136713

Experis is the global leader in professional resourcing and project-based workforce solutions. Our suite of services ranges from interim and permanent recruitment to managed services and consulting, enabling businesses to achieve their goals. We accelerate organisational growth by attracting, assessing, and placing specialised professional talent.

Key Responsibilities:

Big Data Administration and Operational Support for all Big Data clusters globally, this involves from the scratch installation till the clusters are operational.

-Installation of Big Data Platform

-Automation of upgrade and Install steps

-Scripting using Ansible and Python to install and upgrade the big data platform

-Troubleshooting and supporting the Big data platforms across all regions

Challenges Contractor will be facing in this role:

Suggestion: It would be good to add information about the most important challenges the Contractor will face in this role

This being a Operational and Support Role, candidate should be flexible to work with multiple requests, tickets, issues and deliverables.

Essentials Skills and Qualifications:

  1. 6+ years of experience in enterprise IT at a global organization
  2. 4+ years in Big Data support -Big Data Administration and Engineering Background
  3. Strong Scripting knowledge - Python and UNIX
  4. Big Data Administration experience - using Cloudera and Hadoop ecosystem
  5. Fluent English

Desired Skills and Qualifications:

Strong knowledge of CDH, CDP, Cloudera Hadoop components such as Cloudera Manager, HDFS, Sentry, HBase, Hive, Impala, Hue, Spark, Kafka, Kudu, Phoenix, Ozone, Ranger, Atlas, Knox, NiFi, YARN, TeZ, Livy, Oozie, Solr, Sqoop, ZooKeeper and Postgres. You are proficient in Python, Ansible, Salt and dev ops technologies. Linux and BASH scripting

  • You have a good knowledge in Cluster maintenance as well as creation and removal of nodes using tools like Cloudera Manager as well as performance tuning of Hadoop clusters and Hadoop MapReduce routines. You build and maintain positive relationships with your colleagues from other application teams to install Hadoop updates, patches, version upgrades
  • Design and implement column family schemas of HBase, apply different HDFS formats and structure like Parquet, Avro, ORC, etc. to speed up analytics, Fine tune Hadoop applications for high performance and throughput. Monitor, troubleshoot and debug any Hadoop ecosystem run time issues
  • You bring experience in Service Management i.e. Release management, Incident management as well as experience with deployment toolsets like Odyssey, Git, Tableau
  • You are a dedicated problem solver with production support experience and an ability to cooperate within a multidisciplinary, global team. You are a self-starter with a strong curiosity for extracting knowledge from data and the ability to elicit technical requirements from a non-technical audience
  • You bring 6+ years of experience in enterprise IT at a global organization with recent 5+ years in Big Data support and you are fluent in written and spoken English

Candidate Value Proposition:

Great possibility to work with newest technologies. This is a great opportunity for everyone with passion for learning new technologies and tools in Big Data area.

Interested in this opportunity? Kindly send us your CV today through the link in the advert. However, should you have any questions please contact Danny Besse on +41 44 229 99 45.

Even though this position may not be the perfect fit for you, please reach out to us, as we have hundreds of open positions at Experis IT across Switzerland.

Check out all of Experis' job openings at or visit my personal page and connect to me on LinkedIn.