Lead Hadoop Admin Administrative & Office Jobs - San Mateo, CA at Geebo

Lead Hadoop Admin

Lead Hadoop Admin Foster City, CA, Onsite from Day 1 Interview Process:
3 Rounds of Zoom Video Interview Job Description Responsible for implementation and ongoing administration of Hadoop infrastructure.
Responsible for Cluster maintenance, trouble shooting, Monitoring and followed proper backup & Recovery strategies.
Provisioning and managing the life cycle of multiple clusters like EMR & EKS.
Infrastructure monitoring, logging & alerting with PrometheGrafana/Splunk.
Performance tuning of Hadoop clusters and Hadoop workloads and capacity planning at application/queue level.
Responsible for Memory management, Queue allocation, distribution experience in Hadoop/Cloud era environments.
Should be able to scale clusters in production and have experience with 18/5 or 24/5 production environments.
Monitor Hadoop cluster connectivity and security, File system (HDFS) management and monitoring.
Investigates and analyzes new technical possibilities, tools, and techniques that reduce complexity, create a more efficient and productive delivery process, or create better technical solutions that increase business value.
Involved in fixing issues, RCA, suggesting solutions for infrastructure/service components.
Responsible for meeting Service Level Agreement (SLA) targets, and collaboratively ensuring team targets are met.
Ensure all changes to the Production systems are planned and approved in accordance with the Change Management process.
Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
Maintain central dashboards for all System, Data, Utilization, and availability metrics.
Required Skills:
8-12 years of Total experience with at least 5 years of work experience in developing, maintaining, optimization, issue resolution of Hadoop clusters, supporting Business users.
Experience in Linux / Unix OS Services, Administration, Shell, awk scripting Strong knowledge of anyone programming language Python/Scala/Java/R with Debugging skills.
Experience in Hadoop (Map Reduce, Hive, Pig, Spark, Kafka, HBase, HDFS, H-catalog, Zookeeper and Oozie/Airflow) Experience in Hadoop security (Kerberos, Knox, TLS).
Hands-on Experience in SQL and No SQL Databases (HBASE) with performance optimization.
Experience in tool Integration, automation, configuration management in GIT, Jira platforms.
Excellent oral and written communication and presentation skills, analytical and problem-solving skills Recommended Skills Awk (Programming Language) Administration Airflow Analytical Apache H Base Apache Hadoop Estimated Salary: $20 to $28 per hour based on qualifications.

Don't Be a Victim of Fraud

  • Electronic Scams
  • Home-based jobs
  • Fake Rentals
  • Bad Buyers
  • Non-Existent Merchandise
  • Secondhand Items
  • More...

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.