Elevate Your Career: Discover Exciting Opportunities in our Latest Job Openings
Hyderabad/Mangalore
Full Time
Posted Date: 21-06-2024
Established in 2019, EverExpanse is a team of payment experts specializing in innovative and secure payment solutions. We offer expertise in EMV Kernels, POS Development, payment gateway development, integration, and QR code payment systems. In addition, we provide professional consultants to support businesses in their payment and digitization goals. Our focus is on delivering seamless, reliable solutions that meet industry standards, empowering businesses to integrate advanced payment technologies and achieve their product goals. About Us
Hyderabad/Mangalore
6 to 10 years
Full Time
Bachelor’s degree in computer science or equivalent degree/experience and relevant working experience. Relevant IT related training/certifications
Seeking a Hadoop DBA to provide expert-level support for the Hadoop cluster environment.
Ideal candidate should be articulate, approachable, and have practical knowledge of clusters, log monitoring, database replications, and high availability of resources.
Position involves assisting development staff in planning and execution by recommending solutions to complex data issues while maintaining overall system health.
Must be able to troubleshoot time-sensitive production issues promptly.
Responsible for implementation and ongoing administration of Hadoop infrastructure.
Working with data delivery teams to set up new Hadoop users. This job includes setting up Kerberos principals and testing HDFS, Hive, Pig and Spark, Impala, MapReduce, and Hue and configuring access for the new users.
Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, and other tools.
Diligently teaming with the infrastructure, network, database, application, and business intelligence teams to guarantee with high data quality and availability.
Performance tuning of Hadoop clusters and Hadoop MapReduce routine.
Monitor Hadoop cluster and their connectivity and security.
Manage and review Hadoop log files.
File system management and monitoring.
HDFS support and maintenance.
Point of Contact for Vendor escalation.
Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
Screen Hadoop cluster job performance and capacity planning.
Disk space management.
Data modelling, design & implementation based on recognized standards.
Database backup and recovery.
Automate manual tasks.
Superior knowledge in administering Hadoop operating on LINUX environment.
Expertise in Setting up Clusters, Configuring High availability.
Ability to accurately monitor environments, resolve issues and plan capacity improvements.
Solid understanding of storage environments used by Big Data systems.
Knowledge of cluster monitoring tools like Ambari, Cloudera SCM, Ganglia, or Nagios
Knowledge of Hadoop Ecosystem Components such as HDFS, YARN, Hive, Impala, Hue, Pig
Good Understanding of OS Concepts, network configuration, Process management and resource scheduling.
Good Knowledge on Different Cloud environments.
Highly self-motivated and excellent time management skills. Strong interpersonal and communication skills.
6-10 years with Hadoop technologies and all aspects of Big Data Enterprise Grade solutions
Proven experience with the full stack of Hadoop Eco system, e.g., Hive, Spark, Isilon, Kerberos, etc...
Experience working in secure environments (PCI, HIPPA, TS-SCI, etc.)
• Experience with programming languages, e.g., Python, Java, scripting.
Demonstrates collaboration with multiple teams for successful solutions.
The expertise of general operations, including troubleshooting
Adheres to standards and guidelines.
Flexible work hours and provides 24x7 support for critical production systems.
Continues to improve skill sets.