Posted 2y ago

Hadoop App Support | NJ,Texas

@ Photon
United States
OnsiteFull Time
Responsibilities:Supporting applications, Monitoring production, Debugging issues
Requirements Summary:Bachelor's degree in a technical or business-related field, 7 years of experience in data warehousing, and 4 years in big data (Cloudera).
Technical Tools Mentioned:Hadoop, Cloudera, HDFS, Map Reduce, Hive, Impala, Spark, Kafka, Linux, Unix, SQL
Save
Mark Applied
Hide Job
Report & Hide
Job Description

Position Summary: 

    

Hadoop Application Support specialist able to work in one or more than one applications in the hadoop data lake  

Able to support the existing Hadoop applications.  

Able to support the application in rotating shifts (24x7).  

 

 

Job Description: 

 

Essential Duties and Responsibilities: 

Following is a summary of the essential functions for this job.   Other duties may be performed, both major and minor, which are not mentioned below.  Specific activities may change from time to time. 

 

Support multiple projects with competing deadlines 

24/7 Monitoring of production applications to make sure SLAs are met 

Triage and Remediation of Job failures 

Work with upstream/downstream applications for any delays or data issues 

Ability to generate and submit the reports to senior management  

Sound understanding and experience with Hadoop ecosystem (Cloudera). Able to understand and explore the constantly evolving tools within Hadoop ecosystem and apply them appropriately to the relevant problems at hand. 

Experience in working with a Big Data implementation in production environment 

Experience in HDFS, Map Reduce, Hive, impala, Spark, Kafka &Linux/Unix technologies  is mandatory  

Experience in Unix shell scripting is mandatory 

Able to analyze the existing shell scripts/python/perl code to debug any issues or enhance the code 

Sound knowledge of relational databases (SQL) and experience with large SQL based systems. 

Strong IT consulting experience in various data warehousing engagement, handling large data volumes, architecting big data environments. 

Deep understanding of algorithms, data structures, performance optimization techniques and software development in a team environment. 

Benchmark and debug critical issues with algorithms and software as they arise. 

 

Required Skills and Competencies: 

 

Bachelor's degree in a technical or business-related field, or equivalent education and related training 

Seven years of experience in data warehousing architectural approaches and minimum 4 years in big data (Cloudera) 

Exposure to and strong working knowledge of distributed systems  

Excellent understanding of client-service models and customer orientation in service delivery 

Ability to grasp the 'big picture' for a solution by considering all potential options in impacted area 

Aptitude to understand and adapt to newer technologies 

The ability to work with team mates in a collaborative manner to achieve a mission 

Presentation skills to prepare and present to large and small groups on technical and functional topics 

 

Desired Skills: 

Previous experience in the financial services industry 

Previous experience in production support 

Broad technical experience and good understanding of existing testing/operational processes and an open mind on how to enhance those 

Understanding of industry trends and relevant application technologies 

Experience in designing and implementing analytical environments and business intelligence solutions 

 

Additional information: 

 

Hours - 40 per week