Must be US Citizen with ability to obtain public trust clearance
Key Required Skills:
Hadoop, Ansible, DEVOPS, Flexibility and Initiative, RHLinux
Design and implement Big Data analytic solutions on a Hadoop based platform. Refine a data processing pipeline focused on unstructured and semi-structured data refinement. Support quick turn and rapid implementations and larger scale and longer duration a
Detailed Skills Requirements:
• Experience in MapReduce programming with Hortonworks Hadoop and Hadoop
• Distributed File System (HDFS) and with processing large data stores
• Ability to show flexibility, initiative, and innovation when dealing with ambiguous and fast paced situations
• Experience with Hortonworks Hadoop, Amazon EMR,Amazon S3, Spark,
Red Hat IDM
May be scheduled for after hours on-call support and will be required to apply production packages during non-peak hours.
• Experience with R
• Experience with Python
• Experience with deploying applications in a Cloud environment using Ansible
• Eclipse and/or IntelliJ IDE
• Java Enterprise Edition
• Git source code control
• Maven - for dependency management/builds/testing?
• Git and Bash scripting
• Atlassian Tool Suite (Confluence/Jira/Bitbucket
Education: Bachelor’s degree or equivalent experience.