This position is responsible for developing, integrating, testing, and maintaining existing and new applications; knowing one or more programming languages; knowing one or more development methodologies / delivery models.
Required Job Qualifications:
Bachelor Degree and 4 years Information Technology experience OR Technical Certification and/or College Courses and 6 year Information Technology experience OR 8 years Information Technology experience.
Possess ability to sit and perform computer entry for entire work shift, as required.
Possess ability to manage workload, manage multiple priorities, and manage conflicts with customers/employees/managers, as applicable.
Must have extensive hands on experience in designing, developing, and maintaining software solutions in Hadoop cluster.
Must have experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog
Must have experience with NoSql Databases like HBASE, Mongo or Cassandra
Must have experience with Developing Pig scripts/Hive QL,UDF for analyzing all semi-structured/unstructured/structured data flows.
Must have working experience with Developing MapReduce programs running on the Hadoop cluster using Java/Python.
Must have working experience with Spark and Scala.
Must have hands on experience using Talend with Hadoop technologies.
Must have knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2) and considerations for scalable, distributed systems
Must demonstrate Hadoop best practices
Must have working experience in the data warehousing and Business Intelligence systems
SDLC Methodology (Agile / Scrum / Iterative Development).
Systems change / configuration management.
Business requirements management.
Problem solving /analytical thinking
Preferred Job Qualifications:
Bachelor Degree in Computer Science or Information Technology.