We are searching for an experienced Hadoop Developer who's excited about building large scale distributed database systems that run across multiple physical data centers. The developer will be responsible for the design, development, and performance of our Hadoop technologies. (Hortonworks implementations experience preferred)
- Bachelor’s degree in computer science or related field.
- 3+ years of strong hands-on experience in developing code in Hadoop using Spark (Scala/Java), Hive, Hbase, Pig, Spark-SQL, Big-SQL, Sqoop etc.
- 5+ years of strong hands-on experience coding in JAVA, Python, Scala Unix shell scripting, relational databases (RDBMS), and PL/SQL.
- Strong foundational knowledge in Unix, Spark, KAFKA, NIFI, MINIFY, Hadoop architecture and Streaming architectures.
- Experience in Test Driven Development and code/delivery of unit/integration test frameworks.
- Experience in setting up pipeline in schedulers such as Airflow, Azkaban, Oozie etc.
- Must be a self-starter, detail oriented and possess strong problem-solving skills.
- Excellent communication, time management, and technical presentation skills.
- Manage data extraction jobs, and build new data pipelines from various structured and unstructured sources into Hadoop.
- Build, Execute or support test certification scripts, analyze and validate results against established success criteria.
- Build scalable integration framework for ingress or egress data from Hadoop.
- Enable environment sustainability through technology training and development of user guides and knowledge artifacts.
- Provide work guidance or technical assistance to less senior engineers.
- Application design and data modeling experience in Hadoop Ecosystem.
- Analyse and fix the data processing issue and troubleshoot the production issues.
Gentis Solutions offers excellent compensation and benefits including: higher wages than regional averages, full health, dental, vision, 401K, and two weeks of paid time off in year one.