In addition to meeting the labor category requirements for which the candidate is submitted, candidates are also required to meet the following:
• Architect and develop cloud analytics/software hosted in corporate and/or local cloud environments
• Integrate cloud analytics with system internal and external components
• Conduct analysis of alternative for analytic architecture and technologies, analytic performance analysis and testing
• Transition requirements for processing large datasets into analytic architecture and creation of a cohesive system of analytics, to potentially include steaming and non-streaming processing.
• Perform size estimation and storage requirement analysis of analytic input and output
• Develop, maintain, and enhance complex and diverse software systems (e.g., processing-intensive analytics, novel algorithm development, manipulation of extremely large data sets, and real-time systems)
• Translation of algorithm specification into analytic code
• Correlate multiple datasets in order to generate enriched events
• Develop cloud analytic components including software requirements analysis and synthesis from system level to individual software components.
• Additional responsibilities may include support for system internals, data flow, web user interface (UI), and/or results processing and analysis.
• Knowledge of and experience with Portable MapReduce
• Analytic development experience with Java (preferred) or PIG
• Knowledge of and experience with GHOSTMACHINE environments
• Experience in application development and migration into cloud analytic environments and technologies, as well as customer cloud data types, protocols, and policies
• Experience working with open source cloud technologies such as Hadoop and OpenStack
• Direct experience with an intelligence community or signals intelligence activity
• Experience with developing in cloud environments using technologies including Accumulo, Elasticsearch, and Java
• Knowledge of cloud formats (TCLD) and data tagging
• Experience with intelligence community missions, data flow, and data formats (ASDF, CoPilot etc.)
• Experience with streaming platforms such as Flink, Spark, Storm, IBM Streams, etc
• Experience with Agile development
• Experience with JIRA and Agile/Scrum methodologies