Informatica BDM Hadoop Architect

Role:                     Informatica BDM Hadoop Architect 
Location:             Pasadena, CA
Interview:            Phone/Skype
Emp Type:           Permanent job
 
10+ Years 
•                    Conduct effective JAD/JAR sessions with IT team and business users to gather/clarify business requirements. 
•                    Experience in complete Software Development Life Cycle (SDLC) covering Requirements Gathering, Requirement Analysis, Data Analysis and Mapping, System Architecture and Design, Development support, Testing and Deployment support of business applications.’ 
•                    Able to create data models, solution designs and data architecture documentation for complex information systems with an integration of multiple data sources and data extracts for multiple lines of business. 
•                    Design, test and create data models as well as prepared, generated and tested install scripts (ddl) for Data Warehouse/Data Mart using tools like Erwin, ER/Studio and Power Designer. 
•                    Should be able to create a Physical Database Design for Hadoop Landing Zone from the Logical Data Model. 
•                    Data profiling and analysis using HUE Browser for HIVE/IMPALA along with the ETL team conducted during source to PDM data mapping. 
•                    Design end to end architectures for data flow using Hbase and Kafka for streaming data. 
•                    Design, implement and manage operations framework (Security, Automation, incident management, monitoring and notification) in the context of Hybrid cloud mode 
•                    Provide technical architecture to teams and support management in identifying and implementing competency development measures 
•                    Assist development team in providing business and functional logic for development. 
•                    Identify and implement many attributes and relationships in the source model and cleansed unwanted tables/columns as part of data analysis responsibilities. 
•                    Able to do normalization process to create Logical/Physical Data model in 3Normal-Form in the warehouse area of Enterprise Data Warehouse. 
•                    Create 3NF business area data model with de-normalized physical implementation data and information requirements analysis using Erwin tool. 
•                    Implement Normalization/Denormalization techniques for effective and optimum performance in Refined Zone model and then in the creation of data marts or OLAP environments. 
 Highly Specialized (10+ yrs.) 
•                    Own development and updates of project artifacts like data model, source to target mapping, data flow diagrams 
•                    Work on the data source profiling and Gap Analysis and worked with business and source SME to get the metadata approved for the new data elements that are added for this project. 
•                    Work closely with the development team while designing internal and external data interfaces to ensure that database development needs are according to client specifications. 
•                    Create detailed ETL design source to target mapping document and assisted ETL developers in the detail design and development of ETL maps using Informatica. 
•                    Work closely with the application development team to and helped implement data strategies, built data flows diagrams and develop conceptual data models. Helped review the data model with functional and technical team. 
•                    Contribute to the data landscape standards and guidelines. 
•                    Help maintain data integrity to ensure the effective functioning and monitored data quality. 
•                    Integrate data sources with multiple Relational Databases like SQL Server, Teradata, Oracle and DB2. 
•                    Assisted reporting developers in building Reports by creating reporting mapping documents. 
•                    Employ Agile techniques for collaborative Dimensional modeling from requirements to whiteboard to Star Schema. 
•                    Created ERwin reports in HTML, RTF format depending upon the requirement, published data model in model mart, created Naming Convention files, co-coordinated with DBAs to apply the data model changes and created DDL scripts for implementing Data Modeling changes.
•                    Strong in communication 
Bachelor's degree in a related field and/or 10+ years of equivalent work experience. 

Want to apply later?

Type your email address below to receive a reminder

Apply to Job

ErrorRequired field
ErrorRequired field
ErrorRequired field
Error
Error
insert_drive_file
insert_drive_file