- 14 years of IT experience with strong technical skills focus on Data warehousing and business intelligence using ETL Tool - Informaitca PowerCenter, Informatica Data Quality and Master Data Management, Oracle, Teradata, Redshift and Unix shell scripting with key emphasis on requirement gathering, data analysis, design and architect ETL solution and coved all the projecct phases.
- Experience in design and architecting end to end ETL solutions. Have good hands on exposure (till date) in working with Informatica components like mappings, mapplets, sessions, workflows, reusable components, tasks, debugger, and troubleshooting on mapping failures also have good exposure in tuning the ETL for better performance with high data volumes
- Strong expertise on data warehouse architect including data modeling skills with experience in relational modeling theory and star & snow flake schemas.
- Extensive experience in Informatica upgrade from 7.1 to 9.6. data profiling. Have good understanding on IDQ validations and MDM techniques with the match and merge techniques with the rules and configuration for the survivor / golden record.
- Proficient in Logical data model, Physical data model, Data warehouse concepts, Dimensional Modeling using star schema and snow flake schema. Having experience in handling SQL, PL/SQL, packages, procedures, functions and triggers, tables, views, indexes, constraints, synonyms. Have good hands on project exposure in data migration from Oracle8 to Teradata13.1 and also from Netezza to Redshift
- Hands on Unix shell script experience, for handling the ETL loads, file handlings and automation process. Have good exposure in scheduling tools like Autosys, Informatica scheduler, Tidal, Opcon
- Have good exposure in interacting with business users, business analysts in analyzing business requirements and translating requirements into functional and technical design specifications. Experience in writing high level and detail design documents, mapping documents based on business requirements
- Experience in Informatica enterprise platform setup - server installation, configuration and Administration of Informatica Power center client/server. Have created/managed users, user groups, folders and upgrades
- Handled code migration / deployment activities, QA related activities and have very good understanding of quality assurance policies & procedures. Involved in conducting technical training on ETL process, DW concepts, and Informatica tool
- Experience in handling the team at onsite and offshore with Onsite-Offshore model. Providing technical and managerial oversight of all application development projects: develop concepts and strategies; define scope, procedures and objectives; and coordinate cross-functional technical team resource. Involved in project plans and estimation and responsible for project tracking and deliverables and periodic customer status reporting
- Ability to work independently, with strong analytical and technical skills, with good communication, good team building, management skills. Good team player with excellent inter-personal skills and customer interaction capabilities. A quick learner and mentor with the ability to make solutions-oriented, creative and innovative contributions in highly demanding situations
ETL Tools: Informatica PowerCenter 10/9.X/8.X, Informatica Data Quality, Kalido
Database Systems: Oracle 9i/10g/11g sql, pl/sql, Teradata, Redshift, SQL Server
Languages: Unix shell scripting
Operating Systems: Windows, Unix
Scheduling Tools: Autosys, Tidal, Opcon, Informatica scheduler
Version Control: Bitbucket, Tortoise
Other Tools: Toad, SQL Developer, Oracle Forms and Reports, Putty, Visio, Erwin 3.x/4.x, Mercury Quality Center, service now, Jira
- To upgrade the current ETL platform from Informatica 9.6 to 10.2 on the new linux host, troubleshooting issues as they arise and evaluate data to make recommendations to management
- To review and ensure that data loaded into the data warehouse is accurate. To analyze the data and come up with the improvements to processes
- Responsible for designing, building, and supporting the components of data warehouse, such as ETL processes, databases, reports, and reporting environment
- Co-ordination with all upstream and downstream application for the ssh public key setup and test the connectivity from the remote host to the new server. Handled the ciphers during the sftp
- Co-ordination b/w Infa admins and system admins to tackle the challenges during the engagement
- Migrate and identify the issues and to come up with the solution and to address the failures
- Handling the necessary file systems path changes in all the shell scripts, to address the path hardcodes issues in scripts and Informatica to accommodate into the ETL platform architecture
- Testing the drivers on new machine, DB connectivity, http, Java, JSON files downloads that are used in the application
- Autosys Jil changes - with respect to the new linux server, Profile, command calls
- Prepared many one script to address the issues as part of the server upgrade
- Informatica password variable handling during the pmcmd command
- Profile file path changes in the script and files and handled the path hardcodes to fit into the platform
- Mail command changes
- SFTP command changes
- End of line character (EOS) issues on the white spaces
- Autosys Jil updates - download, Replace the jil and upload the revised jil
- Requirement gathering from business, IT and other vendors, Co-ordination with other project stakeholders, and cross team with queries, request, data deviations. Analyzing the gaps on the ETL process and to come up recommendation approach and implementation plan
- Responsible to translate business requirements to technical specifications. Handling ETL and DB design, data analysis, data modelling and impact analysis, identify the data usage for statistical decision and Document and maintain the process and flow on the confluence page
- Helped enabling an enterprise to link all of its critical data to a common point of reference. Profile the data and to t roubleshoot issues as they arise and evaluate data to make recommendations to management and to coordinate with business analysts and developers to translate data requirements into logical data models for the better business decisions
- Designed the data quality framework and data reconciliation. Created processes for maintaining and capturing metadata
- Handled support operations by monitoring and analyzing master data, key data, and master relationship data within the organization. Ensures master data integrity in key systems as well as maintaining the processes to support the data quality.
- Architect the framework - data quality / data validations, audits, data age and size monitoring. Handled long running ETL’s to improve performance, Data profiling, gap assessment and implementing best practices
- Handling informatica repository upgrade from 9.6 (1 node) to 10.2(3 nodes), script level changes, AutoSys changes, and to setup the environment up and running
- Handling task tracker, code reviews, assigning tasks to team, manage and owns the deliverables. Data validation and data profiling, reviews and quality checks for the deliverable. Assisting team on the ETL and DB design solutions and addressing the technical challenges
- Assisting clients, business and other internal teams on the process, data / information as needed.
ETL Lead / Designer
- Worked with business and IT team to understand business requirements in order to transform business requirements into effective technology solutions
- Worked closely with Data Modelers to understand the data architecture with the functional requirements
- Handled the ETL and DB design for the project requirements. Assisting the internal team on the technical challenges and solutions aspects
- Responsible to Identify areas for data quality improvements and helps to resolve data quality problems through the appropriate choice of error detection and correction, process control and improvement, or process design strategies.
- Manages, analyzes, and resolves data initiative issues and manages revisions needed to best meet internal and customer requirements while adhering to published data standards.
- Assists in data management, governance, and data quality of master data requirements with other functional data owners to ensure functional master data integrity across the operation of financial systems is consistent and meets stated business rules and requirements.
- Created High level, low level design documents, ETL specification documents, test cases document and traceability metrics
- Developed and enhanced Informatica mappings, sessions and workflows, created reusable objects like Mapplets, Oracle SQL/PLSQL code, Unix Scripting, Opcon Jobs based on the business needs
- Created Unix shell scripting for the file handling and Opcon scheduling
- Created Triggers, PL/SQL code, opcon Jobs for the project requirements
- Handled data analysis on the user queries on the day to day basis and handling the fixes on History and Incremental data and ETL code, task tracker, code reviews, onsite/offshore, handling overall deliverables.
- Created deployment groups, rollout plan documents for production migration (mappings, workflows, parameter files and Unix scripts) and supporting the post production support during warranty
- Handling the Code review request for the entire CCL with the Cognizant platinum tool
ETL Lead / Designer
- ETL conversion from on-premise to AWS cloud. SAP IDOC interpreters and connectors and rebuild on the AWS to use the redshift the DB. Data movement etls are build to move the data from Netzza to S3 redshift. As part of AWS migration, all Netezza related source, target definitions and lookups are recreated to use the redshift db.
- Responsible for requirement gathering from Confidential business and IT team, analysis of existing system and the new system, estimation, design, project tracking, offshore co-ordination and handling deliverables and code migration activity to QA, UAT and Production
- Defining and finalizing the solution design for data loading, data reconciliation, error handling, Dashboard and reports
- Convert business requirements into technical design document and designing and developing ETL mappings according to the business needs
- Responsible for the project deliverables during all the phase of the migration activity and to report the daily status and progress to the Novo and cognizant management
- Have a good hands on roles in the areas like technical ETL lead, design, ETL Development, data validation, data reconciliation and data profiling, reviews and quality checks for the deliverables
- Responsible for identifying the root cause and gap analysis on the Production issues and to provide the fix. Defining data profiling strategy to provide data quality feedback to the data source owners
- Responsible to prepare and maintain the documents like, project design, project estimation, tracking clarification logs, handling project status reports
- Involved in migration activity from Netezza to Redshift. Handled the data fixes
- Responsible to study and analyst the impact on existing FG system and to plug in the required NFG components.