Sr. Informatica Etl Developer Resume
SUMMARY:
- 9 years of experience in data analysis, design and development of various business applications.
- Experience in implementing Data Warehousing applications using Informatica Power Center.
- Expertise in relational (MS - SQL Server, DB2 and Oracle) databases.
- Well verse with UNIX shell scripting.
- Complete hands on experience in all phases of SDLC.
- Experience in performance tuning of mappings, sessions, workflows.
- Worked on Slowly Changing Dimensions (SCD's), implemented Type1, Type 2 to keep track of historical data.
- Extensive experience in extraction, transformation and loading of data from heterogeneous source systems like flat files, excel, relational DB.
- Worked extensively on complex mappings.
- Strong Business knowledge in Compliance, Surveillance, Financial Domain and Investment Banking.
- Experienced in working on Migration project.
- Extensive knowledge and experience in AGILE (Scrum) Methodology.
TECHNICAL SKILLS:
ETL Tool: Informatica 9.6, 9.1, 8.6 and 8.1 (Power Center Designer, Workflow Manager, Workflow Monitor and Repository Manager)
Languages: SQL, UNIX Shell Scripting and UNIX
IBM DB2, MS: SQL Server, Oracle
Other Tools: Autosys, Perforce, JIRA, BCP Utility, WinSCP, Deployment tool, Migration tool, Service Now, SharePoint, Toad Data Point, NPR, Tidal 6.2, One Identity TPAM, HP QC
Techniques: Data Warehousing, Business Intelligence and Agile Methodology
PROFESSIONAL EXPERIENCE:
Confidential
Sr. Informatica ETL Developer
Responsibilities:
- Worked closely with the BST team to plan and analyze data requirements and mapping documents.
- Created design documents listing the ETL process flow and flowcharts for each application code.
- Developed ETL mappings using Informatica Powercenter and performed peer review for the team.
- Developed various Autoentry and Reconciliation tools using complex mapping transformations.
- Created complex views which were treated as source to the mapping and as Lookup table.
- Created Unit Test Cases to validate the code.
- Added jobs and created grouping for pre-migration and migration in Tidal job scheduler.
- Managed Non-Production Request System(NPR) for deploying code, scripts, and jobs from DUT to SIT/UAT.
- Performed Unit Testing, Integration Testing and User Acceptance Testing.
- Worked on various Conversion Pilot(CP) and Migration Rehearsals(MR) before the actual Go Live.
- Maintained Activity Tracker which records the progress of all the tracks during migration cycles.
- Performed Defect triage and reworked on migration tools ahead of upcoming MR.
- Participated in reporting Status and metrics to PMO on tasks, schedule and results.
Confidential
Sr. Informatica ETL Developer
Responsibilities:
- Design and implement the application for consuming the source feed data from upstream.
- Design various data model and pipeline for different type of data sources.
- Apply strategic approach to deal with very large data set.
- Implemented Slowly Changing Dimension (SCD) - Type 2 logic using Informatica mapping. Worked on SCD, truncate and incremental load.
- Create very complex ETL mappings with extensive use of aggregator, union, filter, router, normalizer, joiner, sequence generator, stored procedure and lookup transformation.
- Worked extensively to identify issues and improve the performance across various aspects.
- Package the different components and deploy the solution to QA and UAT environments.
- Lead the discussions with upstream and BA team to streamline the Job scheduling and execution.
- Address production issues by performing root cause analysis, implementing and deploying the fix within the SLAs.
- Coordinate with development and RTB team for production deployment. Create CR and incident using Service Now.
- Perform the post deployment (QA/UAT/PROD) health checks and monitor for any issues during the initial job execution.
- Perform cross region and peer review of new development and enhancement as per business requirement.
- Closely worked with client on audit queries.
Confidential
Informatica ETL Developer
Responsibilities:
- Work on architecture to consume large amount of data receive and process them in most efficient and optimize manner, considering dependency on other data load/feeds.
- Develop the system to consume data feeds received from front office, middle office and back office up-streams.
- Work with Controller/Users to analyze the requirements and translate them into high level technical design.
- Perform data quality check for data consumed and processed by our system.
- Building the infrastructure which is capable of processing huge amount of data from multiple sources.
- Created a new System/Database (COG-US) where millions of records are loaded and extracted every 15 mins.
- Created a complex solution to handle extraction of data at run time and auditing every single transaction.
- Interact with business users to analyze the production issues, perform root cause analysis and provide permanent fix.
- Assist QA/BA team for data testing.
- Performance tuning in ETL, SQL, Stored Procedure and Scripting.
- UNIX shell scripting to process files and archiving process.
- Unit test, integration test and deployment of code to upper environment.
- Work with Production Support team during deployment of code.
Confidential
Informatica ETL Developer
Responsibilities:
- Contributed in data analysis and design, according to business requirement of client.
- Extensively worked in Data ETL process from source to target systems using various tools and languages.
- Developed and implemented the UNIX shell script for various requirements.
- Tuned and tested the mapping using different logics to provide maximum efficiency.
- Performed unit testing at various levels of the ETL.
- Developed a Dynamic mapping which is used to handle varying number of data element from the source feed.