- Senior ETL/Informatica Developer with 8 years of experience in processing (Migrating, Converting, Replicating, Supporting) data from various sources into EDW, ODS and Data marts using Data Integration Tool Informatica Power Center 10.x/9.x/8.x/7.x, Informatica Developer 10.2 and experience in Data Modeling using Relational and Dimensional data modeling (Star and Snow - Flake).
- Strong exposure to various Databases like Oracle, DB2, Hadoop,MS SQL Server, Teradata, Microsoft Access and worked on integrating data from Flat files like fixed width /delimited and XML files.
- Good experience in writing UNIX shell scripts in automation of the ETL processes.
- Used Informatica Data Quality (IDQ) to perform Data Quality, Data Cleansing and Data Validation.
ETL Tools: Informatica Power Center 10.1/9.6/9.5/9.1/9.0/8.6 (Designer, Workflow Manager, Workflow Monitor, Repository Manager, Informatica Server), Informatica Developer 10.2, Informatica Data Quality 9.6, Informatica Power Exchange 9.5.0
Databases: Teradata 15/14, Oracle 12g/11g/10g, Hadoop (Hive), DB2 v8.1, MS SQL Server 2012/2008
Reporting Tools: Cognos 8, Tableau 2018
Languages: SQL, PL/SQL, TSQL, UNIX, Shell scripts, Hive SQL,SYBASE
Scheduling Tools: Autosys, Control-M, SQL*Plus, SQL * Loader
Testing Tools: QTP, Win Runner, HP Quality Center
Operating Systems: Windows 10/8.x/7/5/NT/XP, UNIX, MS-DOS
Sr ETL Developer
- Was part of business requirement discussions on daily basis and documented technical part of requirement
- Used CDMA to map source to target logics and generated ETL reports
- Also involved in source file i.e. New Directions (NDBH) provider file layout sessions to receive the source file in the required format for ETL process
- Used AQT and SQL server management studio to analyze the data in BlueKC systems
- Designed around 30 mappings in Informatica power center to transform and automate the source data to BlueKC standards following business requirements
- Designed mappings in PowerCenter using different transformations like Lookup, Expression, Joiner, Normalizer etc. to load data to different systems like FACETS, Visual CACTUS, Sipra Care, MA-ACA
- Profiled address related data from source in Informatica Data Quality (IDQ)
- Used address validator transformation in IDQ to validate billing and service address by configuring mailability score
- Created mapplets, sessions and workflows to run the jobs
- Wrote complex queries to transform the source data and loaded to SQL Server, DB2 and SYBASE platforms
- Created Batches for sessions of each system to load the data in series
- Also created command task to start the desired workflow once the Batch ID session is completed
- Used control-M to run to automate the jobs in informatica by creating dependency as required
- Wrote and performed unit testing for the data loaded to target systems and involved in preparing TEST CASES along with testers
Environment: Informatica Power center 10.0, Informatica Data Quality V9.5,Control-M V9, SQL Server ManagementStudio 17.0, AQT V10, DB2, SYBASE, SQL SERVER, CDMA
Sr Data Integation/ETL Developer
- Involved in analysis of business requirements and was part of writing business requirement document
- As part of this project ingested around 160 source legacy systems and loaded to Hadoop stage environment
- Worked with SME (Subject Matter Expert) to understand the data in each source file
- Analysed Mainframe data by running data profile in the Hive DB
- Used Informatica power exchange to create copy books for variable block VSAM source files
- Extensively worked on Informatica BdeD to perform ETL jobs to extract and load daa to Hadoop layer using BLAZE mode
- Developed mappings, mapplets, reusable transformations, workflows in Informatica Developer, Informatica power center using business transformation rules
- Used Normalizer transformation to load into single master table in Hadoop
- Loaded Oracle reference tables from main frame files using power center
- Created Control-M jobs for both power center and Developer workflows to automate the runs
- Optimized, performance tuned mappings in BdeD that are running longer
- Unit tested data in Hadoop target tables using Hue Editor
Environment: Informatica Power Center 10.1, Informatica Developer (BdeD) 10.2.1, Informatica Power Exchange 9.5.0,CDC,Mainframes,VSAM,IBM Data Studio 14, SQL Server Management Studio 17,Toad for Oracle 12.9, Putty,Jira, One Jira,Hue Editor, Informatica Administartor, Informatica Analyst,Control-M V9
Sr. Informatica Developer/Data Quality Architect
- Analyzed the raw data and transformed source data from enabler and loaded into Teradata tables in eCDW by incorporating business rules using different objects and functions.
- Have used Address validator transformation in IDQ and passed the partial address and populated the full address in the target table.
- Extracted data from mainframe sources like COBOL and used copybooks in COBOL file layout to import in designer
- Created Tables, views, synonyms, and test data in Oracle.
- Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.
- Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
- Extensively worked on SQL tuning process to increase the source qualifier throughput by analyzing the queries.
Environment: Informatica Power Center 9.5.0, Informatica Power Exchange 9.5.0, CDC, Mainframes, VSAM, Informatica Data Quality 9.5, Oracle 11g, Teradata SQL assistant, Cognos, Quality Centre, HP PPM
Sr. Informatica Developer
- Coordinated with Verizon Business Users to understand business needs and implement the same into functional
- Extracted data from Verizon files and loaded into different systems like Oracle Database, SQL Server using UNIX scripts
- Data landed in oracle and AS400 systems are migrated to frontier system interface files using various Informatica Power center mappings
- Verified the data quality of landed files and performed data validation through Informatica Data Quality (IDQ) by integrating IDQ mappings, rules as mapplets within Power Center Mappings.
- As a part of conversion, worked on converting subscriber information, high speed internet and treatment files to frontier systems using Informatica Power center.
- Wrote complex SQL Queries involving multiple tables with joins and generated queries to check for consistency, update the data in the tables as per the Business requirements
- Extensively used SQL tools like TOAD, Rapid SQL and Query Analyzer to run SQL queries to validate the data.
- Implemented restart strategy and error handling techniques to recover failed sessions and errored/rejected out data
- Used Unix Shell Scripts to automate pre-session and post-session processes
- Used Autosys scheduler to schedule & run the Informatica workflows on a daily/weekly/monthly basis.
Environment: Power Center 9.6/9.6.1, Oracle 11g, UNIX,AS400,PL/SQL, SQL Server,TOAD 12.8, Autosys, Putty.
Sr. Informatica Developer
- Data from Confidential amp;T systems is moved to frontier systems (i.e. oracle and DB2 database) using sql loader in Unix by creating control tables to load various source files like txt, csv, delimited and oracle dump
- Responsible for the creation of several Informatica mappings, sessions and workflows to load data from Oracle to DB2 databases according to the business specification document
- Converted Confidential amp;T subscriber data such as customer personal information, high speed internet plans, service orders to load into frontier systems tables
- Extensively worked with XML files as the Source and Target, used transformations like XML Generator and XML Parser to transform XML files in informatica power center
- Implemented Informatica Data Quality solutions to do data cleansing, data matching and reporting to check on phone numbers, Address etc.
- Worked with tools like TOAD, SQL Developer, SQL Plus to connect to Oracle and DB2 databases to write queries and analyze data
- Worked with business users, testing team and application development teams on analyzing, resolving and documenting the defects using HP ALM tool
Environment: Power Center 9.5/9.5.1, UNIX, Oracle 11g, DB2, Flat files, SQL Developer, WinSCP, Putty