- Over 11 years of IT experience in ETL Architecture, Analysis, design, development, testing, implementation, maintenance and supporting of Enterprise level Data Integration,Data Warehouse (EDW) Business Intelligence (BI) solutions using Operational Data Store(ODS)Data Warehouse (DW)/Data Mart (DM), ETL, OLAP, ROLAP Client/Server and Web applications on Windows and Unix platforms
- Worked on projects for Insurance Clients such as The Hartford, BlueShield Of CA and life sciences Clients such as Pfizer Inc., Amgen Inc., AstraZeneca(With IMS data), and Retail Clients such as Ahold Inc.,7Eleven and Staples Inc.
- Over 10 years of Relational Modeling and Dimensional Data Modeling using Star & Snow Flake schema, De normalization, Normalization, and Aggregations. Designed Data bases using Erwin 4.5
- Over 9 years experience in using Informatica Power Center 6.x/7.x/ 8.1.1/8.5.1/8.6.1 Data Stencil, Designer, Repository Manager, Server Manager, Workflow Manager, Workflow Monitor including Administration and performance tuning of Informatica transformations/mappings/sessions/workflow's
- Over 2 Years of experience in generating various complex reports, Dash Boards and score cards using Business Objects 6.X/XI/XIR2 Designer, Reports, Info view, Supervisor, Web Intelligence(WEBI) including security administration
- Over 11 Years of experience in using SQL, PL/SQL Dynamic SQL,Stored procedures/functions, triggers and packages, complex joins, correlated sub - queries, aggregate functions analytic functions, materialized views, indexing, partitioning and performance tuning the same using explain plan and TKPROF Analysis.
- Over 11 years of experience working in large scale data warehouse using Databases like Oracle 11g/10g/9i, MS Access 2000/2002, XML, IBM UDB DB2 8.2, SQL Server 2000, SalesForce.com(SFDC), MS Excel and Flat files
- Have Good understanding of ETL/Informatica standards and best practices, Confirmed Dimensions, Slowly Changing Dimensions (SCD1,SCD2,SCD3)
- Experience in writing UNIX Korn shell scripting
Technical Skills: Informatica, Business Intelligence, TERADATA, Oracle, DB2, Netezzav, ETL, sql, pl/SQL, informatica power center, data analysis, data migration, data warehousing, bigdata, hadoop, Teradata SQL Assistant, Big Data, Hadoop HDFS and MapReduce, Pig, Hive
Programming Languages: T-SQL, C, Core JAVA, J2EE, JSP, VB script, Perl, Visual Studio2005/2008/R2, XML, HTML, Unix Shell Scripting, JDBC
Databases: MS SQL Server, MySQL, DB2, Oracle
Applications: Microsoft Office Suite, Acrobat Reader/Writer.
Senior ETL Informatica Developer / Technical Project Lead
- Involved in the analysis of source to target mapping provided by data analysts and prepared function and technical design documents
- Extracted data from excel files, high volume of data sets from data files, Oracle, DB2,SalesForce.com(SFDC) using Informatica ETL mappings/SQL PLSQL scripts and loaded to Data Store Area
- Created complex Informatica mappings using Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations to extract, transform and loaded data mart area
- Created re-usable transformations/mapplets and used across various mappings
- Wrote complex PLSQL scripts /functions/procedures/packages
- Developed Informatica workflows/worklets/sessions associated with the mappings using Workflow Manager
- Created Tidal jobs to schedule Informatica Workflows
- Executed workflows using Tidal scheduler
- Wrote Unit Test cases and executed unit test scripts successfully
- Involved in performance tuning of Informatica code using standard informatica tuning steps
- Involved in the performance tuning of SQL/PLSQL scripts based on explain plan
- Supported during QA/UAT/PROD deployments and bug fixes
- Wrote UNIX Korn shell scripts for file transfer/archiving./Email notifications
- Involved in code Reviews as per ETL/Informatica standards and best practices
ETL Tech Lead & Architect
- As an ETL Tech lead, lead the ETL development for enhancements in Insurance Data warehouse.
- Working closely with the business users to understand the requirements and converting them into project level technical capabilities.
- Worked with business analysts to identify the appropriate data elements for required capabilities.
- Update the status and plan the releases through the scrum meetings.
- Coordinating with offshore team and providing the inputs.
- Worked with source teams to find out the source team changes.
- The project involved developing mappings for moving data from AS/400 and Flat files to Staging Area (STG) and then to Data Warehouse (DWH) and then to Data Mart .
- Developing the ETL detail design documents for each target tables (Fact and dimension tables).
- Creating primary objects (tables, views, indexes) required for the application
- Designing ETL jobs. Used Informatica as ETL tool.
- Designed and developed complex mapping for varied transformation logic like Expression, Filter, Aggregator, Router, Joiner Update Strategy, Unconnected and Connected lookups
- Used Informatica Debugger to troubleshoot logical errors and runtime errors.
- Designed and developed common modules for error checking (e.g. to check if the reject records output file is empty and to check if there are duplicate natural keys in a given table.)
- Performed the tuning at source, Target and informatica mappings using Indexes, Hints and Partitioning in DB2, SQL Server and Informatica.
ETL Tech Lead & Architect
- Analysis of the specifications provided by the clients.
- Preparation of HLD, project plan based on the business functional spec.
- Review of ETL Detailed design documents for Mapping and DDL specification document for creation of tables, defining keys/constraints on tables and data types.
- Analysis of Data model to check table constraints and columns in Data mart
- Extracting data from sources like Oracle, Mainframe Db2 and Flat Files using Power center designer and power exchange and transforming them using the business logic and loading the data to the target warehouse.
- Designing mappings as per the business requirementsUsing Transformations such as Source Qualifier, Aggregator,Expression, Lookup, Filter, Sequence generator, Router, Union, Update strategy etc.
- Coordination of system/Integration/UAT testingwith other teams involved in project and review of test strategy
- Used complex data transformations with more than50 transformations for each mapping.
- Fine tuning mappings and sessions to improve performance
- Coordinating with business and development teams for closure of UAT defects and creating adhoc reports for end users.
- Helping the team in fixing the technical issues if any and Tuning of Database queries for better performance.
- Keep tracking of all CI list, maintaining versions and Change requests .
ETL Informatica Developer
- Analyzing the existing informational sources and methods, understanding the customer expectations and identifying the problem areas
- Extensively used DataStage Designer , DataStage Director , DataStage Administrator and Quality Stage .
- Involved during the analysis, planning, design, development, and implementation stages of projects using IBM Websphere software (Qualitystage, IBM Information Analyzer).
- Designed, coded, and tested the Information Server components
- Designed and developed DataStage jobs using Parallel Processing techniques by implementing Pipeline and Partition Parallelism on a MPP system.
- Used Datastage Enterprise Edition/Parallel Extender stages namely Datasets , Sort , Lookup , Change Capture , Funnel, Row Generator stages in accomplishing the ETL Coding.
- Developed user defined Routines and Transformations using Ascential DataStage Manager
- Tuned the jobs for higher performance by dumping the look-up data into hash-file.
- Coordinated with Database Admin to create appropriate indexes for faster execution of jobs.
- Used Administrator to administer the locks on the jobs and other Administration activities for DataStage Server.
- Developed UNIX shell scripts to automate the Data Load processes to the target
- Developed jobs to load data in slowly changing dimensions.
- Involved in Performance Tuning of Jobs .
ETL Informatica Developer
- Involved in all the phases of the SDLC requirement gathering, design, development, Unit testing, UAT, Production roll-out, enhancements and Production support.
- Interacting with the business representatives to understand the requirements and determine the best approach for timely delivery of information. Writing the Software Requirement Specification for the Business requirement.
- Ensuring timely deliveries of work items to the Client.
- Involved in Implementing ETL standards and Best practices.
- Used the DataStage Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
- Obtained detailed understanding of data sources, flat files and complex data schemas
- Developed jobs using different types of stages -- Sequential File, Transformer, Aggregator, Merge, IPC, Link Partitioner and Link Collector and Hashed File.
- Extensively worked on Error handling, cleansing of data, Creating Hash files and performing lookups for faster access of data.
- Used DataStage Manager for importing metadata from repository, new job categories and creating new data elements. Created Re-usable repository and routine using DataStage Manager.
- Performance Tuning of Jobs, Stages, Sources and Targets.
- Used DataStage Administrator to setup DataStage projects and defined DataStage user profiles and assigned privileges.
- Extensively used DataStage Director for Job Scheduling, emailing production support for troubleshooting from LOG files.
- Developed Routines using DataStage BASIC programming language.
- Migrated jobs from development to QA to Production environments.