- To pursue a career full of challenges which provides me ample opportunities to learn and grow, at the same allowing me to fully utilize my capabilities to dedicate myself to the organization and add value to it by sincere efforts.
- 7+ years of extensive experience in end - to-end implementation of Data Warehouse/Business Intelligence (BI) projects using ETL tools Informatica Power Centre 9.5/9.1 and SSIS in Banking, Insurance & Retail domain.
- Excellent knowledge of data warehouse development life cycle using SDLC.
- Experienced in designing and creation of logical & physical models of the Data Marts/Data Warehouse with multi-dimensional data models star, snowflake and fact constellation schemas and Databases using relational model with Erwin/Toad Data modeling tool.
- Worked on Relational database tables design techniques Normalization, De-Normalization and creation of Unique, primary, foreign keys, partitions and Clustered, Non-Clustered Indexes.
- Extensive experience in data warehousing techniques for Data cleansing, Slowly Changing Dimension (SCD) Type I, II & III and Change Data Capture (CDC).
- Experience with T-SQL/PL-SQL, SQL Server/Oracle in creating Tables, Views, Scripts, Indexes, Triggers, Complex stored procedures, User Defined Functions, Relational Database Models and Data Integrity in observing Business Rules.
- Experience in database activities like Maintenance and Performance Monitoring and Tuning using tools such as Index Tuning Wizard, SQL Profiler, and Troubleshooting.
- Expert in designing and developing ETL pipelines for integrating data from heterogeneous sources (Excel, XML, CSV, IBM MQ, Relational databases, flat files) by using multiple transformations provided in Informatica Power Center and SSIS.
- Experience in writing Python and C# code to extend the functionalities of ETL Tools.
- Experience in designing Error handling and Audit framework for ETL Loads.
- Hands-on experience in all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Expert in using generating Tabular Reports, Matrix Reports, Drill down Reports, Parameterized Reports, Ad-hoc Reports, and Stylish Reports using SQL Server Reporting Services (SSRS).
- Experience in designing re-usable components that can be used across multiple projects.
- Experience in Informatica Performance tuning for mappings and sessions to overcome performance bottlenecks.
- Experience in designing visualizations and dashboards using Tableau and Qlik Sense.
- Extensive experience in writing UNIX shell scripts and Python scripts for automation of the ETL processes.
- Experience in 24X7 production support, ETL executions, and root causes analysis.
- Trained in Big Data technologies like Hadoop, Apache Spark, Pig, Hive, AWS etc.
- Confident in working independently and proactively. Strong leadership, good communication and excellent interpersonal skills, problem-solving and multi-tasking abilities.
- Demonstrated experience seamlessly learning new technologies in a short span of time, integrating well into any environment, and working well under pressure.
T: SQL, PL/SQL, Unix Shell Scripting, Core Java, C#, and Python.
ETL Tools: Informatica 9.x, SQL Server Integration Service (SSIS) and Hummingbird Genio.
Reporting and Profiling Tools: SSRS, Tableau and Qlik Sense.
Operating Systems: Windows 7, Windows 10 and Red Hat Linux
Databases: SQL Server 2012/2008R2, Oracle 11g, DB2, HBase, Cassandra.
Database Tools: TOAD, SQL Server Management Studio(SSMS) and SQL Developer
Other Tools: XML Spy, MS Office, B2B IBM Sterling, Jenkins, Subversion, TFS, and SSDT.
- Participate in technical solutions design sessions with the team on assigned projects.
- Worked with Project Manager and Business Analyst to understand business requirement and involved in planning, defining and designing solutions based on business requirements.
- Performed planning sessions with the Business/Product owners and development Team as part of the team to provide estimates, road maps and steps to be followed for the implementation of the project.
- Work on Data Modeling for creating the dimension and fact data structures using Star and Snowflake schema for the Data Mart/Data Warehouse using Toad Data Modeler.
- Conduct data profiling and analyze results (e.g. analyze data quality, identify data gaps) for the source data.
- Performed Logical and Physical Data Modeling and delivering Normalized, De-Normalized and Dimensional schemas.
- Develop technical design documents (both low level and high level), mapping specifications (source to target mapping docs) and unit testing documents.
- Involved in building the ETL architecture and STTM to load data into Data warehouse.
- Designed and developed automation for creating flat file source definition in Informatica for very wide files to save development efforts required.
- Extensively used Informatica Power Center and SSIS to extract the data from different sources like Flat files, XML files, IBM MQ Sources and MS SQL Server and Transform them as per business logic and load into Oracle and SQL Server.
- Used various transformations like Source Qualifier, Joiner, Lookup, Router, Filter, Expression and Update Strategy etc. in Informatica Power Center.
- Used Aggregate, Conditional Split, Union All, Lookup, Derived Column, Data Conversion, Multicasting and Merge Join Data Flow transformations in SSIS.
- Created STG to ODS and ODS to DW mappings using Informatica 9.5 and SSIS.
- Implemented CDC, SCD’s (type-1, type-2) Mappings.
- Develop ETL deployment plan and ETL operations manual to coordinate and drive ETL activities in Test and Production environment.
- Generated Sub-Reports, Real-time, drill down reports, Drill through reports, Summary and Parameterized reports using SSRS based on user requests.
- Changes/ Enhancements of existing SSRS reports.
- Created POC for automation of objects migration to other environments with command prompt.
- Perform peer review for ETL code.
- Designed Error Handling and Audit Framework for ETL pipelines.
- Worked on the Data Quality certification queries for data marts.
- Successfully delivered migration of ETL code from Legacy Hummingbird Genio to Informatica 9.5.
- Implement CR/Enhancements of existing DW/ETL.
- Analyzed and resolved defects in Production.
- Witten Python programs for automation and simplification of ETL Tasks.
Environment: Informatica 9.5, TFS, Jenkins, Tortoise SVN, MS SQL Server 2008R2/2012, SSIS, SSRS, Qlik-Sense, Oracle 11g, SSMS 2012, Toad, Toad Data Modeler, CA IDMS, Web focus, UNIX, and MS-office.
- Participated in validation and review Data models.
- Created the conceptual, logical & physical data models of the data warehouse using snowflake schema and conducted architectural design overview sessions with business and development team.
- Designed and maintained data warehousing artifacts and data dictionary.
- Created Complex Stored Procedures, Triggers, Functions (UDF), Indexes, Views, Scripts and other T-SQL code for the SQL applications following SQL code standards.
- Wrote various complex SQL queries for data profiling, data validation and data quality critical for data marts.
- Designed and developed Informatica ETL mappings to extract master and transactional data from heterogeneous data feeds and load into relational tables.
- Optimized SQL queries and long-running Stored Procedures for improved performance and availability, and for effective data retrieval.
- Monitored Strategies, Processes, and Procedures to ensure the Data Integrity, Optimized and reduced Data Redundancy, maintained the required level of security for all production and test databases.
- Performed Optimization, Bug Fixing, and Enhancement during the various quality assurance testing cycles and for the Production dashboard reports, also coordinated with the testing team to fix the defects raised during the testing.
- Responsible for Coding, testing, performance tuning in developing Informatica mappings and workflows.
- Provided accurate analysis in a timely fashion for any problem/ issues.
- Migrated SSIS packages to Informatica ETL code.
- Build & implement drill through drill down, parameterized and real-time BI reports using SSRS.
Environment: Informatica 9.5, Informatica Power exchange TFS, MS SQL Server 2008R2, SSIS, SSRS, Tableau 8.1, DB2, UNIX, and MS-office.
- Involved in design, development, and maintenance of star schema for Data warehouse project.
- Participated in the creation of the logical & physical data models of the data warehouse as per the business requirements using Erwin.
- Involved in Business Users/Architect Meetings to understand their requirements.
- Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica 9.5.
- Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
- Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
- Worked extensively with the connected lookup Transformations using the dynamic cache.
- Worked with complex mappings having an average of 15 transformations.
- Created and scheduled Jobs based on demand, time popped.
- Monitored Workflows and Sessions using Workflow Monitor.
- Performed Unit testing, Integration testing and System testing of Informatica ETL code/Pipelines.
- Designed and tested PL/SQL and T-SQL scripts.
- Designed UNIX scripts for ETL Automation.
- Created and maintained runbook for production support team.
Environment: Informatica 9.5, TFS, Jenkins, Tortoise SVN, MS SQL Server 2008R2, SSIS, Tableau 8.1, Oracle 11g, UNIX, and MS-office.
- Responsible for converting business requirement into the functional and technical document.
- Involved in preparation of High-Level Design Document (HLD), Low-Level Design Documents (LLD).
- Maintained LDM and PDM documents for every sprint in a project.
- Worked with the end users to analyze data flow process and understand the requirement for the reporting.
- Applied Innovative/Problem Solving techniques in supporting this project.
- Extensively used XML parser transformation for parsing the XML data and load to a staging area in Informatica Designer for Guidewire Payloads.
- Involved in POC for PDM (Push down optimization) to analyze performance improvement on existing ETL code.
- Written stored procedures for clean partial loads in case of failure/restart ability.
- Rewrite the existing code, queries, and stored procedures for long running queries by replacing cursors with various joins, common table expression (CTE), indexing and applied MS best practices for query tuning.
- Coding, testing, performance tuning of ETL Informatica mappings and workflows.
- Involved in Unit testing and SIT.
- Created monthly and quarterly sales dashboards using Tableau.
- Interacted with testing team to fix the defects raised during the testing period.
- Perform peer review for mappings.
- Responsible for creating data mapping and data dictionary and ETL documentation.
Environment: Informatica 9.1, TFS, Erwin, Jenkins, Tortoise SVN, Altova XML Spy, MS SQL Server 2008R2, Tableau 8.1, Oracle, UNIX, Ervin and, MS-office.
- Involved in Planning, Defining and Designing database based on business requirements and provided documentation.
- Analyzed the sources, targets, transformed the data, mapped the data and loaded the data into the data warehouse.
- Extensively used ETL/Informatica Power Center for loading the data from various sources involving flat files and Oracle tables to Data warehouse.
- Modified existing mappings for enhancements of new business requirements.
- Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer, and Scheduling of the workflow.
- Involved in warranty production support.
- Responsible for optimizing all indexes, SQL queries, stored procedures to improve the performance of reports and ETL.
- Written T-SQL queries, Stored Procedures and used them to build data pipelines.
- Created/Updated database objects like tables, views, stored procedures, functions.
- Handled slowly changing dimensions to maintain the history of the data.
- Created Shell Scripts for generic use.
Environment: Informatica 9.1, MS SQL Server 2008, MS SQL Server Integration services (SSIS), MS SQL Server Reporting Services (SSRS), Oracle, UNIX, and MS-office.
MS SQL Developer
- Responsible for creating database objects like Tables, Stored Procedures, Triggers, Functions etc. using T-SQL to provide structure to store data and to maintain database efficiently.
- Implemented the error handling in the stored procedures and SQL objects and modified already existing stored procedures, triggers, view, indexes depending on the requirement.
- Responsible for optimizing all indexes, SQL queries, stored procedures to improve performance.
- Reversed engineer backend database to change the T-SQL scripts, create Views, Index, Stored Procedures, Triggers, CTE, Constraints and, UDF to improve performance drastically.
- Used DDL and DML Triggers, Stored procedures to check the Data Integrity and verification at early stages before calling them.
- Developed report using SSRS based on end user’s requirements.
- Worked in batch Scripts.
- Developed and maintained ETL and data mapping documentation.
Environment: Informatica 9.1, MS SQL Server 2008, MS SQL Server Integration services (SSIS), MS SQL Server Reporting Services (SSRS), SQL and Excel.
- Creating Software Requirement Specification ( SRS ) based on Formal Meeting/ discussions with HOD of dept., supportive documents etc.
- Creating Software Use Cases based on SRS.
- Designing User Interface, based on Paper Documents/ template provided by the end user.
- Coding for Designing Form, Forms validations throughout the application.
- Database connectivity to User Interface.
- Unit testing as per SRS, irrespective of writing User Test Cases.
Environment: Java, J2ME, MS-Access, MS SQL, Java Scripts.