- 11+ years of IT experience in Software Analysis, Data warehouse and ETL Development, Testing, Enhancements, Production Support, Reporting, Implementation of business applications for various domains like Railways, Media, Education, Banking, Energy, Retail and Utility services.
- Involved in all phases and complete life cycle of the large Data warehouse and decision support systems handling tasks like Logical and Physical Data modeling, Designing ETL strategies, Development, Testing, Validation using use cases & scripts, Implementation, scheduling ETL jobs & Production Support.
- Having sound knowledge in working on Big Data, OLAP & Multi - Dimensional Data Modeling. Designing star- schema / snowflake schemas, Data Warehousing, Data Marts, Data quality Analysis, Data Integration, Slowly Changing Dimensions (SCD), Change data capture (CDC) and creating job flow diagrams.
- Have extensive experience in Informatica 7x through’ 9.6 with PowerCenter, PowerExchange, Informatica IDQ, Data transformation studio and reporting tools like Business Objects 4.1 and Tableau 10x.
- Hands on experience in IDQ on developing IDQ profiles, Data Analysis, Data Cleansing, Data Matching and error handling. Able to configure AddressDoctor, Trillium for Address Cleansing and Standardization.
- Expertise in Big Data integration over cloud environments like AWS/GCP/Salesforce and between Databases like Oracle, Teradata, PostgreSQL, SQL Server, DB2 etc and various flat files like XML, JSON etc.
- Extensively worked on Unstructured & B2B transformations using Data transformation studio handling various file formats like XML,JSON, CSV etc. Created many Worklets, Mapplets, Reusable transformations, partitions, implemented push down optimization logic’s, usage of variables/parameter files etc.
- Hands on experience in writing, testing and implementation of the Oracle/PostgreSQL PL/SQL programs, T-SQL, Stored procedures, Functions, Views and performance tuning of SQL queries.
- Having good working experience PL/SQL programming, creating packages and procedures, functions, UNIX shell scripting. Created scripts for calling pmcmd command, file watcher, FTP, file archiving scripts.
- Having good experience in Production support role managing ETL projects independently and also coordinating with off-shore teams. Handle problem tickets on 24/7 support with On-call support and maintain SLA’s by fixing the issues on-time based on the priority.
- Created various reports using Business Objects 4.1 and Tableau 10x and worked on scheduling tools like Tivoli, Control-M
Data Warehousing& BI: Informatica Power Center 9x/8x/7x, Power Exchange, IDQ
Reporting Tools : Business Objects 4.1, Tableau 10x & Crystal Reports 9/X
Databases : Oracle 12g/11g, Teradata 14, PostgreSQL 10, SQL Server, DB2 etc
Programming : PL/SQL, T-SQL, ANSI-SQL, Python scripting, UNIX Shell,Scripting
OS / Environment : UNIX, AIX, Windows, AWS, GCP, Salesforce
Other tools: Toad, SQL Assistant, PL/SQL Developer, SQL Plus, Aqua data,Jira, Confluence, Remedy, FileZilla, Putty, Tidal, Apex,Explorer, Apex Data Loader etc.
Senior DW ETL/BI Developer
Confidential, New YorkResponsibilities:
- Involved in designing and implementation of multiple Subscriptions related ETL projects and creating various Business Objects/Tableau reports.
- Responsible for Technical architecture and creation of technical specs & designing of ETL processes like Informatica mapping, source, target and staging databases.
- Maintained standards to ensure database scripts and components meet business/technical requirements and performance goals. Created database objects, Informatica workflows and reports using best practices.
- DW has around 25+ dimensional tables with Snapshot/Milestone as the main transactional tables. Data mart uses these tables to generate Rollup/Summary tables to generate various reports for Digital & Print subscriptions
- Event tracker data receives around 300+ million records on daily basis and ETL workflow is scheduled to run continuously to receive the data feeds on regular basis and pick only delta between each runs.
- Involved in migrating data from Oracle/PostgreSQL to AWS S3 & GCS cloud storages and then loading to Redshift & Big query tables using Informatica tasks and Unix/Python scripts wherever necessary.
- Used IDQ for Parsing, Cleansing & Standardizing new customer addresses using Address Doctor/Trillium. Also worked on POC projects using IDQ.
- Handled different types of transformations in IDQ including Address Validator (Address Doctor), Merge and consolidation Transformations.
- Used Unstructured Data transformation for single copy subscriptions data feeds and B2B transformations to handle XML inputs and to provide output formats like XML, JSON etc.
- Generated Health check report and Information Job status reports in Business objects and scheduled to run on hourly basis to make sure smooth running of ETL jobs and no break in integrity of data.
- Created various Business objects report like Daily Digital & Print subscriptions Starts/Stops/Actives report, Forecast Trend Report, Tenure wise permanent stops, Pay model Executive Dash report, Actuals Vs. Forecast waterfall report etc.
- Involved in migration of Business objects reports to Tableau and re-created around 20+ Subscription related Dashboard reports like Pie, Bar, Line, Scatter plots etc. and Advanced charts like - Geospatial Maps,, Gantt Charts etc.
Environment: Informatica Power Center 9.6, Oracle 12g, PostgreSQL 10, AWS, GCP, Business Objects 4.1, SQL Developer, FileZilla
Senior Informatica Developer
- Salon Centric Synergy - Salesforce.com: This project involves migration of data from Loreal USA Enterprise Systems (SAP and EIW) to Salesforce.com to support the needs of SalonCentric Field Sales. Interfaces include Account, Customer, Contact, Account parent, Account Team, License, Product, BOM, Rewards, and Opportunity etc.
- Salon Centric Offline: Whenever SAP system is down for maintenance, Loreal company stores (around 270) uses the Offline data that fed from SAP to Oracle and once backup, updation of data from Oracle to SAP will be done using Informatica interfaces.
- File interface agreement: Loreal provides controlled salon centric data in flat file format to Hartehanks for analysis and marketing.
- Luxe Beauty Advisor Talent Analysis: Project will leverage SAP data for all of the required Customers, Salesforce Hierarchy data and inbound and outbound with Demo as key system that is used to track all of the financial obligations of the Beauty Advisors for the given Retailers that they work.
- Involved in understanding the business requirements and designing various projects involving Salesforce.com, SAP, Oracle and SQL databases.
- Responsible for Technical architecture and creation of technical specs & designing of Informatica mapping, source, target and staging database.
- Design and develop in a manner to ensure that the database scripts and components meet business/technical requirements and performance goals and creation of database objects with best practices.
- Handled joining millions of data within Salesforce objects by bringing key data on premise for optimization and used BULK API for efficient loading to Salesforce.
- Used cross-reference tables for making lookup with sales force data from staging and to establish consistency with Staging and Salesforce.com.
- Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections, relational connections and application connections.
- Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks and various batch & unix scripts.
- Worked on complex de-duplication process against contacts address and used audit and control tables to keep track on the changes made to the records and respective row counts.
- Prepared low level technical design document and participated in build/review of the business requirements and creating various test cases.
- Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Oracle as well as MS SQL.
- Used Informatica error tables to capture any rejections while loading to SalesForce and used these tables for re-loading the records to SFDC once the issues were fixed.
- Implemented source and target based partitioning for existing workflows in production to improve performance so as to cut back the running time.
- Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
- Involved in version migration of workflows from Informatica 8.6 to Informatica 9.1
Environment: Informatica Power Center 9.5, SQL Server 2008, PL/SQL, SalesForce CRM, SAP, SQL Developer, Apex Data loader, Force.com Explorer, Global Scape FTP, Tidal Scheduler, Remedy
Senior ETL Analyst
- Involved in understanding the business requirements and designing of data integration and Migration from multiple legacy systems to Salesforce.com
- Involved in requirement review meetings with Business users & SFDC team.
- Developed technical specification requirement and Informatica mapping documents from the FS and requirement review meetings.
- Performed CDC and updated to SFDC, to enable only incremental data is loaded on Integration.
- Used transformations like Application Source Qualifier, Expression, Normalizer, Idoc transformation, Sorter, Sequence Generator, Aggregator, Joiner, Union, Filter, Lookup (Connected & Unconnected), Update strategy etc.
- Optimized SOQL (Salesforce Object Query language) queries in ETL sessions to get rid of ‘Query timeout issue’ in Informatica - where indexing was not possible on certain SFDC objects.
- Performed Unit testing after every piece of development and conducted code review sessions.
- Performed complete validation of data on staging before loading into SFDC objects.
- Performed UAT with SFDC functional team & Legacy CRM team
- Handled and troubleshooted various rejections of data due to Sales force Triggers, Timeout issues and other possible and unexpected scenarios.
- Created Application, Relational and FTP connections in Workflow manager in DEV area.
- Maintaining the versioning of developments, technical and functional documents to maintain change history.
- Used Informatica error tables to capture Salesforce rejections and make use of it to reprocess the rejected records after fixing the issue.
- Created various Oracle Stored Procedures, Views and Functions based on business rules and validations
- Optimization of complicated Oracle queries for better performance and to handle increasing data.
- Various tests like Unit testing, System Integration testing and User acceptance testing were done before moving data to Production environment.
- Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
- Co-ordinate with offshore team and make sure to meet the deadlines of code delivery.
Technical Environment: Informatica Power Center 8.6.1, Oracle 10g, DB2, PL/SQL, SalesForce CRM, SQL Developer, Mercury Quality Center (MQC), Polytron Version Control System (PVCS), Apex Explorer, Apex Data loader, Appworks Scheduler
Senior Informatica Developer
Confidential,Fort Worth, TexasResponsibilities:
- Performed major role in understanding the business requirements and designing and loading the data into SAP system.
- Extracted data from Oracle, DB2 & Mainframe and loaded into SAP using BAPI /RFC & IDOC transformations.
- Performance tuning using round robin, hash auto key, Key range partitioning.
- Created Interface and Conversion mappings to load data into and from SAP system.
- Implemented Slowly Changing Dimensions - Type I, II & III in different mappings as per the requirements to keep track of historical data.
- Created mappings and mapplets using various transformations like Normalizer, SQL transformation, Update Strategy, Router, Union, Joiner, Rank, Stored procedure, SQL transformation, XML transformation etc.
- Worked on unstructured data and XML documents.
- Created Oracle Stored procedures, Functions, PL/SQL programs, Packages and views wherever required.
- Used TOAD Explain Plan statistics to optimize mappings
- Implemented pipeline partitioning concepts like hash-key, round robin, key-range, pass-through techniques.
- Generated various reports using Crystal Reports X functionalities like Queries, Slice and Dice, Drill down, @Functions, Cross Tab, Master/Detail and Formulas as per client requirements
Technical Environment: Informatica Power Center 8.5.1, SAP, Oracle 10g, DB2, PL/SQL, T-SQL, Unix Scripting, Erwin data modeler, MS Visio, Core-FTP, VSS, Crystal reports X, AIX Server
Senior ETL Developer
- Performed major role as a member of global product support, understanding the business requirements and designing and loading the data into data warehouse (ETL) and maintaining the scheduled jobs.
- Used ETL (Informatica) to load data from source to ODS.
- Trouble shooting and escalation of issues to concerned teams.
- Created PowerCenter and PowerExchange to connect to Oracle, SQL Server and Mainframe files.
- Done various data mining analysis and reporting
- Daily loading of data through scheduled jobs by ODI using SCD types I,II & II.
- Defined Target Load Order Plan for loading data into Target Tables
- Worked on SQL Server DTS packages for ETL into staging area.
- Generated various reports using Business Objects functionalities like Queries, Slice and Dice, Drill down, @Functions, Cross Tab, Master/Detail and Formulas as per client requirements
- Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies
- Perform Debugging and Tuning of mapping and sessions.
- Test mapping to assure that data is loaded as per ETL Specification.
Technical Environment: Informatica 7.1, SQL Server 2000, Oracle 9i, TOAD, Business Objects 3.1, ClearCase & ClearQuest, VSS, Erwin Data modeler, EARS, PL/SQL Developer, Control-M
- Co-ordinate project activities and work closely with clients to identify business requirements and develop solutions
- Extensive documentation on the design, development, implementation, daily loads and process Visio flow of the mappings.
- Data extracted from Oracle & Siebel ERP were migrated to Sales force CRM and Big machines application.
- Created staging tables for making lookup with sales force data from staging for better performance and to also make sure to sync data between both.
- Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema
- Worked on complex de-duplication and roll-up logic (identifying primaries) and make sure to maintain master-detail relationship on SFDC objects involved.
- Used re-parenting logic when the parent of a child records get changed in which ETL will create new record and provide information of old & new records with SFDC id’s and apex job will update the parent with external id of re-parented child and delete the old child.
- Used error tables, audit tables, control tables and partitions to keep track of data changes and in case of job failed and restart the job from the place where it left.
- Various tests like Unit testing, System Integration testing and User acceptance testing were done before moving code to Production environment.
Technical Environment: Informatica Power Center 8.6.1, Oracle 10g, DB2, PL/SQL, SalesForce CRM, SQL Developer, Aqua data studio, Apex Data loader, Win SCP, ESP Scheduler
- As a senior member of Development/Support/Enhancement team, responsible for taking care of Job failures and mapping defects.
- Creation of tables, indexes, view, stored procedures, functions and granting roles and permissions to various users using synonyms.
- Used Erwin to prepare organization chart and their roles, schedule activities, track project progress and to visualize business process.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
- Extracted data from DB2, COBOL, Mainframe and write into flat files then replicated into ORACLE 9i/10g using SQL Loader and extracted data from Teradata using Optiload.
- Created various PL/SQL programs, Stored procedures, Functions and Views.
- Used ODI for integrating data on target database to match with the real time sources.
- Maintain Development, Test and Production mapping migration Using Repository Manager, also used Repository Manager to maintain the metadata, Security and Reporting
- Efficiently handled Problem and Change Management tickets based on the priority of tickets.
- Worked on SLAs and ensured above 99% SLA compliance on the applications throughout the duration of production support
Technical Environment: Informatica Power Center 8.1/7.1, SQL Server 2005, Oracle 9i, ODI, Mainframe, DB2, COBOL, Erwin data modeler, T-SQL, MS Access, TOAD,, VSS,, Remedy EARS, PL/SQL Developer, UNIX