- Overall 5 years of experience in the field of IT Industry. Experience in full life cycle of data warehouse development lifecycle from gathering requirements, design, implementation, testing, maintenance and production support for new and existing applications. Strong technical and development expertise in Informatica PowerCenter, Informatica Data Quality, MDM, Oracle SQL PL/SQL, Teradata, UNIX Shell Scripting. Experience in Agile methodology and sprint planning.
- Experience with dimensional modeling using star schema and snowflake schema.
- Developed mappings in Informatica PowerCenter to load the data from various sources into the Data Warehouse, using different transformations like Source Qualifier, Expression, Lookup, Aggregate, Filter, Router, Update Strategy, Joiner etc.
- Strong Data cleansing, Data quality and Data/Code Migration experience using Informatica.
- Responsible for error handling, supporting and scheduling ETL jobs.
- Experience in resolving on - going maintenance issues and bug fixes, monitoring Informatica sessions as well as Performance Tuning of sources, targets, mappings, transformations and sessions.
- Expertise in implementing complex business rules by creating re-usable transformations,
- Highly Skilled with the Integration of various Operational Data Sources (ODS) to Multiple Relational
- Database Management Systems (RDBMS) like Oracle, DB2, SQL server, Flat Files, Excel, and XML files.
- Worked extensively with slowly changing dimensions (SCD) and change data capture (CDC).
- Troubleshooting and problem-solving applications with Informatica Debugger and performed code migration from Dev to Test.
- Experience in using IDQ for profiling, applying rules and develop mappings to move data from source to target systems.
- Experience in Informatica MDM.
- Good understanding in entity relationship and Data Models.
- Experience in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries and packages.
- Good knowledge in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.
- Expertise in writing large/complex SQL queries using Joins, Views, volatile tables, Sub Queries, Indexes, Constraints and OLAP functions.
- Extensively worked with Teradata Utilities like BTEQ, TPT, Tpump, FastExport, FastLoad, MultiLoad and Indexing (PI, SI, JI, HI, PPI).
- Proficient in SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints both in Teradata as well as Oracle.
- Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
- Performed Data validation, Data Integration and End-to-End testing.
- Experience in developing SQL scripts to validate reports.
- Hands on experience on Tableau Desktop versions 10.1, Tableau Reader and Tableau Server.
- Experience in publishing of various kinds of live, interactive data visualizations, dashboards, reports and workbooks from Tableau Desktop to Tableau servers.
- Experience in creating Scatter Plots, Stacked Bars, Box and Whisker plots using reference, Bullet charts, Heat Maps, Filled Maps and Symbol Maps.
- Expert level capability in Tableau calculations and applying complex, compound calculations to large, complex data sets.
- Experience in creating and scheduling jobs using Tidal and Control-M .
- Good knowledge of Hadoop Ecosystem, HDFS, Pig, Hive & HBase.
- Using HP ALM for storing, maintaining the test repository, bug tracking and reporting.
- Experience in preparing documents on Detailed Technical Design, Migration process, Test cases and ETL Run Book.
ETL & BI Tools: Informatica PowerCenter 8.x/9.x/10.x, Informatica Data Quality 9.1.0, Informatica MDM 10.1, Oracle Data Integrator 10g/11g, Tableau 10.1
Databases: Oracle 9i/10g/11g, Teradata
Programming Language: SQL, PL/SQL
Scripting Language: UNIX Shell Scripting
Scheduling Tools: Tidal Enterprise Scheduler, Control M
DB Utilities: SQL * Plus, SQL Developer, Toad, Teradata SQL Assistance, Teradata Studio
Other Tools: Tidal Enterprise Scheduler, Control M
Operating Systems: Window 2000/NT/XP, Unix-AIX, Linux
Confidential, Irving, TX
Environment: Informatica 10.1.0, IDQ 10.0.0, Teradata, MSSQL, Tidal Scheduler and Transporter, ServiceNow.
- Expression, HTTP, Update Strategy, Union, Lookup, Joiner, XML parser, XML generator and MQ target.
- Developed an error handling framework in Informatica to handle any session failure and deadlock situation in teradata.
- Created dynamic parameter file for each workflow.
- Revalidated the existing mapping/mapplets, workflows/worklets.
- Creating audit, error table and log tables in Teradata.
- Analysing the existing workflow sessions and finding dependencies for scheduling.
- Implemented the procedure to create incident in ServiceNow through mapping when job failed.
- Managed multi source data extraction using Informatica.
- Involved in necessary training and knowledge transfer to assigned project members.
- Created sessions and batches for data movement using Workflow Manager.
- Created reusable Transformations and Mapplets and used them in mappings in case of reuse of the transformations in different mappings.
- Developed and documented Data Mappings/Transformations, and Informatica sessions as per the business requirement.
- Involved in migration of the mappings from IDQ to PowerCenter.
- Data profiling and scorecard in collaboration with Data Architect.
- Complex quality rule development and implementation patterns with cleanse, parse, standardization, validation, exception, notification and reporting with ETL and Real-Time consideration.
- Developed Exception Handling strategies to capture errors during loading processes, to notify the exception records to the source team and automating the processes for loading the failed records to warehouse.
- Applied the rules and transformation logics as per the requirement and profiled the source and target table's data using IDQ.
- Developed mappings, Process Sequence, Dictionaries, reference tables and rules.
- Used various transformations like Address Validator, Parser, Labeller, standardizer, Filter, Case, Match to develop the mappings.
- Worked with Tidal professional service person to install Tidal Enterprise scheduler master, client and agent on Windows environment.
- Perform analysis, design and implementation of batch processing workflows using Tidal Enterprise Scheduler.
- Worked with Rest and Soap APIs to check daily failure jobs and to restart them from Informatica with given time interval.
- Sending incident close request to ServiceNow on successful execution, for any open incident from Informatica.
- Created job event, variables, alerts, email notification to monitor jobs in Tidal.
- Created jobs in Tidal scheduler and defining inter and intra group dependencies.
- Worked in Tidal Transporter to migrate jobs from UAT to Production.
- Participated in sprint planning, story grooming, design discussion and daily scrum.
Environment: Informatica 9.1/8.6.1, UNIX, Putty, WinSCP, Oracle SQL Developer, Teradata SQL Assistance, Tidal
- Designed, Developed and Supported ETL Process for data migration with Informatica.
- Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
- Worked extensively with the connected lookup Transformations using dynamic cache.
- Written shell scripts for dynamic generation of Parameter Files and controlling the ETL flow.
- Used TIDAL enterprise scheduler for scheduling the Daily, Weekly & Monthly jobs.
- Automated ETL process using Tidal Enterprise Scheduler.
- Created job event and email notification using Tidal Client to alert users.
- Involved in the creation of new objects (Tables/Views, Indexes, Keys) in Teradata and modified the existing ETL's to point to the appropriate environment.
- Prepared mapping document for Hadoop team.
- Fixing invalid Mappings, Debugging the mappings in designer, Unit and Integration Testing of Informatica Sessions, Worklets, Workflows, and Target Data.
- Used EXPLAIN PLAN, Collect Statistics for performance analysis, monitoring and query tuning.
- Used Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
- Preparing the test environments, Test results, tracking of bugs.
- Documentation on Migration process, Test cases and ETL Run Book.
- Giving support to Hadoop team for dependent jobs.
- Implemented Email task for session and workflow failure. It was used to notify the support team in case of any failure.
- Developed UNIX Shell Scripts for running Informatica workflows and sessions (pmcmd commands), File manipulations and NDM utility program.
Environment: Informatica 9.0.1, Informatica Data Quality 9.1.0, Oracle 10g, PLSQL Developer, UNIX, Putty
- Extensively worked on Informatica IDQ/PowerCenter.
- Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
- Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
- Used various transformations like Address validator, Parser, Standardizer, Match, Merge, Labeler, Keygen, Case, Comparison, Consolidation, Expression etc. to develop the mappings.
- Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
- Applied the rules and profiled the source and target table's data using IDQ.
- Developed business rules for cleansing/validating/standardization of data.
- Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
- Involved in migration of the mapplets from IDQ to PowerCenter.
- Developed the mappings, applied rules and transformation logics as per the source and target system requirements.
- Effectively used Informatica parameter files for defining mapping variables, workflow variables and relational connections.
Environment: Informatica 8.1.1, ODI 10g, Oracle 9i/10g, PLSQL Developer, Toad, UNIX, Putty, Control M
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) process using Informatica.
- Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
- Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank transformations.
- Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
- Exported/Imported the mappings/sessions/workflows from development to Test Repository and promoted to Production.
- Used Session parameters, Mapping variable and created Parameter files for imparting flexible runs of workflows based on changing variable values.
- Worked with Static, Dynamic and Persistent Cache in lookup transformation for better throughput of Sessions.
- Extensively used debugger to test the logic implemented in the mappings.
- Performed error handing using session logs.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size and target-based commit interval.
- Created mappings to capture changes in the audit table.
- Created UNIX shell scripts for Informatica ETL tool to automate sessions.
- Unit testing code in QA and UAT.
- Giving support to support team during outbound process.