- 9 years of strong Data Warehousing experience in analysis, design, development and implementing software solutions in various business applications with ETL/Informatica Power Center 10.0/9.5.1/126.96.36.199
- 2.5 year of real - time data Warehouse development experience using CDC tool Informatica Power Exchange 9.5.1.
- Extensively worked with various components of the Informatica Power Center - Power Center Designer (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation developer), Repository Manager, Workflow Manager, and Workflow Monitor to create mappings for the extraction of data from various source systems.
- Good knowledge of business/process dataflow in organizations and proficient in data modeling using ERWIN.
- Experience in implementing complex business rules by creating re-usable transformations, and robust mappings/mapplets using different transformations like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure, Normalizer etc.
- Proficient in full life cycle development of Data Warehousing, ETL strategies, reporting and hands on experience in Performance Tuning of sources, targets, transformations, mappings and sessions using Database Tuning, Partitioning, Index Usage, Aggregate Tables, Session partitioning, Load strategies, commit intervals and transformation tuning.
- Extensive experience in integrating data from various Heterogeneous sources like Relational database (Oracle, SQL Server, Teradata, DB2), Flat Files (Fixed Width and Delimited), COBOL files, XML Files and Excel files into Data Warehouse and Data Mart.
- Experience in Database Design, Entity-Relationship modeling, Dimensional modeling like Star schema and Snowflake schema, Fact and Dimension tables.
- Worked with Slowly Changing Dimensions type 1 and type 2.
- Experience in Oracle PL/SQL Programming (Stored procedures, Triggers, Functions, Packages) and UNIX shell scripting to perform job scheduling.
- Worked with Teradata loading utilities like Multi Load, Fast Load, TPump and BTEQ.
- Good experience in documenting the ETL process flow for better maintenance and analyzing the process flow.
- Demonstrated ability to take initiatives, grasp and expand on ideas, tackle and follow through difficult assignments in the fast-pace and changing environment.
- Independent yet team oriented with excellent analytical, problem solving, multi-tasking, oral and written communication skills.
ETL Tools: Informatica PowerCenter 10.0/9.5.1/8.6.1
CDC tool: Informatica Power Exchange 9.5.1
Databases: Oracle 11g/10g/9i/8i,DB2,Teradata 14/13/V2R6, SQL Server 2008/2000.
Data Modeling tools: ERWIN 4.x, MS Visio
Languages: SQL, PL/SQL, UNIX Shell scripts, C, C++, Java, VB Script
Development Tools: SQL *Plus, TOAD, SQL Navigator, SQL Developer
Reporting Tools: Business Objects XI, OBIEE 10g.
Scheduling Tools: Autosys, Tidal
Packages: MS Word, MS Excel, MS Visio, MS PowerPoint
Operating Systems: Sun Solaris, UNIX, LINUX, MS-DOS
Processes/Methodologies: Agile Software development, Scrum/Sprint, Pair programming.
Confidential, Denevar, CO
- Analyzed source data for potential data quality issues and addressing these issues in ETL procedures.
- Developed technical design documents and mappings specifications to build Informatica Mappings to load data into target tables adhering to the business rules.
- Design, develop, test, maintain and organize complex Informatica mappings, sessions and workflows.
- Translated the PL/SQL logic into Informatica mappings.
- Provided support for existing systems and worked on new development initiatives simultaneously.
- Actively participate in design and analysis sessions to ensure sound team decision-making.
- Support system and user acceptance testing activities, including issue resolution.
- Complete technical documentation to ensure system is fully documented.
- Design and develop ETL using CDC using Power Exchange 8.6 in Mainframe DB2 environment.
- Created registration and data map for mainframe source.
- Created restart token for various workflows.
- Demonstrate in-depth understanding of Data Warehousing (DWH) and ETL concepts, ETL loading strategy.
- Investigated current system, created requirements for new user management application
- Created application for migrating data between system/database using PHP, MYSQL
- Attended weekly meetings and discussed about the future goals
- Learned valuable communication and developing skills by working with other developers and administrators.
- Experience in writing SQL, PL/SQL codes, stored procedures and packages.
- Experience in full and partial push down optimization.
- Worked on Data Modeling using Star/Snowflake Schema Design, Data Marts, Relational and Dimensional Data Modeling, Slowly Changing Dimensions, Fact and Dimensional tables, Physical and Logical data modeling using Erwin.
Environment: Informatica PowerCenter 9.1, PowerExchange 9.1, SQL Server 2008, Teradata 13, UNIX, Toad, Harvest, Sun Solaris and Erwin7.5, OBIEE 10g.
Confidential, Lincolnshire, IL
Sr. Informatica Developer
- Worked with business analyst for requirement gathering, business analysis, and testing and project- coordination.
- Created the Detail Design Documents which have the ETL technical specifications for the given functionality, overall process flow for each particular process, Flow diagrams, Mapping spreadsheets, issues, assumptions, configurations, Informatica code details, shell scripts etc. and conducted meetings with the clients for the Approval of the process.
- Analyzed the mapping logic to determine the reusability of the code.
- Handled versioning and dependencies in Informatica.
- Developed complex Informatica mappings using various transformations- Source Qualifier, Normalizer, Filter, Connected Lookup, Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator, Reusable sequence generator transformation.
- Extensively used SCD’s (Slowly Changing Dimension) to handle the Incremental Loading for Dimension tables, Fact tables.
- Worked with Business analysts and OBIEE Developers for requirements gathering, business analysis and designing of the data marts.
- Responsible for Managing and Maintaining the OBIEE servers and services, coordinated with infrastructure team for bouncing services during planned/unplanned server downtimes
- Configured Informatica Power Exchange connection and navigator.
- Created Registration, DataMap, configured Real-Time mapping and workflows for real-time data processing using CDC option of Informatica PowerExchange.
- Wrote Script to load multiple tables and used MLOAD utility.
- Used Debugger utility of the Designer tool to check the errors in the mapping and made appropriate changes in the mappings to generate the required results.
- Performing ETL & database code migrations across environments.
- Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
- Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
- Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution of Informatica workflows in different environments.
- Created Unit test plans and did unit testing using different scenarios separately for every process. Involved in System test, Regression test & supported the UAT for the client.
Environment: Informatica PowerCenter 10.1/9.5.1, Informatica PowerExchange 9.5.1, OBIEE, Toad, UNIX Shell Scripting, Oracle Exadata, SQL Server 2008, Oracle 11g, Mercury Quality Center, Autosys, Linux.
- Designed ETL processes using Informatica to load data from Flat Files, and Excel files to target Oracle Data Warehouse database
- Performed data manipulations using various Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, and Sequence Generator, Stored Procedure etc.,
- Used tuned SQL overrides in source Qualifier to meet business requirements
- Written presession and post session scripts in mappings.
- Created sessions and workflows for designed mappings. Redesigned some of the existing mappings in the system to meet new functionality
- Created and used different reusable tasks like command and email tasks for session status
- Used Workflow Manager to create Sessions and scheduled them to run at specified time with required frequency
- Monitored and configured the sessions that are running, scheduled, completed and failed.
- Created test case, participate in System integration test, UAT and migrated code from Dev to Test to QA to Prod.
- Compiled report presentations using tools like Bussiness Object reports.
- Involved in writing UNIX shell scripts for Informatica ETL tool to fire off services and sessions
Environment: Informatica PowerCenter 9.1, Oracle 11g, Flat files, Excel, SQL Server, PL/SQL, Toad, UNIX Shell Scripting, Windows XP, Bussiness Object XI R2.
Confidential, Birmingham, Al
ETL/ Informatica Developer
- As a team member i involved in collecting all the requirements for building the database.
- Involved in Creation and designed required tables, indexes and constraints. for all activities.
- Played a key role in designing the application and migrate the existing data from relational sources to corporate warehouse effectively by using Informatica Power center.
- Responsible for Extracting, Transforming and Loading data from Oracle, Flat files and placing them into targets.
- Developed various mappings using Mapping Designer and worked with Source qualifier, aggregator, connected unconnected lookups, Filter transformation, and sequence generator transformations.
- Involved in Data Modelling and design of Data Warehouse and Data Marts in Star Schema methodology with conformed and granular dimensions and fact tables.Designed and developed UNIX Shell scripts to report job failure alerts.Worked with Data Quality group to identify and research data quality issues.
- Coding, Debugging and sorting the time-to-time technical problems.
- Extensive experience in Performance Tuning -Identified and fixed bottlenecksand tuned the complex Informatica mappings for better Performance.
- Involved in Unit testing, System testing and UAT to check data consistency.
- Scheduled Informatica Jobs through Autosys scheduling Tool.
- Involved in designing the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) in Oracle as per business requirements using ERWin.
- Extensive experience in Oracle Packages, Stored Procedures, Functions and Database Triggers using PL/SQL.
- Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email, command and pre/post SQL.
Environment: Informatica PowerCenter 8.5, Oracle 10g, SQL Server (7.0), SQL, Sybase,Teradata, PL/SQL, Erwin, UNIX, Windows XP.
Confidential, Edgewood, NY
ETL/ Informatica Developer
- Involved in identification of facts, measures, dimensions and hierarchies for the DM model.
- Analysed Source System data. Worked on Informatica Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet and Transformation designer.
- Imported data from various Sources (Oracle, fixed width flat files, XML), transformed and loaded into Targets using Informatica.
- Used Teradata Special Loads like Tpump, Mload, and FastLoad.
- Created Stored Procedures to transform the Data and worked extensively on PL/SQL for various needs of the transformations while loading the data.
- Tuned mappings, transformations and recommended tuning options of source/target database to DBA team for obtaining optimum performance.
- Created Universe & the Business Objects Reports using the universe as the main Data provider and writing the complex queries including Sub queries, Unions, Intersect and Aliases using Business Objects features like Slice and Dice, Drill Down, and Formulae.
- Created Mapplet and used them in different Mappings.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
- Used Informatica Server manager to create Sessions and Workflows to run with the logic embedded in the mappings.
- Performance tuning of the Informatica mappings using various components like Parameter files, variables and dynamic Cache.
- Involved in Documentation regarding the ETL process.
- Coordinated with offshore site in developing and deploying mappings.
- Created Informatica mappings with PL/SQL stored procedures/functions to in corporate critical business functionality to load data.
- Created database tables, indexes in Oracle Database by implementing various business rules using appropriate constraints and database triggers.
Environment: Informatica PowerCenter 8.1, Informatica Server Manager, Informatica Repository Manager, Workflow manager, Workflow monitor, Oracle 9i, Erwin 4.1, SQL 2005/2000, Flat Files, Windows XP.
Confidential, Nashville, TN
- Created Informatica mappings using various transformations like XML, Source Qualifier, Expression, look up, stored procedure, Aggregate, Update Strategy, Joiner, normalizer, Union, Filter and Router in Informatica designer.
- Extensively worked with Teradata database using BTEQ scripts.
- Involve in all phase of SDLC, i.e. design, code, test and deploy ETL components of data warehouse and integrated Data Mart .
- Created subscriptions for source to target mappings and replication methods using IBM CDC tool.
- Used NZSQL scripts, NZLOAD commands to load data.
- Experience in Data Stage Upgrade and Migration projects - from planning to execution.
- Analysis of heterogeneous data from various systems like pm and Salesforce.com and validating it in ODS (operational Data store).
- Worked with Informatica IDQ Data Analyst, Developer with various data profiling techniques to cleanse, match/remove duplicate data.
- Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ.
- Designed and developed IDQ solutions for data profiling. Implemented Address Doctor as Address Validator transformation for data profiling in IDQ.
- Worked extensively with Netezza scripts to load the data from flat files to Netezza database.
- Created mappings using pushdown optimization to achieve good performance in loading data into Oracle and Teradata.
- Developed various SQL queries using joins, sub-queries & analytic functions to pull the data from various relational DBs i.e. Oracle, Teradata & SQL Server.
- Created Web services mappings for consumer and Provider, used Webservices consumer transformation, XML parser to parse the incoming data.
- Created and edited custom objects and custom fields in Salesforce and checked the field level Securities.
- Worked in the Informatica cloud in order to replicate the data from the Salesforce.
- Responsible for writing Unix Shell Scripts to schedule the jobs.
- Involved in analyzing, defining, and documenting data requirements by interacting with the client and Salesforce team for the Salesforce objects.
- Worked with cleanse, parse, standardization, validation, scorecard transformations.
- Team would pick the Data Extract TXT Files from MBOX Server for processing to SAP System. SAP System would send Error/ Reconciliation data to MBOX Server in TXT File Format.
- Created pre-session, post session, pre-sql, post sql commands in Informatica.
- Used UNIX scripts for file management as well as in FTP process.
- Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings to PROD.
- Production support for the Informatica process, troubleshoot and debug any errors.
Environment: Informatica Data Quality 9.1.0/9.5.1, Flat Files, Mainframes Files, Oracle 11i, Netezza, Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,SQL Server 2008, Salesforce.com, Webservices.