- 8+ years professional experience in IT industry with wide range of experience in design, analysis, development, documentation, coding and implementation of Databases, Reporting, Data Warehouse, ETL Design and BI Applications across wide industry verticals.
- Demonstrated experience with design and implementation of Informatica Data Quality applications for the business and technology users across the entire full development life - cycle.
- Expertise in Data Warehousing Concepts & Data Warehousing Architecture with thorough knowledge on Dimensional Data Model, Conceptual Data Model, Logical Data model, Physical Data Model.
- Excellent experience in designing, modelling, performance tuning and analysis, implementing Processes using ETL tool Informatica Power center for Data Extraction, Transformation, and Loading processes.
- Strong expertise on the various data design models like Star Schema and Snowflake models.
- Expertise in building Enterprise Data warehouse(EDW), Operational Data Store (ODS), Datamarts and Decision Support Systems(DSS) using Data modeling tool ERWIN and Dimensional modelling.
- Strong knowledge and experience in interacting with Oracle, XML, SQL Server, DB2, SAP, Netezza, Teradata databases with SQL/PL SQL programming.
- 2 years of strong experience in Systems Integration using Informatica B2B Data Exchange.
- Exposure to Informatica B2B Data Exchange that allows to support the expanding diversity of customers and partners and their data with capabilities that surpass the usual B2B solutions.
- Worked with Teradata utilities like BTEQ, Fast Export, Fast Load, and Multi Load to export and load data to and from different source systems including flat files.
- Experience in performance tuning mappings, identifying and resolving bottlenecks in various levels like sources, targets, mappings.
- Experience in all phases of Software Development Lifecycle (SDLC) using Waterfall, Agile and Scrum Methodologies.
- Through understanding of Designing, Data Profiling, Data Cleansing, data matching and implementation of data quality services.
- Extensive Knowledge on the development life cycle process from requirements gathering to deployment of code to production.
- Experienced in database design, data analysis, development, SQL performance tuning, data warehousing ETL process and data conversions.
- Experience in UNIX Operating System and Shell scripting.
- Hands on Experience with the scheduling tools like Control M, Autosys and Active batch schedulers.
- Experience in DVO data validation by creating complex views against the target data.
- Ability to communicate requirements effectively to team members and manage applications. Extensive experience in providing 24/7 on-call support.
ETL Tools: Informatica Power Center 9.X/8.X/7.X/6.X, Power Exchange 9.X/8.X, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), ILM Work Bench, DVO (data validation tool).
Data Modeling Tools: Data Mart, Dimensional, Snow Flake, Star Schema, CA Erwin 4.0/3.5.5/3.5.2, Microsoft Visio
RDBMS: Oracle 119/10g, Teradata, SQL Server 2012/2008, DB2, Netezza
O/S: Microsoft Windows 98/NT/2000/XP, Sun Solaris, UNIX, Linux Red Hat
Scheduling Tools: Control - M, Autosys, Tivoli, Active Batch
Confidential, Nashville, TN
Environment: Informatica Power Center 9.5, Informatica Data Quality, Informatica B2B, Oracle 11g, SQL Server 2012, DVO (data validation tool),PL/SQL, MSBI Stack, SQL* Loader, XML, Toad, Unix, Win SCP, B2B DX/DT, Teradata 13.X, Active Batch.
- Extensively used Informatica Power Center 9.5 to extract data from various sources and load in to staging and target database.
- Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ .
- Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
- Involved in migrating the data from flat files, SQL Server, XML files, COBOL Files and Mainframe sources, SAP sources to Teradata target database.
- Extensively worked with XML files as the Source and Target, used transformations like XML Generator and XML Parser to transform XML files, used Oracle XMLTYPE data type to store XML files.
- Involved in complete SDLC phase in collecting and documenting requirements. Prepared technical design/specifications for data Extraction, Transformation and Loading.
- Extensively designed, developed, and tested Informatica mappings to extract data from external flat files and oracle tables using Informatica .
- Extensively used Mapping Variables, Mapping Parameters, Workflow variables and Session Parameters.
- Extensively used the Expression, Router, Filter, Lookups (Connected/Unconnected), Update strategy and Aggregator transformations.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly changing Dimension tables.
- Implemented Informatica Pass through partitions extensively to improve the performance of the mappings.
- Performed Informatica code migration from development to testing and testing to production systems.
- Contributed to Data Migration and the Implementation of Roadmap defined by the business case.
- Wrote Shell Scripts for event automation and scheduling.
- Created UNIX scripts to automate the activities like start, stop, and abort the Informatica workflows by using PMCMD command in it.
- Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
- Closely worked with the reporting team to ensure that correct data is presented in the reports.
- Provide technical assistance to the team on various MS BI stack technologies.
- Wrote, tested and implemented Teradata Fast load, Multi load and BTEQ scripts, DML and DDL.
- Used Teradata utilities fast load, multi load, tpump to load data.
- Migrated workflows, mappings, & repository objects from development to QA to production
- Used Pushdown Optimization technique to push the data to Source and target to increase session performance.
- Used various Informatica Error handling techniques to debug failed session.
- Implemented standards for naming Conventions, Mapping Documents, Technical Documents, and Migration form.
Confidential, Detroit, MI
Senior Informatica Developer
Environment: Informatica Power Center 9.1, Informatica Data Quality, ILM Work Bench, Oracle 10g/9i, PL/SQL, Toad, Flat files, XML, SQL Server 2008, Unix shell scripting, Power exchange, Control M.
- Performed installation, configuration, applying hot fixes, patches and version upgrades of Informatica products and provided on-call support for the ETL applications.
- Responsible for all aspects of data warehouse operations using Informatica 9.1, Microsoft SQL Server, Oracle and DB2 environments.
- Worked with the Business analysts and the DBA for requirements gathering, business analysis, testing, and metrics and project coordination.
- Created Mappings using Mapping Designer to load the data from various sources, using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter, and Sorter transformations.
- Migrated code to QA & Production environments and monitored the jobs.
- Enforced coding standards, best practices and formalized the code reviews.
- Developed complex Informatica mappings, mapplets, transformations and work flows.
- Worked extensively on the performance tuning of ETL workflows and SQL scripts.
- Provided on-call support to resolve production issues in time and meet the SLA.
- Worked with Informatica Data Quality (IDQ) toolkit. Designed IDQ mappings which is used as mapplets in Power center.
- Performed data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
- Developed and re-engineered database objects like Tables, Views, Stored Procedures, Triggers, Functions, Indexes and Constraints in T-SQL and PL/SQL.
- Involved in migration project to migrate data from data warehouses on Oracle/DB2 to Teradata.
- Designed data warehouse and data marts using Relational and Dimensional data modeling.
- Created technical specifications using Informatica Analyst tool and excel spread sheets.
- Supported Business Objects reports with Informatica Data Services and PowerCenter.
Confidential, Dallas, TX
Environment: Informatica Power Center 8.6/9.1, Oracle 10g, Windows NT, Flat files, TOAD, SQL, PL/SQL/SQL, Flat files, XML, MS Access, SQL Server 2008, Active Batch.
- Created new mappings and updating old mappings according to changes in Business logic.
- Performed major role in understanding the business requirements and designing and loading the data into data warehouse (ETL).
- Used Informatica client tools - Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and coded the process of data flow from source system to data warehouse.
- Performed extraction, transformation and loading of data from RDBMS tables and Flat File sources into Oracle RDBMS in accordance with requirements and specifications.
- Extensively used various transformations Lookup, Update Strategy, Expression, Aggregator, Filter, Stored Procedures and Joiner etc.
- Performed Unit Testing and tuned the mappings for better performance.
- Created reusable transformations and mapplets.
- Used SQL to test various reports and ETL Jobs load in development, testing and production.
- Responsible for Generating Progress Reports and Updates to Project Lead Weekly including with test Scenarios Status, Concerns and Functionality outstanding.
- Writing PL/SQL procedures days of supply for processing business logic in the Database.
- Worked on SQL tools like TOAD to run SQL queries to validate the data.
- Worked on database connections, SQL Joins in Database level.
- Extensively used SQL to load Data from flat files to Database tables In Oracle.
- Used Workflow manager for session management, database connection management and scheduling of jobs to be run in the batch process.
- Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.
- Created Oracle Stored Procedure to implement complex business logic for good performance and called from Informatica using Stored Procedure transformation.
- Using Unix Shell Scripting for scheduling Informatica workflows.
- Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
- Created UNIX shell scripts and called as pre session and post session commands. Used WinSCP for FTP purpose for Windows.
- Used calculations, variables, sorting, drill down, slice and dice for creating Stock Status Report.
Environment: Oracle 8i, SQL, PL/SQL, SQL*PLUS, HP-UX 10.20, Informatica Power Center 8.6, DB2 Cognos, Windows 2000.
- Developed mapping using Mapping Designer & worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure and Sequence Generator transformations.
- Implemented Slowly Changing Dimensions of type 1 & 2 to store history as per business needs
- Used Parameter files to pass mapping and session parameters to the session.
- Tuned the Informatica mappings to reduce the session run time.
- Developed PL/SQL procedures to update the database and to perform calculations.
- Worked with SQL*Loader to load data into the warehouse.
- Developed list reports and chart reports in Cognos Reportnet.
- Contributed to the design and development of Cognos framework model.
- Wrote UNIX shell scripts to work with flat files, to define parameter files and to create pre and post session commands.
- Performed Unit testing and System testing of Informatica mappings.
- Involved in migrating the mappings and workflows from Development to Testing and then to Production environments.