- 8+ Years of IT experience in Data warehouse/Data Mart design experience with extensive knowledge in SDLC: Data Analysis, Design, Development, Implementation and Testing using Data Extraction, Data Transformation and Data Loading (ETL) using Informatica Power center 9.6.1/9.5.1/8.1.1/7.x and maintenance of Data Warehouses in Teradata, Metadata Oracle, Netezza, SQL and PL/SQL programming.
- Experience in working with Designer, Work Flow Manager, Work Flow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer, Worklet Designer, Gant Chart, Task View, Mapplets, Mappings, Workflows, Sessions, Re - usable Transformations, Shortcuts, Import and Export utilities.
- Experience in Data Warehouse development working with Extraction/Transformation/Loading using Informatica Power Mart/Power Center with flat files, Oracle, SQL Server, and Teradata.
- Thorough knowledge in various tools and technologies like different versions of Informatica Powercenter (10.1/9.6.1/9.1) and Informatica Developer.
- Experience working on Data quality tools Informatica IDQ 9.1.
- Experience working in multi-terabytes data warehouse using Databases like Oracle 11g/10g/9i, MS Access 2000/2002, XML, IBM UDB DB2 8.2, SQL Server 2008, MS Excel and Flat files.
- Experience Relational Modeling and Dimensional Data Modeling using Star Snow Flake schema, De normalization, Normalization, and Aggregations.
- Very strong in SQL and PL/SQL, extensive hands on experience in creation of database tables, triggers, sequences, functions, procedures, packages, and SQL performance-tuning.
- Proficiency in data warehousing techniques like data cleansing, Slowly Changing Dimension phenomenon, Surrogate key assignment, change data capture.
- Have Good understanding of ETL/Informatica standards and best practices, Slowly Changing Dimensions SCD1, SCD2 and SCD3.
- Experience in testing coordination, writing test cases and executing test scripts and logged defects in Quality Center QC.
- Working closely with ETL developers and other leads during development and support of BI application.
- Experience with Data Extraction, Transformation, and Loading ETL from disparate Data sources, Multiple Relational Databases like Oracle, DB2-UDB and Worked on integrating data from flat files, CSV files, and XML files into a common reporting and analytical Data Model using Erwin.
- Creation of tables, packages, mappings, batch jobs, roles and users in Informatica MDM Hub.
- Created personalized version of reports as well as statements for customers using the data from Informatica metadata and then generated Business Objects reports using slice and dice capabilities.
- Worked extensively in various kinds of queries such as Sub-Queries, Correlated Sub-Queries, and Union Queries for Query tuning.
- Extensively worked on Data migration, Data cleansing and Data Staging of operational sources using ETL processes and providing data mining features for data warehouses.
- Having hands on experience in InformaticaPower Exchange andInformaticaIDQ.
- Hands on experience using query tools like TOAD, PLSQL developer, Teradata SQL Assistant and Query man.
- Developed UNIX scripts for dynamic generation of Files & for FTP/SFTP transmission.
- Developed Complex mappings from various transformation logics like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Normalizer, Joiner, Union, Update Strategy.
Operating System: UNIX, Windows, MS-DOS
Language/Tools: SQL, PL/SQL, C, C++
Scheduling Tools: Autosys, Control-M, Informatica Scheduler
ETL Tools: Informatica Power Center 10.x/9.x/8.x, ETL Informatica Cloud, SSIS
Database: MS SQL Server, Oracle 8i/9i/10g, RDBMS DB2, Netezza, Teradata, PostgreSQL, Redshift
Scripting: Shell Scripting, Python
Data Modeling Tools: Microsoft Visio, ERWIN 9.3/7.5
Data Modeling ER: (OLTP) and Dimensional (Star, Snowflake Schema)
Data Profiling Tools: Informatica IDQ 10.0, 9.5.1, 8.6.1
Excel Tools & Utilities: TOAD, SQL Developer, SQL*Loader, Putty
Cloud Computing: Amazon Web Services (AWS), S3, RDS, Redshift, SNS
Other Tools: Notepad++, Toad, SQL Navigator, Teradata SQL Assistant, Rally, AWS Cli
Defect Tracking Tools: ALM, Quality Center
Reporting Tools: IBM Cognos, Tableau 9
Confidential, Centreville, VA
Sr. Informatica/ETL Developer
- Coordinated with business analysts to analyze the business requirements and designed and reviewed the implementation plan.
- Responsible for designing and development, testing of processes necessary to extract data from operational databases, Transform and Load it into data warehouse using Informatica Power center.
- Followed ETL standards -Audit activity, Job control tables and session validations.
- Created Complex Mappings to load data using transformations like Source Qualifier, Expression, Aggregator, Dynamic Lookup, Connected and unconnected lookups, Joiner, Sorter, Filter, Stored Procedures, Sequence, Router and Update Strategy.
- Created different jobs using UNIX shell scripting to call the workflow by using Command tasks.
- Writing Oracle SQL queries to join or any modifications in the table.
- Design and developed complex informatica mappings including SCD Type 2 (Slow Changing Dimension Type 2).
- Worked on complex mapping for the performance tuning to reduce the total ETL process time.
- Extensively used TOAD to test, debug SQL and PL/SQL Scripts, packages, stored procedures and functions. l
- Designed and developed ETL code usinginformatica Mappings to load data from heterogeneous Source systems like flat files, XML's, CSV files to target system Oracle under Stage, then to data warehouse and then to Data Mart tables for reporting.
- Extracted and transformed data from various sources like Teradata and relational databases (Oracle, SQL Server).
- Analyze source data coming from multiple sources System. Design and develop data warehouse model in a flexible way to cater the future business needs.
- Ability to analyze existing systems, conceptualize and design new ones, and deploying innovative solutions with high standards of quality.
- Development of ETL code to extract data from multiple sources and load to Data warehouse using Informatica and load data into AWS Redshift.
- Involved in enhancements and maintenance activities of the data warehouse including tuning, code enhancements.
- Automation and scheduling of Oracle andinformatica batch jobs using Control-M application that are scheduled with file watchers, daily, weekly and on special On-Demand requests.
- Worked on Parameterize of all variables, connections at all levels in UNIX.
- Performed Developer testing, Functional testing, Unit testing and created Test Plans and Test Cases.
- Create Unit Test Case document and capture Unit test results for each source system.
- Working in Agile environment and experienced with daily stand-ups, sprints, and tracking stories using JIRA application.
Environment: Informatica PowerCenter 10, Control M, Oracle11g, Toad, Redshift, Razor SQL, WinSCP, Composite, UNIX and TWS.
Confidential, Parsippany, NJ
Sr. Informatica Developer
- Interacting with the end users to get the business Requirements, reporting needs and created Business Requirement Document.
- Parsed high-level design specification to simple ETL coding and mapping standards.
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Created mapping documents to outline data flow from sources to targets.
- Extracted the data from the flat files, DB2, SQL server and other RDBMS databases into staging area and populated onto Data warehouse. Worked on Flat Files and XML, DB2, Oracle as sources.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Used Debugger to test the mappings and fixed the bugs.
- Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Generated ABAP programs to load data into Oracle from SAP source systems.
- Customize ABAP programs according to business requirements to load data with respect to SAP source systems
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Developed mapping parameters and variables to support SQL override.
- Worked on performance tuning by creating views in Oracle and implemented transformation logics in database using views.
- Developed mappings to load into staging tables and then to Dimensions and Facts.
- Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
- Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
- Modified existing mappings for enhancements of new business requirements.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
- End-to-end ETL development of the Data Mart. Data Quality Analysis to determine cleansing requirements. Designed and developed Informatica mappings for data loads.
- Created various rules in IDQ to satisfy the Completeness, Conformity, Integrity, Timeliness
- Cleansed, standardized, labeled and fix the data gaps inIDQ where it checks with reference tables to fix major business issues
- Identified issues, performance bottlenecks, and optimized the Business Intelligence Dashboards and Reports.
- ExposedIDQ mapping/ mapplets as web service.
- Worked on enhancements for stored procedure andIDQ web services.
- Worked on Informatica Advanced concepts & also Implementation ofInformatica PushDown Optimization technology.
- Performed Source System Data analysis as per the Business Requirement. Distributed data residing in heterogeneous data sources is consolidated onto target Enterprise Data Warehouse database.
- Designed ETL Flow diagram in Visio to arrive at the schedule for TWS Jobs
- Designed and developed ETL strategies and mappings from source systems to target systems. ETL strategies were designed to cater initial load and incremental load.
- Tested all the modules and transported data to target Warehouse tables, scheduled, ran extraction and load process and monitor sessions and batches by using Informatica Workflow Manager and log files.
- Precise Documentation was done for all mappings and workflows.
- Responsible for ETL process under development, test and production environments.
- Written test Plans, test Cases, Test scripts, test scenarios for the Quality releases in the SOA and Maintenance release
- Wrote test plans and executed it at UNIT testing and also supported for system testing, volume testing and USER testing. Also provided production support by monitoring the processes running daily.
Environment: Informatica Power Center 9.6.1, Control M, Oracle11g, SAP, Toad, DB2, WinSCP, WinSQL, ERWIN, UNIX and TWS.
Confidential, Portland, OR
- Provided technical leadership and developed new business opportunities.
- Supported day to day activities of the Data Warehouse.
- Analyzed the business requirements and functional specifications.
- Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.
- Used Informatica Power Center 9.1/8.6 for extraction, transformation and load (ETL) of data in the data warehouse.
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
- Developed complex mappings in Informatica to load the data from various sources.
- Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
- Parameterized the mappings and increased the re-usability.
- Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
- Created procedures to truncate data in the target before the session run.
- Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
- Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
- Have Experience on Teradata Utility scripts like BTEQ, FastLoad, MultiLoad and FastExport to load data from various source systems to Teradata.
- Working knowledge on Oracle 11g, SQL Server, Teradata, Netezza, DB2, MY SQL and UNIX shell scripting
- Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
- Involved in Unit testing, System testing to check whether the data loads into target are accurate.
- Experience in building, enhancing and managing Universes, creating WebI, DeskI, Complex, Adhoc Reports, Canned Reports, and Charts usingBusiness objects.
- Experience in complex data analysis in Designer, used data from multiple data providers, performed extensive data analysis with complex queries, drill up, drill down, slice and dice.
- Strong hands-on experience in creating/enhancing Universes using BO Designer and in creating and maintaining reports usingBusiness Intelligence, Business objects, Crystal reports, Web Intelligence, and Desktop Intelligence Written documentation to describe program development, logic, coding, testing, changes and corrections.
- Created Test cases for the mappings developed and then created integration Testing Document.
- Followed Informatica recommendations, methodologies and best practices.
- Fine-tuned Informatica transformations and workflows for better performance.
- Involved in the performance tuning of Informatica mappings by using Informatica Push Down optimization (PDO).
- Using SSRS, Deployed and generated reports of each quarter onto the Report Server to access it through browser.
- Created Drill-through, Drill-down, Cross Tab Reports and Sub-Report usingSSRS.
- Formatted theSSRS reports using the Global variables, expressions and Functions.
- Created Jobs Performance reports that queries system tables to track the duration of each job and weekly-average duration usingSSRS.
- Created reports inSSRS using different types of properties like chart controls, filters, Interactive Sorting, and SQL parameters.
- Installed and configured Informatica Power Center 9.1 and 8.6
- Migrated Informatica code and managed security and permissions.
- Designed, created and modified SQL database objects.
- Developed data models from technical and functional specifications.
- Excellent knowledge in data analysis, data cleansing and data validation.
- Represented Informatica and SQL server on Enterprise architecture board.
- Developed SSIS packages to load data into warehouse and benchmark the performance.
- Supported multiple lines of business and IT initiatives.
- Used Informatica Metadata Manager to show data lineage.
- Collaborated with remote offshore team, creating the requirement documents, verifying coding standards and conducting code reviews.
Environment: Informatica Power Center 9.1/8.6, Power Exchange, DB2,Tivoli, SQL server 2005 / 2008, linux.
Confidential, Nashville, TN
- Created Informatica mappingsusing various transformations like XML, Source Qualifier, Expression, look up, stored procedure, Aggregate, Update Strategy, Joiner, normalizer, Union, Filter and Router in Informatica designer.
- Extensively worked with Teradata database usingBTEQscripts.
- Involve in all phase of SDLC, i.e. design, code, test and deploy ETL components of data warehouse and integrated Data Mart .
- Created subscriptions for source to target mappings and replication methods using IBM CDC tool.
- UsedNZSQL scripts, NZLOADcommands to load data.
- Experience in Data Stage Upgrade and Migration projects - from planning to execution.
- Analysis of heterogeneous data from various systems like pm and Salesforce.com and validating it in ODS (operational Data store).
- Worked with InformaticaIDQData Analyst, Developer with variousdata profiling techniques to cleanse, match/remove duplicate data.
- Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ.
- Designed and developed IDQ solutions for data profiling. Implemented Address Doctor as Address Validator transformation for data profiling in IDQ.
- Worked extensively withNetezzascripts to load the data from flat files to Netezza database.
- Created mappings using pushdown optimization to achieve good performance in loading data into Oracle and Teradata.
- Developed various SQL queries using joins, sub-queries & analytic functions to pull the data from various relational DBs i.e. Oracle, Teradata & SQL Server.
- CreatedWeb servicesmappings forconsumer and Provider, used Webservices consumer transformation, XML parserto parse the incoming data.
- Created and edited custom objects and custom fields inSalesforceand checked thefield level Securities.
- Worked in the Informatica cloud in order to replicate the data from the Salesforce.
- Responsible for writing Unix Shell Scripts to schedule the jobs.
- Involved in analyzing, defining, and documenting data requirements by interacting with the client andSalesforce team for the Salesforce objects.
- Worked with cleanse, parse,standardization,validation, scorecardtransformations.
- Team would pick the Data Extract TXT Files from MBOX Server for processing to SAP System. SAP System would send Error/ Reconciliation data to MBOX Server in TXT File Format.
- Createdpre-session, post session, pre-sql, post sql commandsin Informatica.
- UsedUNIXscripts forfile managementas well as inFTP process.
- Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings to PROD.
- Production support for the Informatica process, troubleshoot and debug any errors.
Environment:Informatica Data Quality 9.1.0/9.5.1, Flat Files, Mainframes Files, Oracle 11i, Netezza, Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,SQL Server 2008, Salesforce.com, Webservices.
DBA /ETL Developer
- Performed business analysis, requirements gathering and converted them into technical specifications
- Involved in designed the data mart (star schema dimensional modeling) after analyzing various source systems and the final business objects reports
- Designed and developed all the slowly changing dimensions to hold all the history data in the data mart
- Developed all the ETL data loads in Informatica Power Center to load data from the source data base into various dimensions and facts in the MIS data mart
- Implemented Slowly Changing Dimensions (Type 2) while loading data into dimension tables to hold history.
- Worked with power center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
- Used Informatica data services to profile and document the structure and quality of all data.
- Extensively used Informatica Transformation like Source Qualifier, Rank, SQL, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all transformation properties.
- Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.
- Translated Business processes into Informatica mappings for building Data marts by using Informatica Designer which populated the Data into the Target Star Schema on Oracle 10g Instance.
- Followed the required client security policies and required approvals to move the code from one environment to other.
- Worked on Informatica Cloud to createSource /Target sfdc connections, monitor and synchronize the data in sfdc.
- Worked on sfdc session log error files to look into the errors and debug the issue.
- Worked withInformatica Power Exchangeas well asInformatica cloudto load the data into salesforce.com.
- Developed Informatica mappings, mapping configuration taskand Taskflows usingInformatica Cloud Service(ICS)
- Extensively used PL/SQL programming in backend and front-end functions, procedures, packages to implement business rules.
- Created Informatica complex mappings with PL/SQL procedures/functions to build business rules to load data.
- Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.
- Created automated scripts to perform data cleansing and data loading.
- Performed complex defect fixes in various environments like UAT, SIT etc to ensure the proper delivery of the developed jobs into the production environment.
- Attended daily status call with internal team and weekly calls with client and updated the status report.
- Created reusable transformations, Mapplets and used them in the mappings and workflows
- Designed Control-M scheduler jobs to invoke the UNIX shell scripts
- Supported various testing cycles during the SIT & UAT phases.
- Involved in creation of initial data set up in the Production environment and involved in code migration activities to Production.
- Helped during the migration of reports from Business objects 3.1 toBusiness Intelligence 4.1 and performed regression testing on many of the reports after migration.
- Supported the daily/weekly ETL batches in the Production environment
- Prompt in responding to business user queries and changes.
Environment: Informatica powercenter 9.X, Informatica cloud, Oracle 10g, PL/SQL,Teradata, linux, Control-M.
Confidential, St. Louis, MO
- Developed internal and external Interfaces to send the data in regular intervals to Data warehouse systems.
- Extensively used Power Center to design multiple mappings with embedded business logic.
- Involved in discussion of user and business requirements with business team.
- Performed data migration in different sites on regular basis.
- Involved in upgrade of Informatica from 9.1 to 9.5.
- Portfolio Management Enhancement: Analyzed/developed business requirement, designed/created database and processes to load data from Broad ridge using Informatica, SQL Server etc.; fixed defects; Wrote complicate stored procedures to generate data for PM web reports.
- Created complex mappings using Unconnected Lookup, Sorter, and Aggregator and Router transformations for populating target tables in efficient manner.
- Attended the meetings with business integrators to discuss in-depth analysis of design level issues.
- Provide work Bucket hour estimation and budgeting for each story (agile process) and communicate status to PM.
- Was responsible for Performance Tuning at the transformation Level and Session level.
- Creation of tables, packages, mappings, batch jobs, roles and users in Informatica MDM Hub.
- Worked with business users and analysts to understand the requirements for mastering the data obtained from various sources and loading the golden records to the targetMDM database.
- Involved in MDM Design and Developmental activities.
- Added new sources to the existing MDM implementation.
- Experienced in creating and configuring Address Doctor, Identity Match reference data components for MDM Hub & Cleanse Server(s).
- Experienced in integratingInformatica MDM with Informatica PowerCenter IDQ.
- Involved in data design and modeling by specifying the physical infrastructure, system study, design, and development.
- Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and by using Parameter files.
- Analyzed session log files in session failures to resolve errors in mapping or session configuration.
- Written various UNIX shell Scripts for scheduling various data cleansing scripts, loading process and automating the execution of maps.
- Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier Transformation using the Informatica designer.
- Created mapplets and used them in different mappings.
- Written PL/SQL Procedures and functions and involved in change data capture (CDC) ETL process.
- Implemented Slowly Changing Dimension Type II for different Dimensions.
- Involved in Informatica, Teradata and oracle upgrade process and testing the environment while up gradation.
- Written Unit test scripts to test the developed interfaces.
- Managed enhancements and coordinated with every release with in Informatica objects.
- Provided support for the production department in handling the data warehouse.
- Worked under Agile methodology and used Rally tool one to track the tasks.
- Written thorough design docs, unit test documentation, Installation and configuration guide documents.
- Performed bulk data imports and created stored procedures, functions, views and queries.
Environment: Informatica MDM, Autosys, Oracle11g, SAP, Toad, WinSQL, ERWIN, UNIX.
- Co-ordinated Joint Application Development (JAD) sessions with Business Analysts and source developer for performing data analysis and gathering business requirements.
- Developed technical specifications of the ETL process flow.
- Designed the Source - Target mappings and involved in designing the Selection Criteria document.
- Worked on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Teradata.
- Used Informatica PowerCenter to create mappings, sessions and workflows for populating the data into dimension, fact, and lookup tables simultaneously from different source systems (SQL server, Oracle, Flat files).
- Created mappings using various Transformations like Source Qualifier, Aggregator, Expression, Filter, Router, Joiner, Stored Procedure, Lookup, Update Strategy, Sequence Generator and Normalizer.
- Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing the development time.
- Used T-SQL for Querying the SQL Server2000 database for data validation and data conditioning.
- Worked Extensively with SSIS to import, export and transform the data between Used T-SQL for Querying the SQL Server2000 database for data validation and data conditioning.
- Implemented Informatica Framework for (dynamic parameter file generation, start, failed and succeeded emails for an integration, Error handling and Operational Metadata Logging).
- Implemented sending of Post-Session Email once data is loaded.
- Worked with DBA for partitioning and creating indexes on tables used in source qualifier queries.
- Involved in Performance/Query tuning. Generation/interpretation of explain plans and tuning SQL to improve performance.
- Scheduled various daily and monthly ETL loads using Autosys.
- Involved in writing UNIX shell scripts to run and schedule batch jobs.
- Involved in unit testing and documentation of the ETL process.
- Involved in Production Support in resolving issues and bugs.
Environment: Informatica Power Center 8.6. PL/SQL, Oracle 10g/9i, Erwin, Autosys, SQL Server 2005, Sybase, UNIX AIX.