- 7+ years of experience in Data Warehouse Business Intelligence domain with projects involving Informatica and ETL tools.
- Extensive experience working with Data Warehouses and Data Marts using Informatica Power Center 10.x, 9.x, 8.x, 7.x (Designer, Repository Manager, Workflow Manager and Workflow Monitor).
- Responsible for interacting with different Business Partners, Vendors & Customers to identify information needs and business requirements for reports.
- Responsible as a Mentor in training newly hired contractors with the knowledge of Business and the Data flow concepts.
- Knowledge in designing Dimensional models for Data Mart and Staging Database.
- Extensive knowledge in designing functional and detailed design documents for data warehouse development.
- Experience in writing UNIX Shell scripts.
- Experience in writing Stored Procedures and Functions (PL/SQL and T - SQL).
- Performance tuning of Oracle using SQLtrace, SQLplan, SQL hints, Oracle partitioning, various indexes and join types and PL/SQL tuning.
- Working knowledge of a variety of Relational DBMS products, with experience in designing and programming for relational databases, including Oracle, SQL Server, Teradata, DB2.
- Teradata utilities like Tpump, FastLoad, MLoad and BTEQ.
- Knowledge on Informatica Power Connect and Power Exchange, import sources from externalsystems like for instance Mainframe (IMS, DB2, Copybooks and VSAM) or ERP.
- Knowledge on Informatica Data Explorer (IDE) and InformaticaData Quality (IDQ) for Data Profiling.
- Applied Address Doctor tool in IDQ Plans in order to obtain Gold Source Data / Master Data (MDM).
- XML source and target schema and object definitions.
- Experience in creating various transformations using Aggregator, Look Up, Update Strategy, Joiner, Filter, Sequence Generator, Normalizer, Sorter, Router, XML, Stored procedure in Informatica Power Center.
- Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star Schema and Snowflake Schema used in relational, dimensional and multidimensional modeling.
- Extensive knowledge in handling Slowly Changing Dimensions (SCD) Type 1/2/3.
- Experience working with different OLAP BI Reporting tools like Business Objects, Cognos.
- Skilled in developing Test Plans, Creating and Executing Test Cases.
- Followed effectively industry standards of HIPAA, ANSI837, ANSI834, ANSI-4010, ANSI-5010, NCPDP, X12/837slayouts.
- Experience working with SDLC, SCRUM, RUP, Waterfall and Agile methodologies.
- Reliable, responsible, hardworking and good team player.
ETL: Informatica Power Center 10.1/9.x/8.x/7.x Repository Manager, Designer, Server manager, Workflow Monitor, Workflow Manager, Power Mart 6.2/6.1/5.1. x/4.7, Oracle Warehouse Builder 9.2/10g, Pentaho Data Integration (PDI), SSIS.
Tools: Informatica Data Quality (IDQ), Address Doctor, MDM, Informatica Data Explorer (IDE).
Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005/2003 , Mainframe, SQL*Loader, TOAD.
Environment: Windows 95/98/2000/NT/2003/XP/7, UNIX, MS-DOS, SQL*Plus.
Web Technologies: ASP.NET, ASP Classic, HTML, DHTML.
Languages: SQL, PL/SQL, C, C++, UNIX Shell scripts, XML, CSS, Java Script.
Confidential, Ann Arbor, MI
Sr ETL/Informatica Developer
- Studied the design of the data warehouse model using Star Schema.
- Project life cycle - from analysis to production implementation, with emphasis on identifying the source and source data validation, developing logic and transformation as per the requirement and creating mappings and loading the data into different targets
- Involved in the designing of ETL process using Informatica to populate the BI Data Mart using the Flat Files, SQL server to Oracle and Teradata databases.
- Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
- Used IDQ’s standardized plans for addresses and names clean ups.
- Applied the rules and profiled the source and target table's data using IDQ.
- Worked on MDM concepts in large scale implementation environments
- Extracted data from sources like SQL server, and Fixed width and Delimited Flat files. Transformed the data according to the business requirement and then Loaded into Oracle and Teradata databases.
- Experience in writing SQL queries and optimizing the queries in SQL Server.
- Performed data analysis and data profiling using SQL on various sources systems including SQL Server
- Designed and developed complex mappings to load the Historical and Weekly data from the Legacy Systems to Oracle database.
- Created Reusable Transformations and Mapplets, used them in Mappings to develop the business logic for transforming source data into target.
- Enrich the transaction and load the enriched transactions into Transaction Enrichment Database (TED).
- Implemented Type II Slowly Changing Dimension methodology for accessing the full history of customers.
- Wrote PL/SQL statement and stored procedures in Oracle for extracting as well as writing data.
- Created and Monitored Workflows to run Session tasks using Informatica Server.
Environment: Informatica Power Center 9.1.1/8.6.1 , Oracle 11g/10g, Teradata Database 13.10/12.10 , XML, XSL (XSLT and XPath), SQL Server 2008, Flat files.
Confidential, Grand Rapids, MI
Sr Informatica Developer
- Interacts with business users to understand the requirements.
- Create technical specification documents, STM documents, detailed ETL designs, Unit test plans, Deployment plans, Turn over documents etc.
- Involved in Data Modeling using different techniques.
- Involved in ETL architecture meetings and any new tool implementations.
- Used complex Informatica mappings to extract data from the source system to the Staging DB.
- Converted complex Oracle PL/SQL Packages to Informatica Mapping/Mapplets.
- Converted complex Transformation requirements to Informatica Transformations to transform the Source data to match the target data.
- Implemented Type 2 Dimensions (SCD Type 2) to maintain history in the Dimensions.
- Attended requirement specification meetings to add/change requirement as per the changing ETL Scenarios.
- Attended change control meetings every week.
- Create test plans and scenarios for development unit testing and data validation.
- Performance tuned slow running mappings by using best practices like Partitioning, SQL Overrides, etc.
- Generated fixed and variable length flat files and read XML files.
- Highly motivated and goal-oriented individual with a strong background in SDLC Project Management and Resource Planning using AGILE methodologies.
- Prepared a claim data for risk adjustment by extracting, cleaning, and merging from different tables using SQL.
- Built and maintained SQL scripts, indexes, and complex queries for data analysis and extraction for varies projects.
- Wrote and executed SQL queries using Query Analyzer to provide custom reports to marketing and sales.
- Developed advanced SQL queries with multi-table joins, group functions, subqueries, set operations and T-SQL stored procedures, user defined functions (UDFs) for data analysis.
- Loaded flat and Excel files into Access database and developed SQL scripts to create desired queries (views) for data analysis.
- Knowledge on implementing hierarchies, relationships types, packages and profiles for hierarchy management in MDM Hub implementation
- Hands on working experience in building MDM composite services using Services Integration Framework (SIF) including setup and configuring SIF SDK
- Used Hierarchies tool, for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.
- Experience on Teradata database, analyzing business needs of clients, developing effective and efficient solutions and ensuring client deliverables within committed timelines.
- Experience in Creating Database Objects such as Tables, Views, Functions, Stored Procedures, Indexes, Triggers, Cursors in Teradata.
- Worked on the Teradata stored procedures and functions to confirm the data and have load it on the table.
- Worked on FTP commands for file transfer.
- Worked on SQL developer, TOAD and DB Visualizer tools to write complex SQL queries
Environment: Informatica PowerCenter (PWC) 10.1.0/9.6.1 , Power Exchange (PWX), Data Quality (IDQ), Oracle 11g, PL/SQL Developer, Toad, Flat Files, MS Excel 2010, MS Visual Studio 2010, UNIX, WinSCP, MS Access, Jira, Co-pilot, Master Job Scheduler,Big Data Edition 10.1.0, Hadoop Hortonworks, Hive 2.4
Confidential, Des Moines, IA
Sr Informatica Developer
- Analyzed and thoroughly studied various data sources and different development environments within the organization.
- Extensively worked on extracting the data from various flat files (fixed width, delimited), applying the business logic and then loading them to the oracle databases.
- Extensively worked with Source qualifier, Filter, Joiner, Expression, Lookups, Aggregator, Router, Sequence Generator, and Update Strategy.
- Used various informatica transformations in the development of complex mappings.
- Extracted data from different heterogeneous source systems applied business logic using transformations and loaded to the target systems using Informatica power center.
- Used Informatica Data Quality (IDQ) to profile the data and apply rules to Membership & Provider subject areas to get Master Data Management (MDM).
- Worked on multiple projects using Informatica developer tool IDQ
- Involved in migration of the mapps from IDQ to power center.
- Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ.
- Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes
- Responsible for maintaining the integrity of the SQL database and reporting any issues to the database architect.
- Assisted in mining data from the SQL database that was used in several significant presentations.
- Responsible for the designing the advance SQL queries, procedure, cursor, triggers, and scripts.
- Wrote Unix shell scripts for system administration.
- Processing claims through EDI 837 files to FACETS system and also worked on scenarios for complete claims lifecycle
- To offer hands on support for development and maintenance of the Hadoop Platform and various associated components for data ingestion, transformation and processing.
- Experience in various stages of System Development Life Cycle (SDLC) and its approaches like Waterfall, Agile.
- Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another. involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.
- Worked on Teradata Multi-Load, Teradata Fast-Load utility to load data from Oracle and SQL Server to Teradata.
- Worked closely with business for requirement gathering and to understand the project needs.
- Interacted with environmental team and data architects in design and implementation data models.
- Designed and developed complex mappings to load the Historical, Weekly and Daily files to Oracle database.
- Developed UNIX shell scripting, created command task, and email task for providing the pre-session post-session requirements for various informatica jobs.
- Provide database coding to support business applications using T-SQL
- Worked on automation of informatica job flow using autosys boxes/jobs.
- Extensively worked on basic SQL queries such as creating altering Tables, Indexes, Views also worked with PL/SQL stored procedures. Queried various tables to get resultant datasets as per the business requirements.
- Prepared ETL mapping documents explaining complete mapping logic.
- Prepared unit test document and performed unit testing, regression testing.
- Provided QA/UAT support while code promotion and worked with QA’s to resolve any defects if found.
- Worked with different teams as Release Management, DBA, and UNIX team for smooth code promotions.
Environment: InformaticaPowerCenter 9.6.1/9.5/8.6.1 , Informatica Data Quality (IDQ), Oracle 11g, PLSQL Developer, Flat Files, MS Excel 2010, MS Visual Studio 2010, UNIX, WinSCP, MS Access, Autosys, Big Data Edition 10.1.0, Hadoop Hortonworks, Hive 2.4
Confidential, Atlanta, GA
- Designed and developed ETL strategies and mappings from source systems to target systems. ETL strategies were designed to cater initial and incremental load.
- Worked with source teams to resolve data quality issues raised by end users.
- Applied Slowly Changing Dimension and Dynamic lookup techniques.
- Used debugger to validate the mappings and gain troubleshooting information about data and error conditions.
- Implemented performance-tuning techniques by identifying and resolving the bottlenecks in source, target, transformations, mappings and sessions.
- Responsible for identifying the missed records in different stages from source to target and resolving the issue.
- Extensively used PL/SQL Procedures/Functions to build business rule.
- Used parameters and variables (sessions/mappings) extensively for incremental loads.
- Used SQL tools like TOAD to run SQL queries and validate the data.
- Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.
- Setting up tasks to schedule the loads at required frequency using Power Center workflow manager.
Environment: Informatica PowerCenter8.6.1/8.1, Oracle 10g/9i, SQL Server 2005, Toad, UNIX.
Confidential, New York, NY
- Used update strategy to effectively migrate data from source to target.
- Migrated mappings from development environment to test environment.
- Imported and created source definitions from Oracle, SQL Server, Sybase and flat files.
- Created Informatica mappings using various transformations like Joiner, Aggregate, Expression, Filter and Update Strategy.
- Involved in performance improvement project.
- Involved in designing of testing plan (Unit testing and System testing).
- Tested scripts by running workflows and assisted in debugging the failed sessions.
- Used persistent caches whenever data from workflows were to be retained.
- Created tasks and workflows in the Workflow Manager and monitored the sessions in the Workflow Monitor.
- Performance Maintenance including managing space, remove Bad Files and monitoring services.
- Set up permissions for groups and users in all Development Environments.
- Migration of developed objects across different environments.
Environment: Informatica Power Center 7.1, Oracle 9i, PL/SQL, Windows, ERwin, DB2, Sybase 12.x/11.x and UNIX.