- 8+ years of strong IT experience in analysis, design, development and testing client server environment with focus on Data Warehousing application using tools like Informatica PowerCenter9.x/8.x/7.x/6.x, Business Objects with Oracle, SQL Server databases and IDQ.
- Experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and developing Master Data using Informatica PowerCenter, Teradata.
- Involved in Full Software Development Life Cycle (SDLC) involving Application Development, Data Modeling, Business Data Analysis and ETL/OLAP Processes.
- Strong Data Warehousing ETL experience of using Informatica 9.6.1/9.1/8.6.1/8.5/8.1/7.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
- Experience in Microsoft Business Intelligence technologies like SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS) and SQL Server Analysis Services (SSAS).
- Experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and loading using Informatica PowerCenter9.x/8.x/7.x/6.x, IDQ and Trillium.
- Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
- Experience working on Informatica transformations like normalizer, Lookup, Source Qualifier, Expression, Aggregator, Sorter, Rank and Joiner.
- Worked on PROVIDER, CLAIMS, MEMBER subject areas.
- Experienced in designing the Conceptual, Logical and Physical data modeling using Erwin and ER Studio Data modeling tools.
- Experienced in developing meta-data repositories.
- Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica PowerCenter (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases
- Closely worked with business users in order to develop reports according to user requirements.
- Strong experience in Informatica Data Quality (IDQ), creating data profiles, custom filters, data cleansing and developing score cards.
- Expertise in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server, DB2 8.0/7.0, UDB, No SQL Databases, Mongo DB, MS Access and Teradata
- Performed extensive Data profiling and analysis for detecting and correcting inaccurate data from the databases and to track data quality.
- Strong experience in writing complex SQL Queries, Stored Procedures and Triggers.
- Experience in Agile/Scrum, TDD and BDD methodologies
- Experience in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Excellent problem solving and analytical skills, committed team player with multitasking capabilities.
- Strong communication skills, both verbal and written, with an ability to express complex business concepts in technical terms.
Operating Systems: Linux, Unix, Windows 2010/2008/2007/2005/ NT/XP
ETL Tools: Informatica PowerCenter 10.1/9.x/8.x/7.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server), IDQ.
Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005, DB2/UDB, Mongo DB, Teradata.
Data Modelling Tools: Erwin, MS Visio, E/R Studio, Excel, SAS, R language.
Reporting Tools: Reporting services SSRS, SSIS Tableau, and Micro strategy
Scheduling Tools: Autosys, Control-M.
Tools: Selenium, QTP, Win Runner, Load Runner, Quality Center, Test Director, TOAD
Confidential, Houston, Texas
Sr Informatica Developer
- Worked in Agile development methodology environment and Interacted with the users, Business Analysts for collecting, understanding the business requirements.
- Worked on building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Involved in the installation and configuration of Informatica PowerCenter 9.6 and evaluated Partition concepts in PowerCenter 9.6
- Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
- Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.
- Created stored procedures, views, user defined functions and common table expressions.
- Generated underlying data for the reports through SSIS exported cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse.
- Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
- Involved in IDS Services like building Business logics, analyzing the structure and data quality, creating a single view of data etc.
- Worked on Informatica cloud for creating source and target objects, developed source to target mappings.
- Involved in importing the existing PowerCenter workflows as Informatica Cloud Service tasks by utilizing Informatica Cloud Integration.
- Involved in Data integration, monitoring, auditing using Informatica Cloud Designer.
- Worked on Data Synchronization and Data Replication in Informatica cloud.
- Worked with Informatica powercenter Designer, Workflow Manager, Workflow Monitor and Repository Manager.
- Written PL/SQL scripts, created stored procedures and functions and debugged them.
- Created Mapplets, reusable transformations and used them in different mappings. Used Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Involved in Production Support by performing Normal Loads, Bulk Loads, Initial Loads, Incremental Loads, Daily loads and Monthly loads and Developed reports based on issues related to the data warehouse.
- Used different Informatica Data Quality transformations in the Developer and Configured match properties match paths, fuzzy match key, fuzzy and exact match columns
- Created profiles, rules, scorecards for data profiling and quality using IDQ.
- Used Informatica Data Quality for addresses and names clean-ups and developed error handling & data quality checks to pull out the right data
- Used IDQ to cleanse and accuracy check the project data, check for duplicate or redundant records.
- Used debugger to test the mapping and fix the bugs and identified the bottlenecks in all levels to tune the performance and Resolved the production support tickets using remedy.
- Developed monitoring scripts in UNIX and moved Data Files to another server by using SCP on UNIX platform.
- Extensively used Teradata Utilities like Fast-Load, Multi-Load, BTEQ & Fast-Export.
- Created Teradata External loader connections such as M Load, Upsert and Update, Fast load while loading data into the target tables in Teradata Database.
- Involved creating the tables in Teradata and setting up the various environments like DEV, SIT, UAT and PROD.
Environment: Informatica PowerCenter 9.6, Oracle12C, Informatica Cloud, IDS 9.6.1, IDQ 9.6.1 Teradata 14.0, SQL Server 2014, Teradata Data Mover, Autosys Scheduler Tool, Netezza, UNIX, Toad, PL/SQL, SSIS, Power Connect, DB2, Business Objects XI3.5.
Confidential, Charlotte, NC
Informatica/ IDQ Developer
- Address Doctor Upgrade in PowerCenter and Developer.
- Configure AVOS with hub and made sure that default workflows are integrated.
- Configured BE workflows with IDD.
- Installed IDQ in the windows development environment.
- Handled Address Cleansing before populating the data into landing tables.
- Worked on ETL process for bringing in the data into IDQ landing tables.
- Worked on profiling the data using Developer tool/Analyst Tool for identifying the data integrity from different sources.
- Most of the data was cleansed in PowerCenter/Developer before the data is placed in IDQ landing tables.
- Based on the data quality analysis and discussion with stakeholders the source data trusted scores are defined.
- Developed validation rules based on the profiled data quality and data analysis.
- Came to conclusion on key fields after discussing with people in the knowledge of data.
- Defined Match rules in Match and Merge settings of the base tables by creating Match Path Components, Match Columns and Rule sets.
- Configured match rule set filters for meeting the different data scenarios.
- Performed match/merge and ran match rules to check the effectiveness of IDQ on data and fine-tuned the match rules.
- Developed ad hoc queries like Execute Batch Delete SOAP requests for deleting the specific data from the concerned underlying tables using.
- Developed Unmerge user exist for reprocessing the some of the records which are supposed to be processed differently.
- Closely worked with Data Steward Team for designing, documenting and configuring Informatica Data Director.
- Used ActiveVOS for configuring workflows like One step approval, merge and unmerge tasks.
- Configured static lookups, dynamic lookups, bulk uploads, extended search and Smart search in IDD.
- Configured JMS Message Queues and appropriate triggers for passing the data to the contributing systems.
Environment: Multi-Domain IDQ 10.0, IDD, Oracle 11g, Oracle PL/SQL, Windows Application Server, Active VOS, Informatica PowerCenter 10.1, Informatica Developer, Address Doctor 5.1, PowerShell.
Confidential, Marietta, GA
Informatica/ ETL Developer
- Actively involved in gathering business requirements and performing GAP analysis.
- Translated Functional requirements into Technical specification documents.
- Developed ETL mapping spreadsheet and Fact-Dimension matrix prior to performing the ETL process.
- Involved in Data modeling for GL Data Mart to bring in various set of books assets and balances into the warehouse
- Used complex transformations such as Connected and Unconnected Lookups, Joiner, Router, Filter and Stored Procedure transformations for data loads
- Worked with Memory management for the best throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations and involved in pipeline partitioning
- Debugged and implemented the best practices in mappings, sessions and workflows for data extraction and loading into Slowly Changing Dimensions type 1 and type 2
- Created test plan cases and performed Unit, Volume and Performance tests.
- Created and scheduled Informatica workflows using Crontab application.
- Imported physical tables and created the required aliases, views, joins in the Physical layer of Siebel Analytics Admin tool.
- Developed the facts, dimensions, necessary hierarchies and the joins between the logical tables, added aggregates in the BMM layer.
- Implemented various performance tuning mechanisms by performing Query Optimization, Hints, running advice plans and explain queries on indexes and creating Materialized Views to execute the prompts on the Dashboards faster.
Environment: Informatica PowerCenter, IDQ, Microsoft Visio 2003, Oracle 8i/9i, PL/SQL, UNIX, Windows NT/2000 and Siebel.
SQL - ETL Developer
- Gathered functional requirements and Business requirements and wrote the technical specifications for building the Data model.
- Created new database objects like Procedures, Functions, Packages, Triggers, Indexes and Views using T-SQL in SQL Server 2005/2008.
- Created and executed several SSIS packages to perform ETL operations of the data from source server to destinations server and OLTP to OLAP.
- Experience in creating complex SSIS packages using proper control and data flow elements with Error Handling.
- Experience in providing Logging, Debugging and Error handling by using Event Handlers, and Custom Logging, break point, data viewers, check points for SSIS Packages.
- Involved in complete SSIS life cycle in creating SSIS packages, building, deploying and executing the packages in both the environments (Development and Production).
- Transfer Data from Database to Data Warehouse using SSIS ETL packages such as look up, union-all, multicast Transformations.
- Involved in monitoring and tuning report performance by analyzing the execution plan of the reports.
- Developing back-end PL/SQL packages, building UNIX shell scripts for data migration and batch processing.
- Build, Published and Scheduled SSRS Reports for both Dev and Test Environment.
- Database backups and troubleshooting production databases and cube sync processing issues and SSAS issues.
- Created Star Schema for OLAP cube generation and extensively worked dimension modeling for analysis service using SSAS.
Environment: SQL Server 2008/2005, SSIS, SSRS, T-SQL, Crystal Reports 7/8, Windows 2008 Advance Server, VB.NET, MS Excel, and MS Office.
Junior SQL Developer
- Complete analysis, requirement gathering and function design document creation.
- Collaborated with the application developers in data modeling and E-R design of the systems.
- Creating and managing schema objects such as Tables, Views, Indexes and referential integrity depending on user requirements
- Used DDL and DML commands for writing triggers, stored procedures and data manipulation.
- Involved in tuning system stored procedures for better performance.
- Involved in the Data modeling, Physical and Logical Design of Database
- Configured Exchange Server for sending automatic mails to the respective people when a job failed or succeed.
- Worked with the application developers and provide necessary SQL Scripts using SQL and PLSQL.
- Responsible for the management of the database performance, backup, replication, capacity and security.
Environment: MS SQL Server 2008, T-SQL, SQL Server Management Studio (SSMS), SQL Profiler, Visual Source Safe (VSS), Windows XP.