- Over 8+ years of experience in Information Technology with a strong back ground in Analyzing, Designing, Developing, Testing, and Implementing of Data Warehouse development using Informatica in various domains such as Banking and Heath Care.
- Extensively worked on Informatica Designer components (Source Analyzer, Warehouse Designer, Mapping Designer, Metadata Manager, Mapplet Designer and Transformation Developer), Repository Manager, Repository Server, Workflow Manager and Workflow Monitor.
- Strong knowledge on Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, Logical Data Modeling, Physical Modeling and Dimension Data Modeling (Star Schema, Snow Flake Schema, Fact Table, Dimension Table, Slowly changing dimensions), OLTP and OLAP.
- Expertise in designing and Developing Complex Mappings using Informatica Power Center Transformations - Unconnected and Connected lookups, Source Qualifier, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, Sequence Generator, XML transformation, etc.
- Strong Experience in integrating data from various Heterogeneous Sources like Oracle, Teradata, Netezza, SQL Server, DB2, My SQL, Flat Files (Fixed Width and Delimited), COBOL files, XML Files and Excel files into Data Warehouse and Data Mart.
- Implemented Slowly changing dimensions (SCD 1, 2) and change data capture using Informatica Power center.
- Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
- Experience in performance tuning of sources, targets, mapping, transformations and sessions by implementing various techniques like partitioning techniques and pushdown optimization (PDO) and also identifying performance bottlenecks.
- Worked in the ETL Procedures and processes. Reconciled disparate master data from distributed systems and synchronized this reliable data with analytical and operational systems using master data management.
- Involved in SQL tuning and Informatica performance tuning. Tuned performance of Informatica Sessions for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Expertise in Database development using Oracle SQL, PL/SQL database objects stored procedures, Functions, Triggers, Views, and Packages etc.
- Good exposure using Teradata utilities BTEQ, Fast Load, Multi Load, Fast Export and SQL Assistant and SQL Administrator.
- Experience with writing daily batch jobs using UNIX shell scripts, and developing complex UNIX Shell Scripts for automation of ETL.
- Experience in UNIX Shell Scripting and automated scripts for scheduled queue process and PMCMD commands.
- Experience in using Informatica command line utilities PMCMD to execute workflows in UNIX
- Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for developed Informatica Mappings.
- Experience with industry Software development methodologies like Waterfall, Agile within the software development life cycle
- Able to work independently, a good team player and interacts with all levels of management.
- Possesses good communication skills, self-motivated, pro-active, task oriented and quick learn at new technologies.
ETL: SQL Server Integration Services, SQL Server DTS, DataStage
Windows NT/2000/XP, HP: UX, Sun Solaris, UNIX/LINUX
OLAP/Reporting: Business Objects, SQL Server Analysis Services, SQL Server Reporting Services
Tools: TOAD, Visual Studio.NET, Informatica, SQL*Loader, Microstrategy, Queryman, FILE AID. QMF
Software: JBuilder, Microsoft Office Suite, Rational Rose, Trillium
Databases/RDBMS: MySQL, MS SQL Server 2005/2000, Oracle 10g, DB2
Confidential, Minneapolis, MN
Sr.ETL/Informatica Developer/Teradata Developer
- Created various Informatica mappings to validate the transactional data against Business rules, extract look up values and enrich the data as per the mapping documents.
- Developed various Informatica Workflows to load the data from various upstream systems using different methodologies i.e. trigger based pull, direct pull & file based push.
- Designed the ETL architecture for the Deposits product to process huge volumes of Deposits data on daily basis.
- Fine-tuned several long running Informatica workflows and implemented various techniques for the faster processing of high volume data by creating parallel partitions and using Teradata Fast Export.
- Developed various SQL queries using joins, sub-queries & analytic functions to pull the data from various relational DBs i.e . Oracle , Teradata & SQL Server .
- Created complex DataMart views for the corresponding products.
- Monitoring the Performance using Teradata Viewpoint. Notifying the batch and report users running the problem queries. Providing them with the consultation for changing the queries and possible improvement areas.
- Created various complex PL/SQL stored procedures to manipulate/reconcile the data and generate the dashboard reports.
- Performed Unit Testing & prepared the deployment plan for the various objects by analyzing the inter dependencies.
- Responsible for Teradata code review i.e. ensuring the code going in LIVE is following the standards and not violating the data warehousing standards and provide feedback to DEV team.
- Attended weekly status meetings and represented the MDM Data platform.
- Developed several UNIX shell scripts for the files Archival & Compression.
- Created various AutoSys jobs for the scheduling of the underlying ETL flows.
- Designed and Developed and implemented ETL process and MDM handling for AHM and Complete Spend Health care providers.
- Co-ordinated with various team members across the globe i.e. Application teams, Business Analysts, Users, DBA and Infrastructure team to resolve any technical and functional issues in UAT and PROD.
- Created various technical documents required for the knowledge transition of the application which includes re-usable objects (Informatica & UNIX).
- Maintain the MDM Hub console and Users and develop the Mapping for stage, load and match and merge Jobs. Validate user exits.
- Worked on IDQ for data cleansing, data matching, data conversion and address standardization.
- Involved in integrating the change in the workflow to both test and allow error handling using Informatica IDQ
- Created Data objects, Quick Profiles, Custom Profiles and Drill Down on Profile Result using IDQ .
Confidential, New York, NY
Sr. ETL / Informatica Developer
- Created new reports based on requirements. Worked extensively in documenting the Source to Target Mapping documents with data transformation logic.
- Function as a co-coordinator and working with Business team & offshore team.
- Analyzing the source systems, data nature & business rules.
- Review functional and technical design documents.
- Co-ordinate with different infrastructure teams - Data Center, Windows Administration, Web Application Support and Database/Informatica Admins for implementations.
- Parsed high-level design specification to simple ETL coding and mapping standards.
- Created mapping documents to outline data flow from sources to targets.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- As developer Designed, Developed and tested the Informatica Mappings which are designed as per best practices for the development team to make the development phase more efficient and also serves as a medium to introduce the development standards that developer need to follow.
- Involved in MDM Process including data modeling, ETL process, and prepared data mapping documents based on graph requirements .
- Used Informatica designer to create mappings using transformations like lookup, router, filter, sequence generator, joiner, aggregator, source qualifier and expression transformations to transform the data as per business logic.
- Designed and Created complex Informatica mappings with shared objects/Reusable Transformations/Mapplets using mapping/mapplet Parameters/Variables.
- Used SCD Type 1 and Type 2 mappings to update slowly Changing Dimension Tables.
- Develop and maintain database objects using PL/SQL - Control Structures, Composite data types, Explicit Cursors, Exceptions, Procedures, Functions, Packages and etc.
- Extensively Involved with System Testing and User Testing .
- Used Session parameters, Mapping variable/parameters and created Parameter files for flexible runs of workflows based on changing variable values.
- Used Debugger to test the mappings and fixed the bugs.
- Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
- Created Document Test Plans, Test Procedures, Expected Results, Assumptions and Validations.
- Prepared document to move the mappings from development to testing and then to production repositories.
- Ability to translate verbal requirements from client meetings into requirements documents, statements of work, and proposals.
- Proactively took responsibilities for internal organizational processes like Quality Management & Employee Performance management.
Confidential, St. Louis, MO
Informatica ETL Developer
- Analyzed Business Requirements and functional specifications.
- Used Informatica ETL to load data from Flat files which includes fixed width as well as Delimited files and SQL Server to the datamart on Oracle database .
- Created mappings by cleansing the data and populate that into Staging tables, populating the staging to Archive and then to Enterprise Data Warehouse by transforming the data into business needs & Populating the Data Mart with only required information.
- Created complex mappings in Informatica PowerCenter Designer using Aggregator, Expression, Filter, Sequence generator, Update strategy, Rank, Sorter, Lookup and Joiner Transformation.
- Have implemented SCD(Slowly Changing Dimensions) Type I and Type II for data load
- Involved in Performance tuning by tracking reader, writer, transformation threads in session logs and used tracing level to verbose during development and only with small sets of data.
- Developed UNIX Shell Scripts and used them in Informatica Pre-session, Post session command tasks and standalone command tasks.
- Worked extensively with Mapping Parameters, Mapping Variables and Parameter files for incremental loading.
- Created PL/SQL stored procedures and implemented them through stored Procedure transformation.
- Created complex workflows with multiple sessions, worklets with consecutive and concurrent sessions.
- Created Test Cases and detailed documentation for Unit, Integration, System and UAT testing .
- Scheduling Informatica jobs and implementing dependencies if necessary using Autosys.
- Understanding the user requirement through Functional Specification Document.
- Attend the meetings with Business and analyst to understand the requirement.
- Covert the Functional Specification Document to Technical Specification Design Document which consists of all the code level details as in which objects will be created or which existing objects will be modified to meet the requirement.
- Develop UNIX shell script and create oracle objects like control file, procedure, function, packages etc., as per the logic of the requirement.
- Develop the code using Oracle SQL , PLSQL .
- Solving the production issues which are data related. Working with upstream to receive the missing data.
- Optimization of long running processes which were consuming lot of temp space.
- Periodic index rebuilds of production tables.
- Periodic data refresh of DEV, ITG and PROD data on UNIX server.
- Integrating the new developed software with the existing application.
- Unit testing and integration testing of developed software.
- Created complex Stored Procedures, Functions, Triggers, Tables, Indexes, Views, SQL joins and T- SQL Queries to test and implement business rules.
- Created Stored Procedures for commonly used complex queries involving join and union of multiple tables.
- Created Views to enforce security and data customization.
- Created Non-clustered indexes to improve query performance and query optimization and SQL Queries to extract and compare data in the different sources.
- Maintained and managed database/stored procedures using SQL server tools like Performance Tuner and SQL Profiler .
- Worked closely with DBA team to regularly monitor system for bottlenecks and implement appropriate solutions.
- Involved in package migration from DTS to SSIS , running upgrade advisor against DTS Packages before migration, troubleshooting issues and conversion into SSIS through wizard or manually.
- Extracted data from Flat/Excel files & loaded to SQL Server database using Bulk Insert.
- Created ETL Packages to validate, extract, transform and load data to data warehouse and data marts.
- Developed SSIS Packages for extracting the data from the file system, transformed and loaded the data into OLAP .
- Used SSIS to populate data from various data sources, creating packages for different data loading operations for applications
- Created reports using Global Variables, Expressions and Functions using MS SQL Server Reporting Services.
- Designed and delivered dynamic reporting solutions using MS SQL Server Reporting Services.