- 5 Years of IT Experience in Analysis, Design, development, implementation and troubleshooting of Data warehouse applications.
- Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Multidimensional and Dimensional modeling (Star and Snowflake schema) Concepts.
- Demonstrated expertise in utilizing ETL tool Informatica power center 9.x/8.x/7.x/6.x for developing the Data warehouse loads as per client requirement.
- Excellent knowledge in identifying performance bottlenecks and tuning the Informatica Load for better performance and efficiency.
- Excellent knowledge of Informatica Administration. Involved in grid management, creation and upgrading of repository contents, creation of folders and users and their permissions.
- Significant Multi - dimensional and Relational data modeling experience, Dataflow Diagrams, Process Models, ER diagrams with modeling tools like ERWIN & VISIO.
- Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like DB2 UDB, Oracle, Flat files, XML files, Teradata, Sybase and MS SQL Server into Oracle, Teradata, XML, and SQL server targets.
- Extensive knowledge in developing Teradata, Fast Export, Fast Load, Multi load and BTEQ scripts. Coded complex scripts and finely tuned the queries to enhance performance.
- Profound knowledge about the architecture of the Teradata database.
- Experience using Data Masking Transformation to mask the sensitive data.
- Extensively involved in both Manual and Automated testing of Object Oriented, Client-Server, Web based and N-Tier complex large scale applications.
- Extensively worked on composing and execution of Test plans, Test cases and Test scripts.
- Extensive experience in implementation of Data Cleanup procedures, transformations Scripts, Triggers, Stored Procedures and execution of test plans for loading the data successfully into the targets
- Experienced In working with various Scheduling tools like Autosys, Control-M, Informatica Scheduler and Cronjob.
- Experience in UNIX shell scripting, CRON, FTP and file management in various UNIX environments.
- Strong understanding of Data warehouse project development life cycle. Expertise in documenting all the phases of DWH projects.
- Demonstrated ability to work and communicate effectively with both business and technical audiences.
- Excellent analytical, programming, written and verbal communication skills with ability to interact with individuals at all levels.
Informatica: Informatica Powercenter 9.x/8.x/7.x/6.x, Repository Manager, Designer, WorkFlow Manager, WorkFlow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer, Worklet Designer, mappings, re-usable transformations, Mapplets, Shortcuts, sessions, and workflows, User Defined Functions
Data Modeling: Dimensional Modeling, Star Schema, Snow Flake Schema, Fact Tables, Dimension Tables, Erwin and Microsoft Visio
Databases: Oracle 11g/10g/9i, Teradata, MS SQL Server, MS Access, SQL, PL/SQL, DB2
Languages: C, C++, HTML, PHP
Job Scheduling: Tivoli, Control-M
Environment: UNIX (Sun Solaris, HP, Linux), Windows
Utilities & Tools: SQL Loader, TOAD 9.5
Others: Microsoft Word, Microsoft Excel
Confidential, Milwaukee, WI
Informatica Developer/ Tester
- Involved in the Analysis, Design, Coding and Testing of various applications coming from different vendors like Coram and MEDD.
- Developed and used DataMasking Transformation for various applications and masked the secure data.
- Worked on developing ETL mappings to mask the sensitive data from various databases(Oracle, sql, Teradata, DB2 and FlatFiles).
- Deployed reusable transformation objects such as mapplets and User Defined Functions to avoid duplication of metadata, reducing the development time.
- Involved in creating non-reusable and reusable sessions in Workflow designer and Task developer and define schedules for sessions.
- Worked on creating address dictionary and ssn data which are used to mask the PII data of customer and prescriber.
- Used Informatica Power Center to create mappings, sessions and workflows for populating the data into dimension, fact, and lookup tables simultaneously from different source systems.
- Used Debugger to test the mappings and fixed the bugs.
- Tuned performance of mapping and sessions by optimizing source, target bottlenecks and implemented pipeline partitioning.
- Worked with DBA for partitioning and creating indexes on tables used in source qualifier queries.
- Involved in unit testing and documentation of the ETL process.
- Worked on constantly maintaining the old applications which mask the PHI, PII data and automated some of them
- Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
- Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.
- Designed workflows with many sessions with decision , assignment task, event wait, event raise tasks and mail notifications.
- Created a list of the inconsistencies in the data load on the client side so as to review and correct the issues on their side.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
Environment: Informatica Power Center 9.6 HF4, 9.6, Oracle 11g/10g, UNIX, Win XP, SQL * Plus, DB2 8.x, Control-M, Putty, FileZilla FTP client, Teradata SQL assistant.
Confidential, Charlotte, NC
Informatica Developer/ Tester
- Analyzed business requirements and worked closely with various application teams and business teams to develop ETL procedures that are consistent across all applications and system.
- Interacted with Business Users and Managers in gathering business requirements.
- Wrote Informatica ETL design documents, establish ETL coding standards and perform Informatica mapping reviews.
- Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
- Extensively worked on Power Center Client Tools like Power center Designer, WorkflowManager, and Workflow Monitor
- Extensively worked on Power Center 9.x Designer client tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
- Analyzed the source data coming from different sources (Oracle and SAP Flat files) and worked on developing ETL mappings.
- Created Parameter files and validation scripts.
- Created used custom load Scripts from command task.
- Created Process Control and Metadata for informatica jobs.
- Created Reusable and Non-Reusable command task in Workflow Manager.
- Created Sessions, command task, reusable worklets and workflows in Workflow Manager.
- Involved in providing 24/7production support to solve critical issues.
- Responsible for the Performance tuning at the Source Level, Target Level, Mapping Level and Session Level.
- Assisted with peer code reviews and testing of development team's T-SQL code.
- Performed Unit testing and Data validation testing using Validation scripts.
- Proven Accountability including professional documentation and weekly status report..
Environment: Informatica Power Center 9.1, Oracle 11g/10g, UNIX, Win XP, SQL * Plus, Transact-SQL, Control-M, Putty, BTEQ.
ETL/Informatica Developer/ Tester
- Involved in understanding of architecture design part, business processes and data modeling design.
- Have prepared requirement understanding documents at the time of estimation on planning on design and implementation of ETL flows as per the business flow.
- Working with the clients to create Technical design documents and field-field level mapping spread sheets.
- Working as a developer creating mappings, sessions and workflows in Informatica powercenter.
- Writing DB SQL scripts using Oracle SQL developer.
- Performing Unit testing of the developed mappings and workflows and documenting the results.
- Developing UNIX shell scripts to invoke informatica workflows and testing them.
- Performing data population tests against target system to ensure accuracy and quality of data.
- Involved in extensive Performance tuning by determining bottlenecks at various points like targets, sources and systems. This led to better session performance Provided production support for the Executive Management Information System (EMIS), a web reporting portal to informatica and teradata data sources
- Created Data Validation document, Unit Test Case Document, Technical Design Document, Informatica Migration Request Document and Knowledge Transfer Document.
Environment: Informatica Power Center 9.5/9.1, Oracle 11g/10g, Teradata 13, UNIX, Win XP, SQL * Plus, Transact-SQL, Control-M, Putty, BTEQ
- Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.
- Extraction, Transformation and data loading were performed using stored procedures into the database. Involved in Logical and Physical modeling of the drugs database.
- Based on the requirements created Functional design documents and Technical design specification documents for ETL.
- Created tables, views, indexes, sequences and constraints.
- Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
- Transferred data using SQL*Loader to database.
- Developed FRD (Functional requirement Document) and data architecture document and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis.
- Worked on dimensional modeling to design and develop STAR schemas by identifying the facts and dimensions. Designed logical models as per business requirements using Erwin.
- Designed and Developed ETL mappings using transformation logic for extracting the data from various sources systems.
- Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
- Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
- Developed processes on both Teradata and Oracle using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata) and SQL*Plus, SQL*Loader (Oracle).
- Created, Tested and debugged the Stored Procedures, Functions, Packages, Cursors and triggers using PL/SQL developer.
- Used the feature EXPLAIN PLAN to find out the bottlenecks in a given Query, thus improving the performance of the job.
- Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
- Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
- Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency.
- Involved in production and deployment phase to make sure the job schedules and dependencies are developed in such a way that we are not missing the SLA on a day to day basis.
Environment: Informatica Powercenter 8.6, Business Objects XI, Oracle 11g/10g,SQL Server, XML Files, TOAD, SQL, PL/SQL, Windows XP, UNIX.