- Having 8+ years of experience in Information Technology with a strong background in Data warehousing and Business/Data Analysis, ETL Design, Development and Application Support
- Extensively worked on projects with Informatica Power Centre, Teradata and reporting tools like Cognos, Business objects and OBIEE, Tableau.
- Worked in various domains like Utilities, Pharmaceuticals, Trading, Finance and Government.
- Extensive experience in Data Warehousing, Data Modeling, Data Integration, Data Migration, ETL process and Business Intelligence Migration, Enhancement, Testing and Production Support Projects, SQL Server Integration Services(SSIS).
- Expertise in Informatica. Extensive experience in designing and developing complex Mappings, Mapplets, Transformations, Workflows, Worklets, scheduling the Workflows and sessions.
- Full Software Development Life Cycle (SDLC) experience including Analysis, Design and Review of Business and Software Requirement Specifications; Development and Testing as per the SDLC waterfall methodology.
- Requirement Gathering and Data Analysis of all supporting systems. Designed Informatica mappings and data flows.
- Expertise in SQL performance tuning using Query Plan Analysis, HINTS and query cost optimization.
- Deep understanding of the Data Warehousing SDLC and architecture of ETL tools.
- Created Datamaps, Registrations and Extractions in Power Exchange for DataCom using PWX Navigator.
- Experience in Real time processing using Power Exchange for DataCom.
- Strong understanding of Data Modeling (Relational, dimensional, Star and Snowflake Schema), Data analysis, implementations of Data warehousing using Windows and UNIX.
- Developed mappings in Informatica to load the data from various sources into the Data warehouse using different transformations like Source Qualifier, Expression, Lookup, Aggregate, Update Strategy and Joiner.
- Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex queries using SQL. Also loaded large files using SQL Loader.
- Scheduling jobs using Cron job and inbuilt Informatica scheduler.
- Experience in resolving issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions
- Experience with Data Cleansing, Data Profiling and Data analysis.
- UNIX Shell Scripting, SQL and PL/SQL coding.
- Database / ETL Performance Tuning: Broad Experience in Database Development including effective use of Database objects, SQL Trace, Explain Plan, Different types of Optimizers, Hints, Indexes, Table Partitions, Sub Partitions, Materialized Views, Global Temporary tables, Autonomous Transitions, Bulk Binds, Capabilities of using Oracle Built - in Functions.
- Performance Tuning of Informatica Mapping and workflow, point to point integration, Interface with legacy systems
- Dimensional Data Modeling experience using Erwin 4.5/7.3 and Ralph Kimball Approach, Star/Snowflake Modeling, Data marts, OLAP, FACT & Dimensions tables, Physical & Logical data modeling, and Data Modeling Tools Visio.
ETL Tools: InfoSphere, ASCENTIAL Stage 7.1/7.5/7.5.1/8.1 , IBM 8.x, Quality Stage, Informatica power center 7.x/8.x/9.x, Informatica Data Quality, DIH
Programming Language: C, SQL, PL/SQL
Databases: Oracle 11g/10g/9i/8i/8.0/7, MS SQL Server 2000/2005, Sybase IQ Adaptive Server, DB 2 8.0/7.0/6.0 ,Teradata, Netezza.
Business Intelligence Tools: OBIEEE, Crystal reports, MSBI, Information Analyzer, Analytic Server Data Modeling Tool, BI, BO
Data Modeling Tools: ERWIN, Oracle Designer, Power Designer, Tableau(visualization)
GUI: ORACLE D2K with FORMS 3.0, REPORTS 3.0., tortoise SVN, GIT hub
Operating Systems: Window XP Prof 5.1, Win95/98/2000, Windows NT 4.0, UNIX AIX/Solaris
Confidential - Seattle, WA
Sr. ETL Informatica Developer
- Involved In Gathering requirements from Business and converted them in to technical documents working as a member of an agile cross-functional Scrum team
- Implemented and maintained ETL jobs using Informatica power center, BI &BO.
- Created UNIX shell scripts for database connectivity and ran load via Tidal job scheduler.
- Worked closely with the business Analyst on Data related issues and modified and created stored procedures in Teradata with new table’s creations as per the requirements.
- Created ETL Mappings, sessions and workflows using Informatica Power Center to move Data from multiple sources like XML, DB2, SQL Server, and Oracle into a common target Enterprise Data warehouse, IBM Websphere.
- Promoted jobs from development to Test and then coordinate to promote to Production.
- Worked on single applications business needs, developed codes based on Data Analysis and worked on the production bug fixes.
- Improved the existing unix scripts to handle global variables and updated on the new codes automations too.
- Worked on Informatica Power Centerto develop, migrate codes to test and debug job failures in multiple environments.
- Worked on Fast export scripts and TPT scripts to import and export large amount of data.
- Extensively worked in Teradata Database on data related issues and modified the sql scripts accordingly.
- Prepared Implementations documents with all required details and created change requests to promote the changes in higher environments.
- Performed Smoke Testing, GUI-Testing, Functional-Testing, Backend Testing, System Integration Testing, Sanity Testing, and User Acceptance Testing (UAT)
Environment: Information power center (9.6), Oracle 10g, Teradata, UNIX/Linux, JIRA, Tidal, Teradata sql assistant, Service now, Oracle Data Integrator (ODI) 11g, IBM websphere
Sr. ETL Informatica Developer
- Performed the migration of mappings from Datastage tool to Informatica ETL tool & visualizing using tableau
- With limited guidelines and with no documentation available, studied the existing jobs in DataStage and created specifications for respective mappings.
- Developed the Low complexity, Medium complexity and High complexity corresponding mappings along with any defects in the old processes.
- Extracted data from heterogeneous sources like oracle, DB2, XML, Flat File and perform the data validation and cleansing in staging area then loaded in to Data Warehouse Teradata using Teradata Utilities & Informatica
- Break up the work into task list and estimate based on simple, medium and complex methodology in TFS and assign the task to the ETLdeveloper
- Created Triggers and Stored Procedures using PL/SQL.
- Generated Parameter Files for all the mappings using KSH.
- Developed the nightly batch run KSH to update param files using batch id.
- Developed the nightly integrity run to email succinct report to Integrity group.
- Performed the Unit Testing and Integration testing with the help of Integrity queries.
- Validated the results using MicroStrategy reports with both ETL tools.
- Involved in migrating the newly developed Informatica process in QA and PROD environments.
- Create and maintain documentation related to production batch jobs.
Environment: Informatica PowerCenter 8.1.3 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Ascential DataStage, MicroStrategy 7.1.5, Oracle 9i, TOAD 8.6.1, Sun Solaris & Windows NT, Shell Scripting, Tableau, Team Foundation Services (TFS), DB2, XML .