Informatica Lead Developer Resume
Parsippany, NJ
SUMMARY
- 9 years of overall experience in IT Industry with Data warehousing, OLAP reporting tools, ETL tools using industry best methodologies and procedures.
- Expert knowledge in working with Data Warehousing ETL using Informatica Power Center 10.x/9.x/8.x (Designer, Repository manager, Repository Server Administrator console, -
- Experience in Data Warehouse/Data Mart Development Life Cycle and worked on Dimensional modeling of STAR, SNOWFLAKE schema, OLAP, Fact and Dimension tables.
- Experienced in writing UNIX shell scripts/commands for pmcmd and file handling operations like file manipulation, file watch, file pattern search, file archive and purge.
- Good knowledge of Data warehouse concepts and principles (Kimball/Inman) - Star Schema, Snowflake, Enterprise Data Vault, SCD, Surrogate keys, Normalization/De normalization, Data marts.
- Experience in integration of various data sources with Relational Databases like Oracle, SQL Server, DB2, Teradata and Worked on integrating data from flat files.
- Experience and Expertise in Operational Data Store (ODS) design, Data warehouse and mart design methodologies such as Star-schema, Snowflake, designing slowly changing dimensions and fact tables. Also, expertise in loading and extracting data from different source/target systems like Oracle, SAP R3, S4 HANA, Native HANA, Sales Force, Web services, XML, Flat files and SQL Servers
- Extensive experience in using various Informatica Designer Tools like Source Analyzer, Mapping Designer, Transformation Developer, Mapplet Designer.
- Comprehensive knowledge and experience in process improvement, normalization/de-normalization, data extraction, data cleansing, data manipulation.
- Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator, Normalizer, Union, and XML Source Qualifier.
- Experience in extracting, transforming and loading (ETL) data from spreadsheets, database tables and other sources using Data Stage and Data services.
- Highly experienced in developing, designing, reviewing and documenting Informatica work products like Mappings, Mapplet, Shortcuts, Reusable transformations, Sessions, Workflows, Worklets, Schedulers and experienced in using Mapping parameters, Mapping variables, Session parameter files.
- Familiarity with SAP Data Services, Data Provisioning, flowgraphs, calculation views
- And stored procedures on SAP HANA.
- Excellent working knowledge of Shell Scripting and job scheduling on platforms like UNIX.
- Extensive experience in error handling and problem fixing using Informatica.
- Extensively worked on data extraction, Transformation and loading data from various sources like Oracle, SQL Server, Teradata and files like Flat files.
- Expertise in implementing complex Business rules by creating complex mappings/Mapplet, shortcuts, reusable transformations, Portioning Sessions.
- File transfers using SFTP & NDM.
- Experience in troubleshooting by tuning mappings, identify and resolve performance bottlenecks in various levels like source, target, mappings, and session.
- Involved in Implementing Slowly Changing Dimensions, Star Schema modeling, Snowflake modeling, FACT tables, Dimension tables, De normalization.
- Exposure to Informatica B2B Data Exchange that allows to support the expanding diversity of customers and partners and their data with capabilities that surpass the usual B2B solutions
- Exposure to Informatica B2B Data Transformation that supports transformation of structured, unstructured, and semi-structured data types while complying with the multiple standards, which govern the data, formats.
- Experience in Oracle9i/10g/11g, SQL Server, Teradata, SQL Loader and DB2 etc.
- Extensive Experience in UNIX (AIX/Solaris/ HP-UX 11.x), and Windows Operating system.
- Extensively used SQL and PL/SQL to write Stored Procedures, Functions, and Packages.
- Developed excellent overall software Development life cycle (SDLC) professional skills by working independently and as a team member to analyze the Functional/ Business requirements and to prepare test plans, test scripts. Collaborated onsite teams, interacted and well managed various offshore teams.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 10.2/10.1/9.6.1/9.1/8. x, Informatica Power Exchange 10.2/9.5.1. Informatica IDQ
Databases: Oracle 11g/10g/9i/8i, SQL Server 2000/2005/2008 , Teradata, DB2.
Languages: SQL, T-SQL, PL/SQL, UNIX Shell Scripting, Windows Batch Scripting.
Operating Systems: UNIX, Windows 2000/2003/NT/XP,7, Linux
Business Intelligence Tools: COGNOS, Business Objects (BO).
DB Tools: TOAD, SQL * Plus, PL/SQL Developer, SQL * Loader, DBeaver
Web Technologies: HTML, XML, Java Script
Data Modeling Tools: Erwin, Embarcadero Studio
Other Tools: WIN SCP, PUTTY, JIRA, MS-Office, Autosys, Tidal, Service Now and Control-M.
PROFESSIONAL EXPERIENCE
Confidential, Parsippany, NJ
Informatica Lead Developer
Responsibilities:
- Worked on Various applications as Informatica Power-Exchange CDC, IBM CDC, Informatica Power Centre, Service Now.
- Build critical design flows from different sources to one target including CDC (Change Data Capture), ETL and SQL Replication.
- Design and developed CDC setup for Oracle, DB2 and SQL sources.
- Created critical mappings to handle huge data loads from multiple sources (Oracle, DB2, SQL Server) to target.
- Design the stage tables structures in Oracle and HANA to load the data.
- Created power exchange table registrations through power exchange navigator and imported the tables as source and targets.
- Worked on migration from data warehouse to Snowflake and re-platforming the ETL to Informatica Cloud IICS.
- Scheduled and Monitor jobs using Data Services Management console
- Involved in Informatica upgrade project from Informatica 9.6 to 10.2 version.
- Created Data flow design for ETL’s from Control-M job Scheduling tool.
- Worked on creating multiple Change Management Requests in Service Now.
- Worked closely with business team to finalize on the validation rules that need to be applied outside Informatica ETLs.
- Migrated data from legacy systems SQL server 2000, AS400 to Snowflake and SQL server.
- Extensive experience in error handling and problem fixing using Informatica.
- Developed excellent overall software Development life cycle (SDLC) professional skills by working independently and as a team lead to analyze the Functional/ Business requirements and to prepare test plans, test scripts.
- Developed process for Teradata using Shell Scripting and RDBMS utilities such as MLoad, Fast Load (Teradata).
- Collaborated onsite teams, interacted and well managed various offshore teams on delivering product to client.
- Having exposure to ETL tools like SAP Data Services.
- Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
- Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator, Normalizer, Union, and XML Source Qualifier.
- Defined database Data Stores to allow Data Services to connect to heterogeneous source or target databases.
- Experience with Power Exchange Navigator handling registration/extraction groups from different Databases (Oracle, DB2, SQL) and performing row tests to check the data.
- Created technical specification documents/runbooks for each ETL codes and overall activities in exhaustive detail listing all technical and business validation rules for Production Go-Live.
- Extensive experience in using SQL, functions, complex joins, aggregate functions, materialized views, indexing in databases like Oracle 10g, MS SQL Server, Netezza and HANA.
- Highly experienced in developing, designing, reviewing and documenting Informatica work products like Mappings, Mapplet, Shortcuts, Reusable transformations, Sessions, Workflows, Worklets, Schedulers and experienced in using Mapping parameters, Mapping variables, Session parameter files.
- Creating/updating Unix/Linux scripts and creating/updating Control-M, Autosys files for new/existing jobs
- Resolving recurring issues by finding the root cause for permanent fix
- Performing Production support/maintenance activities like code changes, code review, testing, Performance tuning, creating Change Requests and production deployment
- Performed basic Informatica Admin activities like exporting & importing objects, deploying objects to other environments, configuring parameter files and writing and running scripts to clean -up disk space
- Knowledge of Analysis, Design, Development, Implementation, Deployment and maintenance of Business Intelligence applications.
- Involved in gathering, analyzing, and documenting business requirements, functional requirements and data specifications for Business Objects Universes and Reports.
Environment: Hana, nformatica Power Center 10.2, Informatica Data Quality(IDQ), Informatica Power Exchange 10.2, Oracle 11g, MS SQL Server 2016, SQL, PL/SQL, SQL*Plus, Control-M, windows 10, Unix.
Confidential, Miami, FL
Informatica Developer
Responsibilities:
- Extensively Used Informatica Power Center as an ETL tool to extract data from various source systems to Target system.
- Extracted data from multiple operational sources for loading into staging area, Data warehouse using SCD's (Type 1 and Type 2) loads.
- Worked closely with business team to finalize on the validation rules that need to be applied outside Informatica ETLs.
- Design matching plans, help determine best matching algorithm, configure identity matching and analyze duplicates using Informatica Data Quality (IDQ).
- Created ETL processes to extract data from SQL, Oracle, Netezza and HANA tables for loading into various relational and non-relational staging areas and develop complex mappings in Informatica to load the data from various sources into the Data Warehouse.
- Created ETL Mappings, sessions and workflows using Informatica Power Center to move Data from multiple sources like XML, DB2, SQL Server, and Oracle into a Enterprise Data warehouse Teradata.
- Used direct and indirect flat files (delimited and fix width) as a source of data.
- Identified dimensions and Involved in building Facts and Dimension tables. Created and deployed cubes in Star and Snowflake schema.
- Participated in meetings with business analysts to understand the user requirements.
- Created technical specification document for each workflow in exhaustive detail listing all technical and business validation rules, look-up tables and error messages.
- Created Mappings with Transformations, Sessions and Workflows.
- Created various transformations such as Expression, Lookup, Joiner, Router, Filter, Aggregator and Sequence Generators.
- Experience in Snowflake modelling - roles, schema, databases.
- Imported various Application Sources, created Targets and Transformations using Informatica Power Center Designer (Source analyzer, Warehouse developer, Transformation developer, Mapplet designer, and Mapping designer)
- Created and used reusable transformations when required.
- Extensively used mapping parameters, mapping variables and parameter files.
- Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
- Applied partitioning at session level for the mappings which involved loading data to target using target lookup to avoid duplicates records.
- Configured and ran the Debugger from within the Mapping Designer to troubleshoot predefined mapping
- Worked with DBA team to resolve the database related task like deployment of the table scripts, developing procedure to create and drop table indexes.
- Worked with Informatica administrator to deploy the code from development to QA and production.
- Experience with data moving from SQL server to Oracle database
Environment: Hana, Informatica Power Center 10.1, Informatica Power Exchange 10.0, Oracle 11g, MS SQL Server 2012, SQL, PL/SQL, SQL*Plus, TOAD, windows 7, Unix
Confidential, Miami, FL
ETL Developer
Responsibilities:
- Design, develop, maintain, monitor and execute production ETL processes using Informatica Power Center.
- Extensively used Informatica Power Center to extract data from various sources and load in to staging database
- Investigate production ETL issues / problems. Verify and validate ETL deliverables.
- Translate concepts to requirements, and development into an automated production process.
- Act as a subject matter expert and resource for others on assigned ETL processes.
- Designed and Developed IDQ mappings for address validation / cleansing, doctor master data matching, data conversion, exception handling, and report exception data.
- Built the Physical Data Objects and developed various mapping, mapplets/rules using the Informatica Data Quality (IDQ) based on requirements to profile, validate and cleanse the data. Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ
- Complete projects and development activities timely and accurately while following the System Development Life Cycle (SDLC).
- Suggest changes and enhancements for ETL processes.
- Worked with various transformations like Source Qualifier, Lookup, Stored Procedure, Sequence Generator, Router, Filter, Aggregator, Joiner, Expression, and Update Strategy.
- Helped the team in analyzing the data to be able to identify the data quality issues.
- Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.
- Extensively worked with aggregate functions like Avg, Min, Max, First, Last, and Count in the Aggregator Transformation.
- Extensively used SQL Override, Sorter, and Filter in the Source Qualifier Transformation.
- Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation.
- Document hours and development activities by following SDLC and IS Change Management guidelines
- Scheduled Informatica workflows using power center Scheduler
- Used SQL tools like TOAD to run SQL queries and validate the data in warehouse and mart
- Involved in debugging mappings, recovering sessions and developing error-handling methods
- Resolved memory related issues like DTM buffer size, cache size to optimize session runs
- Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager
- Wrote complex SQL scripts to avoid Informatica Look-ups to improve the performance as the volume of the data was heavy.
- Tuned performance of Informatica session for large Data Files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Worked on SQL tools like TOAD to run SQL queries and validate the data
- Used Unix Shell Scripts to automate pre-session and post-session processes
- Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision
- Creating Test cases for Unit Test, System Integration Test and UAT to check the data quality.
- Worked cooperatively with the team members to identify and resolve various issues relating to Informatica and other database related issues.
- Developed complex reports that involved multiple data providers, master/detail charts, complex variables and calculation contexts.
Environment: Informatica Power Center 10.1/9.6.1, Informatica Power Exchange 10.0, Oracle 11g, MS SQL Server 2012, SQL, PL/SQL, T-SQL, SQL*Plus, TOAD, AutoSys windows 7, Sybase, Teradata, Unix
Confidential, Reston, VA
Informatica Developer
Responsibilities:
- Conducted JAD sessions with business users and SME's for better understanding of the reporting requirements.
- Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
- Develop high level and detailed level technical and functional documents consisting of Detailed Design Documentation function test specification with use cases and unit test documents.
- Analysis of source systems and work with business analysts to identify study and understand requirements and translate them into ETL code.
- Handled technical and functional call across the teams.
- Responsible for the Extraction, Transformation and Loading (ETL) Architecture & Standards implementation.
- Responsible for offshore Code delivery and review process
- Used Informatica to extract data from DB2, UDB, XML, Flat files and Excel files to load the data into the Warehouse.
- Worked in all phases of Data Integration from heterogeneous sources, legacy systems to Target Database.
- Worked on Informatica Power Center tool - Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager, Informatica Workflow Manager and Workflow Monitor.
- Designed and developed complex Type 1 and Type 2 mappings for the stage to integration layer in the project.
- Involved in Design Review, code review, test review, and gave valuable suggestions.
- Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
- Created partitions for parallel processing of data and also worked with DBAs to enhance the data load during production.
- Performance tuned Informatica session, for large data files by increasing block size, data cache size, and target-based commit.
- Involved in writing a procedure to check the up-to-date statistics on tables.
- Used Informatica command task to transfer the files to bridge server to send the file to third party vendor.
- Took part in migration of jobs from UIT to SIT and to UAT.
- Created FTP scripts and Conversion scripts to convert data into flat files to be used for Informatica sessions.
- Involved in Informatica Code Migration across various Environments.
- Performed troubleshooting on the load failure cases, including database problems.
Environment: Informatica Power Center 9.1, Oracle 11g, MS SQL Server 2008, Teradata, HP Quality Control.