Senior Etl Informatica Developer Resume Profile
NJ
Professional Summary:
- 8 years of IT experience in Data warehousing, Teradata, Informatica ETL and quality tools, IBM Infosphere Datastage, UNIX with Relational Databases such as Oracle, DB2, SQL Server, Teradata with emphasis on Business Requirements, Analysis, Design, Development testing of client/server Data warehouse systems.
- Strong understanding and experience in SDLC, Waterfall model, JAD RAD, ILM, Agile project models.
- Excellent experience in design of ETL methodology using tools such as Informatica PowerCenter Ver9.x/8.x/7.x--Designer, Workflow manager Workflow monitor, Repository manager , Power Exchange, Informatica Data quality IDQ , Informatica Data Explorer IDE , Metadata manager, Informatica MDM and other Informatica suite of products.
- Experience in creating complex Informatica Mappings, Mapplets, Sessions, Worklets and Workflows to extract, transform and load data from various sources using transformations like SQ, Lookup, Joiner, sorter, Aggregator, Update Strategy, Filter, Router, Normalizer, Java transformation, Stored Procedure, Web services, Transaction control, xml parser generator etc.
- Strong understanding of Data warehouse concepts and Dimension modeling with Star and Snow flake schemas.
- Experience in all phases of Data Modeling - Conceptual, Logical, Physical modeling and hands on experience with ERWin in using Forward Engineering and Reverse Engineering for operational as well as analytical systems.
- Hands on experience with Informatica Analyst and Informatica Developer in performing Column Profiling, Join analysis, primary key analysis, cleansing, match merge, standardization, creating rules, scorecards, reference tables.
- Experience with Informatica Multi domain Master Data Management MDM in creating data model, Base objects, landing staging tables, creating match and merge rule sets, consolidation and creating Best Version of Truth BVT .
- Experience in SQL query tuning and optimization using EXPLAIN plan, Collect Statistics in both Oracle and Teradata.
- Experience in using Change Data Capture CDC strategy to do incremental loading to capture delta records.
- Hands on experience in designing Slowly changing dimensions SCD Type 1, 2, 3 .
- Experience in Informatica performance tuning of Informatica sessions using Session partitioning techniques such as Round robin, Pass-through, Key range partitioning, Hash auto keys, Hash user keys, Database partitioning and optimizing the usage of Joiner and Lookup cache Persistent cache, Dynamic cache .
- Teradata 14 certified professional as recognized by Teradata Corporation and have deep understanding of Teradata comcepts.
- Strong experience with DDL and DML writing skills and in writing complex SQLs for data analysis and in using advanced SQL functions such as PARTITION BY, RANK, DENSERANK, ROWNUMBER, PIVOT, LISTAGG etc.
- Experience in writing complicated SQLs, PL/SQL procedures, functions, triggers, materialized views.
- Experience in using SQL loader for conventional path loading and direct path loading into Oracle tables.
- Proficient in extraction and loading data from/to various RDBMS such as Oracle 11g/10g/9i/8i, MS SQL server 2005/2008/2012, Teradata 13/12/V2R5, Netezza and DB2 UDB 9.x/8.x and from XML, Flat Files.
- Experience in designing Informatica ETL mappings to load data into Teradata tables by setting up ODBC connection for smaller loads and by setting up external loader connections for TPT, Fastload, Multiload for large loads.
- Profound knowledge about MPP architecture, nodes, AMPs, Hashmaps, parallelism, scalability of Teradata database.
- Experience in active data warehousing on the Teradata platform and hands on experience in using Teradata utilities such as Teradata Parallel Transport, TPump, FastLoad FLOAD , MultiLoad MLOAD , Fast export, BTEQ, Collect stats, Index optimization and Explain plan.
- Expertise in identifying the Teradata skewed redistributions, join order, optimizer statistics, physical design considerations UPI/NUPI/USI/NUSI/Join Index/Sparse Index etc.
- Strong experience working in data migration and conversion projects.
- Experience with Unix Shell Scripting Bash, Korn Shell Scripting in writing various UNIX shell scripts to automate dataflow, to create batch processes, to invoke external loaders and utilities.
- Experience in performing data lineage using Informatica Metadata Manager.
- Experience In working with various job scheduling tools like Autosys, Control-M, TWS, Informatica Scheduler
- Expertise in writing necessary required documentation as per the company standards for ETL processes, developing unit test plan, test cases and preparing migration/deployment document.
- Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.
Technical Skills:
BI/ETL Tools | Informatica PowerCenter 9.5.1/9.1/8.x/7.1, Informatica Data Quality IDQ , Informatica Metadata Manager, PowerExchange/Connect, Informatica Master data management MDM , IBM Infosphere Data Stage. |
Databases | Oracle 11g/10g/9i/8i, Teradata V2R5/R6/R12/R13, SQL Server 2008/2005, IBM Netezza, DB2 9.5 |
Loaders | Teradata Fast Load FLOAD , Multi Load MLOAD , Fast Export, Tpump, TPT, BTEQ, SQL Loader |
Languages | ANSI SQL, T-SQL, PL/SQL , Stored Procedures, UNIX Shell Scripts, Java, XML |
Data Modeling | ERWin r7.3/4/3.5, MS-Visio 2010/2007 |
Other Tools | SQL plus, Teradata SQL Assistant, Oracle SQL developer, PLSQL developer, Web Services, MS Office, MS Visio, TOAD, FTP, SVN, Putty, WinSCP, HP QC |
Operating Systems | Windows NT,2000/03/XP/Vista/7 , Linux Red Hat , Unix AIX 5.0/5.2/6.0, Solaris |
Mainframe | MVS, Z/OS, Natural 4GL, COBOL, JCL, TSO, ISPF, DFSORT/SYNCSORT |
Scheduler Tools | Autosys, Control M, CA7, Informatica Scheduler, Datastage director. |
Professional Experience:
Confidential
Senior ETL Informatica Developer
Responsibilities:
- Designed various mappings using transformations like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Stored Procedure, Sorter, Lookup, Aggregator, Joiner in the mappings.
- Capturing defects in HP QC, performing Fallout analysis data validation by writing complex SQL queries.
- Creating design specifications, unit test plans, deployment document for implementation.
- Creating Informatica Stored Procedure transformations to collect stats and analyze large tables.
- Performing data lineage using Informatica Metadata Manager to ensure whether the metadata changes will be taken care smoothly without disturbing downstream tools.
- Designing Oracle schemas, creating relational tables, optimizing indexes, collecting stats on large tables.
- Creating Informatica Audit session to capture the Session start time and end time.
- Using IDQ to perform data profiling, data cleansing, creating reference tables, scorecards etc.
- Creating Web services consumer transformation to invoke web services for Address scrubbing.
- Migrating Informatica code between different environments and performing Informatica admin tasks such as Folder copy, creating Informatica repository queries to query metadata information etc.
- Develop SQL programs and Stored Procedures as required for the ETL processes.
- Implementing recover and restart strategy for Infomatica workflows.
- Implementing business rules into Oracle PL/SQL procedures functions and running in multiple threads.
- Developing slowly changing dimension SCD Type-2 to get the delta changes and run iterative load.
- Used Informatica power center 9.5.1 to Extract data from various sources like flat files, xml files, Oracle tables, SQL Server tables, DB2 tables etc, Transform and Load into Oracle Pre-stage and Staging tables and ultimately to Target databases.
- Running Informatica Workflows in Advanced mode using Concurrent execution to run multiple instances corresponding to data from multiple markets at the same time.
- Developing Informatica mappings, sessions, worklets and workflows to extract data from variety of sources.
- Coding UNIX shell scripts and perl scripts to automate the data loading tasks.
- Developing mappings based on Change Data Capture CDC techniques.
- Using Oracle SQL developer, PLSQL Developer, TOAD, Teradata SQL Assistant to analyze the existing data and to design SQL queries for mappings.
Environment: Informatica Power Center 9.5.1, IDQ, Oracle 11g, DB2, SQL Server, SQL Loader, SQL developer, PLSQL developer, TOAD, Putty, PL/SQL, UNIX Shell Scripting, Flat Files, Control-M, HP QC, FTP
Confidential
Senior ETL Informatica Developer
Responsibilities:
- Performed extensive data analysis along with subject matter experts and designing ETL data specifications.
- Used Informatica power center 9.5.1 to Extract, Transform and Load data into Oracle data Warehouse from various sources like flat files, xml files, Oracle tables, DB2 tables etc
- Analyzing data from various sources and transforming them according to the business rules.
- Designed various mappings using transformations like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner transformation in the mappings.
- Coding UNIX shell scripts and perl scripts to automate the data loading tasks.
- Using IDQ, performing data quality operations such as data profiling, developing scorecards, column profiling, join analysis, creating reference tables etc.
- Developing SQL loader control files and designing Conventional loading Direct path loading techniques.
- Designing Oracle schemas, creating relational tables, optimizing indexes, collecting stats on large tables
- Created and scheduled Session, jobs based on demand, run on time using Workflow Manager.
- Running and monitoring complex ETL Workflows and ETL Sessions.
- Creating command tasks, Email tasks to control sequence of Informatica workflows and configure failure notifications.
- Developing mappings based on Change Data Capture CDC techniques.
- Developing complex SQLs and PL/SQL procedures according to business rules.
- Installed and maintained Informatica Power Center, Power Exchange on Unix Platform
- Using Oracle SQL developer, TOAD, Teradata SQL Assistant to analyze the existing data and design complex SQL queries.
- Performing Column profiling, Structural profiling, dependency profiling using IDE/IDQ.
- Loading data from relational tables, flat files, xml files etc into Oracle and DB2 databases.
- Scheduling jobs in automatic execution at the specific timings in sequential and parallel modes
- Documented every aspect of data migration and reporting fallouts
- Addressing defects logged by business team and issues/tickets raised by user community maintaining proper tracking in HP QC applications.
Environment: Informatica Power Center 9.5.1, IDQ, Oracle 11g, DB2, SQL Server, SQL Loader, SQL developer, TOAD, Putty, PL/SQL, UNIX Shell Scripting, XML, Flat Files, RPG Legacy Systems, PL/SQL, Control-M, HP QC
Confidential
Sr ETL Informatica Developer
Responsibilities:
- Involved in complete Life Cycle of developing Enterprise Data Warehouse Application and, developing ETL application using Informatica.
- Performed extensive data analysis along with subject matter experts and identified source data and implemented data cleansing strategy.
- Designed data model structure and E-R modeling with all the related entities and relationship with each entity based on the rules provided by the business manager using Erwin.
- Used informatica power center 9.0.1 to Extract, Transform and Load data into Oracle data Warehouse from various sources like Netezza, Teradata flat files
- Developed complex mappings, mapplets using Informatica workflow designer to integrate data from varied sources like Oracle 10g, Flat files and loaded into Teradata data warehouse
- Designed various mappings using transformations like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner transformation in the mappings.
- Creating connection to databases like SQL Server, oracle, Netezza and application connections
- Created Teradata External Loader connections such as MLoad Upsert, MLoad Update, FastLoad and Tpump in the Informatica Workflow Manager while loading data into the target tables in Teradata Database.
- Created and scheduled Session, jobs based on demand, run on time using Workflow Manager.
- Monitored Workflows and Sessions using Workflow Monitor.
- Installed and maintained Informatica Power Center, Power Exchange on Unix Platform
- Maintained warehouse metadata, naming standards and warehouse standards for future application development
- Performing SQL Tuning to handle large volumes of data.
- Designing the ETL Mappings and Jobs according to business specifications mentoring the Developers regarding the ETL Design and Architecture.
- Analyzed the bugs, performance of PL/SQL Queries and provided solutions to improve the same.
- Scheduling jobs in automatic execution at the specific timings in sequential and parallel modes
- Documented every aspect of data migration and reporting any problem, missing data or ambiguity into exception report.
- Actively involved in production support. Implemented fixes/solutions to issues/tickets raised by user community
Environment: Informatica Power Center 9.1, Teradata 13, Oracle 10g, Teradata SQL Assistant, PL/SQL, UNIX Shell Scripting, XML, MS Excel, Flat Files, Legacy Systems, PL/SQL, Control-M, HP QC
Confidential
Informatica Developer/Technology Analyst,
Responsibilities:
- Gathering requirements and creating functional spec and technical spec documents.
- Extracted data from sources like DB2, Oracle and Fixed width and Delimited Flat files Transformed the data according the business requirement and then Loaded into the Oracle Database
- Modified several of the existing mappings and created several new mappings based on the user requirement.
- Used mapping parameters and variables for incremental loading of data and session parameters for migrating workflows.
- Involved in developing complex ETL mappings in different client locations.
- Validation of Informatica mappings for source compatibility due to version changes at the source.
- Performed data cleansing, scrubbing before doing transformations.
- Created Mappings using Mapping Designer to load the data from various sources using different transformations like Aggregator, Expression, Stored Procedure, External Procedure, Filter, Joiner, Lookup, Router, Sequence Generator, Source Qualifier, and Update Strategy transformations.
- Created Mapping Parameters and Variables.
- Handled operating system tasks by generating Pre and Post-Session UNIX Shell Scripts
- Created and Scheduled Sessions and Batch Processes based on demand using Informatica Server Manager.
- Maintained existing mappings by resolving performance issues.
- Improving performance of Informatica jobs by using Session partition, Stage partition and by using Persistent cache model.
- Participated in weekly end user meetings to discuss data quality, performance issues, ways to improve data accuracy and new requirements
Environment: Informatica Power Center 8.6, DB2 9.5, Oracle 10g, Flat files, Shell Scripts, Mainframe Technologies, SQL, PL/SQL, Third party vendor systems.
Confidential
ETL Informatica Developer
Responsibilities
- Interacted with Business Analysts to understand the business requirements.
- Involved in developing the conceptual, logical and physical data models using Erwin.
- Involved in staging data from external sources and was responsible for moving the data into the Warehouse using Informatica
- Analyzed source data and formulated the transformations to achieve the customer requested reports.
- Created reusable transformations and mapplets and used them in mappings.
- Established reusable components and implemented Informatica best practices and standards.
- Designed and implemented mappings using SCD type1, type 2 and CDC methodologies.
- Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Connected, Unconnected , Expression, Aggregator, Update strategy stored procedure transformation.
- Developed BTEQ scripts to load data from Teradata Staging area to Data warehouse, Data warehouse to data marts for specific reporting requirements
- creating MLOAD, Fast load and T Pump control scripts to load data to the IDW
- Optimized high volume tables Including collection tables in Teradata using various join index techniques, secondary indexes, join strategies and hash distribution methods.
- Analyzing data quality, data profiling, developing scorecards, column profiling, join analysis etc.
- Handled Unix operating system tasks by creating Pre and Post-Session UNIX Shell Scripts.
- Co-ordinated with Middleware team for migrating Informatica objects into test environment.
- Expertise in excellent system documentation and unit test plan documentation.
Environment: Informatica Power Center 8.1, Teradata V2R6, Oracle 10g, MS SQL Server 2005, Erwin, PL/SQL, Fastload, MultiLoad, BTEQ, Autosys , TOAD, SQL, Unix Shell Scripts
Confidential
Informatica/ETL Developer
Responsibilities:
- Worked closely with Business team to gather the requirements and designing the ETL specifications.
- Designing and coding the complete ETL process using Informatica for various transactions and loading data from different sources like Flat Files and Relational Database.
- Creating complex Mappings and Mapplets using Transformations like Filters, Aggregator, Lookups, Expression, Sequence generator, Sorter, Joiner and Update Strategy.
- Extracting data from heterogeneous sources and using Joiner transformation to join them efficiently.
- Using persistent cache with Lookups for static tables whose data is not bound to change between runs.
- Dropping and re-creating Indexes using pre-sql and post-sql commands while loading tables in Bulk mode.
- Using Informatica session partitioning techniques to increase performance.
- Creating Audit tables and mappings to capture Informatica session run times.
- Using sequence generator transformation to generate surrogate key before loading into datawarehouse.
- Designing Type-2 Slowly changing dimension to maintain the old record along with new record in DW.
- Creating reusable Sessions, mapplets, Worklets with tasks like event wait, event raise etc
- Performing unit testing for mappings and creating test scenarios before deploying the code for QA testing
- Trouble shooting of long running sessions and performance tuning of Informatica sessions.
- Created workflows using Session, Command, and Email tasks.
- Implementing Change Data Capture CDC using Mapping parameters and variables for incremental loading of data.
- Involved in performance tuning of source, mappings, sessions, workflows and identifying the bottlenecks.
Environment: Informatica Power Center 7.1, Power Exchange 8.1, Toad 9.1, Oracle 10g / 9i, PL/SQL, SQL LOADER and UNIX Shell Scripting.
Confidential
Teradata developer/Programmer Analyst
Responsibilities:
- Involved in Complete SDLC from gathering system design requirements, design, development, testing, deployment and documentation.
- Involved in the development of conceptual, logical physical data model of the star schema using ERWin
- Creating Secondary indexes USI, NUSI for columns used frequently in reporting queries.
- Developing Fast Load scripts to load huge volume of data into the empty staging tables.
- Developing Multi Load scripts to load the data into different target tables.
- Designing Partitioned Primary indexes PPIs for range based queries.
- Developing Fastexport scripts to export data from Teradata tables and generate reports.
- Creating Teradata Macros and used various Teradata analytic functions.
- Writing Teradata BTEQ scripts to implement the business logic and moving data between Teradata tables
- Performed tuning and optimization of complex SQL queries using Teradata Explain. Created several custom tables, views and macros to support reporting and analytic requirements.
- Used various Teradata Index techniques to improve the query performance.
- Excellent knowledge on ETL tools such as Informatica, load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
Environment: Teradata V2R5, Teradata SQL Assistant, DB2, Oracle 9i, Informatica Power center, VSS, Outlook, Putty, MLOAD, TPUMP, FAST LOAD, FAST EXPORT, Erwin, UNIX Shell Scripts, Windows