We provide IT Staff Augmentation Services!

Sr. Etl /informatica Developer Resume

5.00/5 (Submit Your Rating)

Charlotte, NC

OBJECTIVE

  • ETL Informatica Developer I wif 6+ years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application systems.
  • Extensively strong work experience wif large scale Data Warehouse implementations using Informatica 9.x/8.x/7.x PowerCenter, Oracle and SQL Server on Windows platforms.
  • Strong experience wif Ralph Kimball and Inmon data modelling methodologies. Experience in project management, estimations, and resource management activities.

SUMMARY

  • Over 6 Years of IT experience in all phases of Software Development Life Cycle (SDLC) which includes User Interaction, Business Analysis/Modeling, Design, Development, Integration, Planning and testing and documentation in data warehouse applications, ETL processing and. distributed applications
  • Experienced in writing SQL, PL/SQL programming, Stored Procedures, Functions, Cursors, Triggers, Views, Materialized Views, indexes, partitions, table partitions and query performance tuning.
  • Strong expertise in using ETL Tools Informatica Power Center 9.6 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), IDQ, Power Exchange and ETL concepts.
  • Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes using ETL tools Informatica Power Center, IDQ
  • Strong hands on experience using Teradata utilities (SQL, B - TEQ, Fast Load, Multi load, Fast Export, Tpump, Visual Explain, and Query man), Teradata parallel support, Perl and Unix Shell scripting.
  • Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
  • Extensive experience in the design and development of Data Warehouse applications primarily in Oracle using PL/SQL programming and IBM Data stage for ETL, Erwin for data modelling, Perl/shell scripting for batch processing wif Unix/Linux as solution environment.
  • Extensive experience wif Data Extraction, Transformation, and Loading (ETL) from disparate data sources like Multiple Relational Databases (Teradata, Oracle, SQL SERVER, DB2), VSAM, XML and Flat Files.
  • Worked wif various transformations like Normalizer, expression, rank, filter, group, aggregator, lookup, joiner, sequence generator, sorter, SQL, stored procedure, Update strategy, Source Qualifier.
  • Data modelling knowledge in Dimensional Data modeling, Star Schema, Snow-Flake Schema, FACT and Dimensions tables.
  • Experience in working wif POWER EXCHANGE to process the VSAM files.
  • Designing and developing Informatica mappings including Type-I, Type-II, Type-III slowly changing dimensions (SCD).
  • Worked wif various IDQ transformations like Standardizer, Match, Association, Parser, Weighted Average, Comparison, Consolidation, Decision, Expression.
  • Worked wif Informatica Data Quality (IDQ) tool kit, Analysis, data cleansing, data matching, data conversion, exception.
  • Identified and eliminated duplicated in datasets through IDQ components of Biagram Distance, Edit Distance, Hamming Distance.
  • Expertise in using both connected and unconnected Lookup Transformations.
  • Basic knowledge and understanding of Informatica Cloud.
  • Excellent understanding of Star Schema Data Models; Type 1 and Type 2 Dimensions.
  • Experienced in using advanced concepts of Informatica like PUSH DOWN OPTIMIZATION (PDO), PIPELINE PARTITIONING.
  • Applied various techniques at both database level and application level to find the bottle necks and to improve performance.
  • Experienced in working wif version control systems like GIT and used Source code management client tools like Git Bash, GitHub, Git Lab.
  • Coordinating wif Business Users, BI teams, functional Design team and testing team during the different phases of project development and resolving the issues.
  • Good hands on experience in writing UNIX shell scripts to process Data Warehouse jobs.
  • Executed software projects for Banking and financial services.
  • Good skills in defining standards, methodologies and performing technical design reviews.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 10/9.6/9.1/8.6.1/8.5/8.1.1/7.1.2/7.1.1/6.1/5.1.2 , Informatica Power Mart 6.2/5.1.2/5.1.1/5.0/4.7.2 , Informatica ssiss, Informatica MDM 10.1/9.X, Power Connect, Power Exchange, XML, IDQ, IDE, OBIEE 11g/10g, Oracle Data Integrator 12c/11g, OBIA/BI APPS11g/ 7.9.6.x/7.9.5, Oracle Data Integrator (ODI), Oracle Data warehouse builder (OWB), Talend RXT4.1, Informatica CDC, Informatica BDE, Informatica B2B DX/DT(version 10), Informatica on demand IOD, Flat Files (Fixed, CSV, Tilde Delimited,, ETL Tools Data Transformation Services (DTS), Exadata, Metadata Manager, SQL*Loader, MS SQL Server Integration Services (SSIS).

Dimensional Data Modelling: Dimensional Data Modelling, Star Schema Modelling, Snow-Flake Modelling, FACT and Dimensions Tables, Physical and Logical Data Modelling.

Scheduling Tool: OBIEE DAC, Puppet, Chef, Control M, Autosys, Tidal.

Reporting Tools: Tableau, Cognos 8, SSRS, Power BI, Business Intelligence Tools, MS Access.

Database and related tools: MS SQL Server 2000/7.0/6.5 , Oracle 10g/9i/8i/8/7.x, Vertica, Teradata, Netezza, Sybase ASE, PL/SQL, T SQL, Amazon S3, Amazon Red shift, NoSQL, TOAD 8.5.1/7.5/6.2. DB2 UDB, Red hat Enterprise Linux.

Languages: SQL, Dynamic SQL, PL/SQL, SQL*Plus, C, C#, Working knowledge of Unix Shell Scripting, Perl scripting, Java

Web Technologies: HTML, XML and XHTML

Operating Systems: Microsoft XP/NT/2000/98/95, UNIX, Sun Solaris 5

Cloud Technologies: Informatica Cloud

PROFESSIONAL EXPERIENCE

Confidential, Charlotte, NC

Sr. ETL /Informatica Developer

Responsibilities:

  • Involved in the Requirements gathering, Analysis and support of Data Warehousing efforts.
  • Upgraded Informatica from 10.1.0 to 10.2.0.
  • Involving in designing the procedures for getting the data from all the source systems to data warehousing system.
  • Worked wif various transformations including router transformation, update strategy, expression transformation, lookup transformation, sequence generator, aggregator transformation and sorter transformation.
  • Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.
  • Installation and configuration of Power center Repository service, Integration service.
  • Performed the performance and tuning at source, Target levels using Indexes, Hints and Partitioning in DB2, ORACLE and Informatica.
  • Performed the performance and tuning at source, Target levels using Indexes, Hints and Partitioning in DB2, ORACLE and Informatica.
  • Done LDAP configuration and synced users from the external active directory.
  • Tuned performance of Informatica sessions for large data files by increasing block size, data cache size, sequence buffer length and target-based commit interval.
  • Set and follow Informatica best practices, such as creating shared objects in shared for reusability and standard naming convention of ETL objects, design complex Informatica transformations, mapplets, mappings, reusable sessions, worklets and workflows.
  • Extracted data from SAP system to Staging Area and loaded the data to the target database by ETL process using Informatica Power Center.
  • Designed and developed various PL/SQL stored procedures to perform various calculations related to fact measures.
  • Working wif various transformations like router, expression, SQL, lookup, normalizer, and aggregator in building complex mapping.
  • Converted the PL/SQL Procedures to Informatica mappings and at the same time created procedures in the database level for optimum performance of the mappings.
  • Involved in developing test plan cases, documented test cases and test plan queries for all the tables involved. Also, involved in preparing migration checklists in a timely manner meeting conversion schedules for effective cutovers.
  • Solving T-SQL performance issues using query analyzer. Parsed the data from XMLs XSD file using XML. transformation. Participating in the detailed level and high-level design documentation of the ETL system and mapping of business rules.
  • Created SSIS packages to extract data from oltp to olap systems and scheduled jobs to call the packages and stored procedures.
  • Used UNIX to navigate around the system and check for specific files, the files’ content, change permissions and see who the current users are.
  • Created reusable transformations and Mapplets to use in multiple mappings.
  • Developed Error Handling mechanism in Informatica using Lookup, Expression, Normalizer and Filter transformations by generating Error Codes and Error Messages for various error scenarios and passed clean data into the next level.
  • Involved in preparing high-level design, testing and ETL Mapping Specification documents.
  • Wrote Oracle packages and functions to implement business rules and transformations after staging is done.
  • Investigating and fixing the bugs occurred in the production environment and providing the on-call support.

Environment: Informatica Power Center 10.1/10.2, Wherescape, Oracle, SQL server, Agile, Teradata, DB2, MVS, SQL, Netezza, PL/SQL, Toad, SQL Loader, Unix, Flat files.

Confidential, CA

ETL/Informatica Developer

Responsibilities:

  • Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
  • Analyzing the relationships of flat files and extracting the analyzed systems, meeting wif end users and business units to define the requirements.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Developing the logical data models and physical data models wif experience in forward and reverse engineering using Erwin.
  • Designed the ETL processes using Informatica to load data from Mainframe Flat Files (Fixed Width and Delimited) to staging database and from staging to the target Oracle database.
  • Reviewing the requirements wif business and other application partners, doing regular follow ups and obtaining sign offs.
  • Extracting data from excel files, high volume of data sets from data files, Oracle, DB2, Salesforce.com (sfdc) using informatica ETL Mappings/SQL PLSQL scripts and loaded to data store area.
  • Used various transformations like Filter, Expression, Sequence Generator, Source Qualifier, Lookup, Router, Rank, Update Strategy, Joiner, Stored Procedure and Union to develop robust mappings in the Informatica Designer.
  • Created sessions, configured workflows to extract data from various sources, transformed data and loading into data warehouse.
  • Moving the data from source systems to different schemas and staging tables based on the dimensions and fact tables by using the slowly changing dimensions (SCD) type 2 and type 1.
  • Extensively working in the performance tuning of the programs, ETL procedures and processes.
  • Developing etl procedures to ensure conformity, compliance wif standards and lack of redundancy, translates business rules and functionality requirements into etl procedures.
  • Responsible for monitoring all the sessions dat are running, scheduled, completed and failed including debugging session failures.
  • Used Debugger to test the mappings and fixed the bugs.
  • Implementing various process of data clean up and data validation using Informatica data quality.
  • Designed the Process Control Table dat would maintain the status of all the CDC jobs and thereby drive the load of Derived Master Tables.
  • Developed various mappings by using reusable transformations.
  • Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Involved in Unit and integration Testing of Informatica Sessions, Batches, fixing invalid Mappings.
  • Involving in Information administration including creating new users and groups, backing up the repository and domain as well as handling various upgrades.
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
  • Using SQL to query Databases Performing various validations and mapping activities and built the Informatica workflows to load table as part of data load.
  • Developed and executed scripts to schedule loads, for calling Informatica workflows using PMCMD command.
  • Worked wif SQL, PL/SQL procedures and functions, stored procedures and packages wifin the mappings.
  • Wrote Queries, Procedures and functions dat are used as part of different application modules.
  • Created Informatica Technical and mapping specification documents according to Business standards.
  • Extensively working in the performance tuning of the programs, ETL procedures and processes.
  • Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.

Environment: Informatica developer 9.5.1, Informatica PowerCenter 8.6/9.0.1, SSIS, Oracle 10g/11g, SQL Server, UNIX, Windows XP.

Confidential, Carol Stream, IL

ETL/Informatica Developer

Responsibilities:

  • Developing new and maintaining existing Informatica mappings and workflows based on specifications.
  • Creating different parameter files and changed session parameters, mapping parameters, and variables at run time.
  • Extracted data from various heterogeneous sources like Oracle and SAP, MS SQL, Teradata.
  • Creating mapplets, reusable transformations, worklets and used them in different mappings, workflows.
  • Expertise in Informatica - Power Center Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Successfully loading data into different targets from various source systems like Oracle database, flat files, xml files etc. Into the staging table and then to the target database.
  • Analyze data and build reports using Informatica data profiling tool & Toad for Data Analyst tool so dat UHC members can make informed decisions.
  • Developed complex mapping using Informatica Power Center tool 9.6.
  • Created Rich dashboards using Tableau Dashboard and prepared user stories to create compelling dashboards to deliver actionable insights.
  • Extensive experience in developing stored procedures, functions, views and triggers, complex SQL queries using QL server, TSQL and Oracle PL/SQL.
  • Responsible for interaction wif business stakeholders, gathering requirements and managing the delivery.
  • Connected Tableau server to publish dashboard to a central location for portal integration.
  • Resolving design/development issues and interacting wif infrastructure support partners (DBA, Sys Admins).
  • Plan and execute deployments across all environments and writing HIVE table scripts using AWS S3 for L1, L2, L3.
  • Developed workflow dependency in Informatica using Event Wait Task, Command Wait Task and Email Task.
  • Design and develop methodologies to migrate multiple development/production databases from Sybase to Oracle 11g.
  • Installing, configuring and upgrading the Informatica PowerCenter TDM/ILM to 10.1 wif all EBFS.
  • Analyzing the business requirements wif Business Analyst to develop ETL procedures dat are consistent across all systems.
  • Designed and developed a dynamic, data driven and scalable object-oriented framework in C#, usingVisualStudio2010 dat runs Crystal Reports 2010 and SSRS 2008 R2 reports.
  • Extracting data from Oracle and SAP, MS SQL and performed Delta mechanism using Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformations and Update strategy transformations to load data into the target systems.
  • Used TOAD, SQL Developer and SQL Server management Studio to develop and debug procedures and packages.
  • Involved in developing the Deployment groups for deploying the code between various environment (Dev, QA, and Prod).
  • Responsible for creating reports based on the requirements usingSSRS2005 and created pre-SQL and post SQL scripts which need to be run at Informatica level.
  • Worked extensively wif session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading.
  • Extensively worked wif various Lookup caches like Static cache, Dynamic cache and Persistent cache.
  • Develop, test and maintain all ETL maps /scripts and physical data models.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD.
  • Collaborate and work wif business analysts and data analysts to support their data warehousing and data analysis needs.
  • Monitored and improved query performance by creating views, indexes, hints and sub queries.
  • Create Technical Design Document and detail designed documents.
  • Unit Test case preparation and unit testing and do the bug fixing and re-testing.
  • Used Debugger to fix the defects/ errors and data issues.

Environment: Informatica 9.6, Teradata 14, Oracle 11i, Teradata 14, SQL Assistant, TOAD, Tidal, UNIX, Citrix, JIRA, FIT, Dynamic SQL, Oracle SQL *Loader, Sybase ASE l. 2008 and Sun Solaris UNIX, OBIEE, Windows-XP

Confidential, Los Angeles, CA

ETL Developer

Responsibilities:

  • Working closely wif the business users to understand the requirements and converting them into project level technical capabilities.
  • Performance Tuning of SQL Queries and ETL Mappings. Extensively used PDO for optimizing the loads.
  • Converted procedures into mappings using various transformations like Source Qualifier, Expression, sorter, Update Strategy, Filter, Lookup, Aggregator, etc.
  • Involved in designing the tables structure and relations based on the requirements and created the new tables based on the design.
  • Working knowledge of HIPAA standards, EDI (Electronic data interchange), EDIFACT, Implementation and Knowledge of HIPAA code sets, ICD-9, ICD-10 coding and HL7.
  • Trained the offshore team on Informatica, Pl/SQL, Oracle coding, Tidal, Shell scripting standards and debugging the code.
  • Prepared Coding Standards, ETL Build Peer Review Checklist’s and Unit Test Case Templates for different work packages.
  • Coded PL/SQL stored procedures and successfully used them in the mappings.
  • Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process and also to schedule the automatic execution of workflows.
  • Scheduled the Jobs by using Informatica scheduler & Jobtrac.
  • Create Oracle 11g database databases and replicate Sybase schema objects to Oracle.
  • Troubleshoot PL/SQL procedures and functions to support corresponding Sybase functionalities.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Work wif various transformations like Normalizer, expression, rank, filter, group, aggregator, lookup, joiner, sequence generator, sorter, sql, stored procedure, Update strategy, Source Qualifier.
  • Design the Mapplet and reusable transformation according to the business needs.
  • Worked on the complex sql queries, pl/sql packages, Triggers.
  • Design and develop Informatica mappings including Type-I, Type-II, Type-IIIslowly changing dimensions.
  • Created the Tidal jobs and scheduled them to run daily, weekly, and monthly and to run in hourly intervals to automate the loads.
  • Supported the Production, UAT, QA, SIT deployments.
  • Supported and worked wif QA to test the stories in QA environment.
  • Coordinating the onsite and offshore teams on a daily basis.

Environment: Informatica 9.6 (Power Center & Data Quality IDQ), Teradata 13, Oracle 11i, Sybase, Teradata SQL Assistant, SQL Developer, Unix, Citrix.

Confidential

ETL Developer

Responsibilities:

  • Design & Development of ETL mappings using Informatica 9.1.
  • Provide technical support to ETL applications on Informatica 9.1, UNIX and Oracle.
  • Preparation and Review of Project Macro & Micro design based on the LM solution outline document.
  • Validating data files against their control files and performing technical data quality checks to certify source file usage.
  • Profiled the data using Informatica Data Quality (IDQ) and performed Proof of Concept.
  • Worked wif Informatica Data Quality (IDQ) tool kit, Analysis, data cleansing, data matching, data conversion, exception.
  • Used reference tables and rules created in Analyst tool.
  • Used various IDQ transformations like Standardizer, Match, Association, Parser, Weighted Average, Comparison, Consolidation, Decision, Expression.
  • Involved in designing the Mapplet and reusable transformation according to the business needs.
  • Designing and developing Informatica mappings including Type-I, Type-II, Type-III slowly changing dimensions (SCD).
  • Effectively used various tasks (Reusable & Non-Reusable), Command, Assignment, Decision, Event Raise, Event wait, Email, etc.
  • Identify performance bottlenecks, tuning queries, suggesting and implementing alternative approaches like range partitioning of tables.
  • Coding & testing the Informatica Objects & Reusable Objects as per Liberty Mutual BI standards.
  • Prepare High Level and Low-Level Design Documents
  • Worked wif Teradata sources/targets.
  • Used Sqoop to export data from HDFS to Teradata database.
  • Created Hive managed and external tables.
  • Performance tuning the hive queries and created pig scripts to process the files.
  • Used HDFS system to copy files from local to HDFS file system.

Environment: Informatica 9.1(Power Center & Data Quality IDQ), Teradata 13, Oracle, MS SQL SERVER 2008, OBIEE, Unix, Hadoop 1.1.0 and HDFS, Sqoop, HIVE, Pig.

We'd love your feedback!