We provide IT Staff Augmentation Services!

Informatica Etl Developer & Etl Tester Resume

2.00/5 (Submit Your Rating)

San Francisco, CA

PROFESSIONAL SUMMARY:

  • Over 13 years of experience in Requirement gathering, Gap analysis, Designing, Coding, testing, implementation, Production support, Resource Management in Data warehousing with business knowledge of Banking, Insurance, Pharmaceutical & Telecom.
  • Good working Experience in SDLC (Software Development Life Cycle) methodologies like Waterfall and Agile.
  • Experience in development and design of ETL methodology for supporting data transformations and processing.
  • Experienced in using ETL tools including Power Center 9.5/9.1/8.6/8.1 , Power Mart and Power Exchange, Repository Manager and Administration console.
  • Worked with various transformations like Normalizer, Expression, rank, Filter, Group, Aggregator, Lookup, Joiner, Sequence Generator, Sorter, Sql, Stored Procedure, Update Strategy and Source Qualifier.
  • Experienced in Teradata SQL Programming.
  • Worked with Teradata utilities like Fast Load and Multi Load and Tpump and Teradata Parallel transporter.
  • Experienced with the MPP Datawarehouse Vertica.
  • Exposure to ETL tools such as data stage and SSIS/SSRS.
  • Experience in using Transformations, creating Informatica Mappings, Mapplets, Sessions, Worklets, Workflows and processing tasks using Informatica Designer / Workflow Manager.
  • Experienced in scheduling Informatica jobs using scheduling tools like Tidal, Autosys and Control - M.
  • Extensive experience in Netezza database design and workload management.
  • Experience in managing users and user groups in Netezza.
  • Developed effective working relationships with client team to understand support requirements.
  • Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Good command on Database as Oracle 11g/10g/9i/8i, Teradata 13, SQL Server 2008 and MS Access 2003.
  • Experience in all phases of Data Warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing, and documentation.
  • Good knowledge of Data modeling techniques like Dimensional/ Star Schema, Snowflake modeling and slowly changing Dimensions.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
  • Excellent interpersonal and communication skills, and experienced in working with senior level managers, business users and developers across multiple disciplines.
  • Effectively managed globally dispersed teams of up-to 20 members.

TECHNICAL SKILLS:

Data Warehousing /ETL: InformaticaPowerCenter 9.5/9.1/8.6/8.1 /7.1.2/7.1.1/7.0, InformaticaPowerMart9/8.x/7.x, (Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Repository manager), Metadata, DataMart, OLAP, OLTP, Cognos 7.0/6.0 and ERWIN 4.x/3.x.

Dimensional Data Modeling: Dimensional Data Modeling, Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin and Oracle Designer.

Databases &Tools: Oracle 11g/10g/9i/8i/8.x, Teradata, Vertica, DB2 UDB 8.5, SQL*Server, OEM Grid control, MS SQL Server 2005/2000/7.0/6.5, SQL*Plus, SQL*Loader and TOAD.

Scheduling Tools: Appworx 6.0, Informatica Workflow Manager, Autosys, Tidal, Maestro/Tivoli, Apache, Tomcat, IIS, Connect Direct NDM, etc.

Programming Languages: Unix Shell Scripting, SQL, PL/SQL, Java, HTML, DHTML and C.

Reporting Tools: Business Objects XI/6.5/6.0, Business Objects Universe Developer, Business Objects Supervisor, Business Objects Set Analyzer 2.0 and Cognos Series 7.0.

Environment: UNIX, Win XP/NT 4.0, Sun Solaris 2.6/2.7, HP-UX 10.20/9.0 and IBM AIX 4.2/4.3.

PROFESSIONAL EXPERIENCE:

Confidential, San Francisco, CA

Informatica ETL Developer & ETL Tester

Environment:: Informatica Power Center 9.6.1, SQL, Netezza, Oracle 11g, Tidal, Golden Gate,Unix, JIRA and SVN.

Responsibilities:

  • Design & Development of ETL mappings using Informatica Power center 9.6.1.
  • Architectural ETL Framework design and development to load CLAIM objects.
  • Develop ETL Complex code to formulate business rules for Member and Provider.
  • Developed and maintained ETL (Extract, Transform and Load) mappings to extract the data from source Netezza database and loaded it into Oracle tables.
  • Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager.
  • Worked extensively on Netezza database in Windows platform and contributed to building the customized ELT framework using Shell scripting.
  • Worked on transformations like Transaction Control, Lookup, Router, Sequence Generator and Update Strategy.
  • Involved in Implementation of SCD1 and SCD2 data load strategies.
  • Incremental build-out of the data to support evolving data needs by adopting the agile methodology.
  • Performed Gap Analysis, conducted walkthroughs and acted as a liaison between the business users, stakeholders and the team to perform requirements, quality and risk analysis.
  • Writing Data Validation scripts using complex sql, fluid queries to Netezza.
  • Developed the SQL scripts using NZSQL for Netezza and Procedures for the business rules using Unix Shell.
  • Used analytical and Windowing functions of Netezza to implement complex business logic.
  • Replication and Extraction of data and applying on production using Golden Gate.
  • Review all the development queries, performed optimization and query performance tuning Using various techniques for Netezza Database.
  • Successfully conducted Gap Analysis in order to estimate the benefits of the new system V/s the existing system.
  • Performed testing, identify and document bugs, debug and apply fixes to applications.
  • Involved in Unit, Integration, System, and Performance testing levels.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Created detailed Unit Test Document with all possible Test cases/Scripts.
  • Conducted code reviews on the code developed by my team mates before moving it into QA.
  • Actively participated in Scrum meetings, review meetings and developed test scenarios.
  • Create team specific agile process flow in JIRA to move tasks from one activity to another.
  • Maintained documents for mapping logic, developer’s code and UIT SQL scripts.

Confidential, San Ramon, CA

Sr. Informatica Developer

Environment: Informatica 9.5/9.x, Oracle 11g, SQL Server 2005, Netezza, HP-UX, Tidal and UNIX.

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
  • Develop the mappings using needed Transformations in Informatica tool according to technical specifications.
  • Created complex mappings that involved implementation of Business Logic to load data in to staging area.
  • Used Informatica reusability at various levels of development.
  • Involved in Database migrations from legacy systems, SQL server to Oracle and Netezza .
  • Developed mappings/sessions using Informatica Power Center 9.5 for data loading.
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Designed table structure in Netezza.
  • Created mappings, mapplets according to Business requirement using Informatica big data version and deployed them as applications and exported to power center for scheduling.
  • Created NZLOAD process to load data into DataMart in Netezza.
  • Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Building Reports according to user Requirement.
  • Extracted Data from Hadoop and Modified Data according to Business requirement and load into Hadoop.
  • Experienced in loading data between Netezza tables using NZSQL utility.
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
  • Implemented slowly changing dimension methodology for accessing the full history of accounts.
  • Write Shell script running workflows in UNIX environment.
  • Written UNIX shell scripts to load data from flat files to Netezza database.
  • Optimizing performance tuning at source, target, mapping and session level.
  • Participated in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

Confidential, Cincinnati, OH

ETL Informatica Developer

Environment: Informatica Power Center 9.1/8.6.1, Oracle 11g/10g, Teradata V2R5, Vertica, TOAD for Oracle, SQL Server 2008, PL/SQL, DB2, Netezza, SQL, Erwin 4.5, Business Objects, Unix Shell Scripting (PERL), UNIX (AIX), Windows XP and Autosys.

Responsibilities:

  • Interacted with the Business Users to analyze the Business Requirements, High Level Document (HLD) and Low-Level Document (LLD) and transform the business requirements into the technical requirements.
  • Contributed towards documenting EDW process flows as per AGILE methodology.
  • Developed database Schemas like Star schema, Snowflake schema used in relational, dimensional and multidimensional data modeling using ERWIN and XSD (XML SCHEMA DEFINITION).
  • Designed and Developed Oracle PL/SQL Package for initial loading and processing of Derivative Data.
  • Worked with various Informatica client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Workflow Manager.
  • Implemented weekly error tracking and correction process using Informatica.
  • Develop and execute load scripts using Teradata client utilities MULTILOAD, FASTLOAD and BTEQ.
  • Created data load process to load data from OLTP sources into Netezza.
  • Designed table structure in Netezza.
  • Created external tables in NZLOAD process in Netezza.
  • Experience in installation and configuration of core Informatica MDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server and Cleanse Adapter in Windows.
  • Performed Gap Analysis to identify and document the gaps between the existing system and new compliant system.
  • Used massively parallel processing (MPP) architectures to provide high query performance and platform scalability.
  • Developed Unix Korn shell wrapper scripts to accept parameters and scheduled the processes using Autosys.
  • Conducted workflow, process diagram, data analysis and Gap Analysis to derive requirements for existing systems enhancements.
  • Documented "As-is" and "To-be" process maps as part of Gap Analysis for new functionality requirements, and thereafter prioritized them in order to align them with the project requirement.
  • Developed Business Objects in accordance to client’s needs and requirements and implement Business Objects development and testing.
  • Performed Data Quality checks, and developed ETL and Unix Shell Script processes to ensure flow of data of the desired quality.
  • Performed Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
  • Actively involved in the Production support and also transferred knowledge to the other team members.

Confidential, Columbus, OH

ETL Developer

Environment: Informatica power center 8.6, Business Objects 6.0/5i, DB2 7.0 Oracle 9i, Sybase, SQL Server, SQL*Loader, Windows NT/2000, Erwin 3.5.2.

Responsibilities:

  • Used Informatica Designer for developing mappings using transformations, which includes aggregation, Updating, lookup, and summation.
  • Developed sessions using Work flow Manager and improved the performance details.
  • Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.
  • Involved in Data modeling and design of data warehouse in star schema methodology with confirmed and granular dimensions and FACT tables.
  • Analyzed the source data coming from Oracle, Flat Files, and DB2 coordinated with data warehouse team in developing Dimensional Model.
  • Used transformations like Aggregate, Expression, Filter, Sequence Generator, Router, Joiner, Lookup and Stored procedure transformations.
  • Used the repository manager to give permissions to users, create new users and repositories.
  • Monitoring custom business logic to address the challenges of managing the Application Server’s Infrastructure as well as monitoring the performance of the actual business logic.
  • Fine-tuned Transformations and mappings for better performance.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Extensively used Informatica for loading the historical data from various tables for different departments.
  • Responsible for presenting the Data Warehousing concepts and tools to their prospective clients
  • Created database connections and created Repository and domains.

Confidential, Conshohocken, PA

Informatica Developer

Environment: Informatica Power Center 8.1, Windows XP, PL/SQL, Excel, SQL Server 2005.

Responsibilities:

  • Designed and created ETL mappings using Informatica Mapping Designer.
  • Followed Iterative Waterfall model for Software Development Life Cycle Process (SDLC).
  • Worked extensively on performance tuning by making changes to SQL in source Qualifier.
  • Used various Informatica transformations like expressions, filters, joiners, aggregators, routers and lookups to load better and consistent data into targets.
  • Fine-Tuned existing Mappings for better performance, Documented Mappings and Transformations.
  • Responsible for Technical documentation of ETL process.
  • Interacted with end users and gathered requirements.
  • Wrote procedures, functions in PL/SQL, troubleshooting and performance tuning of PL/SQL scripts.
  • Creation of facts and dimensions according to the business requirements.
  • Monitoring and improving performance of daily jobs on Oracle database.
  • Identifying integrity constraints for tables.
  • Developed procedures and functions using PL/SQL and developed critical reports.

Confidential, Rocky Hill, CT

Data Modeler/ Data Analyst

Environment: Informatica Power Center 8.6.1, Teradata, SQL Server, oracle, PL/SQL, SQL Developer, Toad and UNIX.

Responsibilities:

  • Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
  • Performed in team responsible for the analysis of business requirements and design implementation of the business solution.
  • Developed logical and physical data models for central model consolidation.
  • Worked with DBAs to create a best fit physical data model from the logical data model.
  • Conducted data modeling JAD sessions and communicated data-related standards.
  • Used Erwin r8 for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Redefined many attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of data analysis responsibilities.
  • Developed process methodology for the Reverse Engineering phase of the project.
  • Used reverse engineering to connect to existing database and create graphical representation (E-R diagram).
  • Utilized Erwin’s reverse engineering and target database schema conversion process.
  • Involved in logical and physical designs and transforms logical models into physical implementations.
  • Created 3NF business area data modeling with de-normalized physical implementation data and information requirements analysis using ERWIN tool.
  • Involved in extensive data analysis on Teradata, and Oracle Systems Querying and Writing in SQL and Toad.
  • Involved using ETL tool Informatica to populate the database, data transformation from the old database to the new database using Oracle and SQL Server.
  • Involved in different team review meetings.
  • Involved in the creation, maintenance of Data Warehouse and repositories containing Metadata.
  • Developed Star and Snowflake schemas based dimensional model to develop the data warehouse.
  • Performed unit testing and tuned for better performance.
  • Involved in the critical design review of the finalized database model.
  • Involved in the study of the business logic and understanding the physical system and the terms and condition for database.
  • Worked closely with the ETL SQL Server Integration Services (SSIS) Developers to explain the Data Transformation.

Confidential, Dallas, TX

Data Analyst

Environment: Erwin, MS Excel, Oracle 10g, SQL, PL/SQL, MS VISIO, SQL*Loader and UNIX.

Responsibilities:

  • Worked extensively on Workflow Manager, Workflow Monitor and Worklet Designer to create edit and run workflows, tasks, shell scripts.
  • Used various transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement complex business logic.
  • Developed complex mappings/sessions using Informatica Power Center for data loading.
  • Extensively used aggregators, lookup, update strategy, router and joiner transformations.
  • Analyzed the business requirements of the project by studying the Business Requirement Specification document.
  • Created a logical design and physical design in Erwin
  • Created ftp connections, database connections for the sources and targets.
  • Maintained security and data integrity of the database
  • Creation of database objects like tables, views, Materialized views, procedures, packages using Oracle tools like PL/SQL, SQL* Plus, SQL*Loader and Handled Exceptions.
  • Involved in database development by creating Oracle PL/SQL Functions, Procedures and Collections.
  • Worked extensively with XML schema generation.
  • Participated in Performance Tuning using Explain Plan and Tkprof.
  • Extensively used Erwin for developing data model using star schema methodologies.
  • Created Unix Shell Scripts for automating the execution process.
  • Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.

We'd love your feedback!