We provide IT Staff Augmentation Services!

Sr. Etl Informatica Developer Resume

Los Angeles Ca Columbus, OH

PROFESSIONAL SUMMARY:

  • Over 8 years of experience in Information Technology with a strong background in Database development and Data warehouse and ETL process using Informatica Power Center 9.6/9.1/8.x/7.x/6.x/5.x.
  • Good knowledge of Data warehouse concepts and principles (Kimball/Inman) - Star Schema, Snowflake, SCD, Surrogate keys, Normalization/De normalization.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server, Sybase, COBOL files, XML Files and Worked on integrating data from flat files like fixed width and delimited.
  • Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center.
  • Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer.
  • Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators.
  • Strong Experience in developing Sessions/tasks, Worklets, Workflows using Workflow Manager Tools - Task Developer, Workflow & Worklet Designer.
  • Experience with slowly changing dimension methodology and slowly growing targets methodologies.
  • Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.
  • Experience in diversified fields in Data Warehousing. Worked on various projects involving Data warehousing using Informatica Power Center 7.1 and Power Center 9.1/8.1.6 (Workflow Manager, Workflow Monitor, Server Manager, Source Analyzer, Warehouse Designer, Mapping Designer, and Mapplet Designer & Transformation Developer).
  • Extensively used Informatica Repository Manager and Workflow Monitor.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Hands on experience in Performance Tuning of sources, targets, transformations and sessions.
  • Good experience in documenting the ETL process flow for better maintenance and analyzing the process flow.
  • Agile development environment including Scrum methodology. Expertise to follow agile process in application development.
  • Worked with Stored Procedures, Triggers, Cursors, Indexes and Functions.
  • Strong analytical and problem solving skills with the ability to quickly adapt to new environments and learn new technologies.
  • Maintained summaries or aggregates on data using Materialized Views.
  • Expertise in Analyzing system specifications and Business requirements.
  • Involved in various stages of Analysis, Design, Development, Testing, and Implementation of application development process.
  • Experienced in Database Design, Data Modeling, Migration, and Systems Architecture, Planning, Testing Query Optimization and Trouble Shooting.
  • Extensive experience in creating new reports and customization of existing Reports, Forms and PL/SQL Procedures in Oracle Applications.
  • Extensive experience in writing data conversion scripts for migrating legacy data into Oracle Applications using SQL*Loader and PL/SQL.
  • Snowflake schema architecture is a more complex variation of a star schema design. The main difference is that dimensional tables in a snowflake schema are normalized, so they have a typical relational database design.
  • Snowflake schemas are generally used when a dimensional table becomes very big and when a star schema can’t represent the complexity of a data structure. For example if a PRODUCT dimension table contains millions of rows, the use of snowflake schemas should significantly improve performance by moving out some data to other table (with BRANDS for instance).
  • Star schema architecture is the simplest data warehouse design. The main feature of a star schema is a table at the center, called the fact table and the dimension tables which allow browsing of specific categories, summarizing, drill-downs and specifying criteria.
  • A dimensional table is a collection of hierarchies and categories along which the user can drill down and drill up. It contains only the textual attributes
  • Fact Table contains the measurements or metrics or facts of business process. If your business process is "Sales”, then a measurement of this business process such as "monthly sales number" is captured in the Fact table. Fact table also contains the foreign keys for the dimension tables.
  • A dimensional table is a collection of hierarchies and categories along which the user can drill down and drill up. It contains only the textual attributes
  • Fact Table contains the measurements or metrics or facts of business process. If your business process is "Sales”, then a measurement of this business process such as "monthly sales number" is captured in the Fact table. Fact table also contains the foreign keys for the dimension tables.
  • Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
  • Excellent communication skills.

TECHNICAL SKILLS:

Operating Systems: Windows 2008/2007/2005/ NT/XP, UNIX, MS-DOS

ETL Tools: Informatica Power Center 10.1/9.6.1/9.1/8.6/8.5/8.1/7.1 (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server)

Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005, DB2 v8.1, Teradata 13.0/14.0.

Data Modeling tools: Erwin, MS Visio

Languages: SQL, PL/SQL, UNIX, Shell scripts, C++, SOAP UI, JSP, Web Services, Java Script, HTML, Eclipse

Scheduling Tools: Autosys, Control-M, TWS

Testing Tools: Quality Center, Clear case.

PROFESSIONAL EXPERIENCE:

Confidential, Los Angeles,CA/Columbus, OH

Sr. ETL Informatica Developer

Responsibilities:

  • ETL process performed using Informatica and the DB used are SQL SERVER, ORACLE 11g.
  • Extensively used Informatica Power Center 10.1 version to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.
  • Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
  • Implemented CDC mappings for salesforce source data.
  • Converted Business Object and data services (BODS) jobs into Informatica mappings.
  • Extensive Unix Shell Scripting knowledge.
  • Optimized performance by tuning the Informatica ETL code as well as SQL.
  • Based on the requirements, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update Strategy, Sorter, XML, Lookup, Salesforce lookup, Aggregator, Joiner and Stored Procedure transformations in the mapping.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Joiner, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
  • Agile development environment including Scrum methodology. Expertise to follow agile process in application development.
  • Involved in all kinds of testing unit, QA and user acceptance testing.
  • Written Queries, procedures, created Indexes, primary keys and data bases testing.
  • Defects were tracked, reviewed and analyzed.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Agile development environment including Scrum methodology. Expertise to follow agile process in application development.
  • Strong expertise in designing and developing Business Intelligence solutions in staging, populating Operational Data Store (ODS), Enterprise Data Warehouse (EDW), Data Marts / Decision Support Systems using Informatica Power Center 9.x/8.x/7.x/6.x ETL tool.
  • Expertise in Data Modeling using Star Schema/Snowflake Schema, OLAP/ROLAP tools, Fact and Dimensions tables, Physical and logical data modeling using ERWIN 4.x/3.x
  • Experience in documenting High Level Design, Low level Design, STM's, Unit test plan, Unit test cases and Deployment documents.

Environment: Informatica Power Center 10.1, ORACLE 11g, Salesforce, Business Object and data services (BODS), Active Batch, Windows XP, UNIX

Confidential, San Francisco, CA

ETL Informatica Lead Developer

Responsibilities:

  • Review existing code, lead efforts to tweak and tune the performance of existing Informatica processes
  • Conducted code reviews developed by my team mates before moving the code into QA.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.
  • Involved in meetings with key stakeholders of the business and discussed on strategies and roadmaps for Informatica Power Center implementation.
  • Architected and designed end to end process automation to resolve any reconciliation discrepancies based on technology pull.
  • Prepared a plan for OFSAA Informatica ETL and worked closely with Informatica Administrators and IT Teams to perform implementation.
  • Documented the OFSAA Informatica ETL process and worked closely with Informatica Technical support on some of the issues.
  • Extensively worked on Informatica Mappings, Sessions, Workflows and UNIX Shell scripts to provide end to end solution to ID Recertification process.
  • Implemented the Data masking in Dev and SIT environment.
  • Setup of Best Practices Standards in line with State Department and Velocity methodology
  • Maintain naming standards for ETL and DB objects as per the Client Requirement.
  • Involved in 'Performance tuning' in ETL objects, Oracle in order to meet SLA.
  • Automation of the processing layer Unix script.
  • Created the Test cases document and test case report handover to QA team as part of Dev exit.
  • Created the Unit test document and validated the data as part of Dev exit.
  • Created 'On boarding process' document for new members.
  • Created 'OFSAA survival document' for production support and new members.
  • Created 'Operating Level Agreement for file support' for post go-live.
  • Created 'Autosys Details of ETL and processing layer' for post go-live.
  • Responsible for creating the non production request to move the QC and data validation changes.
  • Analyzed data lineage and prepared documents for pre-stage to stage and vice versa table level granularity.
  • Conducted code walk through documents related to the assignments worked on.
  • Responsible for creating the CM's for migrating the code from one environment to other.
  • Taking Quick and Right decisions to fix the failures in order to meet SLA. Ensured that all support requests are properly approved, documented, and communicated. Documenting common issues and resolution procedures.
  • Creating and modifying UNIX shell scripts.
  • Bringing Up new innovations in the project to make the client satisfied.
  • Agile development environment including Scrum methodology. Expertise to follow agile process in application development.

Environment: PowerCenter Designer 9.5.1, Oracle 11g, Toad 10.0.0.41, Green plum, Confidential Service Manager (HPSM), Confidential Quality Center (HPQC), MKS, Non Production Request (NPR), Auto Sys, UNIX and Windows XP

Confidential, Brentwood, NY

Sr. ETL Informatica Developer

Responsibilities:

  • Extensive Unix Shell Scripting knowledge.
  • ETL process performed using Informatica and the DB used are SQL SERVER.
  • Extensively used Informatica Power Center 9.6 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
  • Conducted code reviews developed by my team mates before moving the code into QA.
  • Involved in design and development of complex ETL coding in an optimized manner.
  • Have created / developed different types of profiles like Column level profiling, Summary profiles, drill down profiles, Score cards, reports etc using IDE.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.
  • Redesigned some of the existing mappings in the system to meet new functionality.
  • Optimized performance by tuning the Informatica ETL code as well as SQL
  • Based on the requirements, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update Strategy, Sorter, XML, Lookup, Aggregator, Joiner and Stored Procedure transformations in the mapping.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
  • Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
  • ETL process performed using Informatica and the DB used are SQL SERVER, ORACLE 11g.
  • Developed Slowly Changing Dimensions for Type 1 SCD and Type 2 SCD
  • Written Queries, procedures, created Indexes, primary keys and data bases testing.
  • Defects were tracked, reviewed and analyzed.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.

Environment: Informatica Power Center 9.6, SQL SERVER, Flat Files, SQL Developer, Green Plum, Autosys, Windows XP, UNIX

Confidential, Reston, VA

Sr. ETL Informatica Developer

Responsibilities:

  • Understood the Business point of view to implement coding using Informatica Power Center 9.6/9.1
  • Extensively used Informatica Power Center 9.6/9.1 to create and manipulate source definitions, target definitions, mappings, mapplets, transformations, re-usable transformations, etc.
  • Involved in design and development of complex ETL coding in an optimized manner.
  • Have created / developed different types of profiles like Column level profiling, Summary profiles, drill down profiles, Score cards, reports etc using IDE.
  • Redesigned some of the existing mappings in the system to meet new functionality.
  • Optimized performance by tuning the Informatica ETL code as well as SQL
  • Based on the requirements, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update Strategy, Sorter, XML, Lookup, Aggregator, Joiner and Stored Procedure transformations in the mapping.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
  • Worked with the B2B Operation console to create the partners, configure the Partner management, Event Monitors and the Events
  • Designed Mappings using B2B Data Transformation Studio.
  • Exposure to Informatica B2B Data Exchange that allows to support the expanding diversity of customers and partners and their data with capabilities that surpass the usual B2B solutions
  • Exposure to Informatica B2B Data Transformation that supports transformation of structured, unstructured, and semi-structured data types while complying with the multiple standards which govern the data
  • Have RDBMS concepts experience in Oracle 11i/10g/9i/8i/7.x, WinSQL, SQL*Loader,Netezza
  • Extensively worked with Netezza database to implement data cleanup, performance tuning techniques.
  • Wrote SQL-Overrides and used filter conditions in source qualifier thereby improving the performance of the mapping.
  • ETL process performed using Informatica and the DB used are SQL SERVER, ORACLE 11g.
  • Developed Slowly Changing Dimensions for Type 1 SCD and Type 2 SCD
  • Written Queries, procedures, created Indexes, primary keys and data bases testing.
  • Informatica Extensively Worked in Processing Structured and Unstructured data.
  • Design and develop ETL programs to extract data from transactional database systems and load to the Data Warehouse using SSIS (SQL Server Integration System).
  • Efficient in creating SSIS Packages. Created packages to extract data from flat files, TeraData, Oracle and DB2 and transform the data according to the business requirements and load the data in SQL server tables.
  • Experience in building Data Integration, Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS) and Informatica.
  • Create and support data load processes via SSIS (SQL Server Integration System) and database refreshes.
  • Experienced in performing Incremental Loads and Data cleaning in SSIS. Expert in handling errors, logging using Event Handlers using SSIS.
  • Experienced in ETL (Extracting, Transforming and Loading) development SSIS for migration of data between legacy systems, ERP to SQL Server 2000/2005/2008.
  • All SSIS standards are followed to maintain reliability and scalability in the extraction.
  • Expert in creating SSIS packages for ETL (Extracting, Transforming and Loading) of data.
  • Knowledge of deploying SSIS project in Integration Services Catalog, Deploying SSIS package with File System and SQL server.
  • Researched and Analyzed complex SSIS packages which are used to populate Staging, Mapping and Datawarehouse Databases.
  • Actively involved in issues of 'Performance Improvement of the SSIS packages for Staging, Mapping and Datawarehouse build.
  • Deployed SSIS packages using Project Deployment Model.
  • Conducted code reviews developed by my team mates before moving the code into QA.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.
  • Informatica Data Transformation supports transformations and mappings, via XML
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Developed workflow tasks like reusable Email, Event wait, Timer, Command and Decision.
  • Documenting Enterprise Business Intelligence solutions.
  • Gathering data requirements and co originates with business users
  • Analyze the different data sources and identifying the relationships
  • Involved in creating logical data models
  • Created and scheduled sessions, jobs based on demand, run on time and run only once using Workflow Manager.
  • Created parameter files to Dev, Test and Prod environments.
  • Created Perl Scripts and called in Pre session and Post session commands.
  • Involved in various testing activities like database testing, unit testing, system testing, performance testing and was also responsible for maintaining of testing metrics, defect tracking.
  • Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
  • Used Teradata utilities Fastload, multiload, tpump to load data
  • Wrote BTEQ scripts to transform data
  • Wrote Fastexport scripts to export data
  • Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL
  • Involved as primary on-site ETL Developer during the analysis, planning, design, development, and implementation stages of projects using IBM Web Sphere software (Quality Stage v8.1, Web Service, Information Analyzer, Profile Stage, WISD of IIS 8.0.1).
  • Extensive Unix Shell Scripting knowledge.
  • ETL process performed using Informatica and the DB used are Oracle and Teradata.
  • Actively participated in data base testing like checking the constraints, correctness of the data, stored procedures, field size validation, etc
  • Identified and debugged the errors before deploying and worked on migration of the maps and workflows from development to UAT and from UAT to Production.

Environment: Informatica Power Center 9.6/9.1, SSIS,Oracle 11g, Flat Files, TOAD, Teradata, Autosys,Windows XP, UNIX,Web Services, DATA STAGE- IBM Web Sphere Data stage and Quality Stage 8.

Confidential, Palo Alto, CA

Sr. ETL Informatica Consultant

Responsibilities:

  • Designed data structures for storage and programmed logic for data flow between the various stages of toll transaction processing.
  • Attended multiple requirement gathering sessions with source system teams and business users.
  • Maintained and tracked weekly status call with team and finished the deliveries within given timeline.
  • Developed mappings to extract data from SQL Server, Oracle, Flat files, XML files, and loaded into Data warehouse using the Mapping Designer.
  • Used Source Analyzer and Warehouse Designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Experienced with Informatica PowerExchange for Loading/Retrieving data from mainframe systems.
  • Understanding & Working knowledge of Informatica CDC (Change Data Capture).
  • Extensive Unix Shell Scripting knowledge.
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
  • Developed Slowly Changing Dimensions for Type 1 SCD and Type 2 SCD.
  • Developed various Stored Procedures, Functions, Packages and Materialized views for ETL needs.
  • Developed unit test plans for each scenario and programmed for data validation and proper error handling to manage the transactions without any data loopholes.
  • Responsible for tuning the Informatica Mappings and Sessions for optimum performance.
  • Enhancing the system with change requests and fault fixes delivery.
  • Responsible for loading the flat files into Oracle database by using SQL* Loader and Informatica.
  • Documenting Enterprise Business Intelligence solutions.
  • Gathering data requirements and co originates with business users
  • Analyze the different data sources and identifying the relationships
  • Involved in creating logical data models
  • Database performance by optimizing back end queries, pl/sql tuning and implementing performance improvements by analyzing indexes, table partition, parallelism etc
  • Documentation of all changes made to enhance the code for new requirements.
  • Used external tables to manipulate data obtained on monthly basis for vehicle information before loading them into the tables.
  • Utilized Quest tool (Toad 8.0) for database monitoring and tuning.
  • Involved in writing complex report SQL queries, oracle forms and oracle reports solutions.
  • Involved in post production support activities.

Environment: Informatica 8.6, Erwin 7.1, Oracle 11g, PL/SQL developer, SQL*PLUS, SQL, SQL*Loader, Toad, ER-Studio, UNIX, CVS, HTML, XML.

Hire Now