We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

El Segundo, CA

SUMMARY:

  • Over 8+years of experience in Information Technology as an ETL Developer, experience in all phases of life cycle of Requirement Analysis, Functional Analysis, Design, Development, implementation, Testing, Debugging, Productions Support, and Maintenance of various Data Warehousing Applications.
  • 8 years of experience in designing and development of ETL Methodology using Informatica Power Center 9.x/8.x, Informatica Power Exchange 9.x, Informatica Data Quality 9.x.
  • Experience in creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets and Data marts and Data warehouse.
  • Expert in performance tuning of Informatica mappings, identifying source and target bottlenecks.
  • Extensive experience in integration of various data sources like ORACLE 12c/11g/10g, Teradata 14/13, Netezza, UDB DB2, Mainframes, SQL server 2012, SAP, Sybase, Informix, MySQL, Flat Files, MQ series and XML, Flat files
  • Expertise in Teradata utilities like Multi Load, Fast Load, Fast Export, BTEQ, TPUMP, TPT and tools like SQL Assistant, Viewpoint
  • Well - versed in tuning Teradata ETL queries, remediating excessive statistics, resolving spool space issues, applying compression for space reclamation etc.
  • Experience in Bill Inmon and Kimball data warehouse design and implementation methodologies
  • Expertise in OLTP/OLAP System Study, E-R modeling, developing Database Schemas (Star schema and Snowflake schema) used in relational and dimensional modeling
  • Experience on Informatica Cloud Integration for Amazon Redshift and S3
  • Implemented Slowly Changing Dimensions (SCD) Type 1 and Type 2 for initial and history load using Informatica.
  • Expert in Performance tuning, troubleshooting, Indexing and partitioning techniques on Sources, Targets, Mappings and Workflows in Informatica.
  • Experienced in Teradata Parallel Transporter (TPT). Used full PDO on Teradata and worked with different Teradata load operators.
  • Designing and developing Informatica mappings including Type-I, Type-II, Type-III slowly changing dimensions (SCD).
  • Experienced in using advanced concepts of Informatica like push down optimization (PDO).
  • Validating data files against their control files and performing technical data quality checks to certify source file usage.
  • Very good in data modeling knowledge in Dimensional Data modeling, Star Schema, Snow-Flake Schema, FACT and Dimensions tables.
  • Experienced in writing SQL, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views.
  • Experience in Performance Tuning and Debugging of existing ETL processes.
  • Experience in working with Power Exchange to process the VSAM files.
  • Hands on experience in writing UNIX shell scripts to process Data Warehouse jobs.
  • Coordinating with Business Users, functional Design team and testing team during different phases of project development and resolving the issues.
  • Good Knowledge of Hadoop Ecosystem (HDFS, HBase, Spark, Scala, Hive, Pig, Flume, NoSQL, MapReduce etc.)
  • Good skills in defining standards, methodologies and performing technical design reviews.
  • Excellent communication skills, interpersonal skills, self-motivated, quick learner, team player.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6x/8.x, Informatica PowerExchange9.x/8.x, Informatica Data Quality 9.x

Languages: C, C++, SQL, PL/SQL, HTML, XML, UNIX Shell Scripting, Python, Javascript

Methodology: Agile RUP, SCRUM, Waterfall

Databases: Oracle 11g/10g, SQL Server 2012/2008/2005/2000 DB2, Teradata 14/13, UDB DB2,Sybase, MySQL relational database management system

Operating Systems: Windows NT, 2003, 2007, UNIX, Linux, macOS High Sierra(10.13.4 )

IDEs: PL/SQL Developer, TOAD, Teradata SQL Assistant

Modelling Tool: Erwin 9.1/7.2, MS Visio

Control: m, Autosys

Hadoop / Big Data: Cloudera, HDFS, HBaase, Spark, Scala, Hive, Pig, Flume, NoSQL, MapReduce

Reporting: Tableau 9.2, Cognos 9/8

Web Technologies: HTML, CSS, JavaScript, JQuery, API and Ajax, OOP, WordPress, Bootstrap, Flask and Django

Cloud Technologies: Amazon Web Services

Others Tool: JIRA, Notepad++, Teradata SQL Assistant, Teradata view point MS office, T-SQL, TOAD, SQL Developer, XML Files, Icescrum, JIRA, Control - M, Autosys, GitHub, ORACLE ERP, PUTTY, SharePoint, SVN

PROFESSIONAL EXPERIENCE:

Confidential, EL Segundo,CA

Sr. Informatica Developer

Responsibilities:

  • Analyze business requirements, technical specification, source repositories and data models for ETL mapping and process flow.
  • Worked extensively with mappings using expressions, aggregators, filters, lookup, joiners, update strategy and stored procedure transformations.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
  • Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
  • Extracted data from a web service source, transform data using a web service, and load data into a web service target.
  • Experience in real time Web Services which performs a lookup operation using key column as input and provided response with multiple rows of data belonging to key.
  • Used Web Service Provider Writer to send Flat file target as attachments and also for sending email from within a mapping.
  • Coordinate and develop all documents related to ETL design and development.
  • Involved in designing the Data Mart models with ERwin using Star schema methodology.
  • Used repository manager to create repository, user’s groups and managed users by setting up privileges and profile.
  • Used debugger to debug the mapping and corrected them.
  • Performed Database tasks such as creating database objects (tables, views, procedures, functions).
  • Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
  • Optimized the mappings and implementing the complex business rules by creating re-usable transformations and Mapplets.
  • Involved in writing BTEQ, MLOAD and TPUMP scripts to load the data into Teradata tables.
  • Optimized the source queries in order to control the temp space and added delay intervals depending upon the business requirement for performance
  • Used Informatica workflow manager for creating, running the Batches and Sessions and scheduling them to run at specified time.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
  • Preparing Functional Specifications, System Architecture/Design, Implementation Strategy, Test Plan & Test Cases.
  • Implemented and documented all the best practices used for the data warehouse.
  • Improving the performance of the ETL by indexing and caching.
  • Created Workflows, tasks, database connections, FTP connections using workflow manager.
  • Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs.
  • Code walks through with team members.
  • Developed stored procedures using PL/SQL and driving scripts using Unix Shell Scripts.
  • Created UNIX shell scripting for automation of ETL processes.
  • Used UNIX for check in’s and checkouts of workflows and config files in to the Clear case.
  • Automated ETL workflows using Control-M Scheduler.
  • Involved in production deployment and later moved into warranty support until transition to production support team.
  • Experience in monitoring and reporting issues for the Daily, weekly and Monthly processes. Also, work on resolving issues on priority basis and report it to management.

Environment: Informatica Power Center 9.6.1, IDQ 9.6.1, Oracle 11g, Teradata 14.1.0, WebService, Teradata SQL Assistant, MSSQL Server 2012, DB2, Erwin 9.2, Control-M, Putty, Shell Scripting, Clearcase, WinSCP, Notepad++, JIRA, Hyperion Server, OBIEE Reporting.

Confidential, SAN JOSE, CA

Informatica Developer/ Data Analyst

  • Participated in the Design Team and user requirement gathering meetings.
  • Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, Mapplets, transformations, re-usable transformations.
  • Created different source definitions to extract data from flat files and relational tables for Informatica Power Center.
  • Used Star Schema approach for designing of database for the data warehouse
  • Developed a standard ETL framework to enable the reusability of similar logic across the board.
  • Created different target definitions using warehouse designer of Informatica Power center.
  • Created different transformations such as Joiner Transformations, Look-up Transformations, Rank Transformations, Expressions, Aggregators and Sequence Generator.
  • Created stored procedures, packages, triggers, tables, views, synonyms, and test data in Oracle.
  • Extracted source data from Oracle, SQL Server, Flat files, XML files using Informatica, and loaded into Netezza target Database.
  • Extensively transformed the existing PL/SQL scripts into stored procedures to be used by Informatica Mappings with the help of Stored Procedure Transformations.
  • Used PL/SQL whenever necessary inside and outside the mappings.
  • Created Models based on the dimensions, levels and measures required for the analysis.
  • Validate the data in warehouse and data marts after loading process balancing with source data.
  • Created, launched & scheduled sessions.
  • Fixed Performance issue in Informatica mappings.
  • Teradata utilizes like BTEQ, FASTLOAD, MLOAD, and FASTEXPORT.
  • Customized UNIX shell scripts for file manipulation, ftp and to schedule workflows.
  • Created design specification which includes BI dependency plan, job scheduling and cycle management documents
  • Worked closely with the business analyst’s and Production Support to resolve any JIRA issue.
  • Co-ordinated offshore team on daily basis to leverage faster development.

Environment: Informatica Power Center 9.0.1, Informatica Power Exchange 9.0.1, Informatica Data Quality 9.0.1, Cognos 9.0, Linux, SQL, PL/SQL, Oracle 11g, TOAD, Teradata, Aginity Workbench for Netezza, SQL Server 2012, Control M, Shell Scripting, XML, SQL Loader

Confidential

Informatica Developer

  • Using Informatica Designer designed Mappings, which populated the Data into the Target Star Schema on Oracle Instance.
  • Optimized Query Performance, Session Performance and Reliability.
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy Transformations.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
  • Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager.
  • Scheduled the batches to be run using the Workflow Manager.
  • Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.
  • Performed Unit testing and Integration testing on the mappings in various schemas.
  • Optimized the mappings that had shown poor performance
  • Monitored sessions that were scheduled, running, completed or failed. Debugged mappings for failed sessions.
  • Involved in writing UNIX shell scripts for Informatica ETL tool to run the sessions.
  • Coordinated between the developments and testing teams for robust and timely development of fully integrated application.
  • Constantly monitored application attributes to ensure conformance to functional specifications.
  • Mentored the development members on ETL logic and performed code and document reviews

Environment: Informatica Power Center 8.6, Informatica, SQL Server 2008, Oracle 10g, Shell Scripts, Erwin, TOAD, UNIX, Cognos 9, SQL, PL/SQL, UNIX, Toad, SQL Developer, HP Quality Center

Confidential

Informatica Developer

Responsibilities:

  • Used Informatica ETL to load data from flat files, which includes fixed - length as well as delimited files and SQL Server to the Data mart on Oracle database.
  • Used reverse engineering in Erwin 4.x to understand the existing data model of the data warehouse.
  • Worked with creating Dimensions and Fact tables for the data mart.
  • Created Informatica mappings, sessions, workflows, etc., for loading fact and dimension tables for data mart presentation layer.
  • Have implemented SCD (Slowly Changing Dimensions) Type I and II for data load.
  • Did performance tuning of Informatica components for daily and monthly incremental loading tables.
  • Developed Mapplets, reusable transformations, source and target definitions and mappings using Informatica 7.1.
  • Developed mapping using parameters and variables.
  • Created complex workflows, with multiple sessions, worklets with consecutive or concurrent sessions.
  • Used Timer, Event Raise, Event Wait, Decisions, and Email tasks in Informatica Workflow Manager.
  • Used Workflow Manager for creating validating, testing and running sequential and concurrent batches.
  • Implemented source and target based partitioning for existing workflows in production to improve performance so as to cut back the running time.
  • Analyzed workflow, session, event and error logs for trouble shooting Informatica ETL process.
  • Worked with Informatica Debugger to debug the mappings in Informatica Designer.
  • Involved in creating test plans, test cases to unit test Informatica mappings, sessions and workflows.
  • Involved in migrating Informatica ETL application and Database objects through various environments such as Development, Testing, UAT and Production environments.
  • Documented and presented the production/support documents for the components developed when handing-over the application to the production support team.

Environment: Informatica PowerCenter 8.1, Workflow Manager, Workflow Monitor, Erwin 4.0/3.5.2, TOAD 8.6.1.0, PL/SQL, Flat files, XML, Oracle 10g/9i

Hire Now