We provide IT Staff Augmentation Services!

Senior Informatica Developer Resume

5.00/5 (Submit Your Rating)

Omaha, NE

PROFESSIONAL SUMMARY:

  • IT professional with 8 years of experience in Data Warehousing using Informatica Power Center 9.x/8.x/7.x (Source Analyzer, Target Designer, Mapping Designer, Mapplets Designer, Transformation Developer) and Power Exchange
  • Heavily involved in complete life cycle of enterprise data warehouse
  • Experienced in OLTP/OLAP system study, analysis and ER modeling, developing Database schemas like Star schema and Snowflake schema used in relational and multidimensional modeling by using Erwin/MS Visio
  • Strong Knowledge in Relational Database Concepts, Entity Relation Diagrams, Normalization and De - normalization Concepts
  • Ability to write complex SQL needed for ETL jobs and analyzing data, proficient with databases like Oracle 12i/11g/10g/9i, SQL Server 2012/2008/2005, Teradata, Netezza, DB2, Flat Files, Sybase, COBOL files and XML files
  • Experienced in creating interactiveUI Design(Siebel Answers, Delivers, and Dashboards) including experience in User Interface Design usingCSS, HTML, XML and Java Scripts
  • Developed complex mappings by using various transformations like Unconnected/ Connected lookups, Router, Filter, Expression, Sorter, Aggregator, Joiner, Union, Update Strategy and rarely used transformations like Java, Stored procedure etc.
  • Good exposure to Development, Testing, Debugging, Implementation, Documentation, End-user training and Production support
  • Expertise in Unit Testing, Integration Testing, System Testing and Data Validation for developed Informatica Mappings
  • Well versed in usingOBIEE Answersto create queries, format views, charts, and add user interactivity and dynamic content to enhance the user experience
  • Extensively worked with DBAs and BAs towardsperformance tuningin OBIEE and Data Warehouse environment usingcache management,aggregate tables, and indexes
  • Research oriented, raise issues upfront and address them as soon as they have been identified
  • Experienced in doing Error Handling and Troubleshooting using various log files
  • Experienced in using Batch, Perl and UNIX shell scripts
  • Strong knowledge in Mainframe, CA7 Scheduling
  • Quick learner with strong communication and people skills who can work in a team or independently

TECHNICAL SKILLS:

Data warehousing Tools: InformaticaPowerCenter 9.5/9.1/8.x/7.x, Informatica Developer/Data Quality 9.5/9.1, Informatica power exchange

Business IntelligenceTool: OBIEE 10.1.3.x, Siebel Analytics 7.x

Scheduler: Cisco Tidal and Control M

Databases: SQL Server 2012/2008/2005, Oracle 12i/11g/10g/9i, Teradata v2r6Netezza, Sybase and DB2

Languages: SQL, PL/SQL, C, C++, Java and HTML

ERP Tool: SAP

Database Utilities: TOAD 8.0/7.1, Aginity, Oracle SQL Developer, And SQL Server

Management: Studio, Teradata Studio

Operating System: Red Hat Linux and Windows XP/Vista/7.

Data Modeling: RalphKimball Methodology, Bill Inmon Methodology, Star SchemaSnow Flake Schema, Physical and Logical Modeling, Dimension Data Modeling, Fact Tables, Dimension Tables, Normalization and De Normalization

WORK EXPERIENCE:

Confidential, Omaha, NE

Senior Informatica Developer

Responsibilities:

  • Understanding of business requirements and enhancing the existing data warehouse architecture
  • Used Informatica Designer to create, Load, Update mappings using different transformations to move data to different data marts in Data Warehouse
  • Successfully Loaded Data into different targets from various source systems like Oracle Database, Flat files, ODS, SQL Server...etc into the Staging area and then into Netezza target
  • Developed and documented data mappings, transformations and Informatica sessions
  • Developed PL/SQL stored procedures and triggers for various data cleansing activities
  • SQL*Loader was used to load the data from files
  • Created source, target, transformations, sessions, batches and defined schedules for the sessions
  • Worked on Cisco Tidal for scheduling and Automating the Informatica workflows
  • Used Shell Scripting to automate the loading process
  • Used incremental loading technique to load incremental data into enterprise data warehouse.
  • Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
  • Worked on powerful data profiling using Informatica Developer, to empower data analysts to investigate and document data quality issues.
  • Involved in version control of code from development to test and Production environments.
  • Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Source, Target, Mapplet, and Transformation objects.
  • Involved in Unit & Integration Testing of Mappings & Sessions.
  • Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
  • Involved in migrating Mappings, Sessions and Workflows between development, test and production environments.
  • Scheduled Sessions and Batch Processes based on demand, run on time, run only once using Informatica Scheduler
  • Troubleshoot various reports using and different charts including Line and Pie Charts for analysis
  • Designed OBIEE reports with Slice and Dice & Drill down analysis
  • Worked with DBAs and BAs towardsperformance tuningin OBIEE and Data Warehouse environment usingcache management,aggregate tables, and indexes

Environment: ETL/Informatica Power Center 9.5/9.1, Informatica Power Exchange 9.1, Oracle 12i, Teradata, Neteeza, PL/SQL, XML, Flat files, UNIX, Toad, SQL Developer, Cisco Tidal, DB2, Perl, Batch, Windows Server 2008

Confidential, Miami, FL

Senior Informatica Developer

Responsibilities:

  • Responsible for gathering Business requirements and translate into technical documents
  • Developed Mapping by using various transformations to suit the business user requirements and business rules to load and eliminate unrequired data from Oracle and other data sources into Teradata target
  • Extensively used Teradata SQL Assistant to extract data from Data warehouse
  • Expertise in debugging and production support
  • Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Control, Command, Decision, Session in the workflow manager
  • Made use of Post-Session success and Post-Session failure commands in the Session task to execute scripts needed for cleanup and update purposes
  • Extensively worked with Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer
  • Developed mappings/sessions using Informatica Power Center 9.1/ Power exchange for data loading.
  • Implemented error handling technique
  • Involved in data quality profiling, standardization and testing
  • Monitored daily loads and provided on call production support (L2) during data Loads.
  • Migrated repository objects, and scripts from development environment to QA/production environment. Extensive experience in troubleshooting and solving migration issues and production issues.
  • Used transformations like Joiner, Expression, Connected, Unconnected lookups, Filter, Aggregator, Stored Procedure, Rank, Update Strategy, Router, Sorter, Sequence generator etc

Environment: ETL/Informatica Power Center 9.1, Informatica Power Exchange, Oracle 11g, Teradata, PL/SQL, XML,Flat files, UNIX, Toad, XML files, DB2, Perl, Shell Scripting, HPQC

Confidential, Boston, MA

ETL/Informatica Developer

Responsibilities:

  • Involved in Data transfer from OLTP systems forming the extracted sources
  • Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system
  • Analyzed the sources, transformed the data, mapped the data and loaded the data into targets using Power Center Designer
  • Designed and developed Oracle PL/SQL Procedures and Shell scripts
  • Participated in the design of Snowflake schema data model
  • Designed and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts
  • Worked on Informatica Utilities - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer
  • Worked with various transformations like Source Qualifier, Expression, Filter, Aggregator, Lookup, Update Strategy, Stored Procedure, Sequence generator, Joiner transformations
  • Responsible for creating business solutions for Incremental and full loads
  • Involved in creating Shell Scripts to automate Pre-Session and Post-Session Processes
  • Developed workflows, sessions and job groups to schedule the loads at required frequency using cisco tidal scheduler
  • Involved in data quality profiling, standardization and testing

Environment: Informatica Power Center 8.6.1, Oracle 10g, SQL, PL/SQL, Quest TOAD and Windows Server 2003

Confidential, Dublin, Ohio

ETL/Informatica Developer

Responsibilities:

  • Acted as a key contributor for Production support and involved in supporting the Data Integration application for ETL
  • Referenced Functional Design Documents (FDS), Systems Design Specification (SDS) to understand the overall application
  • Worked as onsite lead in the Onsite/Offshore Model Production Support
  • Monitoring the data loads and Enhancement of existing application for business changes/ improvement/audits
  • Performance Tuning of the mappings to handle increasing data volume
  • Resolved different issues in the support environment like code issues, connectivity issues, data issues, scheduling conflicts, managing dependency jobs, space issues on the Unix server or databases, capacity planning, system maintenance etc so worked with appropriate teams in resolving the production issues in a timely manner by identifying the issues.
  • Responsible for proper knowledge transfer from development team and ensure all the required documents are handed over.
  • Responsible for maintaining different logs like mapping status document, Issue log, Defect detection log etc.
  • Responsible for analyzing the Error table data as well as rejected data and fix them.
  • Played an active role in preparing the ETL specification documents for customization.
  • Managing the work load timely by distributing it appropriately among the team and prioritizing the severity issues in a timely manner.
  • Able to quickly investigate/identify the root causes of the issue, if required reach to other teams if it’s an external issue and fix it ASAP.

Environment: Informatica 8.1, Oracle 9.1, Teradata V2R5/ V2R6, SAP R/3, SQL server 8.0, UNIX and Unicenter.

Confidential, Detroit

Sr. Informatica Developer

Responsibilities:

  • Understanding the customer requirements and analyzing or resolving the discrepancies in the business requirements
  • Worked to design the proper architecture which can isolate the application component (business context) of thedata integrationsolution from the technology, it also helps for reuse - reuse of skills, design objects, and knowledge
  • Prepared the Technical Design documents as per the Functional Requirements document and Logical Data model
  • Worked on IDE/ IDQ for creating profiles in identifying different patterns of source data and examined with business team for correcting/ Analyzing
  • Extensively used Informatica to load data from VSAM, flat files and DB2
  • Developed Power Exchange data maps that were used to pull the data from files on the Mainframe / VSAM files
  • Developed Shell Scripts for retrieving files from FTP server, achieving the source files, Concatenating files and finally to deliver them to remote shared drive
  • Developed the error Logic for streamlining, automating the data loads for cleansing incorrect data and developed Auditing mechanism to maintain the load statistics of transactional records
  • Designed and developed standard load strategies to load data from source systems to Atomize Database which is the final target system.
  • Prepared documentation on the design, development, implementation, daily loads and process flow of the mappings and participating in review design documents
  • Used IBM DB2 Control center, Control editor to check the table design, records populated in the DB and used to test the data to check whether it is correctly loaded into the required schemas as per the business requirements
  • Extensively involved in performance tuning using various components in the mappings, sessions or database tables and used Parameter files, Variables, cache mechanism and using SQL overrides

Environment: Informatica Power Center 7.1.1, Mainframe source system, DB2, AIX OS as platform and CA7 Scheduling tool.

We'd love your feedback!