We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

Tucson, AZ

SUMMARY:

  • 9+ of ETL development experience including experience with Informatica PowerCenter Data Warehousing implementations and data integration experience in developing ETL mappings and scripts using Informatica PowerCenter 9.5.1/8.6.1/7.1 and Power Exchange 9.5.1/8.6. Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
  • Strong knowledge of Entity - Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
  • Extensively worked on Data extraction, Transformation and Loading data from various sources like Oracle, SQL Server and Flat files (.txt, .dat, .csv & xml).
  • Proficient in Optimizing Database querying, data manipulation and population using SQL PL/SQL and Utilities in Oracle 11g/10g/9i, Teradata 13/12/V2R6, DB2 UDB and SQL Server 2008/2000, Sybase databases.
  • Experience in integration of various data sources like Oracle, DB2, Sybase, SQL server and MS access and non-relational sources like flat files into staging area and DWH DB.
  • Experience in creating Reusable components like transformation, mapplets and tasks.
  • Loading data from various data sources and legacy systems into Teradata production warehouse using utilities BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD.
  • Experience in writing, testing and implementation of the PL/SQL triggers, stored procedures, functions, packages.
  • Excellent working knowledge of UNIX shell scripting, job scheduling on multiple platforms, experience with UNIX command line and LINUX.
  • Worked on CDC using Informatica PowerExchange to load data in Real-time mode.
  • Proficient in interaction with the business users by conducting meetings with the clients in Requirements Analysis phase.
  • Assign work and provide technical oversight to onshore and offshore developers
  • Excellent analytical/communication skills and good team player.
  • Worked on workflow parameters & variables to run the workflow dynamically.
  • Have good experience by communicating with customer's regarding the requirements, issues & tasks.
  • Good understanding of Data Warehousing Concepts.
  • Created test cases for the mappings developed and then created unit test plan document.
  • Have hands on experience in Team Leading / Team coordination / Problem solving.
  • Hands on experience in database programming Complex SQL queries.
  • Worked at client location & coordinate team activities.
  • Excellent communication, documentation and presentation skills working in a team and onsite-offshore model.

WORK EXPERIENCE:

Sr. Informatica Developer

Confidential, Tucson, AZ

Responsibilities:

  • Prepared design specification documents as per the inputs received from the Architect and the Business Analyst.
  • Extracted data from Heterogeneous source systems like Oracle, SQL Server and Flat files with fixed width and delimited.
  • Involved in Cleansing and Extraction of data and defined quality process for the warehouse
  • Developed Informatica ETL mappings, sessions and workflows based on the technical specification document.
  • Created Mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator
  • Designed and developed the logic for handling Slowly Changing Dimension tables load by flagging the record using update strategy for populating the desired
  • Developed reusable mapplets and transformations for reusable business calculations
  • Used exception handling logic in all mappings to handle the null values or rejected rows
  • Tuned the ETL components to gain the performance and to avoid business continuity.
  • Worked with Persistent Caches for Conformed Dimensions for the better performance and faster data load to the data warehouse
  • Involved in performance tuning and optimization of Informatica Mappings and Sessions using partitions and data/index cache to manage very large volume of data
  • Performed query overrides in Lookup Transformation as and when required to improve the performance of the Mappings
  • Developed Oracle PL/SQL components for row level processing.
  • Dropped & recreated Indexes before & after loading through pre-SQL& post-SQL
  • Developed Unix scripts for processing Flat files.
  • Scheduled the jobs in the Appworx.
  • Prepared Test Data and loaded it for Testing, Error handling and Analysis
  • Prepared the test cases and tested the ETL components for end to end process.
  • Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for Unit Testing, Systems Testing, expected results
  • Created an Issue Log to identify the errors and used it for preventing any such errors in future development works.
  • Worked on the production code fixes and data fixes
  • Responsible to troubleshoot the problems by monitoring all the Sessions that are scheduled, completed, running and used Debugger for complex problem troubleshooting.
  • Worked with Application support team in the deployment of the code to UAT and Production environments
  • Involved in production support working with various mitigation tickets created while the users working to retrieve the database.
  • Worked as a part of a production support team and provided 24 x 7 supports
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Mappings
  • Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and data.
  • Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.

Environment: Informatica Power Center 8.6, Informatica Data Quality 9.x, IDQ admin console, UC4, Oracle 11g, Toad for Oracle, MS SQL Server 2008, AIX server, Unix, Appworx, Winscp, Putty, Serena Version Manager

Informatica Developer

Confidential, Omaha, NE

Responsibilities:

  • Involved in creating mappings, sessions and workflows as per user requirements and tested it.
  • Performed Data conversions using various transformations such as Source Qualifier, Filter, Router, Aggregator, Union, Lookup, Update Strategy, Sequence Generator, Stored Procedure and configured according to Business Requirements.
  • Extensively used mapplets and Reusable Transformations for reusability of mapping Logic.
  • Used Stored Procedures to implement complex business logic in that basically writing PL/SQL stored Procedures.
  • Involved in writing queries for verifying the targets data using TOAD.
  • Worked on databases like Oracle, Sybase and DB2 to extract and load data.
  • Developed different reports using Oracle Application Express and PL/SQL.
  • Developed PL/SQL Scripts to write data into flat files and then loaded data from flat files to custom tables.
  • Designed and Created mappings for Initial/Incremental Loads.
  • Used debugger to debug mappings to gain troubleshooting information about data and error conditions.
  • Worked with SQL*Loader tool to load the bulk data into Staging Database.
  • Prepared UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings.
  • Prepared UTC for unit testing of mappings, system testing and user acceptance testing.
  • Defect Tracking and reports are done by Rational Clear Quest.

Informatica Developer

Confidential. Newark, DE

Responsibilities:

  • Participated and understand the business needs and implement the same into a functional database design and data integration.
  • Involved in designing real - time Data wareHouse schema and created logical and physical data model. Designed dependency constraints and index on OLAP tables.
  • Created High level design doc(HLD), ETL design doc(LLD), Unit test case (UTC) and Naming template.
  • Guiding the team members in developing the mappings.
  • Identified reusable logic across the tables load logic and created reusable components like reusables expression, look-up, mapplets, tasks and worklets.
  • Used Informatica defined functions (UDF) to reduce the code dependency.
  • Handled versioning and dependencies of various jobs in Informatica.
  • Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
  • Created Informatica mappings, workflows and reviewed codes of teammates.
  • Performing ETL & database code migrations across environments.
  • Created Informatica PowerExchange Registration and Datamap to capture data in real-time and batch mode.
  • Designed real-time data flows to various tables on 24/6 mode. Created restart-token, configured real-time workflows connection, recovery and error login process.
  • Provided production support by running the jobs and fixing the bugs.
  • Taking the backup of the repository at regular interval depending on the amount of work done.
  • Knowledge on Schedulers like Tivoli Work scheduler (TWS) and Job Scheduling Console Maestro).
  • Involved in admin tasks like analyzing Table space requirement, load balancing and performance.
  • Developed various Teradata utilities like FASTLOAD, MLOAD and TPUMP to load data to staging area.
  • Created Shared and Distributed Catalogs for the Users, also Created Calculations, conditions and Prompts in catalogs by using Oracle Procedures and Packages.
  • Created Materialized Views, Synonyms, standard SQL Hints template, Partitioning of Tables, Clustered and Non-clustered indexing.
  • Extensive work on the performance tuning of Oracle and ETL Process.
  • Developed Standard Reports, Cross-tab Reports Charts, and Drill through Crystal Reports.
  • Created a Data feed for EPIC project that would provide Charges, Encounters at the Unit level.
  • Coordinated Onsite and Offshore team and tracked the work.
  • Created various types of reports using OLAP tool OBIEE.
  • Worked on Agile methodology and Played Scrum master role on rotation basis

Environment: Informatica Power Center 8.6/8.1 (Designer, Repository Manager, Workflow Manager, Workflow Monitor), HL7 3.x/2.4, HIPAA, Epic Systems,Oracle 11g/10g, AS400, DB2 8.0, SQL Server 2008/2005 (Enterprise Manager, Query Analyzer), Maestro, Reflection FTP, OBIEE, BOXI, T-SQL, UNIX -AIX VERSION 5, Visio 2003, Erwin, UNIX, Maestro, Siebel

Informatica Developer

Confidential, Somers, NY

Responsibilities:

  • Involved in the analysis of the user requirements and identifying the sources.
  • Created technical specification documents based on the requirements by using S2T Documents.
  • Involved in the preparation of High level design documents and Low level design documents.
  • Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart.
  • Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.
  • Administered the rep ository by creating folders and logins for the group members and assigning necessary privileges.
  • Used Trillium as a Data Cleansing tool to correct the data before loading into the staging area.
  • Designed process flows to link metadata from diverse sources, including relational databases and flat files.
  • Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.
  • Worked on various kinds of transformations like Expression, Aggregator, Stored Procedure, Java, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
  • Developed reusable Mapplets, Transformations, email task and command task.
  • Used debugger to debug mappings to gain troubleshooting information about data and error conditions.
  • Extensively worked on Informatica tuning. Customized cache, partitioned session, created session configuration for few selected session for cache and error login setup.
  • Involved in monitoring the production system and optimizing the load time.
  • Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.
  • Involved in writing stored procedures, functions and Packages in PL/SQL.
  • Developed mappings in Informatica using BAPI and ABAP function calls in SAP.
  • Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Worked with SQL*Loader tool to load the bulk data into Database.
  • Prepared Unix Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings.
  • Involved in Informatica upgrade and modification code from version 8.1 to 8.6.
  • Rational Clear case is used to controlling versions of all files & Folders (Check - out, Check-in).
  • Created Informatica PowerExchange registration, data map to extract data in real-time and batch mode from DB2 and Mainframe files.
  • Prepared Test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing.
  • Defect Tracking and reports are done by Rational Clear Quest.

Informatica Developer

Confidential, Marlborough, MA

Responsibilities:

  • Participated in Data Modeling meetings and worked on Dimension modeling.
  • Created LDM and PDM through ERWIN and documented description of each components like Tables, columns, Indexes, Constraints and Dependencies.
  • Created Dimension Tables and Fact Tables based on the warehouse design.
  • Wrote Triggers and Stored Procedures using PL/SQL for Incremental updates.
  • Integrated various sources into the Staging area to Integrate and Cleansing data.
  • Involved in identifying bugs in existing mappings by analyzing data flow, evaluating transformations using Debugger.
  • Worked extensively on designing Data Mapping flow from various source systems to target systems
  • Involved in identifying Informatica bottleneck and Tuning various components of designer and workflow manager.
  • Developed Teradata Scripts to write data into flat files and then loaded data from flat files to custom tables through FastExport, FastLoad and Mload.
  • Created report generator script through Teradata Bteq Scripts.
  • Created Stored Procedures, Triggers, Functions, Views and Packages using PL/SQL.
  • Developed different reports using Oracle Application Express and PL/SQL.
  • Created scripts using SQL and PL/SQL to generate bulk data for testing team to test the performance of the application and the same data was used by DBA team to test the server performance.
  • Participating in QA and UAT process to clarify questions and fixing identified issues.
  • Performed production support activities in Data Warehouse including monitoring and resolving production issues, pursue information, bug - fixes and supporting end users.
  • Created Reusable Transformations and Mapplets to use in Multiple Mappings.
  • Documented run processes of system for effective production system monitoring, support, tracking and resolving issues.

Informatica Developer

Confidential, Somers, NY

Responsibilities:

  • Developed mappings to move data from Source systems and populate in the Data Warehouse.
  • Developed PL/SQL Procedures, Functions and Packages to load data.
  • Member of the Production Support rotation and Informatica Administration teams.
  • Designed and Developed the Type1 and Type2SCDs (Slowly Changing Dimensions).
  • Created sessions, Command task, Email task, Event wait, Event raise and workflows.
  • Conducted the meeting with business representatives for requirement analysis and to define business and functional specifications.
  • Involved in testing the mappings and debugging process of the queries.
  • Performance tuning, maintain and fix production issues of existing code. Modify existing code as per the new business requirements.
  • Created reusable transformations and used them in the mappings.
  • Involved in preparing documents - TSD and UTP.
  • Prepared individual mapping specification documents and Test case.
  • Involved in identifying bugs in existing mappings by analysing data flow, evaluating transformations using Debugger.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Worked on XML to extract data and load to and send to corresponding applications.
  • Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures.
  • Defects were tracked, reviewed and analysed and resolved through Jira.
  • Conducted VAT (Volume analysis testing), SIT (System integrate testing) and UAT (User Acceptance Testing)with user community.
  • Involved in creation of various UNIX scripts which help the ETL scheduling jobs and help in initial validation of various tasks.
  • Involved in SQL and PL/SQL tuning using toad and query block builder.

Informatica Developer

Confidential, San Francisco, CA

Responsibilities:

  • Co - ordination from various business users' stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of data from multiple source.
  • Defined and developed brand new standard design patterned, ETL frameworks, Data Model standards guidelines and ETL best practices
  • Provided technical design leadership to this project to ensure the efficient use of offshore resources and the selection of appropriate ETL/CDC logic.
  • Performed detailed data investigation and analysis of known data quality issues in related databases through SQL
  • Actively involved in Analysis phase of the business requirement and design of the Informaticamappings.
  • Performed data validation, data profiling, data auditing and data cleansing activities to ensure high quality Cognos report deliveries
  • Implemented Slowly Changing Dimensions (SCD) Type 1 and Type 2 for initial and history load using Datastage.
  • Used various transformations of Informatica, such as Source Qualifier Transformation, Expression Transformation, Look-up transformation, Update Strategy transformation, Filter transformation, Router transformation, Joiner transformation etc. for developing Informatica mappings.
  • Developed Informatica mappings for TYPE 2 Slowly Changing Dimensions.
  • Teradata utilizes like BTEQ, FASTLOAD, MLOAD, and FASTEXPORT.
  • Created sessions and work-flows for the Informatica mappings.
  • Heavily used Informatica Cloud integration using Amazon Redshift connector and integratedata form various sources
  • Configured sessions for different situations including incremental aggregation, pipe-line partitioning etc.
  • Created mappings with different look-ups like connected look-up, unconnected look-up, Dynamic look-up with different caches such as persistent cache etc.
  • Created various Mapplets as part of mapping design.
  • Involved in writing Oracle stored procedures and functions for calling during the execution ofInformatica mapping or as Pre-or Post-session execution.
  • Created effective Test Cases and performed Unit and Integration Testing to ensure the successful execution of data loading process.
  • Documented Mappings, Transformations and Informatica sessions.
  • Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.
  • Involved in designing the ETL testing strategies for functional, integration and system testing for Data warehouse implementation.
  • Extensively involved in testing by writing some QA procedures, for testing the target data against source data.
  • Worked on Data Profiling, Data standardization, error handling and exception handling.
  • Worked on Merge, match and table to find duplicate and generate report.
  • Written Unix shell scripts for file manipulation, ftp and to schedule workflows.
  • Co-ordinated offshore team on daily basis to leverage faster development.

Environment: Informatica PowerCenter 9.6.1, Informatica PowerExchange 9.6.1, InformaticaDataQuality 9.6.1, Datastage 8.0, Amazon Redshift, Cognos 9.0, Sun Solaris, SQL, PL/SQL, Oracle 11g, TOAD, SQL Server 2012, Autosys, Shell Scripting, Icescrum, JIRA, Teradata 14, Control-M, Autosys, GitHub, Hadoop, Hive, Cognos 9

Hire Now