We provide IT Staff Augmentation Services!

Informatica Developer/ Data Analyst Resume

2.00/5 (Submit Your Rating)

Irving, TX

SUMMARY

  • Over 8.5 years of strong experience in all phases of life cycle of Requirement Analysis, Functional Analysis, Design, Development, implementation, Testing, Debugging, Productions Support, and Maintenance of various Data Warehousing Applications.
  • 8 years of experience in designing and development of ETL Methodology using Informatica PowerCenter 9.6.1/9.0.1 /8. x, Informatica PowerExchange 9.6.1/9.0.1 , Informatica DataQuality 9.6.1.
  • Experience in creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets and Data marts and Data warehouse.
  • Expert in performance tuning of Informatica mappings, identifying source and target bottlenecks.
  • Experience in Bill Inmon and Kimball data warehouse design and implementation methodologies
  • Expertise in OLTP/OLAP System Study, E - R modeling, developing Database Schemas (Star schema and Snowflake schema) used in relational and dimensional modeling.
  • Extensive Experience in integration of various data sources like ORACLE 12c/11g/10g, Teradata 14/13, Netezza, UDB DB2, Mainframes, SQL server 2012, SAP, Sybase, Informix, MySQL, Flat Files, MQ series and XML, Flat files
  • Experience on Informatica Cloud Integration for Amazon Redshift and S3
  • Implemented Slowly Changing Dimensions (SCD) Type 1 and Type 2 for initial and history load using Informatica.
  • Expert in Performance tuning, troubleshooting, Indexing and partitioning techniques on Sources, Targets, Mappings and Workflows in Informatica.
  • Expertise in Teradata utilities like Multi Load, Fast Load, Fast Export, BTEQ, TPUMP, TPT and tools like SQL Assistant, Viewpoint
  • Well-versed in tuning Teradata ETL queries, remediating excessive statistics, resolving spool space issues, applying compression for space reclamation etc.
  • Proficient in Data Analysis, Data Validation, Data Lineage Data Cleansing, Data Verification and identifying data mismatch
  • Extensive experience in Data analysis, ETL Techniques, MD5 logic for loading CDC.
  • Experienced in writing UNIX shell scripts, SQL Loader, Procedures/Functions, Triggers and Packages.
  • Superior SQL skills and ability to write and interpret complex SQL statements and also mentor developers on SQL optimization and ETL debugging and performance tuning
  • Exposure of end to end SDLC and Agile methodology
  • Skilled in creating & maintaining ETL Specification Documents, Use Cases, Source to target mapping, Requirement Traceability Matrix, performing Impact Assessment and providing Effort estimates, deployment artifacts
  • Worked many years in onsite offshore model. Provided technical design leadership to ensure the efficient use of offshore resources and the selection of appropriate design, ETL/CDC logic.
  • Extensive experience of providing IT services in Healthcare, Financial and Banking industries.
  • Knowledge of Hadoop Ecosystem (Spark, HDFS, Hive, Pig etc.)
  • Have good problem solving, time management skills, communication skills, self-motivated, ability to work independently or cooperatively in a team and eager to learn.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center ( 9.6.1/9.0.1 /8. x /7.x), PowerExchange, CDC, 9.6.1/9.0.1 , unified modeling language, Informatica DataQuality 9.6.1, Power connect for SAP BW, power connect for JMS, power connect for IBM MQ series, power connect for Mainframes, DTS

BI Tools: Business Objects 5.1, Cognos Impromptu 6.0, Powerplay 6.6, and Oracle 9i Discoverer, p

Data Modeling: ERwin 9.5.2/7.3/4.1 , MS - Visio 2013, UML, Oracle designer.

Databases: Teradata 14/V2R6/V2R5, Oracle 12c/11g/10g/9i, SQL Server 2005/2008, DB2,MySQL, Sybase IQ, Informix, Netezza, Amazon Redshift, MS Access

Languages: XML, Java, HTML, JavaScript, PL/SQL C++, C, UNIX Shell Scripting, SQL, PL/SQL.

WEB Services: SOAP, WSDL

Big Data: Hadoop Ecosystem (HDFS, Hive, Pig)

OS: MS-DOS, HP UNIX, Windows and Sun OS.

Methodologies: Ralph Kimball’s Star Schema and Snowflake Schema.

Others: MS Word, MS Access, T-SQL, TOAD, SQL Developer, Microsoft Office, Teradata View Point, Teradata SQL Assistant, scrum tool, JIRA,, Control - M, Autosys, GitHub

PROFESSIONAL EXPERIENCE

Confidential, Irving, TX

Informatica Developer/ Data Analyst

Responsibilities:

  • Co-ordination from various business users’ stakeholders and SME (Subject Matter Expert) to get Functional expertise, design and business test scenarios review, UAT participation and validation of data from multiple source.
  • Defined and developed brand new standard design patterned, ETL frameworks, Data Model standards guidelines and ETL best practices
  • Provided technical design leadership to this project to ensure the efficient use of offshore resources and the selection of appropriate ETL/CDC logic.
  • Performed detailed data investigation and analysis of known data quality issues in related databases through SQL
  • Actively involved in Analysis phase of the business requirement and design of the Informatica mappings.
  • Performed data validation, data profiling, data auditing and data cleansing activities to ensure high quality Cognos report deliveries·
  • Used various transformations of Informatica, such as Source Qualifier Transformation, Expression Transformation, Look-up transformation, Update Strategy transformation, Filter transformation, Router transformation, Joiner transformation etc. for developing Informatica mappings.
  • Developed Informatica mappings for TYPE 2 Slowly Changing Dimensions.
  • Teradata utilizes like BTEQ, FASTLOAD, MLOAD, and FASTEXPORT.
  • Created sessions and work-flows for the Informatica mappings.
  • Heavily used Informatica Cloud integration using Amazon Redshift connector and integrated data form various sources
  • Configured sessions for different situations including incremental aggregation, pipe-line partitioning etc.
  • Created mappings with different look-ups like connected look-up, unconnected look-up, Dynamic look-up with different caches such as persistent cache etc.
  • Created various Mapplets as part of mapping design.
  • Involved in writing Oracle stored procedures and functions for calling during the execution of Informatica mapping or as Pre or Post session execution.
  • Created effective Test Cases and performed Unit and Integration Testing to ensure the successful execution of data loading process.
  • Documented Mappings, Transformations and Informatica sessions.
  • Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.
  • Involved in designing the ETL testing strategies for functional, integration and system testing for Data warehouse implementation.
  • Extensively involved in testing by writing some QA procedures, for testing the target data against source data.
  • Written Unix shell scripts for file manipulation, ftp and to schedule workflows.
  • Co-ordinated offshore team on daily basis to leverage faster development.

Environment: InformaticaPowerCenter 9.6.1,InformaticaPowerExchange 9.6.1, InformaticaDataQuality 9.6.1, Amazon Redshift, Cognos 9.0, Sun Solaris, SQL, PL/SQL, Oracle 11g, TOAD, SQL Server 2012, Shell Scripting, Icescrum, JIRA, Teradata 14, Control - M, GitHub, Hadoop, Hive, Cognos 10

Confidential, Oakland, CA

Informatica Developer/ Data Analyst

Responsibilities:

  • Participated in the Design Team and user requirement gathering meetings.
  • Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, Mapplets, transformations, re-usable transformations.
  • Created different source definitions to extract data from flat files and relational tables for Informatica Power Center.
  • Used Star Schema approach for designing of database for the data warehouse
  • Developed a standard ETL framework to enable the reusability of similar logic across the board.
  • Created different target definitions using warehouse designer of Informatica Power center.
  • Created different transformations such as Joiner Transformations, Look-up Transformations, Rank Transformations, Expressions, Aggregators and Sequence Generator.
  • Created stored procedures, packages, triggers, tables, views, synonyms, and test data in Oracle.
  • Extracted source data from Oracle,SQL Server, Flat files, XML files using Informatica, and loaded into Netezza target Database.
  • Extensively transformed the existing PL/SQL scripts into stored procedures to be used by Informatica Mappings with the help of Stored Procedure Transformations.
  • Used PL/SQL whenever necessary inside and outside the mappings.
  • Created Models based on the dimensions, levels and measures required for the analysis.
  • Validate the data in warehouse and data marts after loading process balancing with source data.
  • Created, launched & scheduled sessions.
  • Fixed Performance issue in Informatica mappings.
  • Used Teradata utility like BTEQ, FLOAD MLOAD and TPUMP scripts to load the data into Teradata tables.
  • Customized Unix shell scripts for file manipulation, ftp and to schedule workflows.
  • Created design specification which includes BI dependency plan, job scheduling and cycle management documents
  • Worked closely with the business analyst’s and Production Support to resolve any JIRA issue.
  • Co-ordinated offshore team on daily basis to leverage faster development.

Environment: InformaticaPowerCenter 9.0.1,InformaticaPowerExchange 9.0.1, InformaticaDataQuality 9.0.1, Cognos 9.0, Linux, SQL, PL/SQL, Oracle 11g, TOAD, Teradata, Aginity Workbench for Netezza, SQL Server 2012,Control M, Shell Scripting, XML, SQL Loader

Confidential, Bentonville, AR

Informatica Developer

Responsibilities:

  • Involved in design and development of data warehouse environment, liaison to business users and technical teams gathering requirement specification documents and presenting and identifying data sources, targets and report generation.
  • Using Informatica Designer designed Mappings, which populated the Data into the Target Star Schema on Oracle Instance.
  • Optimized Query Performance, Session Performance and Reliability.
  • Extensively used Router, Lookup, Aggregator, Expression and Update Strategy Transformations.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
  • Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager.
  • Scheduled the batches to be run using the Workflow Manager.
  • Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs.
  • Performed Unit testing and Integration testing on the mappings in various schemas.
  • Optimized the mappings that had shown poor performance
  • Monitored sessions that were scheduled, running, completed or failed. Debugged mappings for failed sessions.
  • Involved in writing UNIX shell scripts for Informatica ETL tool to run the sessions.
  • Coordinated between the development and testing teams for robust and timely development of fully integrated application.
  • Constantly monitored application attributes to ensure conformance to functional specifications.
  • Mentored the development members on ETL logic and performed code and document reviews

Environment: Informatica PowerCenter 8.6, Informatica, SQL Server 2008, Oracle 10g, Shell Scripts, Erwin, TOAD, UNIX, Cognos 9, SQL, PL/SQL, UNIX, Toad, SQL Developer, HP Quality Center

Confidential, Carmel, IN

Informatica Developer/ Data Analyst

Responsibilities:

  • Understand the problem description stated in BRD and design and write ETL Specifications
  • Importing source/target tables from the respective databases and created reusable transformations and mappings using Designer Tool set of Informatica.
  • Prepared Test Cases and performed system and integration testing
  • Created and Monitor the Sessions.
  • Created Informatica mappings to load the data from staging to dimensions and fact tables.
  • Configured the mappings to handle the updates to preserve the existing records using update strategy transformation.
  • Scheduled Sequential and Concurrent sessions and batches for loading from source to target database through server manager.
  • Developed informatica mappings, mapplets and transformations.
  • Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator, Update Strategy, Rank, Expression and lookups (connected and unconnected) while transforming the Data according to the business logic.
  • Created sessions, reusable worklets and workflows in Workflow Manager
  • Used Event wait to trigger the file and run the process.
  • Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.
  • Conducting unit testing and Prepared Unit Test Specification Requirements.
  • Used power connect to extract data from mainframes database.
  • Prepared data base request form and Production release document.
  • Creating / Updating and Running / scheduling Batches and Sessions.
  • Created PL/SQL Scripts and Stored Procedures for data transformation on the data warehouse.

Environment: Informatica PowerCenter 8.1, Business Objects 6.0, Oracle 10g, SQL Loader, PL/SQL, SQL Server 2004, UNIX Shell Programming, Linux and Windows NT

We'd love your feedback!