We provide IT Staff Augmentation Services!

Informatica Developer Resume

St Louis, MO

SUMMARY:

  • Over 8+ years of extensive experience in ETL (Extract Transform Load), Data Integration, Data Profiling, Data Mapping, Data Warehousing using Informatica, Teradata and Oracle technologies in real industry segments like Banking, Insurance, Finance, Retail and Health care domains using Informatica Power Center 10.X/9.X/8.X
  • Six years of strong Data Warehousing experience using Informatica Power Center 9.1/8.6 (Workflow Manager, Workflow Monitor, Repository Manager, Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation developer), Datamart, ETL, OLAP, OLTP.
  • Experienced in full Software Development Life Cycle (SDLC) of the data warehousing project, project planning, business requirement analysis, data analysis, logical and physical database design, setting up the warehouse physical schema and architecture, developing reports, security and deploying to end users.
  • Experienced in Data Warehouse/Data mart, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Worked on Informatica Cloud and developed jobs like Data Synchronization Tasks, Data Replication Tasks, Mappings and Task Flows.
  • Expertise in working with databases Oracle 11g/10g/9i, Teradata 14/12, SQL Server 2000/2005/2008 , DB2, My SQL, Sybase, Netezza.
  • Experience using Teradata utilities ( BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Teradata Administrator, SQL Assistant, Data mover ), UNIX.
  • Experience of gathering requirement , Proof of concept, gap analysis, design, development, and Implementation of the ETL/Data warehouse projects .
  • Experience in Performance Tuning of mappings, ETL procedures and process and involved in complete life cycle implementation of data warehouse.
  • Knowledge about the Facets, EDI and ITS.
  • Excellent capabilities in integration mappings including dynamic cache lookup, shared, and persistence mappings for Type I, Type II, Type III slowly changing dimensions.
  • Experience in Developing Informatica complex mappings/Mapplets using various Transformations for Extraction, Transformation and Loading of data from multiple sources of data warehouses and creating workflows with work lets & tasks and scheduling using work flow manager.
  • Very strong in implementation of data profiling, creating score cards, Creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency.
  • Very strong in Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations
  • Extensively used SQL and PL/SQL for development of Procedures, Functions, Packages and Triggers.
  • Very strong knowledge on end to end process of Data Quality and MDM requirements and its implementation
  • Strong involvement in understanding Data Modeling involving - Star and Snow Flake Schema, Identifying Facts and Dimensions.
  • Having solid Experience in Informatica and Teradata mix in Enterprise distribution center Environment.
  • Experience in working both methodologies including SDLC with Agile, Scrum methodologies
  • Experience in planning, designing, building dashboards and Analytics reporting solutions using BI tools (OBIEE, Cognos, Qlikview), Databases (Oracle, DB2, Microsoft SQL), data analytics and dashboard visualization skills, supporting evidence-based business decision making.
  • Proficiency in data warehousing techniques for Slowly Changing Dimension phenomenon, surrogate key assignment, Normalization & De-normalization, Cleansing, Performance Optimization along with the CDC (Change Data Capture).
  • Developed Complex mappings from varied Transformation logics like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Union, Update Strategy and more.
  • Experienced in Informatica data quality (IDQ), power center, data cleansing, data profiling, data quality measurement and data validation processing, Match, Merge, Weightage score, Deduplication process.
  • Worked with Oracle Stored Procedures, Triggers, Indexes, DML, DDL, Database Links, Sequences.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, Teradata, MS SQL Server and Worked on integrating data from XML files, flat files like fixed width and delimited.
  • Experience in utilizing Automation Scheduling instruments like Autosys, Control-M and Maestro.
  • Performed Web Services Testing using SOAP.
  • Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
  • Solid diagnostic and element investigating abilities.
  • Excellent Communication skills, Self-starter with ability to work with minimal guidance.

TECHNICAL SKILLS:

Operating Systems: Windows, UNIX, Linux, MS-DOS

Modeling: Dimensional Data Modeling, Star Schema Modeling, Snow - flake Schema Modeling, Erwin, Microsoft Visio.

RDBMS: Oracle 10g/11g, Teradata 14/12, DB2, SQL Server 2000/20 R2/2012/2014, MySQL, Sybase, Netezza

QA Tools: Quality Center, SOAP

ETL Tools: Informatica Power Center 10.1/ 9.6.1/9.5.1 /8.6.1/8.1.1

Reporting Tools: Cognos, Business Objects, Tableau, OBIEE, SSRS, SSAS, Microstrategy 9.4.1.

Languages: Java, XML, UNIX Shell Scripting, SQL, PL/SQL

PROFESSIONAL EXPERIENCE:

Confidential, St. Louis, MO

Informatica Developer

Responsibilities:

  • Gathered user Requirements and designed Source to Target data load specifications based on Business rules.
  • Used Informatica Power Center 9.6.1 for extraction, loading and transformation (ETL) of data in the DataMart.
  • Designed and developed ETL Mappings to extract data from Flat files, MS Excel and Oracle to load the data into the target Teradata database.
  • Developing several complex mappings in Informatica a variety of Power Center transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center.
  • Integrated Informatica Data Quality (IDQ) with Informatica PowerCenter and Created POC Data quality mappings in Informatica Data Quality tool and imported them into Informatica Power Center as Mappings, Mapplet.
  • Design, Development and implementation of Informatica Developer Mappings for data cleansing using Address validator, to standardize the address using standardizer, Labeler, Association, Parser, Expression, Filter, Router, Lookup transformations and etc
  • Used most of the transformations such as Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Source Qualifier, Expression, Connected and Unconnected Lookups, update strategy and stored procedure.
  • Created drill maps for end users to
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Design, Development and implementation of ETL Informatica Mappings and also development and implementation of data load and data validation processes using PL/SQL, SQL.
  • Worked in Transformation mapping for inserting and updating records when loaded.
  • Extensively used ETL processes to load data from various source systems such as Oracle, Teradata and Flat Files, XML files into target system Oracle by applying business logic on transformation mapping for inserting and updating records when loaded.
  • Worked with Teradata utilities such as Mload, Fload, export etc. to load or extract tables..
  • Worked in developing UNIX Shell Scripts for automation of ETL process.
  • Used Autosys Scheduler to Create, Schedule and control the batch jobs
  • Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
  • Performed operational support and maintenance of ETL bug fixes and defects.
  • Maintained the target database in the production and testing environments.
  • Created Mapping Parameters and Variables.
  • Created repository scripts automating export and import of Informatica Objects
  • Supported migration of ETL code from development to QA and QA to production environments.
  • Migration of code between the Environments and maintaining the code backups.
  • Designed and developed Unix Shell Scripts, FTP, sending files to source directory & managing session files.
  • Created BI- Publisher reports using OBIEE analysis and query builder.
  • Created several Data models and Reports, List of values and parameters, integrated it with OBIEE dashboards
  • Done testing and wrote queries in SQL to ensure the loading of the data.
  • Involved in jobs scheduling, monitoring and production support in a 24/7 environment.

Environment: Informatica Power Center 9.6.1, Power Exchange, Oracle 11g, Teradata 14.0, Flat files, XML, OBIEE 11.1.1.7.0, ERWIN 9, SQL Assistant, Toad, Winscp, Putty, Autosys, UNIX shell scripts, Linux.

Confidential, Los Angeles, CA

Informatica Developer

Responsibilities:

  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets.
  • Designed logical and physical data model. Designed ETL process from source to target.
  • With Technical and functional expertise of the system helping other teams to understand and develop new systems that would make the business more productive and efficient.
  • Extensively worked on Informatica IDE/IDQ.
  • Used IDQ's standardized plans for addresses and names clean ups.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment
  • Created Stored Procedures to handle the selection criteria such as Address, Provider, Specialty, Chapters and Credentialing and to load the data for the Extract and Exclusion reports based on the business requirements.
  • Worked on IDQ file configuration at user's machines and resolved the issues.
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • Extensively used ETL to load Flat Files, Oracle, XML Files, DB2 and legacy data as sources and SQL Server, Teradata as Targets.
  • Designed and developed Mappings using different transformations such as Source Qualifier, Expression, Aggregator, Filter, Joiner, and Lookup to load data from source to target tables.
  • Performed many multiple tasks effectively and involved in troubleshooting the issues.
  • Created references tables, applications and workflows and deployed that to Data integration service for further execution of workflow.
  • Performance tuning of new or existing application wherever required.
  • Handled complete configuration of customized/new tasks and loading process through Control-M jobs.
  • Worked in generating the Mapplet from connected Transformations and also Logical Data object from a SQL Query.
  • Created drill maps for end users to drill down and recover data.
  • Customization/Creation of ETL components as part of development.
  • Create the Unit Test Cases and done SIT testing for ETL & Reporting.
  • Created database objects like staging tables, target tables, synonyms, sequences, triggers and stored procedures to move data to target.
  • Tuning of SQL queries for better performance.
  • Developed Informatica mappings. Unit test ETL mapping codes and validated the result set data.
  • Performed Manual Testing of the application as well as identified the critical test scripts to be automated.
  • Extensively worked on Report Services to create performance dashboard reporting.
  • Executed and managed test cases and reported bugs in Quality Center.
  • Developed Shell scripts to run ETL batch through Control-M schedule.
  • Documented validation rules, exception processing and test strategy of the mappings
  • Migrated ETL codes from Development to Test to Production.

Environment: Informatica 9.6.1, Control-M, Rapid SQL, Flat Files, Cognos, Oracle10g, PL/SQL, IDQ 9.6.1, SQL Server 2012, ERWIN 9, DB2, UNIX Shell Script, Toad.

Confidential, NJ

Informatica Developer

Responsibilities:

  • Interacted with business analyst to understand the business requirements.
  • Involved in gathering requirements from business users.
  • Designed and Implemented the ETL Process using Informatica power center.
  • Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
  • ETL flows are developed from Source to Stage, Stage to Work tables and Stage to Target Tables.
  • Designed various mappings using transformations like Look Up, Router, Update Strategy, Filter, Sequence Generator, Joiner, Aggregator, and Expression Transformation.
  • Created Work Flows with Command Tasks, Worklets, Decision, Event Wait and Monitored sessions by using workflow monitor.
  • Migrated Informatica Folders from Development Environment to Test and System Test Environment and Worked with Admins to migrate the same to Production environments.
  • Wrote PL/SQL procedures for reconciliation of financial data between source and target to automate testing phases and help business for preliminary validation.
  • Wrote UNIX scripts, environment files for Informatica.
  • Developed Metadata driven code for effective utilization and maintenance using technical metadata, business metadata and process metadata.
  • To externalize the business logic instead hard coding in the mapping I have used Parameter file in Informatica.
  • Generated Cognos reports to test standardized reports as per business requirements.
  • Tuned Mappings and Mapplets for best Performance on ETL Side and Created Indexes and Analyzed tables periodically on Database side.
  • Organized the dataflow and developed many Control M jobs for Scheduling Jobs and moved to production.
  • Primary resource in Production support team so, involved in emergency calls when application outage occurred and resolved defects when raised.

Environment: Informatica Power Center 9.6.1/9.5.1, Power Exchange 5.1, SQL Server 2008, Flat Files, Cognos 8.0, Toad, UNIX, Windows XP, ERWIN, Control-M, Maestro.

Confidential, NYC, NY

Informatica Developer

Responsibilities:

  • Responsible for Extraction, Transformation and Loading the data into Data warehouse by using Informatica Power center 8.6.1.
  • Worked on several transformations such as Filter Transformations, Joiner Transformations, Rank Transformations, Sequence Generator Transformations, Stored Procedure Transformations, Lookup and Expression Transformations in Informatica.
  • Created batches to run several sessions sequentially and concurrently.
  • Involved in performance tuning of Informatica mappings and sessions.
  • Based on requirements, Developed Source-To-Target mapping document with business rules and also developed ETL Spec documentation.
  • Did code review and unit tested ETL mappings. Mentor the developers and help them with issues.
  • Created Functional Requirements Document (FRD) containing the glossary, Actors List, Functional requirements, Business Rules, QA Planning and Testing Scenarios, Current and Future Process flows.
  • Developed ETL mapping for the reporting requirement and for data feeds. Changed ETL Mapping provided by vendor to suit business requirement.
  • Gap analysis by mapping the functional requirements to the Business Requirements worked on the report design to create User Interface mock-ups.
  • Simulated real time transaction feed by fabricating transactions required for UAT and manually calculating expected results to generate expectancy going into the testing process.
  • Verifying the deliverables of the testing performed by the QA team and provided sign-off for production.

Environment: Informatica 8.6.1, Power Exchange, Windows XP/2000, Oracle 9i, MS Access 2000, DB2,PL/SQL, UNIX, MS Excel, MS Word.

Confidential, San Francisco, CA

Informatica developer

Responsibilities:

  • Participated in the Design Team and user requirement gathering meetings.
  • Performed business analysis and requirements per high end users and managers and performed requirement gathering.
  • Created stored procedures, tables, views, synonyms, and test data in Oracle.
  • Extracted source data from Oracle, Flat files, using Informatica, and loaded into target Database.
  • Created medium to complex PL/SQL stored procedure for Integration with Informatica using Oracle 11g.
  • Developed complex mapping also involving SCD Type-I, Type-II, mapping in Informatica to load the data from various sources
  • Experience in transforming the existing PL/SQL scripts into stored procedures to be used by Informatica Mappings with the help of Stored Procedure Transformations.
  • Involved in extensive Performance Tuning by determining bottlenecks in sources, mappings and sessions.
  • Design parameter file after crating mapping
  • Schedule a job on Tidal scheduler.
  • Created Models based on the dimensions, levels and measures required for the analysis.
  • Validate the data in warehouse and data marts after loading process balancing with source data.

Environment: Informatica Power Center 9.1.1/8.6.1, Tidal, Flat files, Oracle 11g, Teradata 12/13, SQL, Windows XP, PL/SQL, SQL Server 2005/2000, Toad.

Confidential

Informatica Developer

Responsibilities:

  • Impact Analysis, Detail Design, Traceability Matrix.
  • Task creation for the team members and management related work for Technical Design Document, Functional document and code review for the different project.
  • Created Functional Requirements Document (FRD) containing the glossary, Actors List, Functional requirements, Business Rules, QA Planning and Testing Scenarios, Current and Future Process flows.
  • Developed ETL mapping for the reporting requirement and for data feeds. Changed ETL Mapping provided by vendor to suit business requirement.
  • Gap analysis by mapping the functional requirements to the Business Requirements worked on the report design to create User Interface mock-ups.
  • Coordination with the onsite coordinator for delivery to client.
  • Verifying the deliverables of the testing performed by the QA team and provided sign-off for production.
  • Unit testing and implementation in production for new CR.
  • Daily production supports for the scheduled job and provide the resolution related to failed job and fixed them.
  • Created daily and weekly workflows and scheduled to run based on business needs.
  • Analyzed the enhancements coming from the individual Client for application and implemented the same.
  • Creation of technical documents.
  • Created daily and weekly workflows and scheduled to run based on business needs.
  • To write unit test cases/integration test cases, share test cases with client and do the unit tests for the client sample data.
  • Simulated real time transaction feed by fabricating transactions required for UAT and manually calculating expected results to generate expectancy going into the testing process.

Environment: Informatica Power Center 7.1.1/8.1.1, Oracle, SQL, PL/SQL, ERWIN.

Hire Now