We provide IT Staff Augmentation Services!

Data Services/informatica Developer Resume

Springfield, MA

SUMMARY

  • 6+ years of experience in Data Architecture, Data Integration, Data Warehousing using ETL tool INFORMATICA Power Center 10.1/9.6.1/9.5.1 /9.1.1/8.6/8.1/7.1 (Source Analyzer, Warehouse Designer, Mapping/Mapplet Designer, Sessions/tasks, Worklets/Workflow Manager).
  • Extensively used Informatica PowerCenter 10.1/9.6/9.1/8 , Informatica Data Quality (IDQ) 9.6.1 as ETL tool for extracting, transforming, loading, and cleansing data from various source data inputs to various targets, in batch and real time.
  • Performed Data cleansing, Data matching, standardization, Address doctor, profiles, and score cards using Informatica IDQ tool.
  • Coordinated with Business Leads in making them understand Match & Merge rules.
  • Worked with integrating most of the databases such as Oracle, Teradata, My Sql, Sql server, DB2, Sybase and flat file, XML both at source & target.
  • Excellent knowledge in data warehouse development life cycle including Software Development Life Cycle (SDLC) on various platforms like WINDOWS /XP/2000, UNIX.
  • Extensive knowledge in dimensional data modelling, star schema, snowflake schema, fact and dimension tables and process mapping using the top - down and bottom-up approach.
  • Extensively worked in the Performance tuning of the SQL and ETL processes.
  • Experience in Designing and developing mappings from various transformation logics like, Source Qualifier, Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Union, Joiner, java, Transaction Control and Update Strategy.
  • Proficient in implementing Complex business rules by creating re-usable transformations, workflows/worklets and Mappings/Mapplets.
  • Hands on experience in identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, transformations and sessions
  • Experienced in using Teradata Utilities like BTEQ, M-Load, F-Load, T-Pump, F-Export & TPT.
  • Experience in doing fine tuning of Teradata scripts and writing Teradata scripts using BTEQ.
  • Experience with Teradata Admin related work while supporting also using Explain plan effectively for performance related query tuning.
  • Good experience on Teradata Data Recovery Options and Releasing different utility failures M-Load, F-Load & T-Pump).
  • Good Experience with Teradata Index options, PPIs & Join Index.
  • Worked with Different tables in Teradata - Multi Set, Set, Volatile, & Global tables.
  • Developed slowly changing dimension (SCD) mappings of Type1, Type2 and Type3 (version, flag, and time stamp).
  • Implemented Change Data Capture (CDC) for extracting Delta data.
  • Worked with Informatica Mapping Variables, Mapping Parameters and Parameter Files
  • Scheduling the sessions to extract, transform and load data into warehouse database depending on the business requirements.
  • Wrote SQL, PL/SQL, stored procedures for implementing business rules for adhoc data.
  • Responsible for Unit testing of mappings and workflows.
  • Experience in working with Unit testing & QA team for system testing.
  • Good experience in working with several scheduling tools Maestro, Cron Tab, Autosys, Control M & Informatica Schedular.
  • Extensively worked on Code Migration from Development, UAT, and Production.
  • Involved in Code Deployment, Release Management & Change Management process and documentation.
  • Worked on Informatica Cloud to create Source /Target SFDC connections, monitor, synchronize the data in SFDC.
  • Performed data profiling and analysis of various objects in Salesforce (SFDC) and MS Access database tables for an in-depth understanding of the source entities, attributes, relationships, domains, source data quality, hidden and potential data issues, etc.
  • Heavily Involved in On Call Production Support for monitoring & supporting ETL Production jobs also involved in solving trouble tickets raised by End Users/Customers.
  • Experience in Both Waterfall & Agile Methodology in implementing DWH projects and good experience with scrum process.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.1/9.6.1/9.5.1 , Informatica IDQ 9.6, Informatica MDM 10.1.

Databases: Oracle 11g/10g, MS Sql Server 2005, 2008, Teradata 15/14/13/12, My Sql, DB2, Sybase, Flat File, XML.

Programming Languages/ Scripting: Sql, PL/SQL, Unix Shell, Perl, C, Java

Tools: Toad, MS Office, SQL*PLUS, Sql Assistant, SQL Developer, AOTS, Service Now, Rally, Bit bucket.

Operating Systems: Windows, Unix, Linux

PROFESSIONAL EXPERIENCE

Data Services/Informatica Developer

Confidential, Springfield, MA

Responsibilities:

  • Extensively worked on Data Extraction, Transformation, and Loading with CSV Files Flat files and mainframe files.
  • Possess extensive and in-depth knowledge of developing SQL scripts using SQL functions, grouping operations, sub queries, analytical function and joins to test the DW projects.
  • Worked with release team to perform migration activities.
  • Created Built request to do code-checking and promote them to higher environment.
  • Created deployment groups to and perform migration activities.
  • Integrating various application to their Consolidated routing system.
  • Created ISQL which loads the data to stage tables and final table.
  • Created Maestro scripts which schedules the informatica workflows.
  • Involved in all vital phases of software development life cycle including Business Requirements Analysis, Application Design, Development, Testing, Implementation and Support for Enterprise Data Warehouse and Client/ Server applications.
  • Populated the Staging tables with various Sources like Flat files (Fixed Width and Delimited), DB2 and Sybase
  • Analysed SCD1, SCD2 implementations as per requirement of the project to load Dimension and Fact tables.
  • Participating in requirement gathering meetings with Confidential and translate the client requirements into highly specified project briefs.
  • Effectively use JIRA to document requirements, technical design, and test cases.
  • Conduct meetings with end users to detail the Business needs and determine/define scope of data and code fixes and support user acceptance testing.
  • Involved in converting requirements with SA's and present to client.
  • Used Sybase load utilities for data load and UNIX shell scripting for file validation process.
  • Conduct due diligence meetings with clients, organize meetings with the clients.
  • Worked on TWS scheduling and Unix Scripting.
  • Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records.
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Performed data profiling using IDQ to create tables.
  • Created score cards to measure the data quality before applying the rules.

Environment: Informatica Power Center 10.1/9.6.1, DB Viewer, Rapid SQL, XML, Flat File, Jira, Putty, TWS/Informatica Scheduler, Autosys, UNIX, LINUX, Bit Bucket, Bamboo, JIRA, SCCS, Informatica IDQ 9.6.1Informatica MDM 10.1.

Test Data Management, ETL Developer

Confidential, Richmond, VA

Responsibilities:

  • Extensively worked on Data Extraction, Transformation, and Loading with CSV Files Flat files and mainframe files.
  • Possess extensive and in-depth knowledge of developing SQL scripts using SQL functions, grouping operations, sub queries, analytical function and joins to test the DW projects.
  • Worked with release team to perform migration activities.
  • Worked with DBA team to increase space in respected databases to perform DML's.
  • Heavily involved in performing DCR (Data copy request) activities to copy data from production to sit.
  • Involved in all vital phases of software development life cycle including Business Requirements Analysis, Application Design, Development, Testing, Implementation and Support for Enterprise Data Warehouse and Client/ Server applications.
  • Populated the Staging tables with various Sources like Flat files (Fixed Width and Delimited), Oracle and My Sql
  • Analysed SCD1, SCD2 implementations as per requirement of the project to load Dimension and Fact tables.
  • Participating in requirement gathering meetings with Confidential and translate the client requirements into highly specified project briefs.
  • Effectively use JIRA to document requirements, technical design, and test cases.
  • Conduct meetings with end users to detail the Business needs and determine/define scope of data and code fixes and support user acceptance testing.
  • Involved in converting requirements with SA's and present to client.
  • Use various Teradata load utilities for data load and UNIX shell scripting for file validation process.
  • Conduct due diligence meetings with clients, organize meetings with the clients.
  • Analysis of data warehouse data to build various data marts for reporting purposes.
  • Develop and support Extraction, Transformation and Load process (ETL) using Informatica Power Center to populate Teradata tables and flat files.
  • Worked on Control-M scheduling and Unix Scripting.

Environment: Informatica Power Center 10.1/9.6.1, Teradata 16, Sql Assistant, Toad, Flat File, Jira, Putty, Control-M/Informatica Scheduler, UNIX, Bit Bucket, Bamboo, JIRA.

Informatica developer

Confidential

Responsibilities:

  • Worked on the requirements with Business Analyst and business users also involved in working with data modellers.
  • Worked closely with data population developers, multiple business units and a data solution engineer to identify key information for implementing the Data warehouses.
  • Analysed logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
  • Parsed high-level design spec to simple ETL coding and mapping standards.
  • Used Informatica power centre as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
  • Written Teradata BTEQs & as well Informatica mappings using TPT to load data from Staging to base.
  • Fine-tuned Teradata BTEQs as necessary using explain plan and collecting statistics.
  • Written different loader scripts to load data from Teradata to Teradata.
  • Effectively used Primary Indexes, PPIs & Join Indexes, as necessary.
  • Developed ETL procedures to ensure conformity, compliance with standards and lack of redundancy, translating business rules and functionality requirements into ETL procedures.
  • Developed and tested all the backend programs, Informatica mappings and update processes
  • Populated the Staging tables with various Sources like Flat files (Fixed Width and Delimited), Oracle, My Sql, and Informix.
  • Created mappings using various Transformations such as Source qualifier, Aggregator, Expression, lookup, Router, Filter, Rank, Look up, Sequence Generator, java and Update Strategy.
  • Created and used the Normalizer Transformation to normalize the flat files in the source data.
  • Extensively built mappings with SCD1, SCD2 implementations as per requirement of the project to load Dimension and Fact tables.
  • Used Evaluate expression options to validate and fix the code using Debugger tool while testing Informatica code.
  • Handled initial (i.e. history) and incremental loads in to target database using mapping variables.
  • Used Debugger to debug mappings and created breakpoints for better analysis of mappings.
  • Worked with Workflow Manager for maintaining Sessions by performing tasks such as monitoring, editing, scheduling, copying, aborting, and deleting.
  • Worked on performance tuning at both the Informatica level and Database as well by finding the bottlenecks.
  • Worked on Maestro job scheduling and Unix Scripting.
  • Developed UNIX shell scripts to run the pmcmd functionality to start and stop sessions, batches, and scheduling workflows.
  • Involved in migrating the ETL Code to different environments from Dev to UAT and then to Production with ETL Admins.
  • Performed Unit testing and created unit test plan of the code developed and involved in System testing and Integration testing as well. Coordinated with the testers and helped in the process of integration testing.
  • Heavily involved in Production support on rotational basis and supported DWH system using the ticketing tool for the issues raised by Business users.
  • Solved Different Severity Tickets based on SLAs for data issues raised by Customers using trouble ticket system
  • Experience in working with reporting team in building collection layer for reporting purpose.

Environment: Informatica Power Center 10.1/9.6.1/9.5.1 , Oracle, My Sql, Informix, Teradata 14/13/12, Flat File, Cognos, Rally, Putty, Maestro/Informatica Scheduler, UNIX, JIRA.

Informatica developer

Confidential

Responsibilities:

  • Extensively used ETL to load data from Oracle database, Flat files to Data Warehouse.
  • Involved in production support for the Data Warehouse Informatica jobs.
  • Modified the existing mappings and workflows as per requirement changes and migrated them back to production environment.
  • Created robust and complex workflow and Worklets using Informatica Workflow Manager and troubleshot data load problems
  • Wrote SQL Queries, Triggers and Shell Scripts to apply and maintain the Business Rules.
  • Translated business requirements into Informatica mappings/workflows.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Used Informatica Designer to create complex mappings using different transformations like Source Qualifier, Expression, Lookup, Aggregator, and Update Strategy, Joiner, Filter and Router transformations to pipeline data to Data Warehouse/Data Marts.
  • Developed Mappings to extract data from ODS to Data Mart, and monitored Daily, Weekly and Monthly Loads.
  • Created and monitored Sessions/Batches using Informatica Workflow Manager/Workflow Monitor to load data into target Oracle database
  • Worked with mapping variables, Mapping parameters and variable functions like Set variable, Count variable, Set invariable and Setmaxvariable.

Environment: Informatica Power Center 9.5.1, Sql Server 2005/2008, oracle 12.1, PL/SQL, Business Objects XI R2, Oracle EBS, Toad, Unix Shell Scripts.

Hire Now