We provide IT Staff Augmentation Services!

Sr.etl Informatica Developer Resume

BostoN

SUMMARY

  • 8+ years of IT experience in all phases of SDLC in DW/BI (Analysis, Design, Development, Test, Implementation and Operations).
  • 8 years of experience in Data Warehousing using ETL and OLAP tools InformaticaPowerCenter9.5/9.1/8.x/7.x, Informatica PowerExchange9.x/8.x, OBIEE 11g/10g, Teradata, Informatica Data Quality 8.x,9.x, Informatica Big Data Edition, Informatica Cloud.
  • Worked on Various Domains Health care, Banking, Financial & Retail and extensively worked on Marketing Data.
  • Very strong in implementation of data profiling, creating score cards, Creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency.
  • Designed and Developed the ETL processes for various domains like HealthCare, Finance, Logistics and Insurance.
  • Very strong knowledge of Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.
  • Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.
  • Very strong in implementation of data profiling, creating score cards, Creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency.
  • Proficient in Dimensional Data modeling and Relational Star Join Schema/ Snowflake models, FACT & Dimensions tables, Physical & logical data modeling and Ralph Kimball Methodologies.
  • Responsible for creating Jobs in Informatica using TPT operator, PDO and Change Data Capture techniques.
  • Interacted with end - users and functional analysts to identify and develop Business Specification Documents (BSD) and transform it into technical requirements.
  • Proficient with RDBMS systems, SQL, Oracle PL/SQL, database design and optimizations
  • Skilled in Data warehousing, Scripting and Data Modeling with 7years’ experience in dealing with various Data sources like Oracle, SQL Server, MS-Access, Teradata, and DB2.
  • Strong knowledge of Data warehouse design such as dimensional data modeling, logical and physical design.
  • Extensively worked with Teradata utilities likeBTEQ, Fast Export, Fast Load, Multi Loadto export and load data to/from different source systems including flat files.

TECHNICAL SKILLS

Data warehousing: Informatica PowerCenter 9.5/9.1/8.6/8.5/8.1/7.1 , Informatica Power Exchange 9.5/8.6/8.1,Informatica Big Data Edition, Informatica IDQ 9.x,Informatica Power Center Visio 9.x,Informatica Cloud(IOD)

Databases: Oracle /11g/10g/9i(Native SQL), Sybase, Teradata13/12/V2R6, MS-Access, DB2 8.0/7.0, MS-SQL Server

Languages: XML, SQL, T-SQL, PL/SQL, UNIX Shell Scripting, Perl, Core Java

Operating System: HP-UX 11/10/9, IBM-AIX6.1/5.3, Sun Solaris 9/8/7, SCO-UNIX, LINUX, Windows XP Professional/95/98/2000/XP/2010/Vista/Windows 8

Other Tools: MS Visual Source Safe, ZENA, Autosys, Control M, unicenter, Remedy, Clarity,TIDAL,JIRA,SVN Repository

DB Tools: SQL*Plus, SQL*Loader, Export/Import, TOAD, SQL Navigator, SQL Trace, MLOAD, FLOAD, FEXPORT, TPUMP

Microsoft Tools: MS Office, MS Front Page, MS Outlook, MS Project. MS Visio

PROFESSIONAL EXPERIENCE

Confidential

Sr.ETL Informatica Developer

Responsibilities:

  • Responsible to develop the automation process for various jobs to FTP the files to different destinations.
  • Worked closely with Business Analysts and developers during planning, analyzing, developing and testing phase of the project.
  • Responsible to develop the ETL jobs and automate them using the Control-M Job Utility.
  • Responsible for Promoting the ETL Code and UNIX scripts to the QA and Production Environments.
  • Performed Unit testing, Regression Integration testing and System testing of Informatica mappings.
  • Developed Complex Informatics Mappings using various transformations- Source Qualifier, Normalizer, Filter, Connected Lookup,
  • Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator, Reusable sequence generator transformation.
  • Responsible for creating shared and reusable objects in Informatica shared folder and update the objects with the new requirements and changes.
  • Involved in user acceptance testing. Defect reporting & tracking in Jira and resolved them
  • Widely used JIRA Dashboard to track the issues and communicate with the team members upon the particular issue.
  • Worked on Performance Tuning to optimize the Session performance by utilizing, Partitioning, Push down optimization, pre and post stored procedures to drop and build constraints.
  • Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Effectively worked on Onsite and Offshore work model.
  • Pre and post session assignment variables were used to pass the variable values from one session to other.
  • Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.
  • Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.

Environment: Informatica 9.6/10.x,Informatica IDQ, Unix, Oracle (Native SQL & PL/SQL), SQL Developer, Quick Build, SVN Tortoise, JIRA, Control-M,TOAD

Confidential

Sr ETL Informatica Developer

Responsibilities:

  • Participated in all phases of the Development Life Cycle
  • Responsible for Requirements gathering, Technical System Design, Development, Testing, Code Review
  • Code migration, UAT, Job scheduling.
  • Responsible for developing the ETL Code (Mappings) using Informatica.
  • Actively participate in understanding business requirements, analysis and designing Data Migration/Integration Process from Business Analyst
  • Use Informatica Power enter and Unix Scripts through Control-M tool to extract, transform and load data
  • Used Data Validation Option for Testing.
  • Responsible for updating the CRM (Salesforce data) using IOD Jobs with infa cloud.
  • Validation, Standardization and cleansing of data will be done in the process of implementing the business rules.
  • Loaded huge data sets of data on to Oracle system using Partitions and sub-partitions.
  • Responsible for loading data to Type-1, Type-2 Dimensions and fact tables in Mart layer.
  • Responsible for SQL performance tuning using Query optimization using hints.
  • Have used various performance techniques to enhance the ETL Load performance like Push Down optimization and session partitioning techniques.
  • Responsible for writing Views and materialized views for semantic layer.
  • Have written Unix Scripts to pull source files from the Remote systems to Informatica server.
  • Extensively worked on Informatica Cloud to pull the data from sales force and update the same.
  • Responsible for developing stored procedures and invoked the same from ETL (NFA) .
  • Have very good knowledge on all the data quality transformation which will be used throughout the development.
  • Performed many multiple tasks effectively and involved in troubleshooting the issues.
  • Created references tables, applications and workflows and deployed that to Data integration service for further execution of workflow
  • Responsible for code migration, Code review, test plans, test scenarios, test cases as part of Unit/Integrations testing
  • Automate the Production jobs using Informatica Workflows & Shell script through Control M Tool.
  • Responsible for creating DDLS and granting privileges to Users and Roles to the DB Objects.
  • Responsible for syncing the ETL Code, Unix Script and DB objects to higher Environments

Environment: Informatica 9.6/10.x,Informatica IDQ, Informatica Cloud, Unix, Oracle(Native SQL & PL/SQL), SQL Developer, Quick Build, SVN Tortoise, JIRA, Control-M,TOAD,Tableau

Confidential

Sr ETL Informatica Developer

Responsibilities:

  • Participated in all phases of the Development Life Cycle
  • Worked on all the Unix/Informatica setups required for Informatica upgrade.
  • Responsible for Requirements gathering, Technical System Design, Development, Testing, Code Review
  • Code migration, UAT, Job scheduling.
  • Responsible for developing the ETL Code (Mappings) using Informatica.
  • Actively participate in understanding business requirements, analysis and designing Data Migration/Integration Process from Business Analyst
  • Interacted with Data Architecture group, PM, Project team to finalize the TSD design strategies. Created various diagrams as part of the Project deliverable and physical database design.
  • Use Informatica Power enter and Unix Scripts through Control-M tool to extract, transform and load data
  • Used Data Validation Option for Testing.
  • Validation, Standardization and cleansing of data will be done in the process of implementing the business rules.
  • Most of data which belongs to various members and Providers will be carried out throughout the development.
  • Match and Merge rules will be implemented in Informatica MDM 10.1v to find the duplicates and to analyze the golden record.
  • Extensively worked on Informatica IDE/IDQ.
  • Implemented Change Data Capture at source Side to capture the changes before the ETL layer.
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Created business rules in Informatica Developer and imported them to Informatica power center to load the standardized and good format of data to staging tables.
  • Have very good knowledge on all the data quality transformation which will be used throughout the development.
  • Have Knowledge on Informatica MDM concepts and implementation of De-duplication process.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Worked on IDQ file configuration at user’s machines and resolved the issues.
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • And extensively worked on IDQ admin tasks and worked as both IDQ Admin and IDQ developer.
  • Performed many multiple tasks effectively and involved in troubleshooting the issues.
  • Created references tables, applications and workflows and deployed that to Data integration service for further execution of workflow
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Worked on IDQ file configuration at user’s machines and resolved the issues.
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • Responsible for code migration, Code review, test plans, test scenarios, test cases as part of Unit/Integrations testing
  • Automate the Production jobs using Informatica Workflows & Shell script through Control M Tool.
  • Responsible for creating DDLS and granting privileges to Users and Roles to the DB Objects.
  • Responsible for syncing the ETL Code, Unix Script and DB objects to higher Environments

Environment: Informatica 9.5/9.6,Informatica IDQ, Unix, Oracle,SQL Server, SQL,Quck Build, SVN Tortoise, JIRA,Control-M,TOAD

Confidential, Boston

Informatica/IDQ Developer

Responsibilities:

  • Involved in effort estimation for the requirements in the project and prepared mapping documents based on client requirement specifications.
  • Used File Bridge for flat file automatic DQ validations.
  • Involved in design and development of Informatica mappings using various transformations like expression, lookup, router, joiner, union sorter, aggregators etc.
  • Involved in developing SCD type 1, type 2 mappings in Informatica level and also involved in writing the stored procedure to perform type 1 and type 2 operations.
  • Validation, Standardization and cleansing of data will be done in the process of implementing the business rules.
  • Most of data which belongs to various members and Providers will be carried out throughout the development.
  • Match and Merge rules will be implemented in Informatica MDM 10.1v to find the duplicates and to analyze the golden record.
  • Extensively worked on Informatica IDE/IDQ.
  • Used Java transformation in Informatica and the transformation is coded with the concepts of core java.
  • Involved in the development of automated stored procedures to use as post and pre SQL in the Informatica session level which are useful to load the dimension and fact tables based on the table type.
  • Used SQL Server as database and design various stored procedures using SQL Server.
  • Used SVN to store all the automated scripts and regular insert scripts to run automatically after every data model deployment.
  • Used TIDAL auto scheduling tool for automate the process.
  • Involved in writing shell scripts to automate the jobs which were used in the TIDAL.
  • Widely used JIRA Dashboard to track the issues and communicate with the team members upon the particular issue.
  • Responsible for building Stored Procedures using Teradata and MS-SQL Server.
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Created business rules in Informatica Developer and imported them to Informatica power center to load the standardized and good format of data to staging tables.
  • Have very good knowledge on all the data quality transformation which will be used throughout the development.

Confidential, Cumberland

Informatica/IDQ Developer

Responsibilities:

  • Responsible to develop the ETL Informatica Mappings to mask the target data.
  • Responsible to analyse the Source data(Files) sent by the user prior to development and communicated to application team if required.
  • Responsible for testing the Masked data and provide the Unit Test documents.
  • Built the Customised functions and reusable objects to facilitate reusability of INFA objects
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Worked on IDQ file configuration at user’s machines and resolved the issues
  • Responsible to run and monitor the weekly load and fix the issues if the load fails.
  • Worked on fixed width Source files extensively and generated the masked files accordingly.
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
  • Worked with Expression, Data Masking Transformation and Lookup transformations and stored Procedure so as to mask the Production data.
  • Worked on HP Quality Centre to handle the defects raised for various applications.
  • Worked on Performance tuning and reduced the data load times of few Weekly Jobs.
  • Responsible to Code and implement the SCD Type 1 and Type 2 logics to load the Dimensions.
  • Responsible for developing the Mappings using the Power canter and Teradata TPT Operator.
  • Responsible for writing the BTEQ Scripts and automating the same using the INFA Mappings.
  • Responsible to load the Detail and Summary Fact Tables in the Data Mart.
  • Responsible to write the SQL’s to generate KPI Reports for end Users in OBIEE and Tableau.
  • Have did the ETL Vs User Data Re-conciliation for the reports generated in OBIEE.
  • Used transformations such as the source qualifier, normalizer, aggregators, connected & unconnected lookups, filters, sorter, stored procedures, router & sequence generator.
  • Checked and tuned the performance of application

Confidential, Columbus

Informtica/IDQ Developer

Responsibilities:

  • Responsible for developing the ETL Source code using the Mapping requirement documents
  • Responsible for running the Jobs using Control M Scheduler.
  • Extensively worked in data Extraction, Transformation and Loading from source to target system using Power Center of Informatica.
  • Importing Source/Target tables from the respective databases and created reusable transformations andmappings using Designer Tool set of Informatica.
  • Worked with heterogeneous sources including relational sources and flat files.
  • Work with Data modeler to understand the architecture of Data warehouse and mapping documents
  • Design mappings related to complex business logic provided by data modeler, which includes Dimensions and Fact tables.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
  • Extensively involved in monitoring the jobs in order to detect and fix unknown bugs and track performance.
  • Designed complex mapping logic to implement SCD1 and SCD2 dimensions and worked on many critical dimensional modeling which helps in structuring and organizing data in uniform manner where constraints are placed within structure major concept of Data modeling.
  • Extensively worked on Informatica IDE/IDQ.
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Worked on IDQ file configuration at user’s machines and resolved the issues.
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • Involved in the designing of Dimensional Model and created Star Schema using E/R studio.
  • Extensively worked with Data Analyst and Data Modeler to design and to understand the structures of dimensions and fact tables and Technical Specification Document.
  • OLTP and OLAP system will provide source data to data ware house which helps OLAP in data analysis.
  • Interacting with the front end users to present the proof of concept and to gather the deliverables of the team.
  • Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and reference data.
  • Doing research work to resolve production issues and data discarded during workflow runs.

Environment: Informatica Power Center 9.1.0/9.5, Workflow Manager, Workflow Monitor,PL/SQL, SQL, Oracle 11g/10 g, Toad 10.6, Oracle 11.6

Hire Now