We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

SUMMARY

  • 9+ years of extensive experience with Informatica Power Center in all phases of Analysis, Design, Development, Implementation and support of Data Warehousing applications using Informatica Power Center 10.x/9.x/8.x/7.x.
  • Have clear understanding of Data warehousing concepts with emphasis on ETL and life cycle development including requirement analysis, design, development and implementation.
  • Extensive working experience in design and development of data warehouse, marts and ODS.
  • Handful working experience in Data warehousing development with data migration, data conversion and extraction/transformation/loading using Informatica Power Center to extract & load data into relational databases like SQL Server, Oracle, Teradata, DB2.
  • Good work experience in Informatica Data Integration Tools - Such as Repository Manager, Designer, Workflow Manager, Workflow Monitor and Scheduled workflows using Workflow Manager.
  • Extensively worked with Informatica mapping variables, mapping parameters and parameter files.
  • Experience in several facts of MDM implementations including Data Profiling, Data extraction, Data validation, Data Cleansing, Data Match, Data Load, Data Migration, Trust Score validation.
  • Experience in working with business analysts to identify and understand requirements in order to create Technical Design Documents.
  • Expertise Informatica B2B Data Transformation, DT and DX, pre-built transformations for most versions of industry standard messages including the following EDI standards and derivatives.
  • Experienced in executing complex queries using stored procedures, Triggers, Cursors, User Defined Functions using T-SQL.
  • Experienced in Data Analysis, Data Modeling, Development, Testing and Documentation of projects.
  • Experience working with Informatica Data Quality (IDQ) for data cleansing, data matching and data conversion.
  • Experienced in creating IDQ mappings using Consolidation, Key Generator, Match, Merge, Exceptional, Labeler, Standardizer, Address Validator transformations with Informatica Developer and migrated to Informatica Power Center.
  • Extensively worked with Informatica MDM (Master Data Management) in order to manage the high-volume data.
  • Excellent Hand full working experience in running projects by using scheduling tools like AutoSys.
  • Extensive knowledge in optimizing techniques such as concurrent caching, Auto memory calculations, partitioning and using push down optimization.
  • Experience in implementation Data Profiling, Data Stewardship, Data lineage) using big data management..
  • Solid understanding on Hadoop Hive and HBase implementation. Knowledge on scheduling tools such as oozie, AutoSys.
  • Strong Knowledge on Power BI.
  • Experinced using Cloud Data Integration (CDI), using Cloud Integration Hub (CIH), and using in Cloud Application Integration (CAI) of informatica components/products to integrate and design from multiple applications.
  • Experienced in Google Cloud. And also in BODS Designer Components like Formats, Projects,Data Flow,Jobs, Workflow,Scripts,Data Stores.
  • Experienc in using Informatica Connectors, REST, SOAP and other API’s for Integration.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle12c /Oracle11g, MS SQL Server, DB2, XML files and Flat Files into the staging area, Data Warehouse and Data Mart. Strong experience with star and snow flake schema, dimensional data modeling, facts, dimensional and slowly changing dimensions.
  • Experienced in ETL process using PL/SQL to populate the tables in OLTP and OLAP Data Warehouse Environment.
  • Experience in IICS Application Integration components like Processes, Service Connectors, and Process Object
  • Expertise in implementing Slowly Changing Dimensions (SCD) Type 1 and Type 2 for initial and history load using Informatica.
  • Understanding & working knowledge of Informatica CDC (Change Data Capture) Implementation experience of CDC using stored procedures, triggers and using Informatica power exchange.
  • Experience in debugging mappings, Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Expertise in RDBMS, database Normalization and Denormalization concepts and principles.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
  • Have strong knowledge in Big data management(BDM).
  • Developed shell scripts for invoking Informatica workflows and for running batch process.
  • Extensive experience in designing the architecture for extract, transformation, load environment and development of mapping and process using Informatica power center.
  • Experienced in Performance Tuning at Mapping, Session and database level, tuning mappings, Pipeline Partitioning to speed up data processing.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, T-SQL and Oracle PL/SQL.
  • Experienced in using different scheduling tools - Control-M,Tidal, Auto Sys, and Maestro/TWS & Cron Job.
  • Excellent analytical skills in understanding client’s organizational structure.
  • Excellent problem-solving skills with a strong technical background and result oriented team player with excellent communication and interpersonal skills.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.x/9.x/8.x, IICS, Informatica Cloud, Informatica ENTERPRISE DATA CATALOG(EDC)Master Data Management (MDM), Data Quality Tool (IDQ), Informatica Cloud. AWS

Database: Oracle 12c,Oracle 9i/10g/11g, SSIS,SQL Server SQL Server 2014/2008/2005 and Teradata 14.0

Data Modelling Tools: Erwin 4.1, TOAD, MS Visio, SQL *Loader, Star and Snow Fake Schema., Snowflake

Scheduling Tools: TWS, Auto Sys, Maestro, Cron Tab, Control - M, Informatica Scheduler, Tidal

Languages: SQL, T-SQL, PL/SQL, C, C++

Scripting Languages: UNIX Shell Scripting, Korn Shell, Bash shell scripting. Python

Operating Systems: Windows, MSDOS, Linux, Unix, Sun Solaris

Reporting Tools: OBIEE, CLICK VIEW, COGNOS Report Studio, IBM Cognos 11.0, Tableau

PROFESSIONAL EXPERIENCE

Confidential

Sr. ETL Developer

Responsibilities:

  • Responsible for Developing, support and maintenance for the ETL (Extract, Transform and load) processes Using Oracle and Informatica Power Center.
  • Worked extensively on Informatica Power Center tool - Source Analyzer, Data Warehousing designer, Mapping & Mapplet Designer and Transformation Designer. Developed Informatica mappings and also in tuning of mappings for better performance...
  • Analyzing the source data and working with business users to develop the Model.
  • Sharing Requirements and make them to understand the requirement with Offshore. and setup the environments for brand new Projects and For the Informatica Team.
  • Extracted data from flat files and Sql Server and Sybase and Orcale Database and applied business logic and loaded in to oracle and Sybase Warehouse.
  • Developed mappings/Reusable Objects/Transformation by using mapping designer, transformation developer in Informatica power center.
  • Implemented Informatica recommendations, methodologies and best practices.
  • Implemented slowly changing dimensions to maintain current information and history information in dimension tables.
  • Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
  • Worked on informatica integration cloud software( IICS) Application Integration components like Processes, Service Connectors, and Process Objects to integrate into another application and with CIH,CDI,CAI.
  • Developed Cloud mappings to extract the data for different regions.
  • Worked on different data sources such as DB2,Tearadata Oracle, Sybase, SQL Server, Flat files etc.
  • Involved in Data Validating, Data integrity, Performance related to DB, Filed Size Validations, Check Constraints and Data Manipulation and Updates by using SQL Single Row Functions.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Experience in building REST, API Integration using IICS.
  • Strong understanding across Cloud and infrastructure components (server, storage, network, data, and applications) to deliver end to end Cloud Infrastructure architectures and designs.
  • Experience in IICS Application Integration components like Processes, Service Connectors and process object.
  • Experienced in Enter the directory and install the Enterprise Data Catalog and Enterprise Data Lake binaries.
  • Experienced in enable Informatica to register Enterprise Data Lake with the domain, and install the Enterprise Data Catalog and Enterprise Data Lake binaries in the same directory on any gateway node.
  • Build Service connectors for Real Time integration with Third party applications.
  • Experience in using JSON file to call web services.
  • Worked with the internal and external IT teams to create REST APIs and python scripts as needed as well as recognized and developed effective technical solutions regarding system deficiencies and architectural needs
  • Experience in Integrating Informatica cloud with other applications like Salesforce, Workday, ServiceNow
  • Experience in installing Add-on connectors and drivers for IICS.
  • Ability to install Informatica Intelligent Cloud Services (IICS) Secure Agent components and subsystems for a multi-server platform capable of scaling with growing number of components and applications
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator, Stored Procedure.
  • Created Triggers and Store procedures to drop and recreate the indexes in the target Data warehouse before and after the sessions.
  • Have Hands on experience in Google cloud.
  • Created E-mail notifications tasks using post-session scripts. Created deployment groups, migrated the code into different environments.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections. Demo Presentation with WEB UI results and Sit and Uat and Prod Deployments Done.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.

Environment: Informatica Power Center 10.1,Informatica cloud,IICS, ENTERPRISE DATA CATALOG(EOC), Oracle 11.5,Terradata,Sqlserver,Python, Tidal, Toad, Sybase, Control-m, MSSQL Server SNOFLAKE, AWS, Putty, .Winscp, UNIX, Jira, Agile Methodology, Bitbucket, Tidal, API, Postman, Rest, GCP, Oracle, Sybase, Unix, UI, SNOW, COGNOS Report Studio, EDI, XML and Jason Files Tableau, Tableau

Confidential

Sr. ETL Developer

Responsibilities:

  • Responsible for Developing, support and maintenance for the ETL (Extract, Transform and load) processes Using Oracle and Informatica Power Center.
  • Worked extensively on Informatica Power Center tool - Source Analyzer, Data Warehousing designer, Mapping & Mapplet Designer and Transformation Designer. Developed Informatica mappings and also in tuning of mappings for better performance.
  • Extracted Data from different flat files, MS Excel, MS Access and transformed the data based on user requirement using Informatica Power Center and loaded data into target, by scheduling the sessions.
  • Designed the ETL mappings between sources to operational staging targets, then to the data warehouse using Power center Designer.
  • Used Informatica Power Center 10.1 to create mappings, sessions and workflows for populating the data into dimension, fact and lookup tables simultaneously from different source systems (SQL Server, Oracle, Flat files).
  • Experince in Hadoop and good u nderstanding on Hive and HBase implementation. Knowledge on scheduling tools such as oozie, AutoSys .
  • E xperince in using Informatica Connectors, REST, SOAP and other API’s for Integration.
  • Responsible for creating and maintaining Architecture document for PL/SQL ETL Framework.
  • Worked as Data modeler and created Data model for warehouse and involved in ODS and Data Mart data models.
  • Involved in design and development of data warehouse environment, liaison to business users and technical teams gathering requirement specification documents and presenting and identifying data sources, targets and report generation.
  • Extensively used Oracle ETL process for address data cleansing.
  • Used ETL tools Informatica 10.1 to extract data from source systems, cleanse Transform, and Load data into XML and Flat files.
  • Created DDL scripts to create database schema and database objects like tables, stored procedures, views, functions, and triggers using T-SQL.
  • Identified & documented data integration issues and other data quality issues like duplicate data, non-conformed data, and unclean data.
  • Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA.
  • Designed and developed standard re-usable mapping, transformation logic and processes in Informatica for implementing business rules and standardization of source data from multiple systems into the data warehouse.
  • Experience in IICS Application Integration components like Processes, Service Connectors, and Process Object
  • Develop detailed specifications, including data conversion, interfaces, and custom reporting to ensure business requirements are met.
  • Define and document setup and configuration parameters while follow eBO/Cognos development best practices.
  • Manage reports security, folders security, database security, report schedules and distribution.
  • Created common reusable objects for the ETL team, overlook coding standards and reviewed high - level design specification, ETL coding and mapping standards.
  • Optimized the performance of queries with modification in T-SQL queries, normalized tables and established joins .
  • Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (work flows).
  • Create Sessions, reusable worklets and workflows in Workflow Manager and Schedule workflows and sessions at specified frequency.
  • Created Deployment Document for Migration of code from Dev to Test and from Test to Production environment.

Environment: Informatica Power Center 10.1, Informatica Cloud, SSIS, SQL Server, Oracle 11g, SQL, PL/SQL, Oracle RDBMS 12g, Autosys.

Confidential, IL

Sr. ETL Developer

Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Created references tables, applications and workflows and deployed that to Data integration service for further execution of workflow.
  • Created new database objects like Tables, Procedures, Functions, Indexes and Views using T-SQL in Development and Production environment for SQL Server 2005
  • Extracted the data from the flat files and other RDBMS databases into staging area, ods and populated onto Data warehouse.
  • Published the records to downstream applications with help of ETL power center with the help of message queue. Created the report for the Informatica CDC workflow’s performance
  • Worked with Power Center Designer tools in developing mappings to extract and load the data from XML files into different Flat File formats.
  • Using Informatica Power Center created mappings and mapplets to transform the data according to the business rules.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager..
  • Involved in development and debugging of complex batch T-SQL Procedures and functions.
  • Developed mapping parameters and variables to support SQL override.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Experienced with change data capture (CDC) by using the MD5 function of Informatica.
  • Experienced with Cognos administration - Server administration (install, upgrade, troubleshoot Cognos environments), content administration (reports/package promotion, user security),
  • Experience in all aspects of development in Cognos - Modeling in Cognos Framework manager, map rendering, performance tuning etc.
  • Confer with programmers and architects to gain understanding of needed changes or modifications of existing framework and reports.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.Involved in data modeling to identify the gaps with respect to business requirements and transforming the business rules.
  • Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables..
  • Used Debugger to test the mappings and fixed the bugs.
  • Extensively involved in writing SQL queries (Sub queries and Join conditions), PL/SQL programming.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Developed Shell scripts using UNIX to drop and recreate indexes and key constraints.
  • Used TOAD to develop oracle PL/SQL Stored Procedures.
  • Working in the Agile (Scrum) software development process, for the better accurate signoff.
  • Designed and developed complex procedures to handle errors and exceptions at both application and database level using PL/SQL and shell scripts. experienced in SQL* loader.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Informatica Power Center 9x, SQL, PL/SQL, Data Warehouse, Oracle, Git Cognos,Control-M.

Confidential, Dearborn, MI

Informatica Developer/Project Coordinator

Responsibilities:

  • Worked with business analysts for requirement gathering, business analysis, and translated the business requirements into technical specifications to build the Enterprise data warehouse.
  • Extensively worked on Power Center 9.1 Designer client tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Involved in gathering and analyzing the requirements and preparing business rules.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
  • Experience in Designing ETL & Data Integration Solutions, Development and Support using Informatica PowerCenter.
  • Extensively Worked with Data Modeler, Data Analysts and Business Users to gather, verify and validate various business requirements. Write Stored Procedures and T-SQL statements for retrieval of data and Involved in performance tuning of T-SQL queries and Stored Procedures.
  • Developed Database application to meet business need using Oracle PL/SQL features. Created Packages, Procedures, Functions and Triggers.Involved in creating new table structures and modifying existing tables and fit into the existing Data Model.
  • Extracted data from different databases like Oracle and external source systems like flat files using ETL tool.
  • Worked on performance tuning of the ETL processes. Optimized/tuned mappings for better performance and efficiency.
  • Created the report for the Informatica CDC workflow’s performance
  • Written PLSQL code for the data integration between Informatica and other Source systems.
  • Extensively used Informatica Power Center for extracting, transforming and loading into different databases.
  • Defined Target Load Order Plan and to load data correctly into different Target Tables.
  • Experienced in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
  • Involved in designing the ETL testing strategies for functional, integration & system testing for Data warehouse implementation.
  • Recommended tweaks and changes to the data model design based on the challenges in implementing ETL code. Developed complex mappings using Informatica Power Center Designer to transform and load data from different sources and loaded in to Oracle using Informatica.
  • Involved in enhancements and maintenance activities of the data warehouse including performance tuning.
  • Designed Fact tables and Dimension tables for star schemas and snowflake schemas using ERWIN tool and used them for building reports.
  • Performed Unit testing and created Unix Shell Scripts and provided on call support.
  • Created sessions and workflows to run with the logic embedded in the mappings.

Environment: Informatica Power Center 8x/9x, MDM Workflow Manager, Workflow Monitor, Informatica Power Connect/ Power Exchange, Toad, Data Warehouse, SQL Developer, Oracle 11g, SQL loader, PL/SQL, Erwin,Linux.

Confidential, Hartford, CT

Informatica Developer

Responsibilities:

  • Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
  • Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.
  • Responsible for definition, development and testing of processes/programs necessary to extract data from operational database and syndicated File System IMS and, SWIFT, HL 7 and EDI-X12 flat files.
  • Responsible for the design of mapping and delivering the Business data for the data warehouse from existing data sources.
  • Worked extensively with HL7 Data, EDI X12 Messages, HIPAA Transactions, ICD Codes.
  • Worked with HIPAA Transactions and EDI transaction sets (834, 835, 837, 824, 820, 270, 276, 271, 278) Create workflows and work lets for Informatica Mappings.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for the validity of the data from different data sources.
  • Extensively used Informatica to load data from various Data Sources like Flat files, Oracle, SQL Server, into the Enterprise Data Warehouse.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions.
  • Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
  • Involved in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Involve in migrating the ETL application from development environment to testing environment.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly..
  • Pushing the compressed and encrypted xml files and flat files generated to the external vendor using MFT.
  • Involved in Unit testing and system integration testing (SIT) of the projects.
  • Assist the team members with the mappings developed as part of knowledge transfer.

Environment: Informatica Power Center 8x, Windows Server 2008, Data Warehouse, Oracle, MS-SQL Server 2005, Batch Scripting, Perl Scripting, XML Targets, Flat Files,), UNIX..

Hire Now