We provide IT Staff Augmentation Services!

Informatica Etl Developer Analyst Resume

4.00/5 (Submit Your Rating)

Rocky Hill, CT

SUMMARY:

  • Around 7 years of experience in Data ware housing using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and loading Data with Informatica PowerCenter / IDQ.
  • Extensively used Informatica PowerCenter 9.6/9.1/8, Informatica Data Quality (IDQ) 9.6/9.1 as ETL tool for extracting, transforming, loading and cleansing data from various source data inputs to various targets, in batch and real time.
  • Excellent Knowledge of executing Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3) in line with business requirement, Change Data Capture, Dimensional Data Modeling, knowledge of Entity - relationship concept, Star/Snowflake Modeling, Data Marts, FACT and Dimensions tables, OLAP and OLTP concepts.
  • Well- versed in Data Warehouse life cycle development, performed ETL procedure to load data from different sources like SQL Server, Oracle, Mainframe, Teradata and flat files into data marts and data warehouse using Informatica Power Center-Designer, Workflow Manager, and Workflow Monitor and repository manager.
  • Broadly worked on development of various ETL mappings in Informatica designer processing tasks using Workflow Manager to load the data from various source systems into the Data Warehouse, using different transformations like Joiner, Aggregator,Update Strategy, Router, Java, Lookup, Sequence Generator, Filter, and Sorter.
  • Proficient in developing reports using Business Intelligence tools like Excel, Business objects and tableau.
  • Adept and knowledge in retail domain-order processing systems, purchase orders, sales orders, purchasing behaviors etc.
  • Worked on Informatica Data Quality 9.6/9.1(IDQ) tool kit and performed data profiling, cleansing and matching and imported data quality files as reference tables.
  • Involved in the Analysis, Design, Development,Testing and Implementation of business application systems for various Sectors.
  • Knowledge and academic experience in developing business flow diagrams, UML diagrams etc.
  • Adept at understanding Agile software development methodologies and framework.
  • Academic project experience in SAS enterprise miner, Base SAS, SAP Business Objects, Tableau.
  • Excellent in designing ETL procedures and strategies to extract data from different heterogeneous source systems like oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, MS Access, XML etc.
  • Experience in debugging, error handling and performance tuning of sources, targets, mappings and sessions with the help of error logs generated by Informatica server.
  • Experience in SQL, PL/SQL and UNIX shell scripting.
  • Highly proficient in using T-SQL for developing complex Stored Procedures, Triggers, Tables, Views, User defined Functions, User profiles, Relational Database Models and Data Integrity, SQL joins, indexing and Query Writing
  • Extensive experience using database tool such as SQL*Plus, SQL*Developer.
  • Designed and Developed IDQ mappings for address validation / cleansing, data conversion, exception handling, and report exception data at staging area.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team. Have proven to be highly effective in interfacing across business and technical groups.
  • Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
  • Involved in SDLC- software development life cycle (Water, Scrum/Agile) of building a Data Warehouse on windows.

TECHNICAL TOOLS:

ETL Tools: Informatica Power Center 10.1/9.1/8.x/7.x/6.x, Power Exchange 9.1/8.x/7.x, Data Quality, Power Mart 5.x (Source Analyzer, Data Warehouse Designer, Mapping Designer, Mapplet, Transformation Developer, Repository Manager, Workflow monitor).

Databases: Oracle 12C/11g/10g/9i/8i, DB2, MS SQL Server 2000/2005, Teradata.

DB Tools: Oracle SQL Developer, Toad, Oracle SQL*Plus.

Operating Systems: Windows 98/2003/2000/ XP/7/NT, UNIX, MS-DOS.

Scheduling Tools: Informatica Scheduler.

Languages: SQL, PL/SQL, UNIX, Linux, HTML,C, C++.

Office Suite: MS Word, MS Power Point, MS Excel, MS Access.

Methodologies: Agile and Waterfall methodologies, Data Mart, Dimensional, Snow Flake, Star schema

Other Tools: Tableau, SAS.

WORK EXPERIENCE:

Informatica ETL Developer Analyst

Confidential, Rocky Hill, CT

Responsibilities:

  • Worked with Business Analysts and Data Architects in analyzing business requirements, specifications and business requirement documents in order to define the optimal solution and identify responsibilities for delivering solutions for data loading and maintenance.
  • Engaged in preparing LLD and HLD documents along with the business analysts and got those reviewed by customers in line with detailed project time lines.
  • Performed data cleansing and analysis operations on the incoming data and maintained a report of the analysis performed.
  • Extensively worked on data extraction, cleansing and data integration, various heterogeneous source systems such as DB2, MS SQL and XML files, applied business logic and loaded the target oracle system as per business requirements.
  • Implemented SCD type1 and type 2 to maintain historical data as per business requirements.
  • Developed complex mappings using Informatica Power Center Designer to transform and load the data to Target system.
  • Experience in using Stored Procedures, TOAD, Explain Plan, Ref Cursors, Constraints, Triggers, Indexes-B-tree Index, Bitmap Index, Views, Inline Views, Materialized Views, Database Links, Export/Import Utilities.
  • Used different algorithms like Bio gram, Edit, Jaro, Reverse and Hamming Distance to determine the threshold values to identify and eliminate the duplicate datasets and to validate, profile and cleanse the data. Created/modified reference tables for valid data using Analyst tools.
  • Used various transformations such as expression, filter, rank, source qualifier, joiner, aggregator and Normalizer in the mappings and applied surrogate keys on target table.
  • Developed and implemented UNIX shell script for the start and stop procedures of the sessions.
  • Used Informatica debugger for handling data errors in mapping designer and fixed the bugs in DEV.
  • Performed unit testing and documented unit test plan and its results.
  • Responsible for test case execution and adhoc testing.
  • Used HP Quality Center for storing, maintaining the test repository, bug tracking and reporting.
  • Performed data validation testing writing SQL queries.
  • Investigated and provided solutions to problems and queries relating to reports and also to incidents raised through support.
  • Created connection pools, physical tables, defined joins and implemented authorizations in the physical layer of the repository.

Environment: Informatica Power Center 9.1, Oracle 9i/10g/11g, DB2, SQL Server 2005/2008, Windows 2003/2008, UNIX/LINUX, Shell Scripting, HP Quality center, TOAD

Informatica ETL Developer

Confidential, Birmingham, AL

Responsibilities:

  • Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica PowerCenter.
  • Experienced in Parsing high-level design specs to simple ETL coding and mapping standards.
  • Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter/Data Quality (IDQ), peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.
  • Coded Teradata BTEQ scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.
  • Worked with team to convert Trillium process into Informatica IDQ objects.
  • Extensively used the DQ transformations like Address validator, Exception, Parser, Standardizer, Solid experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor.
  • Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.
  • Extensively worked on CDC to capture the data changes into sources and for delta load. Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions.
  • Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
  • Proficient in System Study, Data Migration, Data integration, Data profiling, Data Cleansing / Data Scrubbing and Data quality.
  • Worked on Informatica Data Quality (IDQ) toolkit, analysis, data cleansing, data matching, data conversion, address standardization, exception handling, reporting and monitoring capabilities of IDQ.
  • Coded Teradata BTEQ SQL scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.
  • Experience in creation of ETL Mappings and Transformations using Informatica PowerCenter to move data from multiple sources into target area using complex transformations like Expressions, Routers, Lookups, Source Qualifiers, XML generator, XML Parser, Aggregators, Filters, Joiners.
  • Responsible in preparing Logical as well as Physical data models and document the same.
  • Performed ETL code reviews and Migration of ETL Objects across repositories.
  • Developed ETL's for masking the data when made available for the Offshore Dev. Team.
  • Developed UNIX scripts for dynamic generation of Parameter Files& for FTP/SFTP transmission.
  • Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's.
  • Integrated IDQ mappings through IDQ web service applications as cleanse functions in Informatica IDQ cleanse Adapters.
  • Migrated codes from Dev to Test to Pre-Prod. Created effective Unit, Integration test of data on different layers to capture the data discrepancies/inaccuracies to ensure successful execution of accurate data loading.
  • Scheduled Informatica workflows using OBIEE.
  • Involved in implementing change data capture (CDC) and Type I, II, III slowly changing Dimensions.
  • Developed functions and stored procedures to aid complex mappings

Environment: Informatica PowerCenter 10.x/9.6, Informatica Data Quality (IDQ) 9.6, Oracle 11g, Teradata, PL SQL, SQL developer, TOAD, Putty, Unix

Informatica ETL Developer

Confidential, Alpharetta, GA

Responsibilities:

  • Extracted data from Order management systems and loaded into the data marts.
  • Assisted to prepare design/specifications for data Extraction, Transformations and Loading developed mappings, enabling the extract, transport and loading of the data into target tables.
  • Created Workflow, Worklets and Tasks to schedule the loads at required frequency Workflow in workflow manager
  • Implemented SCD type1 and type 2 to maintain historical data as per business requirements.
  • Prepared reusable transformations to load data from operational data source to Data Warehouse.
  • Wrote complex SQL Queries involving multiple tables with joins.
  • Scheduled and Run Workflows in Workflow Manager and monitored sessions using InformaticaWorkflow Monitor.
  • Used debugger, session logs and workflow logs to test the mapping and fixed the bugs.
  • Analyzed the dependencies between the jobs and scheduling them accordingly using the Work Scheduler.
  • Improved the performance of the mappings, sessions using various optimization techniques.
  • Created source and target connections in workflow manager with assistance from admin teams.
  • Deployed the code to SIT and QA environments adhering to the process stipulated by the organization
  • Constructed workflows having Command, Email Session, Decision and wide variety of tasks
  • Maintained development, test and production mapping migration using Repository Manager

Environment: Informatica Power center 9.1, Repository Manager, Oracle 11g, SQL Server 2005/2008, Windows 2003/2008, Service Now

ETL Informatica Developer

Confidential, King of Prussia, PA

Responsibilities:

  • Worked on Business support request (BSR) raised by business users for critical/high priority issues.
  • Used SSIS to design ETL process using control flows and data flows.
  • Using Data flows in SSIS, extracted data from an external data sources, flow that data through several transformations such as sorting, filtering, merging it with other data and converting data types, and finally store the result at a destination, usually a table in the data warehouse.
  • Involved in building the ETL architecture using Informatica 9.6.1/ 9.5.1 and Source to Target mapping to load data into Data warehouse.
  • Providing periodic update to customer on the coding, unit testing release and act as Coordinator between development and business team.
  • Proficient in Tableau data visualization tool to analyze and obtain insights into large datasets, create visually compelling and actionable interactive reports and dashboards.
  • Performed the data profiling and analysis making use of informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Used Metadata manager for validating, promoting, importing and exporting repositories from development environment to testing environment.
  • Performed data quality analysis, gathered information to determine data sources, data targets, data definitions, data relationships, and documented business rules.
  • Preparation of technical specification for the development of Extraction, Transformation and Loading data into various stage tables.
  • Created Stored procedures, collections and packages.
  • Created mapping documents to outline data flow from sources to targets. Parsed high-level design specification to simple ETL coding and mapping standards.

Environment: Informatica 9.6.1/9.5.1 , Teradata, DB2, Oracle, Flat Files, UNIX, Windows, SQL Assistant, Tableau.

ETL Developer

Confidential

Responsibilities:

  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Target
  • Extract data from flat files, Oracle, SQL Plus, MS SQL Server 2008 and to load the data into the target database.
  • Extensively used Informatica Power Center 7.1 an ETL tool to extract, transform and load data from remote sources to DW.
  • Involved in Designing High Level Technical Documentation based on specification provided by the Manager.
  • Basic Informatica administration such as creating folders, users, privileges, server setting optimization, and deployment groups, etc.
  • Developed complex joins in the mappings to process data from different sources.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors of target data load.
  • Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.
  • Performed unit testing of Informatica sessions, batches and the target Data.
  • Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
  • Designed and developed UNIX shell scripts as part of the pre-session and post-session command to automate the process of loading, pulling, renaming and pushing data from and to different servers.
  • Transformation and Loading (ETL) process is achieved with Informatica.

Environment: Informatica Power Center Designer 7, Oracle 9.x/10g, Flat Files, UNIX, MS SQL Server 2008, SQL, PL/SQL, and SQL PLUS

Data Engineer

Confidential

Responsibilities:

  • Involved in the requirements definition and analysis in support of Data Warehousing efforts.
  • Worked on ETL design and development, creation of the Informatica source to target mappings, sessions and workflows to implement the Business Logic.
  • Created Informatica mappings with PL/SQL Procedures/Functions to build business rules to load data.
  • Used most of the transformations such as the Source qualifier, Aggregators, Lookups, Filters, Sequence and Update strategy, Router.
  • Extensive knowledge and worked with Informatica Data Quality (IDQ 8.6.1) for data analysis, datacleansing, data validation, data profiling and matching/removing duplicate data.
  • Designed and developed Informatica DQ Jobs, Mapplets using different transformation like Address validator, matching, consolidation, rules etc. for data loads and data cleansing.
  • Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Involved in Data Quality checks by interacting with the business analysts.
  • Performing Unit Testing and tuned the mappings for the better performance.
  • Maintained documentation of ETL processes to support knowledge transfer to other team members.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
  • Validated and tested the mappings using Informatica Debugger, session logs and workflow logs

Environment : Informatica PowerCenter 8.6.1, Informatica Data Quality (IDQ 8.6.1), SQL Server, Oracle 11g, Flat files, MySQL, Teradata 13, Notepad++, Toad, UNIX scripting, Windows NT.

We'd love your feedback!