We provide IT Staff Augmentation Services!

Etl Developer/ Informatica Developer Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • 5+ years of IT experience in design, analysis, development, documentation, coding, and implementation including Databases, Data Warehouse, ETL Design, Oracle, PL/SQL, SQL Server databases, SSIS, InformaticaPowerCenter 9.x/8.x/7.x, Informatica Data Quality
  • Expertise in Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions
  • Very good experience in developing ETL processes to facilitate data archival activities from a variety of
  • Having good Experience in Data warehousing Technology on ETL Tool Informatica Power Center.
  • Good IT experience in system analysis, design, development, testing and implementation of databases, data warehousing using IBM InfoSphere DataStage 11.5, 9.1, 8.7
  • Experience in developing Parallel jobs using various DataStage stages such as Lookup, Join, Transformer, Copy, Filter, Sort, Aggregator, Funnel, Remove Duplicates and SQL Server Enterprise, Oracle Connector, Sequential file, Data set.
  • Excellent knowledge on Python Collections and Multi - Threading
  • Data processing knowledge in Designing and Implementing Data Warehouse applications, mainly Transformation processes using ETL tool Informatica.
  • Experience and knowledge in the Data Warehousing using Data Extraction, Data Transformation and Data Loading (ETL)
  • Having experienced in Agile Methodologies, Scrum stories and sprints experience in a Python based environment, along with data analytics, data wrangling and Excel data extracts.
  • Experience in using SQL Server Integration Services (SSIS) to build Extract, Transform and Load (ETL) solutions and providing Logging, Error handling by using Event Handler, and Custom Logging for SSIS Packages.
  • Skilled experience in Python with proven expertise in using new tools and technical developments
  • Knowledge of collecting Data to AWS platform by subscribing to the topic (MQTT topic), and processed it by using Rule engine and send the data to other services, such as Amazon S3
  • Good experience in OBIEE Oracle BI Analytics 11g, Oracle BI Applications 7.9.4, Siebel Analytics which include building and configuring repository, creating and managing
  • Involved in complete software development life cycle (SDLC) of project with experience in domains like Healthcare, Banking, Insurance
  • Expertise in the ETL Tool Informatica and have extensive experience in Power Center Client tools including Designer, Repository Manager, Workflow Manager/ Workflow Monitor
  • Extensively worked with complex mappings using various transformations like Filter, Joiner, Router, Source Qualifier, Expression, Union, Unconnected / Connected Lookup, Aggregator, Stored Procedure, XML Parser, Normalize, Sequence Generator, Update Strategy, Reusable Transformations, User Defined Functions, etc.
  • Extensively worked on Relational Databases Systems like Oracle11g/10g/9i/8i, MS SQL Server, Teradata and Source files like flat files, XML files and COBOL files
  • Excellent background in implementation of business applications and in using RDBMS and OOPS concepts.
  • Experience in writing complex sql queries and performance improvement of ETL
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1.
  • Expertise in Data warehousing concepts like OLTP/OLAP System Study, Analysis, and E-R modeling, developing database Schemas like Star schema and Snowflake schema used in relational, dimensional, and multidimensional data modeling.
  • Experienced in mapping techniques for Type 1, Type 2, and Type 3 Slowly Changing Dimensions.
  • Hands-on experience in Informatica upgrade from 8.6 to 9.1
  • Extensive experience in debugging mappings, identifying bottlenecks/bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations
  • Solid experience in implementing business requirements, error handling, job control & job auditing using Informatica Power Center tools
  • Strong experience in SQL, PL/SQL, Tables, Database Links, Materialized Views, Synonyms, Sequences, Stored Procedures, Functions, Packages, Triggers, Joins, Unions, Cursors, Collections, and Indexes in Oracle
  • Proficient in Oracle Tools and Utilities such as TOAD and SQL Loader.
  • Experienced in scheduling Sequence and parallel jobs using UNIX scripts and scheduling tool (Control-M v7/v8), CA WA Workstation (ESP),UC4.
  • Expert in analyzing Business & Functional Specifications, creating Technical Design Document and Unit Test cases
  • Experience in Performance Tuning of targets, sources, mapping, workflow, system.
  • Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.

TECHNICAL SKILLS

Databases: Oracle, MS SQL Server, DB2, Teradata

Languages: SQL, PL/SQL, Unix Shell Script, Visual Basic, XML, PERL

Tools: Toad, SQL* Loader, WinSQL, Teradata SQL Assistant

Operating Systems: Windows, AIX, Sun Solaris, Linux

Testing Tools: Quality Center,QTP, WinRunner, LoadRunner, Test Director, Rational Clear Quest

Version Control: Rational Clear Case, CVS, VSS, PVCS

PROFESSIONAL EXPERIENCE

Confidential

ETL Developer/ Informatica Developer

Responsibilities:

  • Extraction, Transformation and Loading (ETL) of data by using Informatica Power Center.
  • Worked on Agile methodology with team of ETL Developers, Database developers and testers.
  • Assisted with the creation of reusable components as they relate to the ETL framework.
  • Worked on Developing, Test and Deploy Data Integration Solutions
  • Worked on Transfer data from Datacenters to cloud using AWS Import/Export Snowball service.
  • Written complex SQL Queries to fulfil the business requirement. Troubleshooting and performance tuning for SQL, SSIS and BizTalk Applications involved in Encounters Outbound file load and file extract process
  • Prepared and analyzed reports using Python libraries and involved in environment Setup.
  • Used shell scripting and various workflow tasks, pre-session/post session utilities for automating ETL process like formatting of .CSV files to include header/footer, trigger emails to end users after report creation or incase of session failure.
  • Used Python and Django to interface with the JQuery UI and manage the storage and deletion of content.
  • Worked on Informatica an ETL tool is used to transform and populate the data, which is further processed for Client Reporting Purpose.
  • Developed ETL code in DataStage parallel jobs utilizing Sequential file, Join, Merge, Lookup, Aggregator, Row generator, Peek, Dataset, Funnel, Remove Duplicates, Copy and Column Generator stages.
  • Worked on Informatica Designer Tool's components - Source Analyzer, Transformation Developer, and Mapping Designer.
  • Involved in the designing of Landing, Staging and Base tables in Informatica MDM.
  • Created MDM mapping and configured match and merge rules to integrate the data received from different sources.
  • Worked on OBIEE 10/11g version upgrades and modifying the reports
  • Developed the reports against Oracle 11g database
  • Created PL/SQL stored procedures and various functions using Oracle.
  • Used collections, performance tuned stored procedures and utilized Bulk processing in PL/SQL stored procedures.
  • Implemented Type 2 slowly changing dimensions to maintain dimension history and tuned the mappings for optimum performance.
  • Created and worked with generic stored procedures for various purposes like truncate data from stage tables, insert a record into the control table, generate parameter files etc.
  • Designed and developed Staging and Error tables to identify and isolate duplicates and unusable data from source systems.
  • Designed and developed ETL processes that load data for the data warehouse using ETL tools and PL/SQL.
  • Simplified the development and maintenance of ETL by creating Mapplets, Re-usable Transformations to prevent redundancy.
  • Developed standard mappings and reusable Mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup and filter.
  • Wrote and implemented generic UNIX and FTP Scripts for various purposes like running workflows, archiving files, to execute SQL commands and procedures, move inbound/outbound files.
  • Maintained existing ETL Oracle PL/SQL procedures, packages, and Unix scripts.
  • Designed data warehouse schema and star schema data models using Kimball methodology.
  • Designed and executed test scripts to validate end-to-end business scenarios.
  • Used session partitions, dynamic cache memory, and index cache to improve the performance of ETL jobs.
  • Designed and developed reporting end user layers (Business Objects Universes) and Business Objects reports.
  • Automated and scheduled Workflows, UNIX scripts and other jobs for the daily, weekly, monthly data loads using Autosys Scheduler.

Environment: Informatica PowerCenter10.1, Power Exchange, Oracle 12c, PL/SQL, Business Objects XI R2, Teradata 15, Erwin 9.7, Autosys, DB2, UNIX.

Confidential

ETL Developer / Informatica Developer

Responsibilities:

  • Involved in Designing High Level Technical Documentation based on specification provided by the Manager.
  • Performed troubleshooting, fixed and deployed many Python bug fixes of the two main applications
  • Worked on extraction, transformation and loading of data directly from different heterogeneous source systems like Oracle and Flat files (ETL) using Informatica power center 9.xtool.
  • Worked on packages, control flow, data flow, containers, data flow tasks, transformations, parallel execution, error handling, logging, debugging, SQL jobs and maintenance in SSIS
  • Extensively used ETL Tool Informatica to load data from Flat Files, Oracle, SQL Server, Teradata
  • Agile data loading for Amazon Redshift using Informatica drag-and-drop based cloud designer to create integrations with multiple source objects and targets.
  • Used Python scripts to update the content in database and manipulate files.
  • Worked on completing upgrade from Informatica MDM 9.7 to 10.1.
  • Loading data from large data files into Hive tables.
  • Worked on Control-M scheduling tool to monitor DataStage jobs and run them in Control-M scheduler.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Designed and developed the DataStage parallel jobs for extracting, cleansing, transforming, integrating, and loading data using DataStage Designer.
  • Created all the Target Table DDLs and as well the Source table DDLs in Teradata.
  • Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
  • Requirement gathering and Understanding of the Functional Business processes and Requirements given by the Business Analyst.
  • Involved in Designing High Level Technical Documentation based on specification provided by the Manager.
  • Modifying the Design of an ETL Package to avoid long running transformation.
  • Designed and developed Informatica mappings for data loads and data cleansing.
  • Performance tuned ETL processes at the mapping, session and Database level.
  • Integrated sources from different databases and flat files.
  • Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.
  • Involved in end to end system testing, performance and regression testing and data validations and Unit Testing.
  • Basic Informatica administration such as creating folders, users, privileges, server setting optimization, and deployment groups, etc.
  • Managed Change Control Implementation and coordinating daily, monthly releases and reruns.
  • Used Teradata Utilities such as Mload, Fload and Thump.
  • Created BTEQ scripts.
  • Used UNIX scripts for automating processes.

Environment: Java Informatica Power Center 10.1/9.6.1/9.1.0 , Informatica Developer Client, IDQ, MDM, Power Exchange,DB2, SAP, Oracle 11g, Hadoop HDFS, Hive, Sqoop, Syncsort, PL/SQL, TOAD, SQL SERVER 2005/2008, XML, UNIX, Windows XP, OBIEE, Teradata.

Confidential, Houston TX

ETL/ Informatica Developer

Responsibilities:

  • Worked closely with business analysts and data analysts to understand and analyze the requirement to come up with robust design and solutions.
  • Involved in standardization of Data like of changing a reference data set to a new standard.
  • Data if validated from third party before providing to the internal transformations should be checked for its accuracy (DQ).
  • Used Informatica Power Center 9.1 for extraction, transformation and load (ETL) of data in the data warehouse.
  • Reviewing the developed ETL Package for various coding and business Standard.
  • Involved in massive data profiling prior to data staging.
  • Created profiles and score cards for the users.
  • Created Technical Specification Documents and Solution Design Documents to outline the implementation plans for the requirements.
  • Designed the mappings according to the OBIEE specifications like SDE (Source Dependent Extraction) and SIL (Source Independent Extraction).
  • Involved in various phases of ETL analysis of the existing system.
  • Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5 by using various transformations like Expression, Source Qualifier, Filter, Router, Sorter, Aggregator, Update Strategy, Connected and unconnected look up etc.
  • Created Informatica components required to operate Data Quality (Power Center required)
  • Designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming Convention, and Version Control.
  • Developed the required Informatica mappings to process the data into Dimension and facts tables which satisfy the OBIEE reporting rules by interacting with reporting team.
  • Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables. Created PL/SQL
  • Programs like procedures, function, packages, and cursors to extract data from Target System.
  • Utilized dimensional and star-schema modeling to come up with new structures to support drill down.
  • Converted business requirements into highly efficient, reusable and scalable Informatica ETL processes.
  • Data if sourced from database that has valid not null columns should not undergo DQ check for completeness.

Environment: Informatica Power Center 9.1/8.6, PL/SQL Developer, OBIEE, IDQ, Oracle 11g, UNIX, Microsoft SQL Server, TOAD, Teradata, Netezza.

We'd love your feedback!