We provide IT Staff Augmentation Services!

Etl Developer Resume

0/5 (Submit Your Rating)

Harrisburg, PA

SUMMARY

  • Over 6 years of diversified experience in the field of Information Technology with an emphasis on Data warehousing using Informatica PowerCenter/Power Mart 8.6/ 8.1/7.1/6.2 Target Databases and developing Strategies for Extraction, Transformation and Loading (ETL) mechanism - using Informatica.
  • Extensive knowledge with dimensional data modeling, star schema/snowflakes schema, fact and dimension tables.
  • Experience in building of operational data stores (ODS), Data marts, and enterprise data warehouses.
  • Extensive experience with ETL from disparate data sources such as Oracle, DB2, SQL Server, XML Files, Flat Files and VSAM files to target (Warehouse / Data Mart) data bases such as Teradata and Oracle.
  • 6+ years of extensive experience with databases (MS SQL Server 2000/2005/2008 , Oracle 8i/9i/10g, Teradata, and IBM DB2).
  • Expertise in SQL/PLSQL programming, developing & executing Packages, Stored Procedures, Functions, Triggers, Table Partitioning, Materialized Views.
  • Extensive work with PL/SQL, performance tuning of Oracle using Tkprof, SQL trace, SQL plan, SQL hints, Oracle partitioning, various indexes and join types
  • Experience in Performance Tuning of sources, targets, mappings, transformations and sessions.
  • Experienced in writing UNIX shell scripts to schedule the informatica sessions automatically.
  • Understanding & working knowledge of Informatica CDC (Change Data Capture).
  • Experienced with Informatica Data Explorer (IDE) / Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.
  • Hands on experience on Informatica Power Exchange for loading/retrieving data from mainframe systems.
  • Experience in working with COBOL files, XML, and Flat Files.
  • Proficient knowledge in ETL using SQL Server Integration Services (SSIS).

TECHNICAL SKILLS

ETL Tools: Informatica 8.6/8.5/8.1/7.1/6.2 (Power Center/Power Mart) (Designer, Workflow Manager, Workflow Monitor, Server Manager).

Databases: Oracle 10g/9i/8i, MS SQL Server 2005/2000, SYBASE, Teradate.

Database Skills: Cursors, Stored Procedures, Functions, Views, Triggers, and Packages

Client Side Skills: SQL, T-SQL, PL/SQL, UNIX shell scripting, Java, HTML, XML, CSS, JavaScript, C, C++, VB 6.0

Methodologies: Data Modeling Logical / Physical, Star/ Snowflake Schema, FACT & Dimension Tables, ETL, OLAP, Agile, RUP, Software Development Life Cycle (SDLC

BI Tools: Cognos BI Suit, SSRS, Business Objects 6.5

Tools: Toad, Erwin, Rational Rose, MS Project, JIRA, Test Director, TestTrack Pro, MS Visio, Autosys, Turbo Data Generator, Advance Data Generator.

PROFESSIONAL EXPERIENCE

Confidential, Harrisburg, PA

ETL Developer

Responsibilities:

  • Mapped business requirements to source systems to decide what data elements will be extracted.
  • Created Logical and Physical Modeling of Star and Snowflake schemas based on the business rules inErwin
  • Developed Logical model by iteratively refining it based on user feedback to match analysis and reporting requirements
  • Worked extensively in the development of Extract,Transformationand Load routines
  • Extensively used Informatica partitioning (Pass through/key range/Hash partition/ Round robin)
  • Used Teradata as a Source and a Target for few mappings. Worked with Teradata loaders within Workflow manager to configure FastLoad and MultiLoad sessions
  • Coordinated between theonsite and offshoreETL teams as a lead offshore developer for meeting the scheduled project milestones and deadlines.
  • Heavily involved with performance tuning of Oracle database - using TKProf utility, working with partitioned tables, implementing layer of materialized views to speed up lookup queries, DBMS Stats package to update statistics, using SQL hints and Oracle partitioning (range/hash/list), indexes (bitmap, B-tree, reverse key, etc) and various join types (Hash joins, Sort merge, Nested iteration join) to improve the performance
  • Collected, Normalized and integrated source data from its operational databases, as well as from third-party hospital, laboratory and physician disease reports
  • Involved in working with Facets 4.48, 4.51 and different EDI transaction file like 837,834, 835, 270, 271 to understand the source structure and the source data pattern
  • Worked closely in ICD-9 4010 data analyzing with data analyst to define the structure of dimension and facts
  • Created mappings using various active and passive transformations like Source qualifier, Lookup, Router, Stored procedure, Aggregator, Filter, Joiner, Expression, Normalizer, Java, Sql and reusable mappings inInformatica.
  • Documented all the mappings and the transformations involved inETL process Unit and integration testedInformaticaSessions, Batches, Workflows and the Target data.
  • Implemented optimization and performance tuning of mappings to achieve high response times
  • Developed Shell Scripts to automate file manipulation and data loading procedures.

Environment: InformaticaPower Center 8.6,Oracle 10i, SQL Server, DB2,MS Access 2000, SQL, PL/SQL, Teradata, MS SQL Server 2000, Erwin, Business Objects, Windows NT.

Confidential, Grand Rapids, MI

ETL Developer

Responsibilities:

  • Worked heavily withBusiness Analystto understand the business and to design the architecture of the data flow.
  • Worked onIDQ (InformaticaData Quality)file configuration at user’s machines and resolved the issues.
  • Created mappings to perform the tasks such as cleansing the data and populate that intostagingtables, Populating theEnterprise Data Warehouseby transforming the data into business needs & Populating theData Martwith only required information.
  • Used different transformations Source Qualifier, Expression, Filter, Lookup, Sequence generator, Router, Aggregator, Normalizer transformation and SQL override extensively.
  • Createdre-usabletransformations to clean the data, which were used in several mappings.
  • Worked onPower Exchangeto create data maps, pull the data frommainframe, and transfer intoStaging area.
  • Developed and tested Stored Procedures,FunctionsandPackagesin PL/SQL
  • Extracted data from various sources like MS SQL Server 2005, DB2, flat files, Excel spreadsheets, Oracle and XML files and loaded into the oracle database.
  • Involved in production support by performing normal loads, bulk loads, initial loads, incremental loads, daily loads and monthly loads
  • Designed and implementedStored ProceduresandTriggersfor automating tasks in SQL Server 2008/ Oracle 10g.
  • Developed flowcharts for complex stored procedures.
  • Created and ExecutedWorkflowsandWorkletsusing Workflow Manager to load the data into the Oracle Database.
  • Responsible forMonitoring Scheduling, running completed and failed sessions. Involved indebuggingthe failed mappings and developing error-handling methods.
  • Worked withInformaticaAdministratorto setup project folders in development, test and production environments.
  • WroteUNIX Shell ScriptsforInformaticaPre-Session, Post-Session and Autosys scripts for scheduling the jobs (work flows)
  • Extensively worked on the HIPPA 4010 transactions and HIPPA 5010 transactions as the source data like 837,837P, 835, 277,276 and more
  • Involved in analyzing the ICD 9 and ICD 10 for the data mapping from ICD 9 - ICD 10 and ICD 10 - ICD 9 in source and target level
  • Worked with theQuality Assurance teamto build the test cases to perform unit, Integration, functional and performance Testing.
  • Performedmigration of mappings and workflowsfrom Development to Test and to Production Servers.
  • UsedInformaticaVersion Controlfor checking in all versions of the objects used in creating the mappings, workflows to keep track of the changes in the development, test and production environment.

Environment: Informatica Power Center 8.1, Power Exchange 8.6.1, IDQ, Oracle 10g, SQL Server 2005, Mainframes, SISS, PL/SQL, SQL, DB2, UNIX,Windows Server 2003 R2,Cognos, Shell scripts.

Confidential, MA

Informatica Consultant

Responsibilities:

  • Extensively involved in interacting with Project coordinators, Business Analysts and End-users to gather requirements.
  • Prepared the ETL spreadsheet after identifying the Facts and Dimensions.
  • Designed the Star Schema after identifying the Facts and Dimensions.
  • Prepared the Fact Dimension Matrix Spreadsheet prior to the development of rpd for the required reports
  • Used theInformaticaPower centre designer to develop the required mapping based on the ETL spreadsheet.
  • Used theInformaticaPowerCenter Workflow Manager to develop the required sessions for the mappings developed based on the ETL spreadsheet.
  • Used the Workflow Manager to create tasks, sessions, worklets and batches.
  • Debugged the failed mappings.
  • Implemented the best practices for creating the mappings, sessions and workflows.
  • Identified bottlenecks in the ETL process and edited/corrected the mappings and sessions to improve the ETL process performance and achieved the required. Full ETL used to take 18 hours to complete which I fine tuned to 12 hours.
  • Analyzed performancetuningof Oracle 10g using SQL Optimizers and creating several indexes to improve its performance.
  • Unit tested the mappings before including the developed sessions into the already existing batch.
  • Integration tested the entire batch after including the individual sessions into the existing batch.
  • Involved in configuring new jobs and importing the newly developer informatica sessions into DAC

Environment: InformaticaPower Center 7.1/8.1, OBIEE10.1.3,DAC, Oracle 9i/10g, Toad,MS SQL Server 2005, Flat files and UNIX,Windows XP, Microsoft Word, Microsoft Excel, PowerPoint.

Confidential, Raleigh, NC

ETL/Informatica Developer

Responsibilities:

  • Analyzed business process workflows and developedETLproceduresto move data from various source systems to target systems.
  • Developed database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensionaldata modelingusingERWIN.
  • Involved in design usingERWIN Data Modeler and development of InformaticaMappings using transformations likeSource qualifier, Aggregator, Connected & unconnected Lookups, Filter, Update Strategy, Rank, Stored Procedure, Expression and Sequence Generator and Reusable transformations.
  • CreatedPL/SQL proceduresto transform data from staging to Data Warehouse Fact and summary tables.
  • Extensively usedStored Procedures, Functions and PackagesusingPL/SQLfor creatingStored Procedure Transformations.
  • Developed, Modified and TestedUNIX Shell scriptsand necessary Test Plans to ensure the successful execution of the data loading process.
  • Created various UNIX Shell scripts for automation of events, File Validation, File Archiving.
  • Developed scripts usingPMCMDcommand to run the Workflows, Sessions from UNIX environment which are called in Autosysjobs.
  • Developed batch file to automate the task of executing the different workflows and sessions associated with the mappings on the development server.
  • Involved in Testing, Debugging, Data Validation and Performance Tuningof ETLprocess, help develop optimum solutions for data warehouse deliverables.
  • Performance tuningofInformaticamappings for large data files by managing the block sizes, data cache sizes, sequence buffer lengths and commit interval.
  • Created unit test plans, Test cases and reports on various test cases for testing the Informatica mappings.
  • Worked onIntegration testing and Regressiontestingto verify load order, time window and lookup with full load.

Environment: InformaticaPowerCenter 7.1, Erwin, Oracle 9g,Sql Server, oracle forms 6i, DB2 UDB, PL/SQL, UNIX, TOAD, AutoSys, Microsoft VSS.

Confidential

Database Developer

Responsibilities:

  • Actively involved in gathering requirements from Business Users, and converting them into system requirement specifications and creating detailed use-cases and design documents
  • Designed, developed, and managed the workflow processes to reflect business requirements with several adapters, exceptions and rules.
  • Was involved in data modeling. Designed data flows using UML.
  • Designed and developed User Group Management modules to implement complex business rules for permissions.
  • Coordinated in setting up the development, test, production and contingency environment.
  • Designed, developed, managed databasestar schema, with various hierarchical and lookup tables.
  • Developed and maintained complex stored procedures.
  • Involved in setting up of application server clustered environment.
  • Underwent training in Standard Software Process in implementing CMM level 5 in an enterprise organization.

Environment: Oracle 8i, Shell Scripts, UML, Test Director, SunOS

We'd love your feedback!