We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

Los Angeles, CA

SUMMARY

  • Ten (10) + years of IT experience wif extensive Data Warehousing implementations across Financial, Insurance, Healthcare and Automotive industries.
  • Experience in Information Technology wif a strong background in Data warehousing using Informatica Power Center 9.x/8.x/7.x and IDQ (Informatica Data Quality).
  • Strong noledge of Entity - Relationship concept, Facts and dimensions tables, slowly changing dimensions (SCD) and Dimensional Modeling (Kimball/Inman methodologies, Star Schema and Snow Flake Schema).
  • Experience in all teh phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
  • Good understanding of software development process like Waterfall, Agile wifin teh SDLC (Software Development Life Cycle).
  • Expertise in Business Model development wif Dimensions, Hierarchies, Measures, Partitioning, Aggregation Rules.
  • Experience in creating High Level Design and Detailed Design in teh Design phase.
  • Experience in integration of various data sources wif Multiple Relational Databases like Oracle, DB2, SQL server and MS access and non-relational sources like flat files (fixed width, delimited).
  • Solid understanding of OLAP concepts and challenges, especially wif large data sets.
  • Well versed in OLTP Data Modeling, Data warehousing concepts.
  • Having extensive noledge in data cleansing and query performance processing using containers, look-ups, derived column, DQS noledge database, etc. in SSIS packages.
  • Experience wif Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer.
  • Developed complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups, Normalizer and Aggregator.
  • Experience in creating Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
  • Extensively used Informatica Repository Manager (Admin, Deployment tasks) and Workflow Monitor.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing teh data flow and evaluating transformations.
  • Experience in using teh Informatica command line utilities like pmcmd to execute workflows in non-windows environments.
  • Experienced wif Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.
  • Performed data profiling, standardization, and deduplication using Informatica Data Quality (IDQ) product.
  • Implemented of batch or real time Account, Contact, and Prospect data cleansing and de-duplication techniques wifin IDQ.
  • Experience in UNIX shell scripting, FTP and file management in various UNIX environments.
  • Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Involved in Unit testing to check whether teh data loads into target are accurate.
  • Extensive experience wif database languages such as SQL and PL/SQL which includes writing triggers, Stored Procedures, Functions, Views and Cursors.
  • Experience in Scheduling tools Control-M, CA-Autosys and CA-7.
  • Excellent Analytical, Communication skills, working in a team and ability to communicate TEMPeffectively at all levels of teh development process.

TECHNICAL SKILLS

Data Warehousing/ETL Tools: Informatica PowerCenter 9.6.1/9.5.1 /8. x/7.x, Informatica Power Exchange 9.x, Informatica Data Quality 9.x

Databases: Oracle 11g/10g, IBM DB2v9.5, MS SQL Server, Teradata V2R12/V2R6

Data Modeling: Dimensional Modeling, Star Schema Modeling, Snow Flake Modeling

Reporting Tools: OBIEE 11.1.1.6

Programming Skills: SQL, PL/SQL, Unix Shell Scripting (K-Shell)

Operating Systems: UNIX, Windows

Scheduling Tools: BMC Control-M, CA-Autosys, CA-7

Other Tools: Putty, Teradata SQL Assistant 12.0, TOAD, DB2 Comm and Editor

PROFESSIONAL EXPERIENCE

Confidential, Los Angeles, CA

Sr. ETL Developer

Responsibilities:

  • Worked in complete life cycle of software development process including designing, implementation, deployment and production support.
  • Conducted JAD sessions wif SMEs and Business users for better understanding of teh requirements.
  • Review Business requirements and develop technical and functional specifications wifin time and cost constraints.
  • Interface wif users, project manager to ensure dat implemented solutions satisfy business requirements and are delivered in a timely manner.
  • Worked wif BA/BSA, Data Modeler/Architect team in designing teh Data Model for teh project.
  • Prepared Technical Required Documents along wif Design and Mapping documents.
  • Designing, developing and Testing software applications and systems according to project requirements using various tools and technologies such as Informatica Power Center, Power Exchange, Oracle, DB2, SQL server, UNIX, SQL and PL/SQL.
  • Implemented slowly changing dimensions (SCD) Type 1 and Type 2 and Loading data into Dimension and Facts.
  • Used Informatica Power Center 9.5.1 to create mappings, mapplets, User defined functions, MD5 function.
  • Developed SQL SSIS Packages to extract data from various data sources such as Excel and flat files into SQL server using different transformations like derived column, sort, and aggregate transformations.
  • Implemented of batch or real time Account, Contact, and Prospect data cleansing and de-duplication techniques wifin IDQ.
  • Utilized IDQ to build Mapplets, Mappings, Profiles and Scorecards.
  • Performed data profiling, standardization, and deduplication using Informatica Data Quality (IDQ) product.
  • Explicitly used Fuzzy lookup, Derived Columns, Condition split, aggregate, lookup, Execute SQL Task, Merge join etc. for creating SSIS packages.
  • Monitored SSIS package jobs and optimized teh jobs for maximum efficiency.
  • Created Workflows wif worklets, sessions, event wait, decision box, and email, control and command tasks using Workflow Manager and monitored them in Workflow Monitor.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
  • Created reusable components like Mapplets, UDF’s (User Defined Functions), Lookups, Worklets, and Tasks etc.
  • Used different transformations like Source Qualifier, Expression, Aggregator, Joiner, Dynamic Lookup, Union, Router, update Strategy, sequence generator, Normalizer and Filter etc. in creation of mappings.
  • Created Pre/Post Session/SQL commands in sessions and mappings on teh target instance.
  • Filling teh process related documents like estimates, PD, SEMR, Unit test exit report...etc.
  • Created Unix Shell scripts for FTP/MFT, Error handling, File Checks, Error reports, Parameter files, Zip/Unzip of Source/Target files etc.
  • Performed data integration, Unit testing and worked closely wif offshore testing team in preparing test cases for System Integration testing.
  • Involved in Debugging and Troubleshooting Informatica mappings.
  • In order to resolve production failures, debugged Mappings, Sessions/Workflows and also by checking teh data after going through teh session/workflow logs.
  • Worked on SSIS migrations to migrate DTS packages across environments.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Performed Code review to ensure dat ETL development was done according to teh company’s ETL standard and dat ETL best practices were followed.
  • Worked on performance tuning of lengthy SQL’s and fine tuning at mapping/workflow level by looking into Source/Target/Session level bottle necks.
  • Involved in 24X7 on call ETL (Informatica & SSIS) production support which involves deployment/migration tasks, folder creations, access to users, server maintenance, clean up of files on server.
  • Performed Code review to ensure dat ETL development was done according to teh company’s ETL standard and dat ETL best practices were followed.
  • Involved in ETL team meetings to discuss about Project status and ETL latest techniques.
  • Used tools like IBM DB2 command editor, WINSQL, MS SQL Studio, Toad, Putty, and WINSCP for UNIX.
  • Involved in taking backups for informatica and migrating teh code from development to testing environment.
  • Scheduled jobs by using Control-M in Production.

Environment: Informatica Power Center 9.6.1/9.5.1 , Informatica Data Quality(IDQ), SQL, PL/SQL, SQL Server 2012/2008, Oracle 11g/10g, IBM DB2 v9.5, UNIX Shell Scripting, Control-M Scheduler, WINSQL, Putty, TOAD 9.7.2, SQL Server studio tool, Unix, XML.

Confidential, Jacksonville, FL

Sr. ETL Developer

Responsibilities:

  • Analysis of source systems and work wif business analysts to identify study and understand requirements and translate them into ETL code.
  • Perform analysis on quality and source of data to determine accuracy of information being reported and analyze data relationships between systems.
  • Worked on complete life cycle from Extraction, Transformation and Loading of data using Informatica.
  • Involved in data modeling and database design of teh new P&C Legal management database by applying Ralph Kimball methodology of dimensional modeling and using Erwin for data modeling.
  • Prepared high-level design document for extracting data from complex relational database tables, data conversions, transformation and loading into specific formats.
  • Designed and developed teh Mappings using various transformations to suit teh business user requirements and business rules to load data from Oracle, SQL Server, flat file and XML file sources targeting teh views (views on teh target tables) in teh target database (Oracle).
  • Worked on Informatica PowerCenter tool - Source Analyzer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Developed standard and re-usable mappings and mapplets using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, SourceQualifier, Sorter, Update strategy and Sequence generator.
  • Configured IDQ Base Line Architecture involving Installation and configuration of IDQ
  • Provided demos on teh Capabilities of IDQ Tool - Analyst Profiling, Address Validation, Identity Match etc.
  • Worked wif Business and IT teams in educating Data Quality process and showcasing IDQ Capabilities in developing Enterprise wide Data Quality Solution
  • Developed PL/SQL stored procedures for Informatica mappings.
  • Created Sessions and Workflows to load data from teh SQL server, flat file and XML file sources dat exist on servers located at various locations all over teh country.
  • Scheduled teh workflows to pull data from teh source databases at weekly intervals, to maintain most current and consolidated data for P&C Legal management reporting.
  • Involved in unit testing, Integration testing and User acceptance testing of teh mappings.
  • Used various performance enhancement techniques to enhance teh performance of teh sessions and workflows.
  • Performance tuning on sources, targets mappings and SQL (Optimization) tuning.
  • Responsible for creating business solutions for Incremental and full loads.
  • Actively Participated in problem solving and troubleshooting for teh applications implemented wif Informatica.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
  • Performed Unit testing and worked closely wif offshore testing team in preparing ETL Test Plans, Test Cases, Test Scripts for Unit and System Integration testing
  • Used Workflows Manager to create teh Sessions to transport teh data to target warehouse.
  • Involved in team meetings and project meetings to discuss about teh project status and technical issues.
  • Used UNIX shell scripting to generate User-Id’s.
  • Scheduled Informatica jobs in production using Control-M V8.

Environment: Informatica Power Center 9.1/8.6.1, Informatica Power Exchange 9.1, Informatica Data Quality 9.1, SQL, PL/SQL, SQL Server 2008, IBM DB2 v9.5, UNIX Shell Scripting, Control-M, Informatica Scheduler, WINSQL, Putty, TOAD 8.0, SQL Server studio tool, Unix, WINSCP.

Confidential, Wilmington, DE

Sr. ETL Developer

Responsibilities:

  • Conducted JAD sessions wif SME’s and Business users for better understanding of teh requirements.
  • Prepared Technical Required Documents along wif Design and Mapping documents.
  • Created Data maps in Informatica Power Exchange using COBOL copy books for Source files which are landed in Mainframe location.
  • Worked wif BA/BSA, Data Modeler/Architect team in designing teh Data Model for teh project.
  • Used tools like IBM DB2 v9.5 and IBM Data Studio, Toad as teh target data base is DB2.
  • Used Informatica Power Center to create mappings, mapplets, User defined functions, workflows, worklets, sessions and tasks.
  • Migrated data from Oracle, Flat files, Excel Files into teh staging area which is in MS Sql Server and Designed ETL processes dat span multiple projects.
  • Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Designed an ETL system by creating Refresh mappings and Workflows for Siebel-Obiee interface to load daily data into corresponding Dimension and Fact tables.
  • Involved in Maintaining teh Repository Manager for creating Repositories, user groups, folders and migrating code from Dev to Test, Test to Prod environments.
  • Load balancing of ETL processes, database performance tuning and capacity monitoring.
  • Involved in Creation of SQL, Packages, Functions, Procedures, Views, and Database Triggers.
  • Involved in Data Validating, Data integrity, Performance related to DB, Filed Size Validations, Check Constraints and Data Manipulation and Updates by using SQL Single Row Functions.
  • Designed and Developed ODS to Data Mart Mappings/Sessions/Workflows.
  • Created various Oracle database objects like Indexes, stored procedures, Materialized views, synonyms and functions for Data Import/Export.
  • Created reusable mapplets, worklets and workflows.
  • UsedTOAD and MS SQL Server to run SQL queries and validate teh data in warehouse and mart.
  • Involved in Debugging and Troubleshooting Informatica mappings.
  • Populated error tables as part of teh ETL process to capture teh records dat failed teh migration.
  • Did Automation and scheduling for Batch and regular files.
  • Extensively worked on STAGING, BASE, AGG levels and moved data successfully from one environment to other.
  • Developed test cases for Unit, Integration and system testing.
  • Partitioned teh Sessions for better performance.
  • Trained end users in using full client OBIEE for analysis and reporting.
  • Extensive documentation on teh design, development, implementation, daily loads and process flow of teh mappings.
  • Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
  • Design and Development of ETL routines, using Informatica Power Center wifin teh Informatica Mappings, usage of Lookups, Aggregator, Java, XML, Ranking, Mapplets, connected and unconnected stored procedures / functions / Lookups, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers were extensively done.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
  • Created Type1 and Type2 SCD mappings.
  • Created Unix Shell scripts for FTP/MFT, Error handling, Error reports, Parameter files etc.
  • Created Stored Procedures, Packages and Triggers for complex requirements.
  • Experience in writing and optimizing SQL code across different databases.
  • Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email, command and pre/post SQL.
  • Used teh command line program pmcmd to run Informatica jobs from command line. And used these commands in shell scripts to create, schedule and control workflows, tasks and sessions.
  • Performed Unit testing and worked closely wif offshore testing team in preparing ETL Test Plans, Test Cases, Test Scripts for Unit and System Integration testing.
  • Involved in team meetings and project meetings to discuss about teh project status and technical issues.
  • Involved wif Scheduling team in creating and scheduling jobs in Control-M.

Environment: Informatica Power Center 8.6.1/8.1.1 , Oracle 11g, SQL, PL/SQL, SQL Server 2008, OBIEE Platform 10.1.3.3, UNIX Shell Scripting, Autosys Workload Scheduler, TOAD 9.7.2, SQL Server tool, Unix.

Confidential

ETL Developer

Responsibilities:

  • Requirements gathering and Analysis, Involved to creating High-level design documents, Application Design documents.
  • Used various transformations like Source Qualifier, Filter, Aggregator, Expression, Lookup, Sequence Generator, Joiner, Union, Router, Sorter and update Strategy to create mappings
  • Creating reusable transformations, mapping, Mapplet, various tasks like Session, Command, Email, Timer, Control, Event Wait, and Event raise, Assignment, Worklets and workflows.
  • Involve in review code and scheduling measures and Performed Unit Testing for teh developed mappings.
  • Making FPA analysis, filling teh process documents according to company standards and calculating teh estimating hours for teh work.
  • Used Informatica designer for designing mappings and mapplets to extract data from SQL Server, Sybase and Oracle sources.
  • Created different parameter files and changed Session parameters, mapping parameters, and variables at run time.
  • Extensively used Source Qualifier Transformation to filter data at Source level rather TEMPthan at Transformation level. Created different transformations such as Source Qualifier, Joiner, Expression, Aggregator, Rank, Lookups, Filters, Stored Procedures, Update Strategy and Sequence Generator.
  • Used Debugger to test teh data flow and fix teh mappings.
  • Partitioned Sessions for concurrent loading of data into teh target tables.
  • Tuned teh workflows and mappings.
  • Involved in identifying teh bottle necks and tuning to improve teh Performance.
  • Created workflows using Workflow manager for different tasks like sending email notifications, timer dat triggers when an event occurs, and sessions to run a mapping.
  • Executed Workflows and Sessions using Workflow Monitor.
  • Dealt wif data issues in teh staging flat files and after it was cleaned up it is sent to teh targets.
  • Actively coordinated wif testing team in teh testing phase and halped teh team to understand teh dependency chain of teh whole project.
  • Executed teh workflow using pmcmd command in UNIX.
  • Participating teh daily status calls wif client, distributing teh work to team and Review teh code developed by team.
  • Participating weekly and monthly meetings wif project managers to discourse work status and required process documents.
  • Experience coordinating teh team and resolving teh issues of teh team in technically as well as functionally.

Environment: Informatica PowerCenter 8.6.1, UNIX, Oracle, VSAM, Flat files, HP Quality Center, CA-7, SQL Developer.

Confidential

ETL Developer

Responsibilities:

  • Involved in teh Development of Informatica mappings and mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
  • Implementing Extraction process of flat files into target database.
  • Designed and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Extensively worked on Informatica to extract data from Flat files, Excel files and Oracle to load teh data into teh target database.
  • Implemented teh Incremental loading of Dimension and Fact tables.
  • Created Stored Procedures for data transformation purpose.
  • Created Tasks, Workflows, and Sessions to move teh data at specific intervals on demand using Workflow Manager.
  • Created PL/SQL Stored procedures and implemented them through teh Stored Procedure transformation.
  • Develop, test and implement break/fix change tickets for maintenance.
  • Migrated Mappings from Development to Testing and Testing to Production.
  • Used Parameter Files for multiple DB connections for teh sources and defining variable values.
  • Developing teh documentation and plan for transformation and loading teh Warehouse.
  • Analyzing teh data from various sources for data quality issues.
  • Participated in all facets of teh software development cycle including providing input on requirement specifications, high level design documents, and user’s guides.
  • Writing Oracle Stored Procedures, SQL scripts and calling at pre and post session.
  • Created Concurrent Tasks in teh work flows using workflow manager monitor jobs in Workflow Monitor.
  • Designing mapping templates to specify high level approach.
  • Involved in Unit testing and documentation.

Environment: Informatica PowerCenter 7.1.4, UNIX, Oracle, Teradata, HP Quality Center, Teradata SQL Assistant

Confidential

ETL Developer

Responsibilities:

  • Developed ETL for Data Extraction, Data Mapping and Data Conversion using Informatica Power Center 8.6.0
  • Informatica Power Center is used to extract teh base tables in teh data warehouse and teh source databases include Oracle, Flat files and SQL Server.
  • Involved in design and development complex ETL mappings and workflows in an optimized manner. Used Power exchange for mainframe sources
  • Used Power Exchange to source copybook definition and tan row test data from data files, VSAM files.
  • Extensively worked in fixing poorly designed mappings and developed schedules to automate teh Informatica workflows
  • Developed oracle PL/SQL, DDLs, and Stored Procedures and worked on performance and fine Tuning of SQL & PL/SQL stored procedures and ICD 10 codes.
  • Involved in Performance Tuning at various levels including Target, Source, Mapping and Session for large data files.
  • Prepared teh documentation for teh mappings and workflows.
  • Worked wif Teradata utilities like Teradata Parallel Transporter (TPT), BTEQ, Fast Load, Multi Load, T Pump. Worked wif work tables, log tables, error tables in Teradata.
  • Created test plans and did unit testing for teh Informatica mappings and stored procedures.
  • Involved in Unit testing, Integration testing, UAT by creating test cases, test plans and halping Informatica administrator in deployment of code across Dev, Test and Prod Repositories.
  • Worked on various data issues in Production environment.
  • Good experience in using Maestro Scheduling tool.
  • Involved in Production Scheduling to setup jobs in order and provided 24x7 production support. Involved in trouble shooting and resuming failed jobs.
  • Designed and developed logical and physical models to store data retrieved from other sources including legacy systems.
  • Extensively used transformations such as Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner, Transaction Control and Stored Procedure.
  • Involved in teh Development of Informatica mappings and mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
  • Worked wif pre and post sessions, and extracted data from Transaction System into Staging Area. Knowledge of Identifying Facts and Dimensions tables.
  • Tuned sources, targets, mappings and sessions to improve teh performance of data load.
  • Worked wif different Informatica tuning issues and fine-tuned teh transformations to make them more efficient in terms of performance
  • Implemented Slowly Changing Dimension methodology for accessing teh full history.
  • Created Several Informatica Mappings to populate teh data into dimensions and fact tables.
  • Migrated repository objects, services and scripts from development environment to production environment.
  • Involved in Unit testing and documentation.

Environment: Informatica PowerCenter 7.1.2, UNIX, Flat Files, Oracle 9i, SQL, CA-Autosys, SQL Plus, SQL Server.

Hire Now