We provide IT Staff Augmentation Services!

Sr Etl Developer Resume

0/5 (Submit Your Rating)

Los Angeles, CA

SUMMARY

  • Ten (10) + years of IT experience with extensive Data Warehousing implementations across Financial, Insurance, Healthcare and Automotive industries.
  • Experience in Information Technology with a strong background in Data warehousing using Informatica Power Center 9.x/8.x/7.x and IDQ (Informatica Data Quality).
  • Strong knowledge of Entity - Relationship concept, Facts and dimensions tables, slowly changing dimensions (SCD) and Dimensional Modeling (Kimball/Inman methodologies, Star Schema and Snow Flake Schema).
  • Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
  • Good understanding of software development process like Waterfall, Agile within the SDLC (Software Development Life Cycle).
  • Expertise in Business Model development with Dimensions, Hierarchies, Measures, Partitioning, Aggregation Rules.
  • Experience in creating High Level Design and Detailed Design in the Design phase.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, DB2, SQL server and MS access and non-relational sources like flat files (fixed width, delimited).
  • Solid understanding of OLAP concepts and challenges, especially with large data sets.
  • Well versed in OLTP Data Modeling, Data warehousing concepts.
  • Having extensive knowledge in data cleansing and query performance processing using containers, look-ups, derived column, DQS knowledge database, etc. in SSIS packages.
  • Experience with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer.
  • Developed complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups, Normalizer and Aggregator.
  • Experience in creating Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
  • Extensively used Informatica Repository Manager (Admin, Deployment tasks) and Workflow Monitor.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.
  • Experienced with Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.
  • Performed data profiling, standardization, and deduplication using Informatica Data Quality (IDQ) product.
  • Implemented of batch or real time Account, Contact, and Prospect data cleansing and de-duplication techniques within IDQ.
  • Experience in UNIX shell scripting, FTP and file management in various UNIX environments.
  • Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Involved in Unit testing to check whether the data loads into target are accurate.
  • Extensive experience with database languages such as SQL and PL/SQL which includes writing triggers, Stored Procedures, Functions, Views and Cursors.
  • Experience in Scheduling tools Control-M, CA-Autosys and CA-7.
  • Excellent Analytical, Communication skills, working in a team and ability to communicate effectively at all levels of the development process.
  • ITIL V3 Foundation certified.

TECHNICAL SKILLS

Data Warehousing/ETL Tools: Informatica PowerCenter 9.6.1/9.5.1 /8. x/7.x, Informatica Power Exchange 9.x, Informatica Data Quality 9.x

Databases: Oracle 11g/10g, IBM DB2v9.5, MS SQL Server, Teradata V2R12/V2R6

Data Modeling: Dimensional Modeling, Star Schema Modeling, Snow Flake Modeling

Reporting Tools: OBIEE 11.1.1.6

Programming Skills: SQL, PL/SQL, Unix Shell Scripting (K-Shell)

Operating Systems: UNIX, Windows

Scheduling Tools: BMC Control-M, CA-Autosys, CA-7

Other Tools: Putty, Teradata SQL Assistant 12.0, TOAD, DB2 Command

PROFESSIONAL EXPERIENCE

Confidential, Los Angeles CA

Sr ETL Developer

Responsibilities:

  • Worked in complete life cycle of software development process including designing, implementation, deployment and production support.
  • Conducted JAD sessions with SMEs and Business users for better understanding of the requirements.
  • Review Business requirements and develop technical and functional specifications within time and cost constraints.
  • Interface with users, project manager to ensure that implemented solutions satisfy business requirements and are delivered in a timely manner.
  • Worked with BA/BSA, Data Modeler/Architect team in designing the Data Model for the project.
  • Prepared Technical Required Documents along with Design and Mapping documents.
  • Designing, developing and Testing software applications and systems according to project requirements using various tools and technologies such as Informatica Power Center, Power Exchange, Oracle, DB2, SQL server, UNIX, SQL and PL/SQL.
  • Implemented slowly changing dimensions (SCD) Type 1 and Type 2 and Loading data into Dimension and Facts.
  • Used Informatica Power Center 9.5.1 to create mappings, mapplets, User defined functions, MD5 function.
  • Developed SQL SSIS Packages to extract data from various data sources such as Excel and flat files into SQL server using different transformations like derived column, sort, and aggregate transformations.
  • Implemented of batch or real time Account, Contact, and Prospect data cleansing and de-duplication techniques within IDQ.
  • Utilized IDQ to build Mapplets, Mappings, Profiles and Scorecards.
  • Performed data profiling, standardization, and deduplication using Informatica Data Quality (IDQ) product.
  • Explicitly used Fuzzy lookup, Derived Columns, Condition split, aggregate, lookup, Execute SQL Task, Merge join etc. for creating SSIS packages.
  • Monitored SSIS package jobs and optimized the jobs for maximum efficiency.
  • Created Workflows with worklets, sessions, event wait, decision box, and email, control and command tasks using Workflow Manager and monitored them in Workflow Monitor.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
  • Created reusable components like Mapplets, UDF’s (User Defined Functions), Lookups, Worklets, and Tasks etc.
  • Used different transformations like Source Qualifier, Expression, Aggregator, Joiner, Dynamic Lookup, Union, Router, update Strategy, sequence generator, Normalizer and Filter etc. in creation of mappings.
  • Created Pre/Post Session/SQL commands in sessions and mappings on the target instance.
  • Filling the process related documents like estimates, PD, SEMR, Unit test exit report...etc.
  • Created Unix Shell scripts for FTP/MFT, Error handling, File Checks, Error reports, Parameter files, Zip/Unzip of Source/Target files etc.
  • Performed data integration, Unit testing and worked closely with offshore testing team in preparing test cases for System Integration testing.
  • Involved in Debugging and Troubleshooting Informatica mappings.
  • In order to resolve production failures, debugged Mappings, Sessions/Workflows and also by checking the data after going through the session/workflow logs.
  • Worked on SSIS migrations to migrate DTS packages across environments.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Performed Code review to ensure that ETL development was done according to the company’s ETL standard and that ETL best practices were followed.
  • Worked on performance tuning of lengthy SQL’s and fine tuning at mapping/workflow level by looking into Source/Target/Session level bottle necks.
  • Involved in 24X7 on call ETL (Informatica & SSIS) production support which involves deployment/migration tasks, folder creations, access to users, server maintenance, clean up of files on server.
  • Performed Code review to ensure that ETL development was done according to the company’s ETL standard and that ETL best practices were followed.
  • Involved in ETL team meetings to discuss about Project status and ETL latest techniques.
  • Used tools like IBM DB2 command editor, WINSQL, MS SQL Studio, Toad, Putty, and WINSCP for UNIX.
  • Involved in taking backups for informatica and migrating the code from development to testing environment.
  • Scheduled jobs by using Control-M in Production.

Environment: Informatica Power Center 9.6.1/9.5.1 , Informatica Data Quality(IDQ), SQL, PL/SQL, SQL Server 2012/2008, Oracle 11g/10g, IBM DB2 v9.5, UNIX Shell Scripting, Control-M Scheduler, WINSQL, Putty, TOAD 9.7.2, SQL Server studio tool, Unix, XML.

Confidential, Jacksonville FL

Sr. ETL Developer

Responsibilities:

  • Analysis of source systems and work with business analysts to identify study and understand requirements and translate them into ETL code.
  • Perform analysis on quality and source of data to determine accuracy of information being reported and analyze data relationships between systems.
  • Worked on complete life cycle from Extraction, Transformation and Loading of data using Informatica.
  • Involved in data modeling and database design of the new P&C Legal management database by applying Ralph Kimball methodology of dimensional modeling and using Erwin for data modeling.
  • Prepared high-level design document for extracting data from complex relational database tables, data conversions, transformation and loading into specific formats.
  • Designed and developed the Mappings using various transformations to suit the business user requirements and business rules to load data from Oracle, SQL Server, flat file and XML file sources targeting the views (views on the target tables) in the target database (Oracle).
  • Worked on Informatica PowerCenter tool - Source Analyzer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Developed standard and re-usable mappings and mapplets using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, SourceQualifier, Sorter, Update strategy and Sequence generator.
  • Configured IDQ Base Line Architecture involving Installation and configuration of IDQ
  • Provided demos on the Capabilities of IDQ Tool - Analyst Profiling, Address Validation, Identity Match etc.
  • Worked with Business and IT teams in educating Data Quality process and showcasing IDQ Capabilities in developing Enterprise wide Data Quality Solution
  • Developed PL/SQL stored procedures for Informatica mappings.
  • Created Sessions and Workflows to load data from the SQL server, flat file and XML file sources that exist on servers located at various locations all over the country.
  • Scheduled the workflows to pull data from the source databases at weekly intervals, to maintain most current and consolidated data for P&C Legal management reporting.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Used various performance enhancement techniques to enhance the performance of the sessions and workflows.
  • Performance tuning on sources, targets mappings and SQL (Optimization) tuning.
  • Responsible for creating business solutions for Incremental and full loads.
  • Actively Participated in problem solving and troubleshooting for the applications implemented with Informatica.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
  • Performed Unit testing and worked closely with offshore testing team in preparing ETL Test Plans, Test Cases, Test Scripts for Unit and System Integration testing
  • Used Workflows Manager to create the Sessions to transport the data to target warehouse.
  • Involved in team meetings and project meetings to discuss about the project status and technical issues.
  • Used UNIX shell scripting to generate User-Id’s.
  • Scheduled Informatica jobs in production using Control-M V8.

Environment: Informatica Power Center 9.1/8.6.1, Informatica Power Exchange 9.1, Informatica Data Quality 9.1, SQL, PL/SQL, SQL Server 2008, IBM DB2 v9.5, UNIX Shell Scripting, Control-M, Informatica Scheduler, WINSQL, Putty, TOAD 8.0, SQL Server studio tool, Unix, WINSCP.

Confidential, Wilmington DE

Sr. ETL Developer

Responsibilities:

  • Conducted JAD sessions with SME’s and Business users for better understanding of the requirements.
  • Prepared Technical Required Documents along with Design and Mapping documents.
  • Created Data maps in Informatica Power Exchange using COBOL copy books for Source files which are landed in Mainframe location.
  • Worked with BA/BSA, Data Modeler/Architect team in designing the Data Model for the project.
  • Used tools like IBM DB2 v9.5 and IBM Data Studio, Toad as the target data base is DB2.
  • Used Informatica Power Center to create mappings, mapplets, User defined functions, workflows, worklets, sessions and tasks.
  • Migrated data from Oracle, Flat files, Excel Files into the staging area which is in MS Sql Server and Designed ETL processes that span multiple projects.
  • Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Designed an ETL system by creating Refresh mappings and Workflows for Siebel-Obiee interface to load daily data into corresponding Dimension and Fact tables.
  • Involved in Maintaining the Repository Manager for creating Repositories, user groups, folders and migrating code from Dev to Test, Test to Prod environments.
  • Load balancing of ETL processes, database performance tuning and capacity monitoring.
  • Involved in Creation of SQL, Packages, Functions, Procedures, Views, and Database Triggers.
  • Involved in Data Validating, Data integrity, Performance related to DB, Filed Size Validations, Check Constraints and Data Manipulation and Updates by using SQL Single Row Functions.
  • Designed and Developed ODS to Data Mart Mappings/Sessions/Workflows.
  • Created various Oracle database objects like Indexes, stored procedures, Materialized views, synonyms and functions for Data Import/Export.
  • Created reusable mapplets, worklets and workflows.
  • UsedTOAD and MS SQL Server to run SQL queries and validate the data in warehouse and mart.
  • Involved in Debugging and Troubleshooting Informatica mappings.
  • Populated error tables as part of the ETL process to capture the records that failed the migration.
  • Did Automation and scheduling for Batch and regular files.
  • Extensively worked on STAGING, BASE, AGG levels and moved data successfully from one environment to other.
  • Developed test cases for Unit, Integration and system testing.
  • Partitioned the Sessions for better performance.
  • Trained end users in using full client OBIEE for analysis and reporting.
  • Extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Extensively used Informatica client tools Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
  • Design and Development of ETL routines, using Informatica Power Center within the Informatica Mappings, usage of Lookups, Aggregator, Java, XML, Ranking, Mapplets, connected and unconnected stored procedures / functions / Lookups, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers were extensively done.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
  • Created Type1 and Type2 SCD mappings.
  • Created Unix Shell scripts for FTP/MFT, Error handling, Error reports, Parameter files etc.
  • Created Stored Procedures, Packages and Triggers for complex requirements.
  • Experience in writing and optimizing SQL code across different databases.
  • Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email, command and pre/post SQL.
  • Used the command line program pmcmd to run Informatica jobs from command line. And used these commands in shell scripts to create, schedule and control workflows, tasks and sessions.
  • Performed Unit testing and worked closely with offshore testing team in preparing ETL Test Plans, Test Cases, Test Scripts for Unit and System Integration testing.
  • Involved in team meetings and project meetings to discuss about the project status and technical issues.
  • Involved with Scheduling team in creating and scheduling jobs in Control-M.

Environment: Informatica Power Center 8.6.1/8.1.1 , Oracle 11g, SQL, PL/SQL, SQL Server 2008, OBIEE Platform 10.1.3.3, UNIX Shell Scripting, Autosys Workload Scheduler, TOAD 9.7.2, SQL Server tool, Unix.

Confidential

ETL Developer

Responsibilities:

  • Requirements gathering and Analysis, Involved to creating High-level design documents, Application Design documents.
  • Used various transformations like Source Qualifier, Filter, Aggregator, Expression, Lookup, Sequence Generator, Joiner, Union, Router, Sorter and update Strategy to create mappings
  • Creating reusable transformations, mapping, Mapplet, various tasks like Session, Command, Email, Timer, Control, Event Wait, and Event raise, Assignment, Worklets and workflows.
  • Involve in review code and scheduling measures and Performed Unit Testing for the developed mappings.
  • Making FPA analysis, filling the process documents according to company standards and calculating the estimating hours for the work.
  • Used Informatica designer for designing mappings and mapplets to extract data from SQL Server, Sybase and Oracle sources.
  • Created different parameter files and changed Session parameters, mapping parameters, and variables at run time.
  • Extensively used Source Qualifier Transformation to filter data at Source level rather than at Transformation level. Created different transformations such as Source Qualifier, Joiner, Expression, Aggregator, Rank, Lookups, Filters, Stored Procedures, Update Strategy and Sequence Generator.
  • Used Debugger to test the data flow and fix the mappings.
  • Partitioned Sessions for concurrent loading of data into the target tables.
  • Tuned the workflows and mappings.
  • Involved in identifying the bottle necks and tuning to improve the Performance.
  • Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.
  • Executed Workflows and Sessions using Workflow Monitor.
  • Dealt with data issues in the staging flat files and after it was cleaned up it is sent to the targets.
  • Actively coordinated with testing team in the testing phase and helped the team to understand the dependency chain of the whole project.
  • Executed the workflow using pmcmd command in UNIX.
  • Participating the daily status calls with client, distributing the work to team and Review the code developed by team.
  • Participating weekly and monthly meetings with project managers to discourse work status and required process documents.
  • Experience coordinating the team and resolving the issues of the team in technically as well as functionally.

Environment: Informatica PowerCenter 8.6.1, UNIX, Oracle, VSAM, Flat files, HP Quality Center, CA-7, SQL Developer.

Confidential

ETL Developer

Responsibilities:

  • Involved in the Development of Informatica mappings and mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
  • Implementing Extraction process of flat files into target database.
  • Designed and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Extensively worked on Informatica to extract data from Flat files, Excel files and Oracle to load the data into the target database.
  • Implemented the Incremental loading of Dimension and Fact tables.
  • Created Stored Procedures for data transformation purpose.
  • Created Tasks, Workflows, and Sessions to move the data at specific intervals on demand using Workflow Manager.
  • Created PL/SQL Stored procedures and implemented them through the Stored Procedure transformation.
  • Develop, test and implement break/fix change tickets for maintenance.
  • Migrated Mappings from Development to Testing and Testing to Production.
  • Used Parameter Files for multiple DB connections for the sources and defining variable values.
  • Developing the documentation and plan for transformation and loading the Warehouse.
  • Analyzing the data from various sources for data quality issues.
  • Participated in all facets of the software development cycle including providing input on requirement specifications, high level design documents, and user’s guides.
  • Writing Oracle Stored Procedures, SQL scripts and calling at pre and post session.
  • Created Concurrent Tasks in the work flows using workflow manager monitor jobs in Workflow Monitor.
  • Designing mapping templates to specify high level approach.
  • Involved in Unit testing and documentation.

Environment: Informatica PowerCenter 7.1.4, UNIX, Oracle, Teradata, HP Quality Center, Teradata SQL Assistant

Confidential

ETL Developer

Responsibilities:

  • Developed ETL for Data Extraction, Data Mapping and Data Conversion using Informatica Power Center 8.6.0
  • Informatica Power Center is used to extract the base tables in the data warehouse and the source databases include Oracle, Flat files and SQL Server.
  • Involved in design and development complex ETL mappings and workflows in an optimized manner. Used Power exchange for mainframe sources
  • Used Power Exchange to source copybook definition and then row test data from data files, VSAM files.
  • Extensively worked in fixing poorly designed mappings and developed schedules to automate the Informatica workflows
  • Developed oracle PL/SQL, DDLs, and Stored Procedures and worked on performance and fine Tuning of SQL & PL/SQL stored procedures and ICD 10 codes.
  • Involved in Performance Tuning at various levels including Target, Source, Mapping and Session for large data files.
  • Prepared the documentation for the mappings and workflows.
  • Worked with Teradata utilities like Teradata Parallel Transporter (TPT), BTEQ, Fast Load, Multi Load, T Pump. Worked with work tables, log tables, error tables in Teradata.
  • Created test plans and did unit testing for the Informatica mappings and stored procedures.
  • Involved in Unit testing, Integration testing, UAT by creating test cases, test plans and helping Informatica administrator in deployment of code across Dev, Test and Prod Repositories.
  • Worked on various data issues in Production environment.
  • Good experience in using Maestro Scheduling tool.
  • Involved in Production Scheduling to setup jobs in order and provided 24x7 production support. Involved in trouble shooting and resuming failed jobs.
  • Designed and developed logical and physical models to store data retrieved from other sources including legacy systems.
  • Extensively used transformations such as Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Joiner, Transaction Control and Stored Procedure.
  • Involved in the Development of Informatica mappings and mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
  • Worked with pre and post sessions, and extracted data from Transaction System into Staging Area. Knowledge of Identifying Facts and Dimensions tables.
  • Tuned sources, targets, mappings and sessions to improve the performance of data load.
  • Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance
  • Implemented Slowly Changing Dimension methodology for accessing the full history.
  • Created Several Informatica Mappings to populate the data into dimensions and fact tables.
  • Migrated repository objects, services and scripts from development environment to production environment.
  • Involved in Unit testing and documentation.

Environment: Informatica PowerCenter 7.1.2, UNIX, Flat Files, Oracle 9i, SQL, CA-Autosys, SQL Plus, SQL Server.

We'd love your feedback!