We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

4.00/5 (Submit Your Rating)

Virginia Beach, VA

SUMMARY

  • Over 9+ years of experience in Information Technology with a strong background in Analyzing, Designing, Developing, Testing, and Implementing of Data Warehouse.
  • Having Data Warehousing experience, Proficient with Dimensional Data modeling, Star Schema using Informatica Power Center/Informatica Cloud/Power Exchange, MDM, in different business domains like finance, banking, Insurance and health care industries.
  • Experience working with various source systems/files like Azure SQL Server RDBMS (DB2, Oracle, SQL Server, Teradata) Flat Files, XML, Cobol Files, AWS - S3 Raw Bucket.
  • Proficient in databases including DB2, Oracle 8i, Teradata, MS SQL Server, MS Access.
  • Proficient in using tools like Microsoft Visio, Microsoft Project, Microsoft Office (Excel, Access, Word, PowerPoint).
  • Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system which include loading nested JSON formatted data into snowflake table
  • Experience in scripting T-SQL queries, complex stored procedures, user defined functions (UDF)
  • Extensively worked on various Data warehouse projects using Informatica Client Tools- Source Analyzer, Transformation Developer, Workflow Manager, Workflow Monitor, Server Manager, Mapping Designer, Mapplets Designer and Warehouse Designer and Multidimensional data modeling using Star and Snowflake schema design.
  • Expert knowledge in Trouble shooting and performance tuning at various levels such as source, mappings, target and sessions.
  • Working experience in using Informatica Workflow Manager to create Sessions, Batches and schedule workflows and Worklets, developing complex Mapplets, Worklets, Re-usable Tasks, Re-usable Mappings, Define Workflows &Tasks, Monitoring Sessions, Export & Import Mappings & Workflows, Backup & Recovery, Power Exchange
  • Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings
  • Excellent knowledge in identifying performance bottlenecks and tuning the Informatica Load for better performance and efficiency.
  • Experience working with Informatica IICS tool effectively using it for Data Integration and Data Migration from multiple source systems in Azure Sql Data warehouse.
  • Extensively involved in performance tuning of Informatica mapping and sessions by reducing Data Flow, Eliminating I/O & bottlenecks, optimize resources, optimize common ETL processes and data parallelism.
  • Having good experience on agile methodologies.
  • Good knowledge of UNIX Shell Scripting.
  • Ability to prioritize multiple tasks.
  • Self-motivated and able to work independently and as a member of a team.
  • Excellent communication, interpersonal and analytical skills.

TECHNICAL SKILLS

ETL: Informatica PowerCenter, Informatica Cloud, Power Exchange and MDM

Database: Oracle 12g/11g/10g/, SQL Server (2012), DB2, Teradata13.0, AWS S3 and AWS Redshift

BI Tool: Tableau

Software & Tools: SQL Server,TSQl, SQL Tools, PL/SQL

Scheduling Tool: Control-M, Tidal.

Operating Systems: Windows /2003/2008R2/ Windows7/8, UNIX

Scripting: UNIX Shell Scripting

PROFESSIONAL EXPERIENCE

Confidential, Virginia Beach VA

Sr. ETL Developer

Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Moved the company from a SQL Server database structure to AWS Redshift data warehouse and responsible for ETL and data validation using SQL Server Integration Services.
  • In order to increase the performance balanced the input files of slice count against large files and loaded into AWS-S3 Refine Bucket and by using copy command achieved the micro-batch load into the Amazon Redshift.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Developed Cloud mappings to extract the data.
  • Utilized SQL and T-SQL programming to develop relational databases.
  • Extensively Worked on Star Schema, Snowflake Schema, Data Modeling, Logical and Physical Model, Data Elements, Issue/Question Resolution Logs, and Source to Target Mappings, Interface Matrix and Design elements.
  • Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
  • Generated automated stats with the staging loads comparing with the present day counts to the previous day counts.
  • Created Filewatcher jobs to setup the dependency between Cloud and PowerCenter jobs.
  • Performed bulk load of JSON data from s3 bucket to snowflake.
  • Used Snowflake functions to perform semi structures data parsing entirely with SQL statements.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Modified existing mappings for enhancements of new business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Building Reports according to user Requirement.
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
  • Implemented slowly changing dimension methodology for accessing the full history of accounts.
  • Involved in implementing error handling Process.
  • Good Understanding about Azure Sql Dwh concepts relating to storage, distribution, DWU units, resource user groups, connection strings etc.
  • Created Forecast Data Mart - Data Model, physical tables, mappings
  • Experience in integration of various data sources like Oracle, DB2, Flat Files and XML Files and good knowledge on Teradata 12.0/13.0, SQL Server 2016.
  • Even used Teradata Parallel Transporter (TPT) for loading pipe delimited, comma delimited and fixed-width data files onto Teradata EDW.
  • Experience in working with COBOL files, XML, and Flat Files.
  • Analyzed the session logs, loader logs for Teradata mload and f load for errors and troubleshoot them.

Environment: Informatica 10.x/9.x/8.x, Informatica Power Exchange, AWS S3, AWS Redshift, IDQ developer 9.6, Shell Scripting including ‘sed’, COBOL, Oracle 12g, UNIX, SQL,TSQL, PL/SQL, Teradata13.0, Active Batch. SQL Server.

Confidential, Nashville, TN

Sr. ETL Consultant

Responsibilities:

  • Requirements Analysis, cost estimates, technical design creation and design reviews
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor
  • Worked with ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate Shire’s data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.
  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Normalizer, Lookup, Filter, Joiner, Rank, Router and Update Strategy
  • Implemented the concept of slowly changing dimensions (SCD) Type I and Type II to maintain current and historical data in the dimension.
  • Experience working with Azure Sql Data Warehouse integration with IICS and various Native (v2 and v3 connectors) and Microsoft connector with PDO support for Azure Sql.
  • Created critical re-usable transformations, mapplets and worklets wherever it is necessary
  • Integrated IDQ mappings, rules as mapplets within Power Center Mappings
  • Created Stored Procedures, Functions, Packages and Triggers using PL/SQL
  • Designed, Installed, Configured core Informatica/Siperian MDM Hub components such as Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, Data Modeling.
  • Extensively used SQL tools like TOAD, Rapid SQL and Query Analyzer, to run SQL queries to validate the data
  • Worked on (MLoad, FLoad) to load the large tables
  • Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data
  • Deployed new MDM Hub for portals in conjunction with user interface on IDD application.
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Worked on data cleansing using the cleanse functions in Informatica MDM.
  • Used Unix Shell Scripts to automate pre-session and post-session processes
  • Used Active Batch scheduler to schedule and run Informatica workflows on a daily/weekly/monthly basis
  • Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated
  • Reviewed the defects raised by the UAT team and following up on the critical defects with the team to ensure to fix the defects

Environment: Informatica 9.6, Informatica Power Exchange,Informatica MDM, Azure Sql Data Warehouse, IDQ developer 9.6, Oracle 12g, UNIX, SQL, PL/SQL, Teradata, Active Batch. SQL Server

Confidential, Tarrytown, NY

Sr. ETL Consultant

Responsibilities:

  • Requirements Analysis, cost estimates, technical design creation and design reviews
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor
  • Extracted data from various centers with the data in different systems like Mainframe files and Flat files loaded the data into oracle staging using Informatica Power Center 9.1
  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Normalizer, Lookup, Filter, Joiner, Rank, Router and Update Strategy
  • Implemented the concept of slowly changing dimensions (SCD) Type I and Type II to maintain current and historical data in the dimension
  • Created critical re-usable transformations, mapplets and worklets wherever it is necessary
  • Integrated IDQ mappings, rules as mapplets within Power Center Mappings
  • Created Stored Procedures, Functions, Packages and Triggers using PL/SQL
  • Wrote complex SQL Queries involving multiple tables with joins and also generated queries to check for consistency of the data in the tables and to update the tables as per the Business requirements
  • Extensively used SQL tools like TOAD, Rapid SQL and Query Analyzer, to run SQL queries to validate the data
  • Implemented restart strategy and error handling techniques to recover failed sessions
  • Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated
  • Reviewed the defects raised by the UAT team and following up on the critical defects with the team to ensure to fix the defects

Environment: Informatica 9.1, Informatica Power Exchange, IDQ developer 9.1, Oracle 12g, DB2, UNIX, SQL, PL/SQL, Db visualizer, Putty, Mainframe, Teradata

Confidential, Los Angeles

Sr. ETL Consultant

Responsibilities:

  • Requirements Analysis, cost estimates, technical design creation and design reviews
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor
  • Extracted data from various centers with the data in different systems like Oracle Database and SQL Server and loaded the data into Teradata staging using Informatica Power Center 9.5.
  • Involved in Migration from Informatica Power Center 9.3 to Informatica Power Center9.5
  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Normalizer, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
  • Implemented the concept of slowly changing dimensions (SCD) Type I and Type II to maintain current and historical data in the dimension.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, duplicate elimination and exception handling and monitoring capabilities of IDQ.
  • Created critical re-usable transformations, mapplets and worklets wherever it is necessary
  • Integrated IDQ mappings, rules as mapplets within Power Center Mappings.
  • Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.
  • Wrote complex SQL Queries involving multiple tables with joins and also generated queries to check for consistency of the data in the tables and to update the tables as per the Business requirements.
  • Extensively used SQL tools like TOAD, Rapid SQL and Query Analyzer, to run SQL queries to validate the data.
  • Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
  • Implemented restart strategy and error handling techniques to recover failed sessions.
  • Used Unix Shell Scripts to automate pre-session and post-session processes.
  • Worked on (MLoad, FLoad) to load the large tables
  • Used Autosys scheduler to schedule and run Informatica workflows on a daily/weekly/monthly basis.
  • Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated
  • Reviewed the defects raised by the UAT team and following up on the critical defects with the team to ensure to fix the defects

Environment: Informatica 9.3/9.5, Informatica Data Quality (IDQ)9.1, Business objects, Oracle 11g, UNIX, PL/SQL, SQL, TOAD, Putty, Teradata

Confidential

ETL Developer

Responsibilities:

  • Interacting with the end users to get all the incomplete requirements and developed client satisfied code
  • Performed Source System Data analysis as per the Business Requirement. Distributed data residing in heterogeneous data sources is consolidated onto Teradata staging using Informatica Power Center 8.3
  • Used heterogeneous data sources Oracle, DB2, and XML Files, Flat Files as source also imported stored procedures from Oracle for transformations.
  • Developed Mappings, Sessions, Workflows and Shell Scripts to extract, validate, and transform data according to the business rules
  • Identify the Fact tables and slowly changing dimensional (SCD) tables
  • Extensively used SQL tools like TOAD, Rapid SQL and Query Analyzer, to run SQL queries to validate the data
  • Sourced the data from XML files, flat files, SQL server tables and Oracle tables
  • Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Mapplets and Transformation objects. Partitioned the sessions to reduce the load time
  • Performed data cleansing and cache optimization
  • Change Data Capture for Incremental Aggregation
  • Involved in review of the mappings and enhancements for better optimization of the Informatica mappings, sessions and workflows
  • Extensively worked in performance tuning of programs, ETL procedures and processes
  • Performed Unit, Systems and Regression Testing of the mappings. Involved in writing the Test Cases and also assisted the users in performing UAT
  • Extensively used UNIX shell scripts to create the parameter files dynamically and scheduling jobs using Autosys
  • Created integration services, repository services and migrated the repository objects
  • Written PL/SQL procedures for processing business logic in the database
  • Provided production support and maintenance for all the applications with the ETL process

Environment: Informatica Power Center 8.3, Oracle 10g, Teradata, SQL, PL/SQL, TOAD,UNIX, Windows 2000/XP

Confidential

ETL Developer

Responsibilities:

  • Involved in requirements gathering, developing detailed designs and technical specifications
  • Used Informatica as an ETL tool to extract data from multiple source systems by creating mappings using various transformations
  • Developed and tested mappings, sessions, worklets and workflows
  • Developing and modifying changes in mappings according to business logic and also tuned them for better performance
  • Worked on complex data loading (implemented the batch data cleansing and data loading)
  • Used BCP utility to publish table output to text files
  • Worked on DTS Package, DTS Import/Export for transferring data from Heterogeneous Database to SQL Server
  • Used Modification Language (DML) to insert and update data, satisfying the referential integrity constraints
  • Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard
  • Developed SQL scripts to Insert/Update and Delete data in MS SQL database tables
  • Created Business-Crucial stored procedures and functions to support efficient data storage and manipulation
  • Maintained a good client relationship by communicating daily status and weekly status of the project

Environment: Informatica Power Center 7.1.1, UNIX, Shell Scripting SQL Server 7.0/2000, Oracle 8 Enterprise Manager, SQL Profiler, DTS, PL/SQL, Replication

We'd love your feedback!