We provide IT Staff Augmentation Services!

Etl Developer Resume

Chicago, IL

SUMMARY

  • 6+ years of hands - on development experience with ETL tools (i.e. Informatica), Oracle Databases, SQL Server, MySQL, Informatica, SSIS, SSRS, Oracle Forms and Reports.
  • Experience working with MS SQL Server 2012/2014/2015 , Teradata, and Oracle 11g/10g/9i/8i.
  • Extensive knowledge on the Powercenter components as Powercenter Designer, Powercenter Repository Manager, Workflow Manager and Workflow Monitor.
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Thorough knowledge in creating ETL’s to perform the data load from different data sources and good understanding in installation of Informatica.
  • Experience in integrating business application with Informatica hub using Batch process, SIF and message queues.
  • Extensive ETL tool experience using IBM Infosphere/WebSphere Datastage to perform ETL & ELT operations on data.
  • Experience in performance tuning of Informatica mappings and sessions to improve performance of the large volume projects.
  • Demonstrated experience with design and implementation of Informatica (IDQ v9.1), Data Quality applications for the business and technology users across the entire full development life cycle.
  • Understanding of the entire AWS Product and Service suite primarily EC2, S3, VPC, Redshift, Spectrum, EMR(Hadoop) and other monitoring service of products and their applicable use cases, best practices and implementation, and support considerations.
  • Strong in SQL, PL/SQL, SQL*LOADER, SQL*PLUS, MS-SQL.
  • Experience with Data Cleansing, Data Profiling and Data analysis. UNIX Shell Scripting, Perl Scripting, SQL and PL/SQL coding.
  • Involved in Testing, Test Plan Preparation and Process Improvement for the ETL developments with good exposure to development, testing, debugging, implementation, documentation, user training & production support.
  • Experience in Oracle supplied packages, Dynamic SQL, Records and PL/SQL Tables.
  • Hands on experience in developing Stored Procedures, Functions, Views and Triggers, SQL queries using SQL Server and Oracle PL/SQL.
  • Experience in resolving on-going maintenance issues and bug fixes Monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Database / ETL Performance Tuning: Broad Experience in Database Development including effective use of Database objects, SQL Trace, Explain Plan, Different types of Optimizers, Hints, Indexes, Table Partitions, Sub Partitions, Materialized Views, Global Temporary tables, Autonomous Transitions, Bulk Binds, Capabilities of using Oracle Built-in Functions. Performance Tuning of Informatica Mapping and workflow.
  • Worked a great extent with the design and development of Tableau which includes Preparing Dashboards using calculations, parameters, calculated fields, groups, sets and hierarchies.
  • Experience in importing/exporting data between different sources like Oracle/Access/Excel etc. using SSIS/DTS utility.
  • Strong Data Modeling experience using ER diagram, Dimensional data modelling, Star Schema modelling, Snow-flake modelling using tools like Erwin, EMBARCADERO ERStudio.
  • Worked directly with non-IT business analysts throughout the development cycle, and provide production support for ETL.
  • Proficient in Informatica administration work including installing and configuring Informatica PowerCenter and repository servers on Windows and UNIX platforms, backup and recovery, folder and user account maintenance.
  • Experience with industry Software development methodologies like Waterfall, Agile within the software development life cycle.
  • Involved in understanding client Requirements, Analysis of Functional specification, Technical specification preparation and Review of Technical Specification.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Expertise in working in Agile Environment (Scrum) and good experience worked on Insurance, Healthcare domain.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 10.4/10.2/9.6/9.1/ , Power Connect, Power Exchange, Informatica PowerMart 6.2/5.1.2, Datastage 11.5, Informatica 10.1/9.X, SQL*Loader, Flat Files (Fixed, CSV, Tilde Delimited), MS SQL Server Integration Services (SSIS).

Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Erwin, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling

Scheduling Tool: Autosys, Control M, CA WORKSTATION, Tivoli Workload scheduler (TWS)

Reporting Tools: SSRS, MS AccessTableau Database and related tools Oracle 11g/10g/9i/8i/8/7.x, MS SQL Server 2012/2008, Teradata, PL/SQL, AQT Languages SQL, PL/SQL, SQL*Plus, Unix Shell Scripting, Java

Web Technologies: HTML, XHTML and XML

Operating Systems: Microsoft XP/NT/2000/98/95, UNIX.

Cloud Technologies: AWS, Azure, Informatica Cloud.

PROFESSIONAL EXPERIENCE

Confidential, Chicago IL

ETL Developer

Responsibilities:

  • Used Informatica Power Center for extraction, transformation and load (ETL) of data in the data warehouse.
  • Developed Mappings using PowerCenter tool 10.x to load the data into Netezza from DB2, flat file, SQL server using various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, and Update Strategy Designing and optimizing the Mapping.
  • Utilized of Informatica IDQ 9.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Involved with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of data.
  • Source Analyser, Data Warehousing designer, Mapping & Mapplets Designer and Transformation Designer. Developed Informatica mappings and also in tuning of mappings for better performance.
  • Testing of Informatica mappings and migrated data from one database to other through ETL process.
  • Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Netezza.
  • Tuned Performance of mapping and sessions by optimizing source, target bottlenecks and implemented pipeline partitioning.
  • Monitoring the data, worked on automate jobs running and validating the data.
  • Build a re-usable staging area in Oracle for loading data from multiple source systems using template tables for profiling and cleansing in IDQ or QualityStage.
  • Utilized IDQ profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
  • Involved in Analyzing/ building Teradata EDW using Teradata ETL utilities and Informatica.
  • Web Methods installation, clustering, apply patches, and maintenance of application servers. HiQPdf Evaluation.
  • Analyzed data sources and targets using Informatica Data Profiling option.
  • Applied slowly changing dimensions like Type 1 and 2 effectively to handle the Loads.
  • Utilized Datastage on Parallel jobs to extract, cleanse, transform, integrate and load data into EDW and then to Data Mart.
  • Extensively worked on database performance tuning techniques and modifying the complex join statements.
  • Performance tuning and optimization achieved through the management of indices, table partitioning, and optimizing the SQL scripts.
  • Maintained the table performance by following the tuning tips like normalization, creating clustered and non-clustered indexes and collect statistics.
  • Wrote PL/SQL scripts for pre & post session processes and to automate daily & monthly loads.
  • Wrote heavy stored procedures using dynamic SQL to populate data into temp tables from fact and dimensional tables for reporting purpose.
  • Implemented optimization techniques for performance tuning of Informatica Workflows by determining bottlenecks in sources, target, mapping and session level.
  • Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows
  • Interact with business, gathering requirements, co-ordinate with offshore team to deliver the code to client.
  • Developing Stored Procedures, Functions, Views and Triggers, SQL queries using SQL Server and Oracle PL/SQL.
  • Developed database objects like tables, views, indexes, stored procedures, common table expressions.
  • Applying SQL /PL SQL knowledge to write efficient SQL queries or Stored Procedures wherever required.
  • Addressing production issues like performance tuning and enhancements.
  • Keep track of Change request for the production deployment & support document.
  • Analyzed and resolves the data issues identified by business users, IT operations, or partners and captured the test results.
  • Trouble-shooting production incidents requiring detailed analysis of issues and used Autosys batch jobs, and databases.
  • Participated in Daily status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

Environment: Informatica power center 10/9.6, Netezza, Teradata13/14, Datastage 11.5, Teradata SQL Assistant, SQL, DB2, IDQ 9.6.1, Oracle 11/10g, PL/SQL, Autosys, UNIX, Shell scripting, HP Quality Center, AQT.

Confidential, Minnetonka, MN

ETL Developer

Responsibilities:

  • Created complex Stored Procedures, Triggers, Functions, Indexes, Tables, Views, SQL joins and other T-SQL code to implement business rules.
  • Responsible for performance tuning, optimizing the queries, removing redundant and inconsistent data, joins from the database tables and normalizing them.
  • Developed stored procedures to transform the data from enterprise data warehouse to data mart to build dimensions and load data to fact tables.
  • Used Execution Plan and Database Engine Tuning Advisor to optimize queries and enhance the performance of databases.
  • Improved Performance by creating Clustered and Non-clustered Indexes and by optimizing the T-SQL statements using SQL profiler.
  • Tuning Queries and Database performance using tools Database Tuning Advisor, SQL Server Profiler.
  • Created SSIS Packages using Lookup, Merge, Sort, Derived Columns, Condition Split, Aggregate, Execute SQL Task, and Data Flow Task, Execute Package Task etc. to generate underlying data for the reports and to export cleaned data from Excel, Text file, XML file to different data Marts. HiQPdf Evaluation 03/16/2021
  • Made use of SQL joins, sub queries, tracing and performance tuning for better running of DTS packages.
  • Mastered the ability to design and deploy rich Graphic visualizations with Drill Down and Drop down menu option and Parameters using Tableau.
  • Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Performed Incremental load with several Dataflow tasks and Control Flow Tasks using SSIS.
  • Performed the physical database analysis and feasibility requirements. This involves the performance, security, archival, maintenance, and recovery requirements
  • Wrote VB.NET code for Script task to perform functions that are not available in the built-in tasks and transformations that SSIS provides.
  • Migrated and translate legacy reports into Power BI from SSRS environment.
  • Wrote Parameterized Queries for generating Tabular reports, Formatting report layout, creating reports using Global Variables, Expressions, Functions, Sorting the data, Defining Data Source and Datasets, calculating subtotals and grand totals for the reports using SSRS 2014.
  • Experienced in performed partitioning of cubes as well as cube optimization in order to increase the performance of cubes in SSAS.
  • Extracted data from heterogeneous sources and transferred to OLAP using SSIS 2012.
  • Created SSIS packages to load data into Data Warehouse using Various SSIS Tasks like Execute SQL Task, bulk insert task, data flow task, file system task, send mail task, active script task, xml task and various transformations.

Environment: SQL Server 2012/2014, BIDS, Visual Studio 2010/2013,T-SQL,Autosys, MS Access, MS- Excel, SSIS,SSRS,SSAS, Windows Server 2014, Informatica, Power BI, Tableau.

Confidential, Columbus, Ohio

Informatica Developer

Responsibilities:

  • Used Informatica Power Center 10.x to Extract Transform and Load (ETL).
  • Developed mappings, workflows, Sessions using the ETL tools like workflow Manager, Task Developer and workflow designer and monitored the results.
  • Developed Informatica mappings using ETL Tools to load the data into Netezza and Eagle from various sources like oracle and flailed using various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, and Update Strategy Designing and optimizing the Mapping.
  • Developed UNIX scripts to move the source files to Archive Directory using CICS Explorer.
  • Worked on UNIX environment to extract the required data, triggered and run the job scheduling in dev, Test and preprod.
  • Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Extracted data from oracle database using Advanced Query language Tool (AQT).
  • Created packages and jobs using Change man (CICS explorer) and Loading using CA Workstation.
  • Scheduled various daily and monthly ETL loads using ESP. Experience in creating indexed views, complex stored procedures, effective triggers, and useful functions to facilitate efficient data manipulation and consistent data storage.
  • Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.
  • Coordinated with QA People to follow up the issues on testing to make the code perfect without any issues before migrating to the production.
  • Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Experienced as a short term tester, tested code in dev, test and preprod environment
  • Scheduling the workflows and Loading the files and monitoring the workflows very frequently to ensure data loading is done accordingly.
  • Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.
  • Prepared SQL Queries to validate the data both in sources and targets.
  • Tested End to End to verify the failures in the mappings.
  • Developed Parameter files for passing values to the mappings for each type of client.
  • Involved in extracting data from the Flat Files and Relational databases into staging area.
  • Worked on design and development of Informatica mappings, workflows to load data into staging area, DataMart’s in Oracle, Eagle and Netezza.
  • Tuned Performance of mapping and sessions by optimizing source, target bottlenecks and implemented pipeline partitioning.
  • Involved in unit testing and documentation of the ETL process.
  • Participated in Daily status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.

Environment: Informatica power center 9.6, Netezza, SQL, Oracle 11/10g, PL/SQL, UNIX, Windows.

Confidential, Charlotte, NC

ETL Developer

Responsibilities:

  • As an ETL Developer for the CISCM Metrics project involved in all the phases of the project from scratch to end and performed different roles in the different phases of the project.
  • Convert specifications to programs and data mapping in an ETL Informatica Cloud environment.
  • Design, Development, Testing and Implementation of ETL processes using Informatica Cloud
  • Detailed study and data profiling of all the underlying information security application systems and understood the information security data models. Identified and captured the right metadata from source systems.
  • Developed the proto-type and delivered three months of historical data in development environment to the business users for a proof of concept.
  • Identified the field and column level data quality issues of the source systems to drive the data quality, data cleansing and error checking in the ETL packages.
  • Analysed and performed Data mapping which involved identifying the source data fields, identifying target entities and their lookup table ids and translation rules.
  • Used Informatica DQ to perform Unit testing and create Mapplets that are imported and used in Power center Designer.
  • Designed and developed ETL packages using SQL Server Integration Services (SSIS) to load the data from SQL server, XML files to SQL Server database through custom C# script tasks.
  • Filtered bad data from legacy systems using T-SQL, implemented constraints and triggers into new system for data consistency.
  • Created T-SQL stored procedures, functions, triggers, cursors and tables.
  • Developed Drill-through, Drill-down, sub Reports, Charts, Matrix reports, Linked reports using SQL Server Reporting Services (SSRS).
  • Involved with Query Optimization to increase the performance of the Reports.
  • Co-ordinated with business users to perform the User Acceptance Test (UAT). Prepared migration and deployment plans for Production deployment.

Environment: SQL Server, T-SQL, SSIS, SSAS, SSRS, Windows 7, Visual Studio / 2013, C#, VB.Net, Team Foundation Server.

Hire Now