We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

Reston, VA

CAREER SUMMARY:

  • 10+ years of IT experience in Data warehousing, Business Intelligence and Informatica developing ETL solutions.
  • Have clear understanding of Data warehousing concepts with emphasis on ETL and life cycle development including requirement analysis, design, development and implementation.
  • A seasoned ETL developer with 8+ years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application system for Retail, Banking, Insurance & Manufacturing.
  • Experienced on Tableau Desktop, Tableau Server and good understanding of tableau architecture.
  • Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting
  • Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
  • Extensive working experience with Powercenter components of Informatica server and client tools like designer, workflow manager, repository manager and workflow monitor.
  • Hands - on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Experience with dimensional modelling using star schema and snowflake models.
  • Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
  • Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
  • Vast experience in Designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc
  • Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills.

TECHNICAL SKILLS:

ETL Tools: Informatica 6.x/7.x/8.x/9.x, Power Exchange 8.6.1, Datastage8.1, Datastage8.5

Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow Flake Modeling, Fact and Dimensions Tables, Physical and Logical Data Modeling Using Erwin Tool.

Testing Tools: HP Quality Center, Silk Test, SCM tools etc.

RDBMS: Oracle 11g/10g/9i/8.x, SQL Server 2008/2005/2000 , MySQL,DB2, MS Access

Database Tools: TOAD, SQL*PLUS, SQL Developer, SQL*Loader, Teradata SQL Assistant

Languages: SQL (2012 and 2014), PLSQL, Shell Scripting, C, C++, PERL, PYTHON.

MS: DOS, Windows7/Vista/XP/2003/2000/NT, UNIX, AIX, Sun Solaris

PROFESSIONAL EXPERIENCE:

Confidential, Reston, VA

Sr. ETL Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering , design, development, testing, Production, user and support for production environment.
  • Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
  • Develop the mappings using needed Transformations in Informatica tool according to technical specifications
  • Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts.
  • Wrote reports using Tableau Desktop to extract data for analysis using filters based on the business use case.
  • Code reviews of ETL and SQL processes. Worked on upgrading Informatica from version 9.6.1 to 10.1.
  • Developed UNIX Shell scripts to execute the workflows using PMCMD utility and used Autosys scheduler for automation of ETL processes.
  • Scheduling the Informatica Cloud Service jobs using Informatica Cloud task scheduler.
  • Teradata SP/View/BTEQ development and involving in the code review meetings.
  • Involved in Implementation of SCD1 and SCD2 data load strategies.
  • Designed and developed several SQL Server Stored Procedures, Triggers and Views.
  • Created complex mappings that involved implementation of Business Logic to load data in to staging area.
  • Used Informatica reusability at various levels of development.
  • Involved with Informatica team members in Designing, document and configure the Informatica MDM Hub to support loading, cleansing, and publication of MDM data.
  • Made use of various PowerCenter Designer transformations like Source Qualifier, Connected and Unconnected Lookups, Expression, Filter, Router, Sorter, Aggregator, Joiner, Rank, Router, Sequence generator, Union and Update Strategy transformations.
  • Implemented various loads like daily, weekly, and quarterly loads and on demand load using Incremental loading strategy and concepts of Change Data Capture (CDC).
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Extensively used the Workflow Manager tasks like Session, Event-Wait, Timer, Command, Decision, Control and E-mail while creating worklets/workflows.
  • Worked with “pmcmd” command line program to communicate with the Informatica server, to start, stop and schedule workflows.
  • During the course of the project, participated in meetings with client and data/ETL architects to propose new ideas for performance tuning and gather new requirements.
  • Developed mappings/sessions using Informatica Power Center 8.6 for data loading.
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Building Reports according to user Requirement.
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
  • Implementedslowly changing dimensionmethodology for accessing the full history of accounts.
  • Write Shell script running workflows in UNIX environment.
  • Optimizing performance tuning at source, target,mapping and session level.
  • Participated inweeklystatus meetings, and conducting internal andexternal reviews as well as formal walk through among various teams and documenting the proceedings.

Environment : Informatica Power Center 9.6.1/10.1, Informatica Cloud, Informatica Data Quality(IDQ) 9.6.1, Informatica MDM 9.5, Oracle 11g, DB2, Teradata 14, Tableau Desktop 9.3, Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD, Autosys, Cognos, Erwin Designer, SQL, PL/SQL, UNIX, MS SQL Server 2016.

Confidential, Duluth, GA

Sr. ETL Developer

Responsibilities:

  • Creating new repositories from scratch, backup and restore.
  • Performed Informatica upgrade from 9.0.1 to 9.5
  • Created Groups, roles, privileges and assigned them to each user group.
  • Code change migration from Dev to QA and QA to Production
  • Worked on SQL queries to query the Repository DB to find the deviations from Company’s ETL Standards for the objects created by users such as Sources, Targets, Transformations, Log Files, Mappings, Sessions and Workflows.
  • Used Pre session and Post Session to send e-mail to various business users through the Workflow Manager
  • Leveraging the existing PL/SQL scripts for the daily ETL operation.
  • Ensure that all support requests are properly approved, documented, and communicated using the MQC tool. Documenting common issues and resolution procedures
  • Extensively used Informatica Client tools -PowerCenter Designer, Workflow Manager, Workflow Monitor and Repository Manager
  • Extracted data from various heterogeneous sources like Oracle, Flat Files.
  • Developed complex mapping using Informatica PowerCenter tool.
  • Extracting data from Oracle and Flat file, Excel files and performed complex joiner, Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformation, Update strategy transformations to load data into the target systems.
  • Created Sessions, Tasks, Workflows and worklets using Workflow manager.
  • Worked with Data modeler in developing STAR Schemas
  • Extensively involved in enhancing and managing Unix Shell Scripts.
  • Developed workflow dependency in Informatica using Event Wait Task, Command Wait.
  • Involved in analysing the existence of the source feed in existing CSDR database.
  • Involved in converting the business requirement into technical design document.
  • Documenting the macro logic and working closely with Business Analyst to prepare BRD.
  • Involved in requirement gathering for procuring new source feeds.
  • Involved in setting up SFTP setup with the internal bank management.
  • Building Unix scripts in cleaning up the source files.
  • Involved in loading all the sample source data using sql loader and scripts.
  • Creating Informatica workflows to load the source data into CSDR.
  • Involved in creating various Unix script used during ETL load process.
  • Handling high volume of day to day informatica workflow migrations.
  • Periodically cleaning up informatica repositories.
  • Monitoring the daily load and handing over the stats with the QA Team.
  • Creating new repositories from scratch, backup and restore
  • Review of informatica ETL design documents and working closely with development to ensure correct standards are followed .

Environment : Informatica Power Center 9.5, MS SQL server 2008, Erwin - 4.0, Oracle 11g/10g, TOAD 9.x,Quality Center, Teradata 13.10 and UNIX.

Confidential, Tampa, FL

ETL Developer

Responsibilities:

  • Design scheduling structure of Informatica jobs to be executed within operational calendar.
  • Extensively used the Slowly Changing Dimensions-Type II in various data mappings to load dimension tables in Data warehouse.
  • Proactively worked to analyze and resolve all Unit testing and UAT issues.
  • Facilitate/lead reviews (walkthroughs) of technical specifications and program code with other members of the technical team.
  • Involved in extensive performance tuning by determining bottlenecks using Debugger at various points like targets, sources, mappings, sessions or system.
  • Writing technical documentation and routine production ETL process support.
  • Develop new components in Informatica Data Integration Hub (DIH), a latest tool by Informatica and good understanding of DIH components, concepts.
  • Develops Cloud Services tasks (Replication/Synchronization/Mapping Configuration) to load the data into Salesforce (SFDC) Objects.
  • Designing, developing, maintaining and supporting Data Warehouse or OLTP processes via Extract, Transform and Load (ETL) software using Informatica, Shell Scripts, DB2 UDB and Autosys.
  • Responsible for User administration & maintaining the Informatica Cloud Services - Secure Agent on Unix Server for Dev/QA environment.
  • Developed informatica mappings, mapping configuration task and Taskflows using Informatica cloudservice (ICS).
  • Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
  • Understand and perform Data Analysis, requirement gathering and Design and Development of code
  • Designed ETL high level work flows and documented technical design documentation (TDD) before the development of ETL components to load DB2 from Flat Files, Oracle, DB2 systems to build Type 2 EDW using Change data capture.
  • Working on Business workshops for requirement gathering, explaining business on do's and Don'ts in TABLEAU, Preparing documentation and end user roll out.
  • Responsible for developing and maintaining ETL jobs, including ETL implementation and enhancements, testing and quality assurance, troubleshooting issues and ETL/Query performance tuning
  • Participate in design and analysis sessions with business analysts, source-system technical teams, and end users.
  • Involved in Performance tuning for sources, targets, mappings and sessions.
  • Helped IT reduce the cost of maintaining the on-campus Informatica PowerCenter servers by migrated the code to Informatica Cloud Services.
  • Worked with Master Data Management (MDM) team to load data from external source systems to MDM hub.
  • Manage and expand current ETL framework for enhanced functionality and expanded sourcing.
  • Utilization of Informatica IDQ to complete initial data profiling and matching/removing duplicate data.
  • Translate business requirements into ETL and report specifications. Performed error handing using session logs.
  • Analyzed data using complex SQL queries, across various databases.
  • Migrated mappings, sessions, and workflows from development to testing and then to Production environments.
  • Involved in creating database objects like tables, views, procedures, triggers, and functions using T-SQL to provide definition, structure and to maintain data efficiently.

Environment: Informatica Power Center 9.1.0/9.5, Flat Files, Oracle 11i, Oracle 11, Actimize, Autosys,Toad,MS Excel-Macro.

Confidential, NJ

ETL Developer

Responsibilities

  • Worked with Business Analyst and Analyzed specifications and identified source data needs to be moved to data warehouse, Participated in the Design Team and user requirement gathering meetings.
  • Worked on Informatica - Repository Manager, Designer, Workflow Manager & Workflow Monitor.
  • Involved in discussing Requirement Clarifications with multiple technical and Business teams.
  • Performed Informatica upgrade from V9.1 to 9.5.
  • Creation and maintenance of Informatica users and privileges.
  • Migration of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments.
  • Documented the LDAP configuration process and worked closely with Informatica Technical support on some of the issues.
  • Fixing all the workflows failure in unit testing and system testing.
  • Scheduling all the ETL workflows for the parallel run comparison.
  • Involved in preparing the migration list inventory.
  • Involved in requirement gathering for redesign candidates
  • Worked along with the Informatica professional to resolved Informatica upgrade issue.
  • Monitoring the disk space issue and cleaning up the unwanted logs periodically.
  • Worked with BA in the QA phase of testing.
  • Worked on Informatica Schedulers to schedule the workflows.
  • Scheduled batch jobs using Autosys to run the workflows.
  • Extensively involved in ETL testing, Created Unit test plan and Integration test plan to test the mappings, created test data. Use of debugging tools to resolve problems.
  • Used workflow monitor to monitor the jobs, reviewed error logs that were generated for each session, and rectified any cause of failure.

Environment: Informatica Power Center 9.1/8.6, Oracle 11g, PL/SQL, Autosys, SQL, Teradata, SQL* LOADER, TOAD, Shell Scripting

Confidential, Vinings, GA

ETL Developer

Responsibilities :

  • Worked with External vendors to understand the marketing needs and identify the source data for mapping the requirement.
  • Setting up the SFTP connection between the external vendors and the Honeywell security team to transfer the reports securely
  • Data for Connected home reports are pulled using both Informatica/Pentaho ETL Tool.
  • Created stored procedure and function to improve the Job performance.
  • Troubleshooting the job failure during the daily operation.
  • Converting the Existing Informatica ETL jobs in to Pentaho Jobs.
  • Participation in Performance tuning in database side, transformations, and jobs level.
  • Creating Pentaho jobs for loading data sequentially & parallel for initial and incremental loads.
  • Using various PDI steps in cleansing and load the data as per the business needs.
  • Automating all the Excel Input using Pentaho Jobs for Dashboard reporting.
  • Gathering data source for the various business requirement.
  • Integrating data from API using Informatica.
  • Created Jobs to Process JSON data obtained through API (App Annie).
  • Created Jobs to integrate data from sales force API to our reporting database.
  • Working with External vendors on the data needs and provision the data as per the needs.
  • Worked in Agile environment, with daily scrum and ticket updates.
  • Worked on complex ETL transformation and jobs including processing data from salesforce web services.

Environment: Informatica 10.0.1, Pentaho PDI 6.1, Auto-sys Scheduler & Winscp

Confidential, Cleveland, OH

ETL Developer

Responsibilities :

  • Involved in Business Requirements analysis and design, prepared and technical design documents.
  • Used Erwin for logical and Physical database modeling of the staging tables, worked with the Data Modeler and contributed to the Data Warehouse and Data Mart design and specifications.
  • Developed technical design specification to load the data into the data mart tables confirming to the business rules.
  • Involved in design and development of complex ETL mappings and stored procedures in an optimized manner.
  • Cleansed the source data, extracted and transformed data with business rules, and built reusable components such as Mapplets, Reusable transformations and sessions etc.
  • Involved in loading the data from Source Tables to ODS (Operational Data Source) Tables using Transformation and Cleansing Logic using Informatica.
  • Involved in analysis of source systems, business requirements and identification of business rule and responsible for developing, support and maintenance for the ETL process.
  • Created / updated ETL design documents for all the Informatica components changed.
  • Extracted data from heterogeneous sources like oracle, xml, DB2, flat file and perform the data validation in staging area then loaded in to data warehouse in oracle 11g.
  • Used Informatica B2B data transformation to read unstructured and semi structured data and load them to the target.
  • Code walkthrough to ensure the programs are in accordance to ASA standards
  • Enhancement, Testing and Documentation of projects.
  • Handling failed jobs in production environment within SLADeveloped complex Informatica mappings to load the data from various sources using different transformations like source qualifier, connected and unconnected look up, update Strategy, expression, aggregator, joiner, filter, normalizer, rank and router
  • Developed mapplets and worklets for reusability.
  • Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.
  • Implemented partitioning and bulk loads for loading large volume of data.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading
  • Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
  • Created Materialized views for summary tables for better query performance.
  • Implemented weekly error tracking and correction process using Informatica.
  • Developed Documentation for all the routines (Mappings, Sessions and Workflows).
  • Creating Test cases and detailed documentation for Unit Test, System, Integration Test and UAT to check the data quality.

Environment: Informatica Power Center 8.6/9.0.1, Oracle 10g, UNIX (AIX), WINSQL, Windows 7, Flat files, MS SQL Server 2008, MS-Access, Auto-sys, Ultra Edit.

Hire Now