Sr.etl Developer Resume
Milwaukee, WI
SUMMARY
- A seasoned ETL developer with 8+ years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application system for Retail, Banking, Insurance & Manufacturing.
- Strong Data Warehousing ETL experience of using Informatica 10.0.1 /9.1/8.6.1/ PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
- Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting
- Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Experience with dimensional modelling using star schema and snowflake models.
- Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
- Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
- Vast experience in Designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc
- Ability to meet deadlines and handle multiple tasks, decisive with strong leadership qualities, flexible in work schedules and possess good communication skills.
PROFESSIONAL EXPERIENCE
Confidential, Milwaukee,WI
Sr.ETL Developer
Responsibilities:
- Responsible for developing and maintaining ETL jobs, including ETL implementation and enhancements, testing and quality assurance, troubleshooting issues and ETL/Query performance tuning
- Participate in design and analysis sessions with business analysts, source-system technical teams, and end users.
- Writing technical documentation and routine production ETL process support.
- Develop new components in InformaticaDataIntegrationHub(DIH), a latest tool by Informatica and good understanding of DIH components, concepts.
- DevelopsCloudServices tasks (Replication/Synchronization/Mapping Configuration) to load the data into Salesforce (SFDC) Objects.
- Designing, developing, maintaining and supporting Data Warehouse or OLTP processes via Extract, Transform and Load (ETL) software using Informatica, Shell Scripts, DB2 UDB and Autosys.
- Responsible for User administration & maintaining theInformaticaCloudServices - Secure Agent on Unix Server for Dev/QA environment.
- Developedinformaticamappings, mapping configuration taskand Taskflows usingInformaticacloudservice (ICS).
- Automated/Scheduled thecloudjobs to run daily with email notifications for any failures.
- Understand and perform Data Analysis, requirement gathering and Design and Development of code
- Involved in Performance tuning for sources, targets, mappings and sessions.
- Helped IT reduce the cost of maintaining the on-campusInformaticaPowerCenter servers by migrated the code toInformaticaCloudServices.
- Worked with Master Data Management (MDM) team to load data from external source systems to MDM hub.
- Manage and expand current ETL framework for enhanced functionality and expanded sourcing.
- Utilization of InformaticaIDQ to complete initialdata profiling and matching/removing duplicate data.
- Translate business requirements into ETL and report specifications.Performed error handing using session logs.
- Analyzed data using complexSQLqueries, across various databases.
- Migrated mappings, sessions, and workflows from development to testing and then to Production environments.
- Involved in creating database objects like tables, views, procedures, triggers, and functions using T-SQLto provide definition, structure and to maintain data efficiently.
- Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts.
- Wrote reports usingTableauDesktop to extract data for analysis using filters based on the business use case.
- Code reviews ofETLand SQLprocesses.Workedon upgradingInformaticafrom version 9.6.1 to 10.1.
- Developed UNIX Shell scripts to execute the workflows using PMCMD utility and used Autosys scheduler for automation of ETL processes.
- Scheduling theInformaticaCloudService jobs usingInformaticaCloudtask scheduler.
- Teradata SP/View/BTEQ development and involving in the code reviewmeetings.
- Involved in Implementation of SCD1 and SCD2 data load strategies.
- Designed and developed several SQL Server Stored Procedures, Triggers and Views.
Environment: Informatica Power Center9.6.1/10.1,Informatica Cloud,Informatica Data Quality(IDQ)9.6.1,Informatica MDM9.5,Oracle11g,DB2, Teradata14,TableauDesktop 9.3,Teradata SQL Assistant, Teradata Manager, BTEQ, MLOAD, FLOAD,Autosys,Cognos,Erwin Designer,SQL, PL/SQL, UNIX,MS SQL Server2016.
Confidential, NJ
ETL Developer
Responsibilities:
- Worked with External vendors to understand the marketing needs and identify the source data for mapping the requirement.
- Setting up the SFTP connection between the external vendors and the Honeywell security team to transfer the reports securely
- Data for Connected home reports are pulled using both Informatica/Pentaho ETL Tool.
- Created stored procedure and function to improve the Job performance.
- Troubleshooting the job failure during the daily operation.
- Converting the Existing Informatica ETL jobs in to Pentaho Jobs.
- Participation in Performance tuning in database side, transformations, and jobs level.
- Creating Pentaho jobs for loading data sequentially & parallel for initial and incremental loads.
- Using various PDI steps in cleansing and load the data as per the business needs.
- Automating all the Excel Input using Pentaho Jobs for Dashboard reporting.
- Gathering data source for the various business requirement.
- Integrating data from API using Informatica.
- Created Jobs to Process JSON data obtained through API (App Annie).
- Created Jobs to integrate data from sales force API to our reporting database.
- Working with External vendors on the data needs and provision the data as per the needs.
- Worked in Agile environment, with daily scrum and ticket updates.
- Worked on complex ETL transformation and jobs including processing data from salesforce web services.
Environment: Informatica 10.0.1, Pentaho PDI 6.1, Auto-sys Scheduler & Winscp
Confidential, NJ
Informatica Developer
Responsibilities:
- Creating new repositories from scratch, backup and restore.
- Performed Informatica upgrade from 9.0.1 to 9.5
- Created Groups, roles, privileges and assigned them to each user group.
- Code change migration from Dev to QA and QA to Production
- Worked on SQL queries to query the Repository DB to find the deviations from Company’s ETL Standards for the objects created by users such as Sources, Targets, Transformations, Log Files, Mappings, Sessions and Workflows.
- Used Pre session and Post Session to send e-mail to various business users through the Workflow Manager
- Leveraging the existing PL/SQL scripts for the daily ETL operation.
- Ensure that all support requests are properly approved, documented, and communicated using the MQC tool. Documenting common issues and resolution procedures
- Extensively used Informatica Client tools -PowerCenter Designer, Workflow Manager, Workflow Monitor and Repository Manager
- Extracted data from various heterogeneous sources like Oracle, Flat Files.
- Developed complex mapping using Informatica PowerCenter tool.
- Extracting data from Oracle and Flat file, Excel files and performed complex joiner, Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformation, Update strategy transformations to load data into the target systems.
- Created Sessions, Tasks, Workflows and worklets using Workflow manager.
- Worked with Data modeler in developing STAR Schemas
- Extensively involved in enhancing and managing Unix Shell Scripts.
- Developed workflow dependency in Informatica using Event Wait Task, Command Wait.
- Involved in analysing the existence of the source feed in existing CSDR database.
- Involved in converting the business requirement into technical design document.
- Documenting the macro logic and working closely with Business Analyst to prepare BRD.
- Involved in requirement gathering for procuring new source feeds.
- Involved in setting up SFTP setup with the internal bank management.
- Building Unix scripts in cleaning up the source files.
- Involved in loading all the sample source data using sql loader and scripts.
- Creating Informatica workflows to load the source data into CSDR.
- Involved in creating various Unix script used during ETL load process.
- Handling high volume of day to day informatica workflow migrations.
- Periodically cleaning up informatica repositories.
- Monitoring the daily load and handing over the stats with the QA Team.
- Creating new repositories from scratch, backup and restore
- Review of informatica ETL design documents and working closely with development to ensure correct standards are followed
Environment: Informatica Power Center 9.1.0/9.5, Flat Files, Oracle 11i, Oracle 11, Actimize, Autosys,Toad,MS Excel-Macro.
Confidential, NJ
ETL Developer
Responsibilities:
- Worked with Business Analyst and Analyzed specifications and identified source data needs to be moved to data warehouse, Participated in the Design Team and user requirement gathering meetings.
- Worked on Informatica - Repository Manager, Designer, Workflow Manager & Workflow Monitor.
- Involved in discussing Requirement Clarifications with multiple technical and Business teams.
- Performed Informatica upgrade from V9.1 to 9.5.
- Creation and maintenance of Informatica users and privileges.
- Migration of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments.
- Documented the LDAP configuration process and worked closely with Informatica Technical support on some of the issues.
- Fixing all the workflows failure in unit testing and system testing.
- Scheduling all the ETL workflows for the parallel run comparison.
- Involved in preparing the migration list inventory.
- Involved in requirement gathering for redesign candidates
- Worked along with the Informatica professional to resolved Informatica upgrade issue.
- Monitoring the disk space issue and cleaning up the unwanted logs periodically.
- Worked with BA in the QA phase of testing.
- Worked on Informatica Schedulers to schedule the workflows.
- Scheduled batch jobs using Autosys to run the workflows.
- Extensively involved in ETL testing, Created Unit test plan and Integration test plan to test the mappings, created test data. Use of debugging tools to resolve problems.
- Used workflow monitor to monitor the jobs, reviewed error logs that were generated for each session, and rectified any cause of failure.
Environment: Informatica Power Center 9.1/8.6, Oracle 11g, PL/SQL, Autosys, SQL, Teradata, SQL* LOADER, TOAD, Shell Scripting
Confidential, NJ
ETL Informatica developer
Responsibilities:
- Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
- Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
- Develop the mappings using needed Transformations in Informatica tool according to technical specifications
- Created complex mappings that involved implementation of Business Logic to load data in to staging area.
- Used Informatica reusability at various levels of development.
- Developed mappings/sessions using Informatica Power Center 8.6 for data loading.
- Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
- Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
- Building Reports according to user Requirement.
- Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
- Implementedslowly changing dimensionmethodology for accessing the full history of accounts.
- Write Shell script running workflows in UNIX environment.
- Optimizing performance tuning at source, target,mapping and session level.
- Participated inweeklystatus meetings, and conducting internal andexternal reviews as well as formal walk through among various teams and documenting the proceedings.
Environment: Informatica 8.6 .1,Oracle 11g
Confidential
ETL Informatica developer
Responsibilities:
- Involved in Business Requirements analysis and design, prepared and technical design documents.
- Used Erwin for logical and Physical database modeling of the staging tables, worked with the Data Modeler and contributed to the Data Warehouse and Data Mart design and specifications.
- Developed technical design specification to load the data into the data mart tables confirming to the business rules.
- Involved in design and development of complex ETL mappings and stored procedures in an optimized manner.
- Cleansed the source data, extracted and transformed data with business rules, and built reusable components such as Mapplets, Reusable transformations and sessions etc.
- Involved in loading the data from Source Tables to ODS (Operational Data Source) Tables using Transformation and Cleansing Logic using Informatica.
- Developed complex Informatica mappings to load the data from various sources using different transformations like source qualifier, connected and unconnected look up, update Strategy, expression, aggregator, joiner, filter, normalizer, rank and router
- Developed mapplets and worklets for reusability.
- Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.
- Implemented partitioning and bulk loads for loading large volume of data.
- Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading
- Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
- Created Materialized views for summary tables for better query performance.
- Implemented weekly error tracking and correction process using Informatica.
- Developed Documentation for all the routines (Mappings, Sessions and Workflows).
- Creating Test cases and detailed documentation for Unit Test, System, Integration Test and UAT to check the data quality.
Environment: Informatica Power Center 8.6/9.0.1, Oracle 10g, UNIX (AIX), WINSQL, Windows 7, Flat files, MS SQL Server 2008, MS-Access, Auto-sys, Ultra Edit.
Confidential
Support Analyst
Responsibilities:
- Developed standard and re-usable mappings and mapplets with various transformations like expression, aggregator, joiner, source qualifier, filter, lookup, stored procedure and router.
- Involved in enhancement for the COBOL coding and promoting the code in to the production systems.
- Involved in scheduling jobs in scheduling tools.
- Coding COBOL, JCL, DB2 programs
- Handling Change Requests.
- Handling Ad-hoc requests.
- Involved in automated and manual FTP.
- Review of Programs
- Code walkthrough to ensure the programs are in accordance to ASA standards
- Enhancement, Testing and Documentation of projects.
Environment: Mainframe, DB2, JCL,Oracle 10g, SQL, Informatica 8.6, Quality Center.