We provide IT Staff Augmentation Services!

Etl Developer Resume

3.00/5 (Submit Your Rating)

Austin, TX

PROFESSIONAL SUMMARY:

  • 8+ years of IT experience in Datawarehousing, Business Intelligence and Informatica developing ETL solutions.
  • Have clear understanding of Datawarehousing concepts with emphasis on ETL and life cycle development including requirement analysis, design, development and implementation.
  • Very strong knowledge in Informatica Powercenter suite which includes Mapping Designer, Workflow Manager, Workflow Monitor, Admin Console and Repository Manager.
  • Extensive working experience in design and development of data warehouse, marts and ODS.
  • Handful working experience in Datawarehousing development with data migration, data conversion and extraction/transformation/loading using Informatica Powercenter to extract & load data into relational databases like SQL Server, Oracle, Teradata, DB2.
  • Experienced on Tableau Desktop, Tableau Server and good understanding of tableau architecture.
  • Extensive knowledge in performance tunings, error handling and various indexes in teradata.
  • Familiarized with various Teradata tools and utilities like BTEQ, FastLoad, Multiload, Fast Export, Pump, ARC, Quarryman, PMON, and other Teradata Utilities.
  • Extensive experience in BI Solutions (ETL & Reporting) using SSIS, SSAS, SSRS and T - SQL Programming using DDL, DML, DCL commands for various business applications.
  • Expert at developing ETL interfaces using SSIS, Data marts/cubes using SSAS and enterprise reports using SSRS utilizing latest versions … & features.
  • Extensive working experience with Powercenter components of Informatica server and client tools like designer, workflow manager, repository manager and workflow monitor.
  • Expertise in Oracle Data Integrator (ODI) to Extract Load and Transform (E-LT) for high performance and faster responsiveness.
  • Strong Analytics/Data warehouse experience using SAP Business Objects Data Services/ Data Integrator. Strong experience in the integration of BODS with SAP modules and Non-SAP data sources. Strong experience in using BODS as an external system for ETL between SAP ECC and SAP BW/BI.
  • Expert in Extracting and Transforming data (ETL) from various heterogeneous sources and creating packages using SSIS/DTS, Import, Export Data, Bulk Insert and BCP utilities.
  • Great Expertise in creating and managing Event Handlers, Package Configurations, Logging, System and User-defined Variables for SSIS Packages.
  • Experience in conversion of packages from legacy systems (DTS•SSIS) and migrating, deploying the packages across Dev/Prod environments.
  • Involved in the deployment activity of the BODS code into the different environment i.e. DEV, ST, and PROD. Developed programs/scripts to automate common pre/post conversion tasks.
  • Good in Hive and Impala queries to load and processing data in Hadoop File system (HFS).
  • Hands on experience with performing various SSIS data transformation tasks like Look-ups, Fuzzy Look-ups, Conditional Splits and Event Handlers, Error Handlers etc.
  • Extensively worked with Informatica mapping variables, mapping parameters and parameter files.
  • Extensively worked with Informatica MDM (Master Data Management) in order to manage the high volume data.
  • Transformed Tableau into a managed service offering for consumption across Corporate Treasury and Corporate Investments.
  • Excellent Hand full working experience in running projects by using scheduling tools like TWS and AutoSys
  • Strong knowledge and understanding of data modelling (Star and snow flake schemes), ER diagrams, Data flow diagrams/Process diagrams.
  • Hands on experience on admin tasks of ODI such as Installations, Configurations, Security, Space Management and Monitoring.
  • Experience in working with MapReduce programs using Hadoop for working on Big Data.
  • Demonstrated expertise utilizing ETL tools including Oracle Data Integrator, Informatica, Data stage and RDBM systems like Oracle, Sybase and SQL Server.
  • Experienced in Installation, Configuration, Upgrade, Migration and Administration of ODI 10g to 11g.
  • Extensive knowledge in optimizing techniques such as concurrent caching, Auto memory calculations, partitioning and using push down optimization.
  • Worked with Oracle, Teradata and SQL stored procedure, table partitions, triggers, SQL queries, PL/SQL packages, IBM SQL procedural languages and loading data into data warehouse/data marts using Informatica, DB2 load and SQL loader.
  • Expert in working with Data Stage Designer, Administrator, and Director.
  • Designed and developed ETL processes using Data Stage designer to load data from Oracle, Flat Files to target Oracle Data Warehouse database
  • Experience in developing various types of reports like Drill Down, Drill Through, Matrix and Parameterized using SQL Server Reporting Services (SSRS).
  • Experience in Online Analysis Processing (OLAP), created cubes, dimensions for schemas.
  • Migrated and recreated existing dimensions and cubes using star schema on SQL Server … to achieve the efficiency of SQL Server Analysis Server (SSAS).
  • Strong experience with star and snow flake schema, dimensional data modelling, facts, dimensional and slowly changing dimensions.
  • Developed shell scripts for invoking Informatica workflows and for running batch process.
  • Extensive experience in formulating error handling mechanism.
  • Excellent analytical skills in understanding client’s organizational structure.
  • Excellent problem solving skills with a strong technical background and result oriented team player with excellent communication and interpersonal skills.

TECHNICAL SKILLS:

ETL Tools: Informatica Powercenter 9.6.1, 9.5, 9.1.0, 8.x, Informatica MDM, IDQ.

Database: Oracle 9i/10g/11g, SQL Server 2014/2008/2005 and Teradata 14.0

Data Modelling Tools: Erwin 4.1

Scheduling Tools: TWS, Auto Sys

Languages: SQL, PL/SQL

Scripting Languages: Unix Shell scripting

Operating Systems: Windows, Red Hat Linux

PROFESSIONAL EXPERIENCE:

ETL Developer

Confidential - Austin,TX

Responsibilities:

  • Designed ETL high level work flows and documented technical design documentation (TDD) before the development of ETL components to load DB2 from Flat Files, Oracle, DB2 systems to build Type 2 EDW using Change data capture.
  • Created stored procedures, views based on project needs
  • Develop and coding the 'real time' and batch modes loads.
  • Developed standard framework to handle restart ability, auditing, notification alerts during the ETL load process.
  • Involved in performance tuning and optimization of mapping to manage very large volume of data
  • Worked on Informatica Utilities Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Working on Business workshops for requirement gathering, explaining business on do's and Don'ts in TABLEAU, Preparing training documentation and end user roll out.
  • Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps, Gantt charts using show me functionality
  • Created mapping using various transformations like Joiner, Filter, Aggregator, Lookup, Router, Sorter, Expression, Normalizer, Sequence Generator and Update Strategy, data standardization, address validator etc.
  • Develop complex ETL mappings on Informatica 10.x platform as part of the Risk Data integration efforts.
  • Implemented SCD Type 1 and SCD Type 2 for loading data into data warehouse dimension tables.
  • Implemented error handling for invalid and rejected rows by loading them into error tables.
  • Implementing the Change Data Capture Process using Informatica Power Exchange.
  • Extensively worked on batch frame work to run all Informatica job scheduling.
  • Analyzing the sources, transforming data, mapping the data and loading the data into targets using Informatica Power Center Designer.
  • Developed complex mappings such as Slowly Changing Dimensions Type II-Time stamping in the Mapping Designer.
  • Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings.
  • Deep experience with the design and development of Tableau visualization solutions.
  • Experience with creation of users, groups, projects, workbooks and the appropriate permission sets for Tableau server logons.
  • Preparing Dashboards using calculations, parameters in Tableau.
  • Used Variables and Parameters in the mappings to pass the values between mappings and sessions.
  • Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.
  • Implemented restart strategy and error handling techniques to recover failed sessions.
  • Used Unix Shell Scripts to automate pre-session and post-session processes.
  • Did performance tuning to improve Data Extraction, Data process and Load time.
  • Worked with data modelers to understand financial data model and provided suggestions to the logical and physical data model.
  • Designed presentations based on the test cases and obtained UAT signoffs
  • Documented test scenarios as a part of Unit testing before requesting for migration to higher environment levels and handled production deployments
  • Recorded defects as a part of Defect tracker during SIT and UAT
  • Identified performance bottlenecks and suggested improvements.
  • Performed Unit testing for jobs developed, to ensure that it meets the requirements
  • Handled major Production GO-LIVE and User acceptance test activities.
  • Created architecture diagrams for the project based on industry standards
  • Defined escalation process metrics on any aborts and met SLA for production support ticket

Environment : Informatica 10.1.1, 9.5, Oracle, XML, SQL Server 2008, Web services, DB2 Mainframe, Mastero(Scheduler)

ETL Developer

Confidential, Jacksonville,FL

Responsibilities:

  • Design scheduling structure of Informatica jobs to be executed within operational calendar.
  • Extensively used the Slowly Changing Dimensions-Type II in various data mappings to load dimension tables in Data warehouse.
  • Proactively worked to analyze and resolve all Unit testing and UAT issues.
  • Facilitate/lead reviews (walkthroughs) of technical specifications and program code with other members of the technical team.
  • Involved in extensive performance tuning by determining bottlenecks using Debugger at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Worked with database operations and Informatica admin teams to fix performance issues.
  • Proactively worked on analyzing the data and data management activities.
  • Worked on developing and troubleshoot Stored Procedures, Functions, Cursors and Triggers at Database level using PL/SQL.
  • Designed and developed BI user interfaces such as dashboards and reports using SSRS and Tableau.
  • CIMS is the Customer Level Data Collection System that allows the department to collect and analyze accurate and comprehensive information to meet federal and state reporting requirements and to make decisions.
  • Gathered the requirements for CIF claiming from the vendors and documented it
  • Customer's claiming is done using manual upload of files to drop box for every customer to claim or unclaim from their district.
  • Troubleshoot and fix Data Quality issues on both Oracle (ODS-Operational Data Store) and DB2 (BDW- Brokerage Data Warehouse) platforms using Informatica, Autosys and Unix Shell scripts.
  • Analyzed the data based on requirements, wrote down the techno-functional documentations and developed complex mappings using Informatica Data Quality (IDQ) 9.6.1
  • Provide guidance to software development teams regarding production supportability, coding and deployment standards, adherence to security policies etc.
  • SAP BODS inbuilt functions and BODS scripting.
  • Global variables, Substitution parameters and system configurations.
  • Executing BAT files and VB scripts in SAP BODS Scripts.
  • Responsible for granting/revoking the access for the users and deployment of the scripts to the production server.
  • Responsible for installation, patching and upgrades of the server
  • Provided technical support for database access control, job execution and other database maintenance tasks.
  • Used Data Stage Designer to develop various parallel jobs to extract data and to do necessary transformations and load into target tables or send output files.
  • Reviewing and managing Hadoop log files from multiple machines using Flume.
  • Real time processing of raw data stored in Kafka and storing processed data in Hadoop using Spark Streaming(DStreams).
  • SIF is the project that automates the existing manual upload process to automate the claiming process
  • Worked on formatting the SSRS report using Global variables and expressions
  • Source data comes from Microsoft SQL Server, Flat files, Oracle and the target data is finally loaded into Oracle.
  • Developed Tableau workbooks from multiple data sources using Data Blending.
  • Developed Tableau visualizations and dashboards using Tableau Desktop.
  • Developed Tableau workbooks to perform year over year, quarter over quarter, YTD, QTD and MTD type of analysis.
  • Extensively worked on Change management to promote codes to different environment.
  • Validate data in the reports using Micro strategy and provide update on business queries.
  • Created Informatica mappings as part of enhancement to fix production bugs.
  • Written UNIX shell scripts to automate manual tasks.
  • Used SQL Server Integration Services (SSIS) and its tools to Extract, Transform and Load data into Data Ware House and for OLTP side.
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
  • Building, publishing customized interactive reports and dashboards, report scheduling using Tableau Server.
  • Effectively used data blending feature in Tableau to combine different sources.
  • Coordinating with source system owners, day-to-day ETL progress monitoring, Data warehouse target schema Design (Star Schema) and maintenance.
  • Involved in code walkthroughs and documentation.
  • Involved in database migration from DB2 to Netezza and fixing post migration issues.
  • Involved in Informatica uplift activities from version 9.1 to 9.6.1 and fixing post migration issues
  • Provided on-call support to resolve critical production issues.
  • Mentor offshore team on new code implementations/migrations.
  • Used SQL Server Reporting Services to migrate.

Environment: Informatica Powercenter 9.1/9.6.1, Power Exchange, UNIX, Oracle, DB2, Netezza, Micro Strategy.

ETL BI Developer

Confidential, Atlanta, GA

Responsibilities:

  • Collaborated with Business People for requirements gathering and Participated in translating the business requirements into technical requirements.
  • Involved in effort estimation for the requirements in the project and prepared mapping documents based on client requirement specifications Oracle Data Integrator(ODI)
  • Designed simple and complex mappings using Informatica Powercenter to load the data from source using different transformations like Source Qualifier, Lookup, Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Update Strategy, Normalizer, SQL, Rank and Router transformations into CRDB database (Oracle).
  • Developed packages, scenarios using interfaces and variables and used them in Load plans for loading various Facts and Dimensions.
  • Involved in writing Complex SQL Queries to test the data from the Data Stage jobs.
  • Used Data Stage Director to unlock jobs (resources clean up), monitoring jobs and managing logs involved in the ETL design and its documentation.
  • Automation of monitoring all the failed jobs in the database using SSIS and SQL mail.
  • Extensively worked on Data Stage jobs for splitting bulk data into subsets and to dynamically distribute to all available processors to achieve best job performance
  • Monitoring the execution logs in operator.
  • Experienced in Working with Load Plans and Scenarios.
  • Good knowledge in creating File to DB and DB to DB related interfaces.
  • Working knowledge of Knowledge Modules (LKM, IKM, JKM and CKM).
  • Worked on Informatica Master Data Management in the process of managing high volume incoming data.
  • Extensively used Informatica functions and parameters & variables in the transformations at mapping level.
  • Wrote SQL queries for pre-sql and post sql fields in source qualifier transformation with appropriate schema names for extracting only the required rows for optimal performance.
  • Worked on Informatica Master Data Management (MDM) in the process of managing high volume incoming data.
  • Tested the queries in SQL navigator for integrity and identifying data errors.
  • Used Debugger to fix errors in the mappings and to check the data flow.
  • Extensively used ODI (Oracle Data Integrator) ELT tool to load data from various source systems to a target Data warehouse.
  • Performed post load validation procedures after loading the data into SAP BODS HANA and validated the results with business.
  • Strong Work on SAP BODS Develop profiling queries/validation programs to cross check all the data before and after the data loading into S/4 HANA.
  • Uploaded data from operational source system (Oracle 8i) to Teradata.
  • Used utilities of FLOAD, MLOAD, FEXP, TPUMPof Teradata and created batch jobs using BTEQ.
  • Imported Metadata from Teradata tables
  • Having good experience on Oracle Data Integrator Designer ODI to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse.
  • Experience in using ODI designer, Topology, Operator and security Tools in ODI.
  • Combined Work Management Systems (WMS Data) with Financial Data in order to report against multiple sources (WMS and CR) and have an efficient and consistent reporting solution
  • Created sessions, event tasks, decision tasks and workflows using Powercenter.
  • Partitioned sessions for concurrent loading of data into the target tables.
  • Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager.
  • Developed UNIX Shell Scripts for scheduling the sessions in Informatica.
  • Used Pre session and Post Session to send e-mail to business users through Workflow Manager.
  • Created parameter files to change the load date and configured that parameter at session level.
  • Used Toad as an advanced SQL, PL/SQL editor. Built and tested scripts, PL/SQL packages, procedures, triggers, and functions to implement business rules.
  • Created scripts to migrate data from CRDB database to CCAR staging area (SQL server) and developed mappings and workflow in staging area.
  • Documented ETL test plans, test cases, test scripts, test procedures, assumptions and validations based on design specifications for unit testing and system testing.
  • Worked with the QA team to Trouble shoot and resolve any QA or UAT issues.
  • Migrated Mappings from Development environment to QA, QA to Production environment .
  • Coordinated with the reporting team to develop the MicroStrategy reports.
  • Ongoing support has been provided for production issues by looking after the errors in daily loads and played an active role in resolving them .
  • Based on the assigned JIRA tickets, added and modified Informatica mappings and sessions to improve performance, accuracy, and maintainability of existing ETL functionality as requested by the users.
  • Worked on new column additions to existing target and source tables in ETL as per new business requirements.

Environment:: Informatica Powercenter 9.5, Informatica MDM, SQL Server, Oracle 11g, TOAD, UNIX Shell Scripting, SQL, PL/SQL, MicroStrategy.

ETL Developer/ SQL Developer

Confidential, Jacksonville,FL

Responsibilities:

  • Involved in the Dimensional Data Modelling and populating the business rules using mappings into the Repository for Data management.
  • Designed Sources to Targets mapping from primarily Flat files to Oracle using Informatica Powercenter.
  • Created global repository, Groups, Users and assigned privileges using repository manager.
  • Involved in developing source to target mappings and scheduling Informatica sessions
  • Various kinds of the transformations were used to implement simple and complex business logic. Transformations used were: connected & unconnected lookups, Router, Expressions, source qualifier, aggregators, filters, sequence generator, etc.
  • Extensively worked in Oracle SQL, PL/SQL, Query performance tuning, Created DDL scripts, Created database objects like Tables, Indexes, Synonyms, and Sequences etc.,
  • Tuned Informatica Mappings and Sessions for optimum performance
  • Worked in writing UNIX scripts, SQL statements and interacted with development and production teams to expedite the process.
  • Developed UNIX shell scripts to run the pmcmd functionality to start and stop sessions.

Environment: Informatica Powercenter, Oracle 8i, SQL Server 2000, SQL, PL/SQL.

Confidential

ETL Informatica Developer

Responsibilities:

  • Source systems data from the distributed environment was extracted, transformed and loaded into the Data warehouse using Informatica.
  • Involved in the Design, development and implementation of mappings using Informatica Power Center designer and creating Design Documents .
  • Extracted Data from flat files and various relational databases and loading to Warehouse.
  • Worked on different Transformations like Source Qualifier, Joiner, Router, Aggregator, Lookup, Expression and Update Strategy to load data into target tables.
  • Worked extensively on fixing invalid Mappings, and performing Unit and Integration Testing .
  • Used transformations like Aggregator, Filter, Router, Sequence Generator, Update Strategy, Rank, Expression and lookup (connected and unconnected) while transforming Data according to the business logic.
  • Created Sessions, reusable Worklets and Batches in Workflow Manager and Scheduled the batches and sessions at specified frequency.
  • Developed various Mappings to load data from various sources using different Transformations.
  • Worked on scheduling production jobs in Autosys.
  • Involved in production deployment.

Environment: Informatica Powercenter 8.1/ 8.6, UNIX, Oracle, Autosys

We'd love your feedback!