Etl Developer Resume
OH
PROFESSIONAL SUMMARY:
- Accomplished ETL Developer/Tester/Support with over eight years of experience in Information Technology Industry with emphasis on in Data warehousing and ETL process using Informatica Technology. A skilled technical professional with an earned reputation for meeting demanding deadlines and delivering quality Retail/Banking/Insurance/Healthcare products. Strong knowledge of Software Development Life Cycle including Waterfall and Agile Scrum methodologies. Proven human relations skills and high work ethics. Recognized for excellent contribution by prior employer.
- Around 8 years of work experience with strong background in ETL Data warehouse/Data Mart Development Life Cycle and performed ETL procedure to load data from different sources into data warehouse using Informatica PowerCenter.
- Sound Knowledge and experience in Metadata and Star schema/Snowflake schema, analyzing Source Systems, Staging area, Fact and Dimension tables in Target D/W, ETL design, including process flow, data flow, source - to-target mappings, physical database designs, data models.
- Experience in ETL methodology for supporting Data Extraction, Transformations and Loading Process in a corporate-wide-ETL solution using Informatica Power Center, Power Excahange, Data Mining, OLTP, OLAP, Performance Tuning and Trouble Shooting.
- Proficient in Defect management, including Defect creation, modification, tracking, and reporting using Industry Standard Tools like Quality Center QC/ALM, Test Director, JIRA, Clear Quest.
- Strong Exposure as database Developer to SQL, PL/SQL, RDBMS, Oracle 10g/11g/Exadata, SQL Plus, SQL Loader, TOAD, and MS SQL Server 2005/03.
- Worked with Teradata utilities like MLOAD, FLOAD and Tpump and loaded data using scripts.
- Experienced with Integration of data from various Sources & Targets such as Relational tables, flat files, .CSV, XML files, web services etc.
- Worked on Slowly Changing Dimensions (SCDs) and its implementation to keep track of historical data
- Optimized the Solution using various performance tuning methods SQL tuning, ETL tuning i.e. optimal configuration of transformations, Targets, Sources, Mappings and Sessions etc.
- Worked and loaded data using PowerExchange in Real time CDC data as well as batch mode.
- Good working experience with parameters and variables.
- Worked with QA team to create test cases for unit and integration testing.
- Excellent team player as well as self-starter with ability to work independently and work in both development and maintenance phases of the project and Analytical and Technical aptitude with ability to work in a fast paced, highly flexible environment where in-depth knowledge of technology, hard work and ingenuity are highly appreciated.
- Excellent interpersonal, presentation, communication and project management skills with technically competent and result-oriented with problem solving skills and ability to work effectively.
- Developed effective working relationships with client team to understand support requirements, develop tactical and strategic plans to implement technology solutions and effectively manage client expectations.
- Good exposure of onsite offshore model and on call production support schedules.
TECHNICAL SKILLS:
ETL Tools: Informatica PowerCenter 10.1/9.6/9.1/8.6/8.1 , PowerExchange 9.6/9.1/8.6, Informatica
Platforms: Windows (all flavors), UNIX, LINUX (Red hat)
Databases: Oracle Exadata/11g/10g, MS SQL Server 2000/05, MS Access 07, Teradata 14/13, DB2 9/8., Netezza 7.2.1
Languages: C, C++, SQL, PL/SQL, Shell Scripting, VB6.0, ASP, Java
Data Modeling: Visio, Erwin
Tools: DbVisualizer, Toad, SQL Developer, SQL*Plus, Business Objects XI R2Tidal, OBIEE 11g, TFS, SharePoint, SVN, Autosys,Tivoli 9.3
Web Technologies: HTML, DHTML, VB Script
PROFESSIONAL EXPERIENCE:
Confidential, OH
ETL Developer
Responsibilities:
- Designed and developed mappings, defined workflows and tasks, monitored sessions, exported and imported mappings and workflows, backups, and recovery.
- Created mappings, which involved transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expression and Aggregator transformations to load the transactions, accounts and customer’s data from flat files.
- Used debugger to test the mappings and fixed the bugs.
- Developed slowly changed dimensions (SCD) Type 2 for loading data into Dimensions.
- Involved in the design, development and implementation of the Fact tables and related Dimensions
- Developed common routine mappings. Made use of mapping variables, mapping parameters and variable functions.
- Created and used reusable Mapplets and Worklets to reduce the redundancy.
- Extensively used ETL to load data from Flat files which involved both fixed width as well as delimited files and also from the relational database, which was Oracle/SQLServer.
- Used Tivoli v9.3 for scheduling and monitoring job streams in TEST and PROD. Also worked on creating jobs and job streams and added dependencies.
- Implemented recovery strategy for the aborted and failed workflows. Also created documents on share point to follow the process
- Attended daily stand-up calls, sprint meetings, PI planning, Demo meetings, and worked closely with Product Owners, architectures, scrum master and testing team.
- Used Unix to generate .ksh file and ran .rcp files to promote code to QA and Prod environments
- Used Informatica Analyst tool for profiling data
- Used putty/unix to fetch session logs, bad files, develop scripts, running workflows, source file manipulation during development and production issues etc.
Environment: Informatica PowerCenter 10.1/9.6, Informatica Analyst, Netezza 7.2.1, Oracle 11g, SQL Server, SharePoint, DbVisualizer, Putty, Unix, Tivoli 9.3.
Confidential, Jessup, PA
Sr. ETL Developer
Responsibilities:
- Responsible for Effort Estimation, tracking work progress, coordination with various teams, Clients and Users.
- Lead or participated in JAD sessions. Conducted walkthroughs with the project team and the DBA.
- Extracted the data from heterogeneous sources and performed ETL using Informatica.
- Loaded data to heterogeneous targets like, JMS queue, XMLs, Flat file, .csv and Relational targets.
- Developed complex mappings as per the requirement using all most all transformations and effectively used Debugger for identifying errors and resolving them.
- Populating the business rules using mappings into the target tables.
- Worked on Data Extraction, Data Transformations, Data Loading, Data Conversions and Data Analysis.
- Generated hash code, file list using shell scripts.
- Debug the Informatica mappings and validate the data in the target tables once it was loaded with mappings.
- Extensively worked on XML sources, targets and transformation.
- Involved in developing the SQLs for staging tables which are used to apply all business rules on the data before load into the target tables.
- Translated the PL/SQL logic into mappings. Extensively Worked on Database stored procedure, functions, packages and triggers.
- Implemented optimization techniques for performance tuning and wrote necessary Pre & Post session shell scripts.
- Involved in writing scripts for loading data to target data Warehouse for Bteq, FastLoad, MultiLoad.
- Did error handling and performance tuning in Teradata queries and utilities.
- Teradata Parallel Transporter used to perform ELT.
- Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.
- Did data reconciliation in various source systems and in Teradata.
- Involved in end-to-end system testing, performance and regression testing and data validations.
- Tidal Scheduler was implemented for scheduling of Informatica workflows.
- Creation of ETL Design Documents & migration documents etc.
- Facilitated code reviews as a Lead developer, to ensure code quality .
- Creation of weekly status reports, preparation of project plan and quality related documents. Used TFS to maintain documents and tracking.
Environment: Informatica Power Center 9.6/8.6, SQL Server 2008 (Enterprise Manager, Query Analyzer), Oracle Exadata, Teradata 14/13, Erwin, Tidal 5.3.1, MS-DOS, PEGA, Eclipse, SharePoint, Daptive, Facets, Business Object XI R2, XML, XSD, Sybase, T-SQL and DB2
Confidential, Durham, NC
Integration Developer
Responsibilities:
- Developed Detailed Design documents based on business requirements.
- Designed and developed Informatica code to extract, convert and load data for the application as per the business requirements.
- Involved in the full development lifecycle from Requirements gathering through Design, Build, QA, Deployment and Support using Informatica PowerCenter.
- 24x7 production support for jobs running in the production environment.
- Involved in debugging the failed mappings identify the root cause and developing error-handling methods.
- Resolved Source system data issues by performing Data profiling and providing workaround techniques.
- Worked closely with business analysts to gather functional specifications and turn them into technical specifications.
- Worked with Direct and Indirect Flat Files.
- Extensively used Informatica to load data from Flat Files to Teradata, Teradata to Flat Files and Teradata to Teradata
- Identified long running jobs and implemented various performance tuning techniques to mitigate the bottlenecks.
- Worked with DBA’s to tune long running SQL queries.
- Designed and developed complex Informatica mappings including slowly changing dimension (Type 1, Type2).
- Used UNIX commands in sessions to handle multiple target files into one, clean unused Target or error files and remove special characters from Flat files.
- Developed reusable transformations and Mapplets transformations.
- Designed, developed and executed Run Control jobs.
- Worked with different source and target systems like XML, flat file and relational tables.
- Facilitated code reviews as a Lead developer, to ensure code quality.
- Worked on Informatica transformations including but not limited to SQL, XML, SFDC lookup and Stored Procedure to facilitate the business requirements.
- Used Variables and Parameters in Mappings, Workflows and Sessions.
- Created post-session and pre-session shell scripts and mail-notifications. Used various session tasks like Event wait, Timer, Decision and Control.
- Created labels, deployment groups and worked on migrating the code between different environments using deployment groups.
- Develop and maintain the best practices and guidelines throughout data Warehouse practice in the organization.
- Coordinated with the offshore team to make sure tables were loaded appropriately.
- Triaged issues worked on defects and implemented change requests as and when assigned.
- Coordinate with the GFA application team in modifying the existing Informatica ETL code as per the new business requirements.
- Followed agile methodology and used TFS for creating user stories, task and time estimation.
- Develop dashboard and reports using OBIEE 11g.
- Generated Institutional and Professional claims for the identified providers with ICD 10 diagnosis and procedure codes in order to generate payment advise files for different types of providers.
- Identified different provider groups for set up to receive ERA or PRA or both
- Worked on Defect logged and tracked in QC.
- Validated the reports and files according to HIPAA X12 enforced standards.
Environment: s: Informatica PowerCenter 9.1, XML, SQL Server 2008, Teradata 13, TOAD, Crystal Reports, Oracle 11g, PL/SQL, IBM DB2, Microsoft Visio 2003.
Confidential, Memphis, TN
Informatica Developer
Responsibilities:
- Gather Business requirements by interacting with Business Analyst team and analyzed the requirements to translate into Technical Specifications.
- Perform unit testing for Migration of Clients from one date warehouse to another.
- Developed various Ad-hoc Technical documents for various business needs.
- Optimized and tuned existing ETL scripts (SQL and PL/SQL).
- Maintained, supported, and created ETL scripts for the inBound and outBound systems.
- Managed the load and extraction of data to over seventy internal and external systems.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Created Complex mappings using Unconnected Connected Lookup, Aggregate, SQLT, Transactional control, Normalizer and Router transformations for populating target table in efficient manner.
- Created Mapplet and used them in different Mappings.
- Used sorter transformation and newly changed dynamic lookup
- Created events and tasks in the work flows using workflow manager
- Tuned Informatica mappings for better performance with PL/SQL Procedures/Functions to build business rules to load data.
- Created Schema objects like Indexes, Views, and Sequences.
- Designed and Developed Oracle PL/SQL and UNIX Shell Scripts, Data Import/Export.
- Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Developed shell scripts for running batch jobs and scheduling them.
Environment: Informatica PowerCenter 8.6, Informatica PowerExchange 8.6, Oracle 11g, SQL plus, PL/SQL, TFS, SharePoint, SVN, Toad, Putty, Unix, Autosys.
Confidential, Birmingham, AL
ETL Tester
Responsibilities:
- Analyzing the source data coming from different sources and working with business users to develop the OLAP Model.
- Used Source Analyzer, Target Designer, Transformation Developer and Mapping Designer to map data sources to targets.
- Involved in designing the Informatica mappings by translating the business requirements and extensively worked on Informatica lookup, update and router to implement the complex rules and business logic.
- Load data using Informatica mappings to Data Warehouse (from Transient to Staging to Data warehouse).
- Designed and developed complex aggregate, join, lookup transformation rules to generate consolidated (facts/summary) data identified by dimensions using Informatica ETL tool.
- Configuration of Informatica repository to connect to database through ODBC.
- Understood the server architecture and designed the sessions for the mappings using the Task Developer and implemented those sessions.
- Used Workflow Manager for creating, validating, testing and running the sequential and concurrent batches and sessions.
- Monitored the workflows in the workflow monitor and resolved the issues in the process using the session logs.
- Worked with Shell scripts and pmcmd to interact with Informatica server from command mode.
- Scheduled the ETL jobs daily, weekly and monthly based on the business requirement.
- Built Universes, reports using Business Objects Designer to feed the reports.
- Performed Drill down, Slice and Dice operations in business objects.
Environment: s: Informatica PowerCenter 8.1, Oracle11g, SQL plus, PL/SQL Developer, Unix, Flat files, Windows XP professional, Business Objects 9.