Sr. Etl Developer Resume
Chicago, IL
SUMMARY
- 12 years of IT experience on Data Warehouse, Data Analysis, Application Development and Business Intelligence with ETL tools Informatica Power Center 9.1/9.0.1/8.6.1/ x.
- Strong technical exposure with good degree of competence in business systems such as Financial, Banking and Retail.
- Good Knowledge in Ralph Kimball and Bill Inmon methodologies.
- Good understanding of Star and Snowflake Schema, Dimensional Modeling, Data Marts, Relational Data Modeling, Data Analysis, ERWIN, Microsoft Visio.
- Good knowledge in B2B Data transformation for unstructured objects like EDI, Cobol, XML for specifications and other related standards.
- Good experience in using Oracle 10g/9i, SQL, SQL Plus, PL/SQL, SQL Loader, TOAD, Unix Shell Scripts.
- Knowledge in Installation, Configuration and administration of Informatica Power Center 8.x (server and Client) on windows 2003 server.
- Understanding of XML, SQL, Data Modeling Language (DML), and Data Definition Language (DDL)
- Experience with creating Data Marts/ Enterprise Data warehouse
- Experience with Unified Modeling Language (UML), including Entity - Relationship Diagrams (ERD), Sequence Diagrams, and Data Flow Diagrams (DFD)
- Extensively worked on Power Center client tools: Repository Manager, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager and Workflow Monitor to extract, transform and load data.
- Designed mappings using SQL overrides necessary for pre and post SQL, Source Qualifier, Look up (Dynamic and static), Filter, Update and Expression Transformations.
- Developed common logic using reusable transformations, mapplets, and worklets.
- Worked on Slowly Changing Dimensions (SCD's) and implemented Type1, Type 2 (Flag and time stamp).
- Extensive usage of Sequence Generator, shortcuts and reusable components for sources, targets, transformations, mapplets, worklets and sessions.
- Expertise in documenting the ETL process, Source to Target Mapping specifications, status reports and meeting minutes.
- Employed Change Data Capture (CDC) based on requirements using Power Exchange 9.01 HF2.
- Extensively used Informatica to load data into ODS, Data Warehouse on integration of various sources that involve relational databases-Oracle 10g/9i, MS SQL Server 2008/2005/2000 and flat files.
- Efficient in troubleshooting mapping bottlenecks, performance tuning and debugging.
- Performed data migration from development to QA, Pre-Test Process, and QA to PROD of SDLC ETL process in coordination with supervisor, team members and users.
- Experience in Data Profiling, data analysis, parameterizing DB connections, file paths, error handling, error remediation and impact analysis.
- Well versed with the business functioning, end user requirements gathering and developing process in accordance with the business rules, coming up with data load strategies, unit testing, and integration testing in development.
- Knowledge of coding ETL Tasks in UNIX Shell scripts and interacted with INF Server using PMCMD command.
- Used UC4, Autosys, Control-M and Informatica Scheduler for scheduling batch cycle jobs.
- Professionally trained in Project Management, Leadership Management, Managerial Finance and Communication.
- An Energetic, flexible, well organized, self-motivated fast learner with minimal supervision, independent and team-oriented, process focused professional in various team sizes performing cross functional roles in a fun filled, challenging environment.
- Excellent analytical, problem solving, communication and interpersonal skills.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 9.1/9.0.1/8.6.1/ x (Power Center Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor), Power Exchange 9.0.1
Data Modeling: Toad Data Modeler, Relational and Dimensional Modeling (Star Schema, Snow-Flake, Fact, Dimensions), Erwin, MS Visio
Databases: Oracle 11g\10g\9i, MS SQL server 2000/2005/2008 , My SQL, MS-Access
Programming: SQL, Unix, SQL Tuning/Optimization, CGI, C++, Python, VB.NET, C#
Tools: TOAD, SSIS, SQL *Plus, SQL* LOADER, MS Office, Autosys, UC4, ICM COGNOS, HIVE
OS: Sun Solaris, Linux, Windows 7/98/95/NT/2000/2003/XP/Vista.
Data Analytics: BO XI-R3, SAP-BW-305
PROFESSIONAL EXPERIENCE
Confidential, Chicago, IL
Sr. ETL Developer
Responsibilities:
- Assist in developing the Data aggregator for margin models for energy asset class related with crude oil, natural gas and heating oil derivatives products.
- Gathering requirements from Team Lead and implement them into source to target mappings in landing, staging and Target.
- Implemented Daily Data Feed reporting with forward curve and daily % change along with 124 days lag correlation.
- Implemented auto alerter and reporting process in CGI script for daily monitoring of Informatica workflow.
- Extensively used Blade logic package creation and deployment for workflows.
- Interacting with Business Analysts in understanding the Database requirements whenever debugging is required without changing the code.
- Focused on new system and tools with minimal supervision and trained Project Architecture to team members.
- Involved in populating data from flat files into Oracle database from landing to ODS (Staging) and into EDW using the common code.
- Effectively worked on Informatica Mapping Designer, Workflow Manager, and Workflow Monitor.
- Extensively used Sequence Generator in all mappings and fixed bugs raised in production for existing mappings in common folder for new files through versioning (Check in and Check out).
- Applied slowly changing Dimensions Type I and Type II on business requirements.
- Parameterized all DB connections wherever necessary in sandbox and common folder and tested links in DEV, QA, and PROD for populating data into SQL database.
- Involved in Data Profiling through requirements taken from Data Dictionary and applied the Data cleansing process.
- Extensively worked on performance tuning and also in isolating header and footer in single file.
- Performed code migration of mappings and workflows from Development to Test and Production Servers using Blade logic package.
- Used debugger to test mapping and fixed bugs in DEV in following change procedures and validation.
- Tuned and optimized mappings to reduce ETL run time and ensuring they ran within designated load window.
- Documented as Run Book for all providers, Eligibility related mappings into their respective spread sheets to run in a sequence from landing to Staging to ODS for Active Batch and Cleansing Data before the feeds into EDW.
- Raised change requests, analyzed and coordinated resolution of program flaws and fixed them in DEV and Pre-Production environments, during the subsequent runs and PROD.
- Precisely documented mappings to ETL Technical Specification document for all stages for future .
- Developed multiple job plans in UC4 for workflow automation.
- Implemented cyber ark security for C# code for pricing model.
- Employed UC4 as Job Automation tool for running daily, weekly and monthly loads through UNIX PMCMD command for each workflow in a sequence with command and event tasks.
- Assist in reporting framework through MS Visio before and after the execution of ETL jobs and designed the document with exact representation for each transformation used in mappings.
- Involved in meeting minutes for status reports, updates, any issues for any change procedures to Lead, Manager.
- Effectively played a multi dynamic role facing all challenges and managed working on similar projects as well.
Environment: Informatica Power Center 9.1, Oracle, Flat Files, UC4, Windows 2003, JIRA(assigning issues), RFC for Change Requests, MS Visio, GIT, C#
Confidential, San Jose, CA
Business Intelligence Developer
Responsibilities:
- Involved in Project Life Cycle of application access issues to Implementation phase.
- Worked on converting/designing Excel Sales and commission model into commission system as per business needs.
- Implemented legacy sales-commission model to ICM COGNOS.
- Worked closely with Architects, Lead and Project Manager for the applications assessment to all the Data Masking Team on Proxy server.
- Gathering requirements from Business Analyst’s in understanding the Databases and Structures required for masking in star schema data model.
- Designed, developed and tested mappings/sessions/workflows to move data from source to staging for data load and data cleansing for True comp application.
- Good knowledge for working with parameterized DB connections using UNIX scripts to view processed data.
- Worked on identifying Mapping Bottlenecks in Mappings and improved session performance by including partition points, modifying target load order and emphasizing more on SQL override in Lookup, Filter, Expression and Update strategy transformations.
- Performed performance tuning on Informatica sessions, and mappings.
- Migrated mappings and folders from one repository through import and export from Version 7.6 to 8.6 and also into deployment groups.
- Ensured that system testing and change management procedures are followed with an ability to prioritize and manage multiple tasks with effective time management skills.
- Precisely documented all data for further .
- Designed Multiple Unix scripts for Job scheduling for daily, weekly and monthly loads to Oracle Database.
- Designed 22 reports in BO-XI R3 for sales reps on Quota credit, Run Rate and actual commission amount.
Environment: Informatica Power Center 8.6.1, Oracle 10g/9i, MS SQL Server 2000, Flat files, PL/SQL, MS Visio, Toad 7.0, Windows 2000, Unix, Putty, Power Exchange 9.0.1. ICM-COGNOS
Confidential
Sr. ETL Developer
Responsibilities:
- Gathered requirements and developed mappings to maintain meta-data in Repository.
- Extensively worked to load data from Oracle, MS SQL Server, flat files into the target SQL Server database.
- Implemented various Transformations: Joiner, sorter, Aggregate, Expression, Lookup, Filter, Update Strategy and Router.
- Developed PL/SQL stored procedures to process and load data into data warehouse.
- Implemented Type I & II changes in slowly changing dimension tables.
- Modified the existing mappings for the updated logic and better performance.
- Worked on identifying Mapping Bottlenecks and improved session performance through error handling.
- Worked on mapplets and created parameter files wherever necessary to facilitate reusability.
- Used Error Handling to capture error data in tables for handling nulls, analysis and error remediation Process.
- Involved in massive data cleansing and data profiling of the production data load.
- Tuning the complex mappings based on source, target and mapping, session level.
- Avoided Duplicate issues through SQL override using look up, Joiner transformations.
- Developed UNIX scripts to automate different tasks involved as part of loading process.
- Performed Unit Testing on the mappings.
- Created workflows using Workflow manager for different tasks like sending email notifications, timer that triggers when an event occurs, and sessions to run a mapping.
- Actively participated in resolving Email tasks working on priority and non-priority issues.
- Developed and Scheduled jobs for daily loads that invoked Informatica workflows & sessions associated with the mappings, SQL scripts to drop & recreate the indexes on source and target tables for batch process.
- Ensured that integration testing and system testing followed with QA and change management procedures.
- Implemented OLAP online cubes for Data Mining.
Environment: Informatica Power Center 8.6.1/7.1, Oracle 10g/9i, MS SQL Server 7.0/2000, Flat files, SQL*Loader, ERWIN 3.5, Toad 7.0, Unix, windows 2003..
Confidential
ETL Data Developer
Responsibilities:
- Discussed with Group and Technical Manager for business issues to bridge the gap between user needs.
- Developed Data mart reporting for ZEUS data mart in ASP.NET
- Involved in Data model reviews and validated tables, columns and data types.
- Created shared folders, local and global shortcuts to reuse metadata.
- Configure database and ODBC connectivity for various sources and targets.
- Worked with various transformation types: Lookup (Connected & Unconnected), Update Strategy, Joiner, Filter, Sorter, Aggregation, Rank and Router to extract data from multiple source systems like Oracle, SQL and flat files.
- Developed PL/SQL stored procedures to process and load data into data warehouse.
- Extracted Data from sources: Oracle, SQL Server, andFlat files to load into ODS (staging area) and EDW.
- Applied partitioning at session for data loads using target lookup and avoided duplicates records.
- Implemented Mapping Variables and Parameters in Transformations.
- Reduced complexity and time using re-usable components: worklets, Mapplets, and transformations.
- Implemented Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
- Developed Slowly Changing Dimensions for Type 1 SCD and Type 2 SCD.
- Utilized Informatica tasks: Session, Command, Timer, Email, Event-Raise, and Event-Wait.
- Created Parameter file for session parameters and called in the sessions.
- Written PMCMD commands for FTP of files from remote server
- Tuned performance of session for large data sets by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Used Debugger wizard to remove bottlenecks at source, transformation, and target for optimum loads.
- Worked with session logs, Error Handling and validated data populated by Unit Testing and Integration testing on mappings.
- Worked with QA team and fixed bugs that were reported and validated code in the mappings.
Environment: Informatica Power Center 8, SQL Server 2000, Flat Files, SQL, TOAD 8.0, Oracle 9i, DB2, Windows 98/2000.
Confidential
Sr. ETL Developer
Responsibilities:
- Served as leader of three-person team, responsible for pre-audit, pre-migration, post-migration and support.
- My responsibility was to do the pre-audit for ensuring the availability and correctness of data in old system and necessary infrastructure required by the branch for FINACLE migration.
- Extract the data from legacy system and load it to FINACLE test environment for format compatibility test.
- Responsible for making data compatible flat files in unix to export the data in data repository of (FINACLE) with proper testing.
- Prepare Balance sheet, cash register after migration the branch data into FINACLE from new system.
- Provide support and to branch users in a new system.