Etl Developer Resume
Dallas, TX
SUMMARY
- Over 7+ years of experience in development and implementation of data - warehouses &data marts with ETL & OLAP technologies using Informatica Power Center 9.x/8.6, ORACLE 10g/9i, SQL, PL/SQL both on Windows & UNIX for various Clients in the Insurance, Financial, Healthcare, Pharmaceutical.
- Extensively worked with various components of the Informatica Power Center - Power Center Designer, Repository Manager, Workflow Manager, and Workflow Monitor to create mappings for the extraction of data from various source systems
- Experience in debugging and performancetuning of targets, sources, mappings and sessions
- Demonstrated expertise utilizing ETL tools, including SQL Server Integration Services (SSIS), and Informatica and ETL package design, and RDBMS systems like SQL Server, Oracle.
- Vast experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.,
- Proficient in using WorkflowManagerTools like Task Developer, Workflow, Designer and Worklet Designer, Mapplets, Reusable Transformations, Sessions, Batches and scheduled them in the Server manager.
- Performed build, tested and deployed activities related to BI and data warehouse implementations with Cognos DataManager
- Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data Cleansing, and Data Validation.
- Good exposure in Informatica MDM where data Cleansing, De-duping and Address correction were performed.
- Designed, developed, and deployed well-tuned generic parameterized and custom ETL processes involving complex business logic sourcing data from various heterogeneous sources such as Teradata and Oracle databases and from flat files.
- Delivered IT documentation and procedures associated with support and new solution delivery.
- Identified opportunities to improve/streamline existing processes for continuous improvement.
- Maintaining strong & collaborative relationships and detailed working knowledge of the assigned area's systems, organization, and business processes.
- Experience in Unit, System and Integrate Testing of different process in Data warehouse.
- Through Knowledge in Data Warehousing concepts like Star Schema, Snow Flake Schema, Fact Table, Dimension Table and Dimensional Data Modeling.
- Good Experience in using SQL and PL/SQL to create Stored Procedures, Functions, Packages and Triggers.
- Experience in creating UNIX shell scripts to access data and move data from Production to Development and developed UNIX scripts and scheduled ETL Loads.
- Extensive experience in creating complex mappings using various transformation logics.
- Developed Slowly Changing Dimension for Type 1 SCD and worked on ETL ODI applications.
- ETL jobs creation & customization using Oracle Data Integrator (ODI).
- Extensively used Autosys, Control-M, Tidal and Cognos for scheduling the UNIX shell scripts and Informatica workflows.
- Effective interpersonal and communications skills (verbal and written) for technical as well as non-technical audiences.
- Knowledge and experience with Agile Methodology, Pentaho Data Integration.
- Excellent team player, short learning curve and ability to analyze complex problems.
TECHNICAL SKILLS
Data Warehousing: Informatica Power Center 9.x/8.x,Informatica MDM, IDQ, Data Profiling, Data cleansing, OLAP, OLTP, Pentaho Data Integration 5.x,SSIS
Languages: SQL, PL/SQL, C, C++, XML, HTML, Visual Basic 6.0,Java Script.
Operating Systems: UNIX, Windows XP/2007, LINUX, Sun Solaris
Tools: PL/SQL, SQL,T-SQL, Developer 2000, Oracle Developer Suite, SQL Plus, Toad 8.x/9.x/10.x,11.x,12.x SQL *Loader, Multi-Load, Teradata SQL Assistant, Erwin, Oracle APEX,SSPS
Databases: Oracle 11g/10g/9i, SQL Server, DB2, Teradata
Job Scheduling: Autosys, Control-M agent, Tidal, Cognos
BI Tools: OBIEE, Crystal Reports 8, ODI
PROFESSIONAL EXPERIENCE
Confidential, Dallas, TX
ETL Developer
Responsibilities:
- Developed mappings, sessions and workflows in Informatica Power Center.
- Created Mapplets, reusable transformations and used them in different mappings.
- Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.
- Made use of Post-Session success and Post-Session failure commands in the Session task to execute scripts needed for cleanup and update purposes.
- Validation of Informatica mappings for source compatibility due to version changes at the source.
- Trouble shooting of long running sessions and fixing the issues.
- Implemented daily and weekly audit process for the Claims subject area to ensure Data warehouse is matching with the source systems for critical reporting metrics.
- Developed shell scripts for Daily and weekly Loads and scheduled using Unix Maestro utility.
- Involved in writing SQL scripts, stored procedures and functions and debugging them.
- Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
- Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
- Worked with reporting team to help understand them the user requirements on the reports and the measures on them. Helped them in creating canned reports.
- Created Tidal jobs to schedule Informatica Workflows
- Executed workflows using Tidal scheduler
- Preparing and using test data/cases to verify accuracy and completeness of ETL process.
- Actively involved in the production support and also transferred knowledge to the other team members.
- Co-ordinate between different teams across circle and organization to resolve release related issues.
Environment: Informatica Power Center 9.6, Toad for Oracle 12.6g, windows, UNIX Maestro, PL/SQL, SQL, Tidal 5.3.1,Java Script.
Confidential, Grand Rapids, MI
Sr. Application Development Analyst
Responsibilities:
- Worked with business analysts to identify appropriate sources for Data warehouse and to document business needs for decision support data
- Extensively worked with the sources which are extracts from Oracle, TERADATA, flat files and XML files
- Star Schema was used to design the warehouse.
- Worked extensively on Source Analyzer, Mapping Designer, Warehouse Designer and Transformation Developer.
- Developed several Mappings and Mapplets using corresponding Source, Targets and Transformations.
- Extensively used almost all Transformations such as Filter, Aggregator, Expression, Router, Lookup, Update Strategy, Sequence generator transformations for developing the mappings to apply the business logic.
- Designed developed and deployed new data marts along with modifying existing marts to support additional business requirements.
- Configured and used FTP by the Informatica Server to access source and target files.
- Schedule, Run and Monitor sessions by using Informatica Scheduler.
- Used PL/SQL whenever necessary inside and outside the mappings.
- Involved in writing SQL Stored procedures and Shell Scripts to access data from Oracle.
- Created launched & scheduled sessions. Involved in the Performance Tuning of Informatica mappings.
- Wrote PL/SQL Packages and stored procedures to implement business rules and validations.
- Involved in writing the UNIX scripts to schedule the Informatica workflows, which were in turn used by the scheduling team to schedule the workflows using the Control-M agent.
- Worked with testing team for creation of Test environment and introduced them to Informatica Workflow manager and Workflow monitor
- Closely interacted with the reporting team which uses Business Objects for reporting team to explain the tables and relation between the tables.
Environment: Informatica 9.6, Toad for Oracle 12.6g, windows, UNIX Shell Programming, PL/SQL, SQL, XML, Teradata, Control- M agent, Oracle APEX, git, CSS, Java Script,HTML.
Confidential, Virginia Beach, VA
Informatica Developer
Responsibilities:
- Involved in Business Analysis and also in designing the technical specifications.
- Analyzed the high level design specifications and transformed them into simple ETL coding.
- Used Informatica Power Center extensively to load data from flat files to DB2, flat files to SQL Server, flat files to Oracle and DB2 to XML files, Teradata to Oracle.
- Designed and developed Mappings with various transformations such as Connected and Unconnected Lookup, XML Source Qualifier, Filter, Joiner, Router, Sorter, Expression, Aggregator, and Union transformations.
- Worked extensively with designer tool to develop Mappings, Mapplets, and Reusable transformations to extract, transform and load the data into database.
- Developer complex T-SQL quries and designed SSIS packages to load the data into warehouse.
- Designing SSIS packages using serval transformations to perform Data profiling,Data Cleansing and Data Transformation.
- Made use of AUTOSYS for monitoring and scheduling the jobs.
- Extensively used pmcmd commands to process the workflows.
- Fine tuning the mapping and workflows to optimize the execution time of the workflows.
- Used Erwinto create the physical and logical data models and also maintained the relationships between the tables.
- Involved in the Migration process from Development, Test and Production Environments.
- Using cognos, Performed analysis, design, and document report requirements also did performance tuning for ETL ODI applications.
- Made use of breakpoints and various test conditions with the help of the Debugger tool to test the logic and the validity of the data flows through the mappings.
- Performed end-to-end testing to validate new or updated business processes, customization and configuration, reports, and data migration.
Environment: Informatica Power Center 9.6, Oracle 11g, SQL Server Integration Services (SSIS) 2005, LINUX, SQL Developer, Teradata, DB2, XML, Flat files, PL/SQL,SQL Server, AUTOSYS, Erwin, Cognos 8 and Windows7 .
Confidential, New Brunswick, NJ
Informatica Developer
Responsibilities:
- Extracted data from various sources such as Oracle, Teradata, SQL Server, Flat files and DB2.
- Developed complex mappings involving various transformations such as Source Qualifier, Sorter transformation, Joiner transformation, Update Strategy, Lookup transformation, Expressions and Sequence Generator for loading the data into the target table.
- Worked extensively with session parameters, mapping parameters, mapping variables for incremental loading.
- Responsible for identifying the missed records in various stages from source to target and resolving the problem.
- Developing, testing and debugging the mappings in Informatica.
- Solving load failure issues on a daily basis.
- Designed and Developed Cognos DataManager objects/scripts to extract and cleanse data from legacy system and multiple disparate source databases RDBMS, XML, Spreadsheets, flat files, unstructured data files.
- Have used SQL Server 2005 Integration Services (SSIS) transformations in the data flow of a package to aggregate, merge, distribute, and modify data.
- Designed SSIS Packages to transfer data between servers, load data into database; Scheduled the jobs to do these tasks periodically.
- Created SSIS packages for application that would transfer data among servers and perform other data transformations.
- Created procedures to truncate data in the target before the session.
- Designed and developed UNIX and ORACLE, PL/SQL, Shell Scripts.
- Provided documentation about the ETL processes which were developed for future reference.
- Created jobs for automation of the Informatica Workflows and for DOS copy and moving of files using Tidal Scheduler.
Environment: Informatica Power Center 9.5, SSIS, Oracle 10g, Cognos DataManager, DB2, SQL Server, Flat files, SQL/PLSQL, T-SQL, UNIX, Shell Scripts, TOAD, Tidal.
Confidential, CHICAGO, IL
Informatica Developer
Responsibilities:
- Worked with the business and data analysts in requirements gathering and to translate business requirements into technical specifications.
- Created number of complex Mappings, Mapplets, Reusable Transformations, Workflows, Worklets, Sessions using Informatica Power center 9.1 to implement the business logic and to load the data incrementally.
- Used the transformations like Expression, Lookup, Source Qualifier, Normalizer, Aggregator, Java, Filter, Router, Sorter, etc.
- Extracted the data from the Flat files, Excel sheets, Mainframe COBOL files, SQL server, and Oracle databases into staging area and populated onto Data warehouse.
- Used Workflow manager for session management, database connection management.
- Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements using Active batch scheduling tool.
- Cleansed the data using MDM technique.
- Utilized ofInformatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
- Used Debugger to test the mappings and fixed the bugs.
- Load Monthly and Daily loads, track issues and resolve them based on the priority.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Monitor and troubleshoot batches and sessions for weekly and Monthly extracts from various data sources across all platforms to the target database.
- Worked on the UNIX scripts for running the workflows and threshold check for the incoming files.
- Scheduled the UNIX scripts in AUTOSYS for automation.
Environment: Informatica Power center 9.1,MDM,IDQ 8.6.1, Oracle 9i/10g, UNIX, Autosys, Toad 9.5/10, WinSCP, SQL, PL/SQL, SQL server, Flat files, Mainframe COBOL files, Unix Shell Scripting and Windows XP.
Confidential, Boston
ETL Developer
Responsibilities:
- Worked closely with a project team for gathering the business requirements and interacted with business analysts to translate business requirements into technical specifications
- Created extraction transformation and loaded data using Cognos DataManager
- Extracted large volumes of data from different data sources into staging tables, performing transformations on them before loading into final tables.
- Performance tuning of sources, targets, mappings and SQL queries in the transformations.
- Developed PL/SQL procedures, functions to facilitate specific requirement.
- Updating existing procedures, functions, triggers and packages to synchronize with the changes in Transformations.
- Worked on delimited flat file sources and Extracted data from DB2 and Oracle source systems and loaded into flat files.
- Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
- Designed and supported development of dimensional models off transactional systems, and integrated multiple transaction systems in Cognos BI system.
Environment: Informatica Power Center 9.1, Cognos DataManager, IDQ 8.6.1, Oracle 10g, DB2, PL/SQL, SQL Cognos.