Etl Developer Resume Profile
Danville, PA
Professional Summary
- Good knowledge of Master Data Management concepts.
- Designed, tested, and deployed plans using IDQ 8.5
- Good knowledge in Normalizing and De-normalizing the tables and maintaining Referential Integrity by using Triggers and Primary and Foreign Keys.
- Good knowledge in using system tables such as sysindexes, sysprocesses, sysobjects and syscomment in various queries and using Database Console Commands DBCC .
- Experience in Analyzing Execution Plan and managing indexes and troubleshooting deadlocks.
- Experience in UNIX shell scripting, job scheduling and communicating with server using pmcmd.
- Used Power Exchange to integrate the sources like Mainframe MVS, VSAM, GDG, DB2 and XML files.
- Experience in Creating and Updating Clustered and Non-Clustered Indexes to keep up the SQL Server performance.
- Experience in upgrading from Informatica Powercenter 8.6 to Informatica Powercenter 9.1.
- Worked extensively on ETL process using Talend and SSIS.
- Experience in administrating the created reports and assigning permission to the valid users for executing the reports.
- Extracted data from File Sources, Relational Sources, XML and COBOL sources using Informatica PowerCenter.
- SQL Server 2008/2005 RDBMS database development including T-SQL programming.
- Extensive experience with data modeling techniques, logical and physical database design.
- Hands on Experience in Installing, Configuring, Managing, Monitoring and Troubleshooting SQL Server 2005/2008.
- Good understanding in database and data warehousing concepts OLTP OLAP .
- Experience in Database Development, Data Warehousing, Design and Technical Management.
- Expert Knowledge of Integration Services SSIS , Analysis Services SSAS and Reporting Services SSRS .
- SQL database experience in a high transaction and multi-server production environment.
- Worked extensively on ETL process using Informatica Power Center 9.x/8.x/7.x.
- 7 years of IT Experience in Data Warehousing, Database Design and ETL Processes in the Development, Test and Production environments of various business domains.
- Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large-scale Data warehouses using Informatica Power Center.
- Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions and at database.
- Worked on OLAP Data warehouse , Model ,Design and Implementation
- In-depth understanding of Data Warehousing and Business Intelligence concept
- Designed and developed efficient Error handling methods and implemented throughout the mappings in various projects.
- Responsible for interacting with business partners to identify information needs and business requirements for Reports.
- Good Knowledge of Kimball and Inmon data warehouse design approaches and considerations
- Good understanding of dimensional models, slowly changing dimensions, star snowflake schemas.
|
Technical Skills
Professional Experience
Confidential
Role: ETL Developer
Confidential, a not-for-profit health maintenance organization HMO , serves the health-care needs of members in 43 counties throughout central and northeastern Pennsylvania. Confidential teamed with Confidential, which is a joint venture between Confidential and Microsoft Corporation, to develop the Confidential application. The application is aimed at presenting a 360o, patient-centric view to Health Services and Pharmacy stakeholders who manage patient's care. It will integrate systems from both the payer and provider sectors of health care to deliver functionality for Case Management, Medical Management, Wellness Programs, Pharmacy, Appeals, and Quality Improvement as well as a patient's personal health record PHR , an employer view of that subpopulation, and access for providers of care.
Responsibilities:
- Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Normalizer, Lookup, Filter, Joiner, Rank, Router, Update Strategy and XML.
- Developed various Ad-hoc mappings for various business needs.
- Acquired substantial knowledge with respect to the different source systems in healthcare namely Amisys, Milliman etc.
- Experienced with loading into and reading from XML files in Informatica Powercenter.
- Responsible to tune ETL procedures and schemas to optimize load and query Performance.
- Interpreted logical and physical data models for Business users to determine common data definitions.
- Involved in Data Validating, Data integrity, Performance related to DB, Field Size Validations, Check Constraints and Data Manipulation.
- Involved in Data Validating, Data integrity, Performance related to DB, Field Size Validations, Check Constraints and Data Manipulation.
- Implemented the concept of slowly changing dimensions to maintain current and historical data in the dimension.
- Wrote UNIX shell Scripts to FTP of files from remote server.
- Developed mappings to load into staging tables and then to Dimensions and Facts.
- Used Informatica Analyst for Data Profiling.
- Created mapping documents to outline data flow from sources to targets.
- Involved in upgrading from Informatica 9.1 to Informatica 9.5.1.
- Used Tidal scheduler to schedule and run Informatica workflows.
Environment: Informatica Powercenter 9.1/9.5, Informatica Power Exchange, Informatica Analyst, UNIX, PL/SQL, MS SQL Server 2008, Oracle 11g, Tidal, Putty.
Confidential
Role: ETL Developer
Confidential. is a large office supply chain store, with over 2,000 stores worldwide in 26 countries. It has its headquarters based in Confidential. The primary objective of this project is to get data from different sources DB2, MySQL, SQL Server, Oracle, Flat file and perform the necessary operations on the data as per the user requirement and load into the data warehouse for analysis and generate the reports from them and was also working in some minor in-house projects
Responsibilities:
- Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Normalizer, Lookup, Filter, Joiner, Rank, Router, Update Strategy and XML.
- Developed various Ad-hoc mappings for various business needs.
- Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts.
- Responsible to tune ETL procedures and schemas to optimize load and query Performance.
- Perform Designs and Analysis on business systems applications, systems interfaces, databases, reporting, or business intelligence systems
- Deliver new systems functionality supporting corporate business objectives
- Implemented the concept of slowly changing dimensions to maintain current and historical data in the dimension.
- Translate requirements and high-level design into detailed functional design specifications.
- Responsible for design, development and maintenance of Data Marts including Sales, Inventory, Customer Reporting and Resource Management leveraging Informatica Power Center ETL tool, Oracle and SQL Server.
- Interpreted logical and physical data models for Business users to determine common data definitions.
- Involved in Data Validating, Data integrity, Performance related to DB, Field Size Validations, Check Constraints and Data Manipulation.
- Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.
- Used ERStudio to analyze and optimize database and data warehouse structure.
- Used Tidal scheduler to schedule and run Informatica workflows.
- Designed numerous ad-hoc ETL processes in Talend to load data from DB2 to MySQL for an in house data quality project.
- Used Informatica Analyst to profile the source systems
Environment: Informatica Powercenter 9.1, Informatica Power Exchange, Informatica Analyst, Oracle 11g, UNIX, PL/SQL, Embarcadero ERStudio Data Architect, TOAD, Tidal, Putty.
Confidential
Role: ETL Developer
Confidential is a global company with one of the most recognized and admired brands in the world. Confidential is the world's largest package delivery company and a leading global provider of specialized transportation and logistics services. The primary objective of this project is to get data from different sources SQL Server, Oracle, Flat file and perform the necessary operations on the data as per the user requirement and load into the data warehouse for analysis and generate the reports from them.
Responsibilities:
- Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Lookup, Filter, Joiner, Rank, Router, Update Strategy and XML.
- Developed various Ad-hoc mappings for various business needs.
- Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts.
- Perform Designs and Analysis on business systems applications, systems interfaces, databases, reporting, or business intelligence systems
- Responsible for design, development and maintenance of Data Marts including Sales, Shipment, Customer Reporting and Transportation leveraging Informatica Power Center ETL tool, Oracle and SQL Server.
- Deliver new systems functionality supporting corporate business objectives.
- Involved in production support activities with Installation and Configuration of Informatica Power Center 9.1.
- Translate requirements and high-level design into detailed functional design specifications.
- Wrote stored procedures, functions, and database triggers in SQL Server 2008.
- Responsible to tune ETL procedures and schemas to optimize load and query Performance.
- Interpreted logical and physical data models for Business users to determine common data definitions.
- Involved in Data Validating, Data integrity, Performance related to DB, Field Size Validations, Check Constraints and Data Manipulation and Updates by using SQL Single Row Functions.
- Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.
- Used ERStudio to analyze and optimize database and data warehouse structure.
- Implemented the concept of slowly changing dimensions to maintain current and historical data in the dimension.
- Experience in using Normalizer transformation for normalizing the XML source data.
- Extensively used XML transformation to generate target XML files.
- Developed UNIX shell scripts for scheduling sessions in Informatica.
- Provide SME level guidance to Development and Application support teams during development and deployment phases.
- Facilitate/lead design reviews of functional design with other members of the technical team, communicating design, requirements, feature set, functionality and limitations of systems/applications.
- Developed ETL technical specs, Visio for ETL process flow and ETL load plan, ETL execution plan, Test cases, Test scripts etc.
- Actively involved in building the system test environment
- Migrated mappings from Development to System Test environment and QA environment.
- Used Informatica Power center workflow manager to create sessions, workflows and Worklets to run with the logic embedded in the mappings.
Environment: Informatica Power Center 9.1/8.6, SQL Server 2005/2008, PL/SQL, Transact SQL, Oracle 10g/11g, Embarcadero ERStudio Data Architect, TOAD, UNIX
Confidential
Role: ETL Developer
Confidential. oversees the operation of two chains of supermarkets and superstores in the Confidential, with warehouse facilities and headquarters in the Confidential town of Confidential. The idea of the project is to initially build data marts and then integrate all marts into one enterprise wide data warehouse.
Responsibilities:
- Extensively used several transformations such as Source Qualifier, Router, Lookup connected unconnected , Update Strategy, Joiner, Expression, Aggregator and Sequence generator transformations.
- Involved in Dimensional modeling to Design and develop STAR Schemas, used ER-win 4.0, identifying Fact and Dimension Tables.
- Worked with different Sources such as Oracle, MS SQL Server and Flat file.
- Used Informatica to extract data into Data Warehouse.
- Identified and tracked the slowly changing dimensions, heterogeneous Sources and determined the hierarchies in dimensions.
- Created reusable transformations and Mapplets and used them in mappings.
- Used Informatica Repository Manager to maintain all the repositories of various applications.
- Developed number of Complex Informatica Mappings, Mapplets and Reusable Transformations for the Claim Profitability Systems to facilitate Daily, Monthly and yearly Loading of Data.
- Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Testing of Informatica Sessions, and the Target Data.
- Wrote UNIX scripts and PL/SQL scripts for implementing business rules.
- Worked on SQL tools like TOAD to run SQL queries to validate the data.
- Design and Development of pre-session, post-session and batch execution routines to run Informatica sessions using Informatica Server.
- Created and Scheduled sessions Worklets using workflow Manager to load the data into the Target Database
- Wrote PL/SQL stored Procedures and Functions for Stored Procedure Transformations.
- Used Control M scheduler to automate the process.
- Participated in Review of Test Plan, Test Cases and Test Scripts prepared by system integration testing team.
- Monitored the performance and identified performance bottlenecks in ETL code.
Environment: Informatica PowerCenter 7.1.3, Oracle9i, MS SQL Server 2000, MS Enterprise Manager, MS Query Analyzer, Flat file, XML, Control-M scheduler , Linux, UNIX Korn scripting, TOAD, Erwin 4.0, Windows NT, Clear case.
Confidential
Role: SQL BI Developer
Confidential is the world's single-source leader of automation technology products engineered and manufactured for all industrial sectors. This project was to create and maintain a central data warehouse for all the data and to create reports to give better analysis to the automation and drive department.
Responsibilities:
- Worked on complex data loading implemented the batch data cleansing and data loading .
- Used BCP utility to publish table output to text files.
- Worked on DTS Package, DTS Import/Export for transferring data from Heterogeneous Database to SQL Server.
- Creation/ Maintenance of Indexes for fast and efficient reporting process.
- Configured Server for sending automatic mails to the respective people when a DTS process failure or success.
- Created new tables, written stored procedures for Application Developers and some user defined functions.
- Maintained a good client relationship by communicating daily status and weekly status of the project.
- Created linked servers between different SQL Servers and also created linked server with different access files used across various Used Data departments
- Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
- Developed SQL scripts to Insert/Update and Delete data in MS SQL database tables.
- Developed code that matches the prototype and specification, is maintain and, as necessary, portable to other environments.
- Created Business-Crucial stored procedures and functions to support efficient data storage and manipulation.
- Designed and developed database objects like Tables, Stored Procedures, Triggers, Rules, Defaults, user defined data types and functions for this project.
- Created Indexes to get high level performance in making consumer search queries faster.
- Performed ETL operations to support the data loads and transformations using SSIS.
- Excellent report creation skills using Microsoft SQL Server 2005 Reporting Services SSRS with proficiency in using Report Designer as well as Report Builder.
- Worked extensively in writing and debugging complex stored procedures, triggers, Inner Joins, Outer Joins, views and user-defined functions
- Used SQL Profiler for troubleshooting, Monitoring, Optimizing SQL Server and T-SQL statements from developers and testers.
- Implemented automated backup and database maintenance / cleanup jobs.
- Create and Update Stored Procedures to produce tables and flat files from the production database
- Created various weekly, monthly/quarterly Tabular and Matrix Reports using SSRS to assist in decision making.
- Created and scheduled jobs for various tasks and also was responsible for maintaining jobs.
Environment: SQL Server 7.0/2000/2005, Enterprise Manager, SQL Profiler, DTS, T-SQL, Query Analyzer, SSIS, SSRS ,SSAS.