Sr. Etl Developer/ Lead Resume
Plano, TX
SUMMARY
- 10 + years of IT experience in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Pharmaceutical, Financial, and Telecom domain.
- Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
- Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools Informatica Server, Repository Server manager.
- Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
- Extensive testing ETL experience using Informatica 10/9.5.1/9.1/8.6.1 (Power Center) (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects.
- Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions.
- Expertise in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server 2008/2005, DB2 8.0/7.0, MS Access, Teradata and Netezza.
- Expertise in working with CRM platform Salesforce.
- Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
- Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, SQL and Oracle PL/SQL.
- Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Expertise in Installing and managing Informatica Power center, Metadata Manager, Data Explorer and Data Quality.
- Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
- Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
- Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server, DB2, Teradata and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Experience in using Automation Scheduling tools JAMS, Autosys and Control-M.
- Worked extensively with slowly changing dimensions.
- Extensive experience in Extraction, Transformation and Loading of data from different heterogeneous source systems like Flat files, Excel, NETEZZA, DB2, Oracle, SQL Server and Teradata.
- Extensively worked on migration of objects in all phases (DEV, QA and PROD) of project.
- Experience in Informatica Data Quality, Data Explorer and Power Exchange.
- Performed Unit testing, integration and System testing.
- Developed, Modified and Tested UNIX Shell scripts and necessary Test Plans to ensure the successful execution of the data loading process.
- SAP ECC connectivity using Informatica BCI & ALE/IDOCS, BAPI/RFC.
- SAP BW connectivity with Informatica.
- Excellent communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high adaptability to new technologies and tools.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 10.1.0/9.5.1 HF 4/9.1.0/8.6/8.1 & SSIS
Databases: Oracle 11g/10g, Netezza 7.1.0.x, SQL Server, Teradata 13.0
Languages: SQL, PL/SQL, Unix Shell Scripting (ksh)
CRM: Salesforce (API version 36.0)
Environment: UNIX (AIX, Solaris, LINUX), Windows NT/XP.
Data Modeling tools: Erwin, MS Visio
OLAP Tools: Cognos 8.0/8.1/8.2/8.4/7.0/ ,OBIEE 10.1.3.4/ 10.1.3 & Tableau.
Scheduling Tools: JAMS, Autosys and Control-M
Testing Tools: QTP and ALM ( Quality Center 11.0)
Project Tracking tool: JIRA & SNOW.
Big Data: Hadoop (HDFS, Map Reduce, yarn), HIVE, PIG, Hbase.
Web Server: InternetInformationServer 5.1/6.0/7.0
Web Technologies: ASP.NET, Html, WPF, WCF, and MVC
PROFESSIONAL EXPERIENCE
Confidential, Plano, TX
Sr. ETL Developer/ Lead
Responsibilities:
- Interactive collaboration with business analysts and gathered functional requirements and designed technical design documents for ETL process.
- Operated Informatica Power Center to load data from different sources like flat files and Oracle, DB2 into IR.
- Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
- Analysis of the source system to determine elements needed for ETL designs and did extensive data profiling for ETL maps.
- Providing the JIL script to Autosys team and make sure to trigger the workflows accordingly.
- Developed shell scripts to kick off the work flows and verifying the file transfer through MFT.
- Developed Informatica Mappings and Workflows for Data Integration.
- Organized the Workflows and Sessions for the Initial and Incremental Data load runs.
- Prepared the Schedules for the Initial Load runs and Incremental Load runs.
- Designed test cases for initial and Incremental Load testing.
- Providing the updates to the team and involved in code review. Assigning the tasks to the team members.
- Making sure SFDC Bulk API, SFDC error file properties are enabled in session level when we are loading any data to salesforce.
- Interacting with TIBCO, PEOPLE SOFT teams for converting files as per their requirement and load the data into IR.
- Validating data and make sure to consume it by upstream and downstream systems.
- Create connections for salesforce, oracle and Netezza as standard format as per the requirement.
- Converted user defined functions of business process into Informatica defined functions.
- Developed and support the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Informatica and provide technical support and hands-on mentoring in the use of Informatica
- Debug the Informatica mappings and validate the data in the target tables once it was loaded with mappings.
- Prepared and implemented data verification and testing methods for the Data Warehouse as well as to design and implement data staging methods and stress testing of ETL routines to make sure that they don't break on heavy loads.
- Importing source & target objects from the data bases and CRM application.
- Executing SOQL queries in salesforce and validating data in salesforce.
- Tracking all the project issues in SNOW. Raising RFC ‘s for CAB approval to get the code from STG to PROD.
- Verifying the headers to make sure the client has sent right file lay out as per the submission guide.
- Worked on offshore/onsite model.
- Provided KT to support team about the interface and work flows.
Environment: Informatica Power Center 10.1.0, Salesforce, Oracle 11 g, MS Access, SQL Developer4.2.1,TOAD,SNOW, MS Excel., IBM MFT, Salesforce accelerator, Informatica cloud, Aginity workbench 4.6, Salesforce Workbench API version 36.0, Putty, FileZilla, WinSCP, IBM AS400, TIBCO, People soft, Autosys Scheduler and Tableau.
Confidential
Sr. ETL Informatica Consultant
Responsibilities:
- Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into Oracle, Netezza and Salesforce.
- Analysis of the source system to determine elements needed for ETL designs and did extensive data profiling for ETL maps.
- Developed Informatica Mappings and Workflows for Data Integration.
- Organized the Workflows and Sessions for the Initial and Incremental Data load runs.
- Prepared the Schedules for the Initial Load runs and Incremental Load runs.
- Designed test cases for initial and Incremental Load testing.
- Making sure SFDC Bulk API, SFDC error file properties are enabled in session level when we are loading any data to salesforce.
- Create connections for salesforce, oracle and Netezza as standard format as per the requirement.
- Converted user defined functions of business process into Informatica defined functions.
- Developed and support the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Informatica and provide technical support and hands-on mentoring in the use of Informatica
- Debug the Informatica mappings and validate the data in the target tables once it was loaded with mappings.
- Prepared and implemented data verification and testing methods for the Data Warehouse as well as to design and implement data staging methods and stress testing of ETL routines to make sure that they don't break on heavy loads.
- Importing source & target objects from the data bases and CRM application.
- Executing SOQL queries in salesforce and validating data in salesforce.
- Verifying the headers to make sure the client has sent right file lay out as per the submission guide.
- Executing various quries to make sure the source & target counts are matching or any drop off count .
- Sending emails & automated messages across the DIS team in case of emergency or app maintience support.
- Checking the data in sandbox as per the requirement also verifying the ETL code whether all the fields are appropriately populating to target system as per the S2T.
- Improved performance and fixed bottle necks for the processes which are already running in production. Also gained 3-4 hours load time for each process.
- Responsible for production supportof thesystemonRotabasis and monitoring critical jobs and escalating the issues to upper management in case of emergency.
- Worked on address doctor and validated the address and zip as per the requirement.
- Coordinating with DA’s, BA’s & PM’s in case of client enhancements and to make sure to having reference data in the mart. Adhere to HIPPA regulations and completed the HIPPA certification as per the org standards.
- Extensively involved in migration of Informatica Objects, Database objects from one environment to another.
- Communicate infrastructure needs and directions with management.
- Configured OS Profiles and LDAP Authentication.
- Migration of Informatica Mappings/Sessions/Workflows from Dev, TRAINING to Prod environments.
- Created Groups, roles, privileges and assigned them to each user group management.
- Write Repository Queries in support of developments and Production statistics.
- Ensures proper application of the licensing files.
- Developed shell scripts to auto run jobs in JAMS scheduler for picking up the files from one server to another, validating source files and arching files into path as per the requirement.
- Migration of Informatica Mappings/Sessions/Workflows from Sandbox to Prod environments.
- Created Groups, roles, privileges and assigned them to each user group.
- Developed, Modified and Tested UNIX Shell scripts and necessary Test Plans to ensure the successful execution of the data loading process.
- Add and update ODBC and TNS entries.
- Extensively used workflow variables, mapping parameters and mapping variables.
- Responsible for users & groups Management & Folder security.
- Provide Application support to ETL application development teams.
- Create, backup, restore and DR for Informatica Repositories.
- Handling space issues on UNIX servers to minimize the repository unavailability.
- Extensively worked on Informatica code deployments across different repositories.
- Record and maintain ETL software procedures.
- Worked on offshore/onsite model.
Environment: Informatica Power Center 9.5.1 HF4, IBM Netezza7.1.0.x, Salesforce, Oracle 11 g, MS Access, SQL Developer4.2.1, HP Quality Center 11.0 (ALM), JIRA, MS Excel, Aginity workbench 4.6, Salesforce Workbench API version 36.0, Putty, FileZilla, WinSCP, JAMS Scheduler, Tableau and Health Care Solver.
Confidential, Houston, TX
Informatica Developer Lead/Admin
Responsibilities:
- Responsible for Business Analysis and Requirements Collection.
- Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Parsed high-level design specification to simple ETL coding and mapping standards.
- Designed and customized data models for Data warehouse supporting data from multiple sources on real time.
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Created mapping documents to outline data flow from sources to targets.
- Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Developed mapping parameters and variables to support SQL override.
- Created Mapplets to use them in different mappings.
- Developed mappings to load into staging tables and then to Dimensions and Facts.
- Used existing ETL standards to develop these mappings.
- Used Crontab for scheduling few of the jobs to auto run as per the requirement.
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
- Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
- Modified existing mappings for enhancements of new business requirements.
- Used Debugger to test the mappings and fixed the bugs.
- Migration of Informatica Mappings/Sessions/Workflows from Dev, Test, UAT to Prod environments.
- Created Groups, roles, privileges and assigned them to each user group.
- Developed, Modified and Tested UNIX Shell scripts and necessary Test Plans to ensure the successful execution of the data loading process.
- Add and update ODBC and TNS entries.
- Responsible for users & groups Management & Folder security.
- Create, backup, restore and DR for Informatica Repositories.
- Handling space issues on UNIX servers to minimize the repository unavailability.
- Extensively worked on Informatica code deployments across different repositories.
- Perform independently with little direct supervision in administration of ETL.
- Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
- Had played the role of an offshore ETL Technical Lead/Architect on this Project and was responsible for creating Technical Specs & Low Level Design for Offshore development.
- Was completely responsible for offshore delivery and the creation of WBS - Work Breakdown Structure for the offshore team.
- Played the role of a Project coordinator between various work streams within the project and create Unit Test and Integration Test Plans.
- Organize daily technical discussions with the Onsite team also including the individual offshore work stream leads and set expectations for offshore delivery.
- Provide Daily and Weekly Status Updates to the Project Manager .
- Was responsible in performing Integrating Migrated code and performing Integration Tests from Offshore and publish Test results to the onsite team.
Environment: Informatica Power Center 9.5.1, Workflow Manager, Workflow Monitor, Informatica Power Centre, Putty, FileZilla, Data Analyzer 8.1, PL/SQL, Oracle 11g, Autosys, SQL Server 2005, UNIX, Toad 9.0, Cognos 8, Windows Server 2012,RFC,SAP ECC, SAP BW and Click software.
Confidential
ETL Developer
Responsibilities:
- Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA.
- Using Informatica Power Center Designer analyzed the source data to Extract & Transform from various source systems(oracle 10g,DB2,SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
- Using Informatica Power Center created mappings and Mapplets to transform the data according to the business rules.
- Used various transformations like Source Qualifier, Joiner, Lookup, sql, router, Filter, Expression and Update Strategy.
- Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools.
- Documented Informatica mappings in Excel spread sheet.
- Tuned the Informatica mappings for optimal load performance.
- Created and Configured Workflows and Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
- Have generated reports using OBIEE 10.1.3 for the future business utilities.
- This role carries primary responsibility for problem determination and resolution for each SAP application system database server and application server.
- Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
- Constantly interacted with business users to discuss requirements.
Environment: Informatica Power Center 9.1.0, Informatica Repository Manager, Oracle10g/9i,DB2 6.1, Erwin, TOAD, SAP Version: 3.1.H,Unix, Tera data,PL/SQL,SQL Developer, WinSCP and Putty.
Confidential, Waltham, MA
ETL consultant
Responsibilities:
- Analyzed the requirements and framed the business logic for the ETL process.
- Extracted data from Oracle as one of the source databases.
- Involved in the ETL design and its documentation.
- Interpreted logical and physical data models for business users to determine common data definitions and establish referential integrity of the system using ER-STUDIO.
- Followed Star Schema to design dimension and fact tables.
- Experienced in handling slowly changing dimensions.
- Collect and link metadata from diverse sources, including relational databases Oracle, XML and flat files.
- Responsible for the development, implementation and support of the databases.
- Extensive experience with PL/SQL in designing, developing functions, procedures, triggers and packages.
- Developed mappings in Informatica to load the data including facts and dimensions from various sources into the Data Warehouse, using different transformations like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
- Developed reusable Mapplets and Transformations.
- Used data integrator tool to support batch and for real time integration and worked on staging and integration layer.
- Optimized the performance of the mappings by various tests on sources, targets and transformations
- Design, develop and Informatica mappings and workflows; Identify and Remove Bottlenecks in order to improve the performance of mappings and workflows
- Review existing code, lead efforts to tweak and tune the performance of existing Informatica processes
- Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements.
- Scheduled the tasks using Autosys.
- Loaded the flat files data using Informatica to the staging area.
- Created SHELL SCRIPTS for generic use.
- Created high level design documents, technical specifications, coding, unit testing and resolved the defects using Quality Center 10.
- Developed unit/assembly test cases and UNIX shell scripts to run along with daily/weekly/monthly batches to reduce or eliminate manual testing effort.
Environment: Windows XP/NT, Informatica Power center 8.6, UNIX, Teradata V-14, Oracle 11g, Oracle Data Integrator, SQL, PL/SQL,SQL Developer, ER-win, MS VISIO, Autosys, Korn Shell, WinSCP, Putty and Quality Center 10.
Confidential
Informatica Administrator
Responsibilities:
- Installed, configured & upgraded InformaticaPowerCenter 9.5.0/9.1.0 HF1/8.6.1 on IBM-AIX platform.
- Assists team in defining solutions using Informatica.
- Plans and manages software upgrades and installation.
- Communicate infrastructure needs and directions with management.
- Installed and configured the following IDQ 9.1 components on AIX Server.
- Configured OS Profiles and LDAP Authentication.
- Migration of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments.
- Created Groups, roles, privileges and assigned them to each user group management.
- Write Repository Queries in support of developments and Production statistics.
- Ensures proper application of the licensing files.
- Configured Teradata Parallel Transporter with Informatica 9.1 and created multi load, fast load and fast export connections.
- Migration of Informatica IDQ Mappings/Sessions/Workflows from Dev, QA to Prod environments.
- Created Groups, roles, privileges and assigned them to each user group.
- Developed, Modified and Tested UNIX Shell scripts and necessary Test Plans to ensure the successful execution of the data loading process.
- Add and update ODBC and TNS entries.
- Responsible for users & groups Management & Folder security.
- Provide Application support to ETL application development teams.
- Create, backup, restore and DR for Informatica Repositories.
- Handling space issues on UNIX servers to minimize the repository unavailability.
- Extensively worked on Informatica code deployments across different repositories.
- Record and maintain ETL software procedures.
- Perform independently with little direct supervision in administration of ETL.
Environment: Informatica Power Center 8.6.1, Oracle 10G, IBM AIX, Teradata, Ctrl M, Sybase.