We provide IT Staff Augmentation Services!

Senior Datawarehouse Developer Resume

4.00/5 (Submit Your Rating)

Woonsocket, RI

SUMMARY

  • Overall more than 11 years of IT experience in the Business Analysis, Design, Data Modeling, Coding, Implementation and Testing of Applications using wide range of Technologies including Datawarehousing, Datamasking and Database for Energy Resources,Manufacturing and Retail domains .
  • Having good expertise in migrating various Informatica sessions and mappings from one version to another.
  • Having good experience in informatica IICS (cloud) and cloud technologies for data and application integration.
  • Having good experience in Azure data factory mappings pipelines and flows and Logic apps.
  • Worked on Salesforce, AWS redshift and cloud datawarehouse domains like Snowflake using Informatica cloud connections.
  • Used SSIS/SSRS for designing and developing and cleansing data from sql based sources and generating reports.
  • Well versed with Informatica Power center 8.1, 8.6, 9.1, 9.5, 9.6, 10.1, 10.2 64 bit.
  • Having Experience in Informatica MDM/IDQ for cleaning merging and matching Patients/Prescribers data in Confidential project and in current Confidential project.
  • Well versed in using Python script, Py - Spark and python libraries like pandas,numpy,seaborn and matplotlib to deliver the automation and data conversion and analysis. Expert in different regression machine learning techniques(ML).
  • Use of Power BI/Tableau for the drill down and drill up dashboard representation and analysis of data.
  • Experience in working with all the transformations in Informatica and applying the logic with the requirement.
  • Responsible for developing various data extracts and loading routine (batches) using Informatica PowerCenter 10.2 and other lower versions, Oracle PL/SQL and UNIX and Netezza.
  • Responsible for designing and developing of mappings, mapplets, sessions and work flows for load the data from source to target database using Informatica PowerCenter Designer.
  • Used Informatica PowerCenter Workflow Manager to create sessions, batches to run with the logic embedded in the mappings.
  • Create procedures, functions and packages for different RxConnect modules and other functional requirements.
  • Writing Unix shell scripts for the data validation, integration and automation.
  • Loading data from salesforce to ODS database using Informatica cloud and real time replication using outbound messages workflow.
  • Loading data to AWS redshift tables using informatica cloud.
  • Analyze and fix defects occurring in the Production environment.
  • Delivered major change requests (CRs) in the project.
  • SPOC of one of the major functional part in the project i.e.Datasync and Datamask which provides better security to the client’s data by masking the data.
  • Interacted with client directly and team for requirement gathering and analysis.
  • Carrying out impact analysis of CR and Defects.
  • Follow-up with customers on satisfaction of solution and conducting User Acceptance Testing.
  • Tuning the performance of the batches and Informatica Workflows.
  • Experience in integration of various data sources from Databases like Oracle, SQL Server and file formats like flat-files, CSV files, COBOL files and XML files.
  • Sound knowledge of Oracle 10g/9i/8i/8.0/7.x, MS SQL Server 2000, MS Access 2000, PL/SQL, SQL*Plus,
  • Worked on various database tools like Toad, SQL Developer, PL/SQL Developer, Rapid SQL developer.
  • Extensively executed SQL queries on Oracle using Toad and SQL server tables in order to view successful transaction of data and to validate data.
  • Experience in integration of various data sources with multiple Relational Databases like Oracle, SQL Server, XML files and worked on integrating data from flat files like fixed width and delimited.
  • Expertise in several key areas of Enterprise Data Warehousing such as Change Data Capture (CDC), Data Quality, and lookup Tables and ETL data movement.
  • Experience in identifying Bottlenecks in ETL Processes and Performance tuning of the production applications using Database Tuning, Partitioning, Index Usage, Aggregate Tables, and Normalization/ Denormalization strategies.
  • Experience in database performance tuning.
  • Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Good knowledge of Hadoop-Hive and Pig.
  • Experience in ETL testing, involved in QA testing with various teams.
  • Highly Motivated to take independent responsibility as well as ability to contribute and be a productive team member
  • Ability to work in multiple projects simultaneously.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 10.2, Informatica IICS, MDM, IDQ,SQL*Loader,SAP- BODI,AWS,TERADATA,Azure datafactory,Azure Databricks,SSIS/SSRS.Databases Oracle 11i/10g/9i/8i/7.x, SQL Server 2012/2008/2005 , MYSQL, Aws Redshift, Snowflake,Azure sql,Netezza

Scripting: Unix Shell Scripting.

GUI: Toad 9.x/7.4,Oracle designer2000, Pl/SQL developer, SQL Developer,Salesforce,sql workbench

OS: UNIX(Sun Solaris 2.x,HP/UX 11), Win 95/NT/98/2000/XP, MVS,MS-DOS, IBM AIX 4.2.

Scheduling: Control-M, Informatica Scheduler,RunMyjobs

Other: SQL* PLUS, SVN, MS Word, Excel, Outlook, PowerPoint, HP Lifecycle,Tortise,BMC Remedy,Rapid SQL, Mercury Quality center.

PROFESSIONAL EXPERIENCE

Confidential

Senior Datawarehouse Developer

Responsibilities:

  • Design,develop,test and implement detailed software application and ETL solutions using various tools and technologies including Informatica Power Center and IICS cloud, SQL Loader,SSIS/SSRS and Azure data factories pipelines and azure datafactory under multiple operating systems.
  • Supervising the junior members of the team as a module lead for EIM-enterprise solutions.
  • Helping the team troubleshoot the production issues and updating the Jira board for tickets and ongoing issues. With the latest status.
  • Using AWS s3 as Data Lake and load encoded data to AWS redshift using copy/unload CLI commands as well as Informatica power center tool.
  • Gather and analyze requirements, translate requirements to technical specifications for development and implementations. Design, develop, test and deploy mappings.
  • Prepare high level design documents and design and develop specifications for complex systems.
  • Develop and support custom SQL and Netezza packages, procedures, functions, triggers and view.
  • Migration of current data from Netezza to Cloud Snowflake as EDW datawarehouse.
  • Perform code deployment on test, stage and production environments.
  • Create ETL components for DW and other distributed applications. Conduct unit testing for enhancements and new development. Perform debugging and provide issue resolutions.
  • Analyzing performance bottlenecks using OLAP function and query optimization techniques.
  • Analyze existing applications to determine functionality and implement optimal designs and provide enhancements to meet changing requirements.
  • Create charts and dashboards using data profiling and analytics techniques on the Power BI and Tableau/Python.
  • Troubleshoot and support ETL systems in production environments and fix bugs. Prepare workflow charts and diagrams to specify detailed operations to be performed by the programs.
  • Working on building the customer relationship and a 360 view of data using multiple informatica tools like Informatica power center,IDQ and MDM.This process includes landing and staging of data and then cleaning it up using the cleanse functions and mapping before loading it to business object tables and then applying the merge configurations to get the golden records .The 360 orientation is again checked at the IDD console and domain and business areas are created as per business demand.

Confidential

Senior informatica Developer

Responsibilities:

  • Analyzing user requirements and defining functional specifications using agile methodologies and JIRA scrum board.
  • Developing and implementing complex data warehouse and data mart solutions using ETL tools and Informatica.
  • A cloud data warehouse is built from scratch using AWS and SAP-Hybris cloud. Loading the data to cloud databases from Hybris DB.
  • Extracting and loading data between legacy systems and Informatica Power Center/PLSQL packages and routines. Loading the data using PLSQL packages to the ODS table which will be used for reporting later.
  • Studying, analyzing and developing database schemas such as Star schema and Snowflake schema.
  • Leading multiple modelling, simulations and analysis efforts to uncover the best data warehouse solutions.
  • Designing logical and physical data models using Erwin.
  • Developing various extracts loading routines using Informatica Power Center.
  • Creating sessions and batches to run with the logic imbedded in the mappings using Informatica PowerCenter Workflow Managers.
  • Developing UNIX shell scripts for analyzing and directing data feeds.
  • Creating Informatica IICS Mapping and TASK flow configurations and loading data into AWS redshift.
  • Creating database procedures, packages, triggers and view.
  • Developing and implementing test validations of the data warehouses and data marts.
  • Analyzing test results and recommending modifications to meet project specifications
  • Analyzing performance bottlenecks using OLAP function and query optimization techniques.
  • Loading from flat files to target data warehouses.
  • Deploying applications in Informatica and migrating the applications to different environments.
  • Being a technical resource for direct communications to team members in the project development, testing and implementation processes
  • Documenting modifications and enhancements made to the data warehouses and data marts as required by the project.

Confidential, Woonsocket, RI

Sr. ETL Developer/Lead

Responsibilities:

  • Create Test templates that will be used for validation testing and testing with the various tool sets which includes Unix Batches, Informatica, Control M, SQL Developer, SBM teamtrack, Sharepoint.
  • Create the results summaries once tests are completed which contains the required information on performance parameters like CPU utilization, SBS, Runtime, stability of ENV during the test.
  • Worked on various tasks including enhancements, new developments of ETL loads in the CFRX project.
  • Interface with Business Owners and project team resources on the requirement gathering. Various changes has been done to the ETL loads in RxConnect, CFRX,Omnicar-LTC,Thumbdrive, C2 drugs application based on business requirements.
  • Responsible for developing various data extracts and loading routine (batches) using Informatica PowerCenter 9.6, Oracle PL/SQL and UNIX.
  • Loading data from Salesforce to the Retail Data warehouse matching the opportunities and contacts.
  • Responsible for designing and developing of mappings, mapplets, sessions and work flows for load the data from source to target database using Informatica PowerCenter Designer.
  • Performed Informatica Admin role by creating new repositories, users and maintaining test environments.
  • Used Informatica PowerCenter Workflow Manager to create sessions, batches to run with the logic embedded in the mappings.
  • Create procedures, functions and packages for different RxConnect modules and other functional requirements.
  • Creation of Retail Datamart using Informatica cloud IICS connections for salesforce CRM and data replication.
  • Used sql loader process to load images received from the federal databank (FDB) and optimize them for size using java transformations.
  • Use MDM informatica for merging and matching patients and prescribers data received from different resources like Pharamacy/retail stores/Opportunities
  • Analyze and fix defects occurring in the Production environment.
  • Delivered major change requests (CRs) in the project.
  • SPOC of one of the major functional part in the project i.e.Datasync and Datamask which provides better security to the client’s data. For obscuring the patient and prescriber’s data while providing production like data to the test environments for testing.
  • Interacted with client and team for requirement gathering and analysis.
  • Carrying out impact analysis of CR and Defects.
  • Using Python to create data pipelines and data cleaning
  • Using Informatica MDM for matching and merging duplicate patient/prescriber data.
  • Writing Teradata Fload,Mload and BTEQ,TPT queries for loading data from teradata databases.
  • Follow-up with customers on satisfaction of solution and conducting User Acceptance Testing.
  • Worked on various UAT and PROD incidents raised by users and which require timely response for L3 support.
  • Worked on enhancing the opportunity scale enhancements for Confidential when the data volume increased by around 8 times. Completely redesigned the existing process with better performance and increased revenue for the organization.
  • Responsible for Regression testing ETL jobs before test to production migration.
  • Tuning the performance of the batches and Informatica Workflows.

Environment: Informatica PowerCenter 9.6.1, Data Masking, SQL Developer,Informatica cloud IICS,Python,Teradata, Teamtrack, Unix-Putty, Oracle, Tortoise, Control-M 7.0, WinScp, IBM DB2, Toad for IBM DB2,JIRA.

Confidential

Sr. ETL Developer

Responsibilities:

  • Responsible for creating and executing of reports on Oracle’s SIOP (System Integrated operational planning) and providing the data to the users in the specified format.
  • Data cleansing and loading using informatica 8.6.1
  • Responsible for designing and developing of mappings, mapplets, sessions and work flows for loading the data from source to target database using Informatica PowerCenter Designer and cleanse the data using transformations.
  • Used Informatica PowerCenter Workflow Manager to create worklets, sessions and workflows to run with the logic embedded in the mappings.
  • Analyze and fix defects and validate and cleanse the data in case of any data issue.
  • Delivered major change requests (CRs) in the project.
  • Interacted with client and onsite team for requirement gathering and analysis.
  • Carrying out impact analysis of CR and Defects.

Environment: PL/SQL Developer 7.1, Putty, WinScp, Informatica PowerCenter 8.6,SIOP(System Integrated Operational Planning),Mantis bug tracker.HP Team track for tracking of slotted defects and projects.

Confidential

Business objects Developer-SAP (BODI)

Responsibilities:

  • Responsible for designing and developing of various jobs to extract data from heterogenous sources and transforms that data using built-in transforms and functions to meet business requirements and then loads the data into datastore/database using SAP’s BODI (Business Objects Data Integrator),PLSQL and UNIX.
  • Extensive testing of the code and Bug fixes during unit testing and system testing.
  • Analyze and fix defects occurring in the Production environment.
  • Delivered major change requests (CRs) in the project.
  • Interacted with client and onsite team for requirement gathering and analysis.
  • Carrying out impact analysis of CR and Defects.

Environment: SAP-BODI, PL/SQL Developer 7.1, Oracle 10g/11g, Putty, WinScp, Control-M.

We'd love your feedback!