- 10+ years of IT work experience in Software Development Life Cycle with focus on business analysis, design, data transformations, development, continuous testing and support for test data.
- A TDM Consultant with over 8+ years of experience in Informatica Power center v8.x/ 9.x/10 with hands - on experience on Informatica TDM, CA Test Data Manager, IBM Optim, TOSCA COMMANDER and K2View.
- Proven ability to develop creative ETL solutions independently or as a member of a large team.
- Strong analytical skills, Team Leadership, Interpersonal Skills combined with the ability to quickly learn new technologies and business practices.
- A self-starter, self-motivated and result oriented with expertise in resolving a wide range of strategic and tactical IT challenges as well as assisting with day-to-day information management, administration and project related tasks.
- Experience in Devops/CI-CD/Iterative models with AGILE and Waterfall Methodologies
- Key Business understanding of Domains like Telecom/Insurance/Banking and Transportation.
- Intensive knowledge on Data compliance standards aligned to PIPEDA Canada act for data privacy and implementation of major data sensitizing scripts via tools like INFA/CA TDM and major SQL Scripting.
- Test Data provision for QA and DEV cycles like Data Sub-setting(production)/Data Masking(SPI)/Data Generation(Automation scripts)
- Intensive Data Warehousing experience handling Terabytes of data in Teradata, Oracle 11i/10g, DB2 from analysis to implementation and support
- Rich experience in UNIX Shell Scripting and designer tools like Visio and Erwin.
- Fine tune Informatica and SQL code to achieve performance gain.
- Excellent record of meeting deadlines and ability to handle multiple projects simultaneously.
- Experience in developing and executing Project plans with delegating tasks and mentoring up to 7 project team members.
- Certified Solution Architect Associate - AWS
ETL Tools: Informatica PC/ TDM 9x/10x, IBM Optim-TDM, K2View Studio
TDM Tools: Informatica TDM, CA TDM, IBM Optim, TOSCA Commander and K2View
Development Tools: Informatica, Ab Initio, SSIS, Erwin, Toad, Vi, MS Office, MS Excel, SQL/Plus, job scheduler CA7, MS Visio, Cognos, Power Designer
Database: ORACLE, Teradata, MS SQL Server 2005, MS Access 97/2000Languages SQL, PL/SQL, T-SQL, UNIX Shell Scripts, Perl, Java, VB .Net
Operating Systems: Windows, UNIX (AIX, Sun), IBM Mainframe, MS DOS
Defect/Batch Tracking: Clear-Quest Rational, HP-ALM, Jira, and zephyr
Methodologies: Agile, Waterfall, Devops, DevSecOps, CI/CD
Senior TDM Consultant
- working closely with all applications, common services/middleware, and infrastructure teams throughout the development/engineering lifecycle
- Recommended approaches to streamline and integrate technological processes and systems in the organization to improve overall efficiency and improve the bank.
- Implemented CA TDM solution for various P&C Canada applications for the integrated environments.
- Lead the team with five resources onsite and offshore to accommodate vast data generation and subset requests across all P&C/RIS applications.
- Coordinated with the STO’s and SME’s of each application to build the data subset/generation strategy before implementation and to obtain the clearance from data governance and security teams.
- Used various Testing strategies to test the solution before implementation in the integrated environment.
- Implementation to automate the data loads on a frequent or refresh cycle basis in all the Non-Production Environments including QA (standalone/Integrated) and Dev.
- Performed monthly upgrades and patch implementation if any from the CA to accommodate the latest benefits in the BMO TDM lifecycle.
- Coordinated with multiple QA Teams and accommodated the right data at the right time and excelled the process for the STLC of each project.
- Performed rigorous evaluation on applications to understand the application completely before implementation of data subset/generation.
- Coordinated with the Data-Masking team to perform/execute TDM jobs post obfuscation process.
- Maintained the TDM SharePoint page and also the newsletter for TDM for publishing to various audiences in BMO.
- Implementation of CA TDM for Mainframes using IMS integrator & CA File Definition Manager to convert the mainframe files to local tables
- Responsible to build/maintain the Central Metadata Repository of various application in scope for TDM
- Developed TDM strategy/document for Confidential including features such as automation, integration, of various data feeds from source to target platforms as well as functionality such as sub-setting, and data generation
- Producing TDM Framework with centralized and integrated database platform to provide data provisioning serviceOn-DemandthoughSelf-Serviceinterface.
- Implemented multiple use cases on Synthetic Data generation, Data Masking of PII Data (SSN, Phone Number, Names and Address), Data Sub setting (Filter Data based on certain criteria)
- Responsible for building and maintaining all the PoC’s for new application in scope and showcase to the stakeholders when required within the timelines.
- Overall Managed the TDM activities along with other QA Leads and provided uninterrupted TDM solutions to overcome existing data need issues for Bothe QA/DEV teams.
Test Data Manager
- Interviewed and Evaluated Data Masking and Service Virtualization product vendors like CA, Informatica, Solix, IBM, HP and Voltage for initial ENABLE 1 project startup.
- Project Lead for Test Data Management for ENABLE 1st & 2nd phase projects for Brampton Intermodal to Moncton Intermodal for Data Subsetting and Masking of Non-Production environments
- Data Discovery performed for various applications on Oracle, DB2 and I-Series using IBM-Discovery tool to identify key relationships and mappings for Data Masking.
- Responsible for analyzing different applications for Mainframes and Servers to locate sensitive and restricted data while using a test environment that needs to be fully protected to ensure security of client information at all times.
- Revised Data Masking Exits and Functions to remediate previous flaws and to prove successful Data Masking capabilities in the Development region.
- Completed Data Masking for all applications and databases with sensitive customer information across pre-production environments to meet set guidelines and government policies.
- Managed Data Masking and Subsetting Pilot successfully for Siebel-Service First, Power MHS (iSeries/AS/400) and One View in phase 1 of ENABLE project.
- Performed Data Sub-Setting using Informatica TDM to Sub-set One View, Siebel and Power MHS applications used by MARITIME ONTARIO in Pre-production environment.
- Worked with MARITIME ONTARIO Subject matter experts to identify Sub-setting strategy for specific applications.
- Used TOSCA Commander, created scripts for Data Generation from front end application rather than backend as the trains are time sensitive.
- Held workshops with CA TDM/IBM Optim Test Data Manager for Data Masking on-going solution.
- Data Masking validation automation for source and target databases
- Discover and Identify data relationships across applications
- Create New Projects, Versions, Data Sets, Data Groups, Extracts and Data Pool in Data Maker
- Documented testing and development procedures and operational tasks activities
- Implemented multiple use cases on Synthetic Data generation, Data Masking of PII Data (SSN, Phone Number, Names and Address), Data Sub setting (Filter Data based on certain criteria).
- Worked on masking tables for different subsystems like Eligibility Enrollment, Third party liability(TPL), Providers, Internet, data warehousing
- Worked on Databases like ORACLE/DB2/SQL Server/CASSANDRA/MONGO databases.
- Mainframes/ Salesforce/SAP - Extracts for QA and Dev needs and masking sensitive data using Informatica TDM and Attunity Gold Client
- Verify masked data for consistency including PII/PHI/PCI data according to PIPEDA ACT.
- Currently working on the RPA implementation using UIPath for activities like email notification to customers on an event where the product reaches each station
Test Data Manager-Informatica-TDM
- Lead the team of four test data analysts on building the Global Enterprise Test Data Management Approach for Confidential World.
- Developed custom ETL data provisioning logic integrated with Informatica TDM that performs key functions:
- Subsets data based on specific Selection Criteria, extracts data in specific order considering Parent-Child relationships and extracts data considering table characteristics: reference vs. transactions
- Utilizes Oracle Data Pump for parallel data import for performance optimization
- Developed Auto-Generation tool for Oracle Data Pump Export and Import functions.
- Implemented masking capabilities for PII data using Informatica TDM.
- Configured development and QA environments to meet Data Privacy, Data Masking and Data Sub-setting requirements using Informatica Test Data Management (TDM) services.
- Implemented Enterprise Data Management using Informatica TDM/PC 10x for Test Data Management (TDM) and Data Growth Solutions for all the Guide-wire Applications(PC/CC/BC/CM)
- Gather requirements for Test Data Management and Data Archival from Application testers and application owners.
- Create projects/profiles/data domains and DB replication for Oracle, SQL Server, Teradata databases.
- Work with relational databases installed on Oracle RAC and SQL Server Clusters and Teradata and derive database relations using Data Discovery in Informatica TDM.
- Collaborated with business analyst and QA team for various data requirements selection criteria through interview and discovery sessions.
- Evaluate and implement data selection criteria based on Policies, Claims, Broker Information and Dollars information provided by the business owner be used for data mining and data extraction from enterprise data warehouses for production of replicated platforms for development, ACPT, pre-prod, QA and UAT.
- Perform obfuscation masking/disguising real data hide proprietary and private client information using Informatica TDM maintaining the law intact as per PIPEDA
- Produced documentation including process and data flow diagrams illustrating hardware infrastructure and processes that utilize them within and cross network domains.
- Coordinate efforts to establish the infrastructure necessary for the entire testing platform: Source and Target database connection including updating TNS entry, database access Oracle account, and schema including required privileges to use Oracle DATA PUMP to be able to extract data.
- Secure both Oracle and Linux ports to be available/open in order to be able to communicate between servers distributed within various networks restricted by firewalls.
- Assist in specifying criteria for WebLogic application server configuration including client interface URL.
- Manage Oracle database objects for Guidewire applications integrity such as constraints, index, sequence, stored procedures and packages in addition to loading data in the process of preparing a composite replicated platform.
- Validate the data discovered, subset data or synthetic data generation based on the selection criteria that include Policies, Claims, Broker Information and Dollars information following the mining/extraction process including reference and Application tables.
- Verify masked data for consistency including PII/PHI/PCI data according to PIPEDA ACT.
Test Data Specialist-Informatica
- Informatica tools (Power Center/TDM) were used to design the Source definitions, Target definitions, Mappings and Transformations to build mappings and to create/manipulate Data.
- Demonstrated experience in effective use and deployment of one or more of Informatica’s Application ILM products. These include Data Masking, Data Archive, and Data Subset.
- Demonstrated experience in effective use and deployment of Power Center, Power Exchange, and/or Data Services for ETL development in an enterprise environment across diverse sources and targets.
- Leads the analysis of enterprise-wide Test Data requests. Defines the test data characteristics and review with tester/requesters to validate the test data within the Test Data Management framework for all phases of testing knowledge of data analysis, end-user and business requirements analysis to articulate business needs and in corporate into technical solutions
- Build frameworks for all aspects of test data and strategy needs with respect to data sub setting, data generation, data masking and testing methodology to align with or define relevant new Test Data Management strategy.
- Created sessions, configured workflows to extract data from various sources, transformed data, and loaded into data warehouse.
- Coordinate development effort with offshore development team and participate and conduct Code reviews once the code is unit tested and ready for QA
- Create project documentation (Detailed design, Source-to-target mappings, Implementation plans, Run books, etc.)
- Assist in QA activities and resolve issues raised by QA team
- Created error handling strategy with checking on data integrity and data quality ensured high quality and clean data in the data warehouse
- Developed and scheduled Workflows using task developer, worklet designer and workflow designer in Workflow manager and monitored the results in Workflow monitor.
- Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.
- Performed performance tuning to remove bottleneck on source, target, mapping, session reduced 30% ETL loading time. Optimized the Performance of the mappings to load the data quicker.
- Analyze business and system requirements, manage development of specifications to create and execute detailed test plans, and verify bug fixes. Corrected testing defects and supported all levels of testing, ensured high quality of ETL program.
- Use Clear-Quest for reporting and tracking bugs, and providing updates on resolved bugs.
Environment: Oracle 11g, Informatica Power Center 9.0.1/8.6,Informatica TDM 9.7,K2view Studio, SQL, PL/SQL, Windows 2008 Server, UNIX, Toad 12.5, flat files, windows database management tool, Teradata SQL Assistant, mainframes, PuTTY, Maestro, Rational ClearQuest.
- Development of data-marts for various products like Norton 365, Norton internet security, Norton antivirus, Informatica Power-center ETL tool is used to extract the data from multiple platforms and transform, loaded into Oracle Database 11.1.
- Analyzed the requirements provided by business users.
- Involved in the analysis of the data model
- Design and implementation of mappings to extract data from various source systems.
- Extensively used Informatica Power-center ETL Tool to extract customer data, payment options and commercial & residential customer’s data from flat files and oracle database.
- Documentation of the daily updating in the mappings.
- Maintained the issue log and tracking the issue status.
- Monitored the data load and performance tuning of the load.
- Scheduling jobs using Autosys to automate the Informatica Sessions
- Delivered and sign-off of deliverables pertaining to the Operations Data-warehouse.
- For Customer system, developed logical & physical data model by defining strategy for Star Schema with Fact & Dimension tables with Detail, Summary, Look-up tables, Indexes and views.
- Imported data from various Sources and loaded to Target database, created Transformations using Informatica Power Center Designer (Source analyzer, Warehouse developer, Transformation developer, Mapplet designer, and Mapping designer).
- Involved in creating and managing global and local repositories and assigning permissions using Repository Manager. Also migrated repositories between development, testing, QA and production systems.
Environment: Informatica Power-center 8.1, Mapping designer, workflow Monitor, Sybase, Oracle 9i, UNIX, Autosys, Windows NT, PL/SQL.
Junior Technical Consultant
- Review of Document and code.
- Testing of mapping in Development Environment.
- Working on performance tuning of mapping.
- Knowledge transfer to Application support group.
- Monitoring the workflow.
- Familiar with the parameter file and its usage.
- Preparing and updating the required documentation for the Project.
- Tracking cause of any recurring issue.
- To coordinate between customers and the team.
- Developed and maintained the database, website and security infrastructure.
- Developed store procedures, triggers and views to support the up to date reports.
- Involved in developing data mart utilizing star schema, as the data source for reporting.
- Used SQL statements to query the Oracle database and retrieve the required data.
- Designed and developed the tables, views for the system in Oracle.
Environment: Oracle 9i, Informatica Power Center 8.6, SQL, PL/SQL, UNIX, PuTTY, SQL Developer Team Foundation Server.