We provide IT Staff Augmentation Services!

Qa Lead Resume

4.00/5 (Submit Your Rating)

Charlotte, NC

PROFESSIONAL SUMMARY:

  • 10 years of IT experience in the Analysis, design, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications.
  • Experienced in all phases of Systems Development Life Cycle (SDLC), including requirements analysis/design, development and testing.
  • Experience in end - to-end tasks in different SDLC AGILE, Iterative and Waterfall for project testing methodologies
  • Good Experience in QA Methodology and Validations to ensure industry standards.
  • Extensive experience in the ETL testing, manual testing and Business Intelligence Applications.
  • Proficient experience in different Databases like Oracle, SQL Server, DB2 and Teradata.
  • Experienced in running SQL queries using Toad, SQL Navigator and SQL Server.
  • Extensive testing ETL experience using Informatica 8.6/8.1/7.1/6.2/5. IBM DataStage 9/8.1, SSIS
  • Proficient experience with SQL Server and MSBI tools like SSIS, SSRS, and SSAS. Process all of the big data that is collected in a batch environment
  • 3 Years Proven Hands-on Experience with MDM (Master Data Management) Toolset. Involved in Security testing for different LDAP roles.
  • Experience in using Automation Tools: Quick Test Professional, HP ALM/Quality Center
  • Worked on various Operating Systems which include Windows 95/98/2000/NT/XP and Linux (RedHat), UNIX (Sun Solaris 9.0/8.0/7.0, IBM AIX, HP UX 11i v1.5/1.6/2.0).
  • Created Reports using MicroStrategy, Crystal Reports & Business Objects.
  • Oversee test data setup approach for complex ETL and metadata testing.
  • Provided guidance to QA/data analysts on test approach for complex ETL, metadata and Cognos report
  • Experienced in Black Box, and White Box Testing of Database and Web applications.
  • Experienced doing traceability matrixes, test reports, test scripting, automation.
  • Experienced in executing UNIX commands and Korn Shells.
  • Good Experience in Python, PERL, UNIX Shell Scripting.
  • Experience with programming technologies like C, C++, Visual Basic, ASP, .NET, HTML and SQL.
  • Strong working experience on data warehousing applications, directly responsible for the Extraction, Transformation and Loading of data from multiple sources into Data Warehouse.
  • Experienced in Performance Tuning of sources, targets, mappings and sessions.
  • Strong in Troubleshooting/Problem Solving skills.
  • Excellent Analytical, Communication skills and Leadership qualities working in a team.

TECHNICAL SKILLS:

OPERATING SYSTEMS:  Windows XP,NT/95/2000, Sun Solaris 2.6/2.7, Linux 6.X/7.X/8.X

LANGUAGES KNOWN: C, PL/SQL 8.x/2.x, SQL*Plus, SAS, .NET

RDBMS: Oracle 7.X/8.X/9.X/10g, Teradata, MS SQL Server 6.5/7.0/2000 , MS Access 7.0/’97/2000, SQL Server 11.x/12.0

SCRIPTING LANGUAGES: VB Script, Java Script

TECHNOLOGIES: Active X, OLEDB, and ODBC

WEB SERVERS: Java WebServer2.0, Netscape Enterprise Server, Web Logic 6.0.

DATAMODELLING TOOLS: Erwin 3.5.1/4.x, Designer 2000

DATAWAREHOUSING: Informatica PowerCenter / Powermart 8.1/7.1/6.1/5.1 ETL, Ab Initio, (GDE 1.15, Co>Op 2.14), Data Mining, DataMart, SQL*Loader, TOAD 7.5.2, DataFlux.

BI Tools: Business Objects XIR3, Cognos 8.0 Series  

PROFESSIONAL EXPERIENCE:

Confidential, Charlotte, NC

QA Lead

Responsibilities: 

  • Involved in Business analysis and requirements gathering.
  • Written SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting
  • Requirements including identifying the tables and columns from which data is extracted
  • Tested various data mapping requirements.
  • Worked in Capital Market, Equipment Finance and Wholesale Market Division. Wrote Macros for finding occurrence of a word in Word Document, PDF.
  • Automated the Daily Defect Status Reporting Process Using VB Macro.
  • Extensively involved in testing the ETL process of data sources, PeopleSoft, Teradata, SQL Server,
  • Oracle, flat files into the target Teradata, Oracle database as per the data models.
  • Experienced using database query and integration tools such as SSIS, TOAD, and SQL Plus etc.
  • Involved in testing SSIS Packages, and in Data Migration.
  • Tested SSIS packages according to business requirements and data mapping requirement documents.
  • Obtained detailed understanding of data sources, Flat files & Complex Data Schemas in order to write Best test script
  • Extensively ran the Ab Initio graphs, monitored the log files and involved in find the root cause of the failures.
  • Responsible in testing Initial and daily loads of ETL jobs.
  • Interacted with design team to decide on the various dimensions and facts to test the application. 
  • Tested & developed the mapping for extracting, cleansing, transforming, integrating and; loading data using Autosys.
  • Updates the status of old defects and logs any new defects in HP ALM.
  • Experienced in data analysis using SQL, TSQL, PL/SQL and many other queries based applications.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Reported bugs and tracked defects using ALM.
  • Extensively used SQL to verify and validate the data loaded into Target.
  • Testing, making sure that the correct data for a specific report are being pulled out from the Data Warehouse.
  • Performed the tests in both the SIT, QA and contingency/backup environments
  • Used SQL for querying the database for data validation, data verification and data conditioning.
  • Writing the test scripts, test cases & test plan for multiple ETL Feeds.
  • Submitted weekly bug or issue report updates to the Project Manager in the form of the QA Error Log.
  • Involving in Functional testing, Backend Testing, End to End testing and Regression Testing
  • Involved in Preparing and supporting the QA test environments
  • Load new or modified data into back-end database.
  • Defect Triage with Development team & Involved in peer review of test cases.
  • Verified Hot Docs Templates for Data Validation in the Templates using Mapping Sheet.
  • Supported Automation Team for Day to Day activity

Environment: Siebel CRM, ISM, SQL Server 2011, SQL Server Management studio, HP-ALM, VBA-Macros, Selenium, Hot-Docs, Siebel Row, Sales-Force.

Confidential, Cincinnati, Ohio

Lead QA Engineer

Responsibilities: 

  • Developed Mapping’s in Aginity Management workbench for loading data into corresponding sources
  • Developed Attributes in Attribute workbench for loading the data using Mapping rules
  • Published using Publication wizard of Aginity Amp the data to test the data against Tables present in staging
  • Performed Functional Testing and Back-end Testing using the database comparable results.
  • Created, Load new or modified data into back-end Netezza database.
  • Involved in double programming and QC check of the programs written by fellow programmers.
  • Created Tables, Graphs and Listings for clinical study.
  • Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security
  • Created IBM Unica Campaigns flat files for providing to the vendor for there campaign generation
  • Created Tivoli jobs, Job Stream & Ran them for orchestration of data loads using Shell script
  • Developed Json file that creates SQL files which loads data by using the sql generated by Aginity AMP
  • Developed shell Script to load text files present in Unix box into corresponding database
  • Developed Cognos reports as per the Organization standards and tested them
  • Developed tableau reports to ensure availability of Campaigning data to meet the Needs of IBM Unica Campaign
  • Developed shell script that loads into Netezza tables present using publication present in Aginity AMP
  • Designed & Created Mapping documents based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).
  • Enhancing the QTP scripts by doing unit testing on the scripts before creating scenario based tests in Test-Lab Module of Quality Center.
  • Developed and Performed execution of Test Scripts to show how many records are passed or fail
  • Loaded data into Netezza tables from various data sources like Netezza, SQL Server, Oracle, flat files by creating Informatica Mappings
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Created ETL test data for all ETL mapping rules to test the functionality data
  • Wrote complex SQL scripts using inner joins, Cross joins and outer joins and sub queries
  • Create Test scripts by using T-SQL. Programming elements include control-of-flow techniques, local and global variables, functions, and error handling techniques
  • Participated in the release control process (when the application is transferred from the build team to the test team) to ensure that solutions meet business requirements
  • Participated in the PI Planning Quantitative Metrics (to get an idea of the business value gained during Previous PI)
  • Sprint Demo with Business Users (A demo to review everything completed during the sprint)
  • Extensively executed T-SQL queries in order to view successful transactions of data and for validating data in SQL Server Database
  • Validating the Cognos reporting objects in the reporter against the design specification document.
  • Strong Knowledge in protocols like SSH and SCP and tested the application in Unix Environment
  • Tested the dimensional cubes developed by SQL Server Analysis Services (SSAS) and queried the data using SQL Server Management studio(SSMS)
  • Did Data Modeling testing for source identification, data collection, data transformation, rule administration, data consolidation and data reconciliation.
  • Developed SQL to identify the deltas of daily loads
  • Identifying field and data defects with required information in ETL process in various mappings and one to one mapping.
  • Verified all the Data load processes developed for fetching data from Big Data system to the Enterprise Data warehouse using SQL queries.
  • Performed data validation on the flat files that were generated in UNIX and Netezza environment using UNIX and SQL * Loader commands as necessary
  • Defects identified in testing environment where communicated to the developers using defect tracking tool Mercury Test
  • Reported the defects / bug in Jira tickets and documented the report.
  • Checked the reports for any naming inconsistencies and to improve user readability.

Environment: Aginity Amp Pure Analytics, Netezza Release 7.2.1.1, Unix, Putty, Tivoli, Informatica, Confluence, Jira, Agile, IBM Cognos 10, IBM Unica Campaign, Hadoop Hive, SQL Server, Oracle, Mobaxterm, Tableau

Confidential, Sanramon, CA

CCAR QA Engineer

Responsibilities: 

  • Involved in Business analysis and requirements gathering.
  • Written SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted
  • Tested various ETL data mapping requirements.
  • Worked in CCAR System.
  • Oracle, flat files into the target Teradata, Oracle database as per the data models.
  • To database query and integration used the tools TOAD, and SQL Plus etc.
  • Tested Flat Files according to business requirements and data mapping requirement documents.
  • Obtained detailed understanding of data sources, Flat files & Complex Data Schemas in order to write best test script
  • Monitored the log files and involved in find the root cause of the failures.
  • Interacted with design team to decide on the various dimensions and facts to test the application.
  • Updates the status of old defects and logs any new defects in HP ALM.
  • Involved in extensive DATA validation using SQL queries and back-end testing
  • Extensively used SQL to verify and validate the data loaded into Target.
  • Testing, making sure that the correct data for a specific report are being pulled out from the Data Warehouse.
  • Performed the tests in both the SIT, QA and contingency/backup environments
  • Used SQL for querying the database for data validation, data verification and data conditioning.
  • Involving in Functional testing, Backend Testing, End to End testing and Regression Testing
  • Involved in Preparing and supporting the QA test environments
  • Load new or modified data into back-end database.

Environment: SQL Server 2008, Oracle 11g, XML, XSLT, XSLD, Flat Files, MS Excel, SQL, TSQL, PL/SQL, Windows, UNIX, HP ALM 12.

Confidential, San Diego, CA

Sr. BIG Data ETL QA Lead

Responsibilities:  

  • Designed & Created Mapping documents based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).
  • Developed a python script to generate TSV file that shows number of columns are null and number of duplicate records at table level
  • Developed python scripts to execute database SQL queries and scheduled them in Jenkins jobs to show Status of test case
  • Created Jenkin jobs to run scheduled python scripts that has complex SQL queries for data verification
  • Developed shell Script to load text files present in Unix box into corresponding database
  • Developed tableau reports as per the Organization standards and tested them
  • Developed shell script in cron job to report disk space usage of Unix server
  • Developed tableau reports to ensure availability of data in all the table of Vertica to meet UAT needs
  • Developed shell script that loads into oracle database present in UNIX box for reconciling source data
  • Interacting with senior peers or subject matter experts to learn more about the requirements. Involved in the preparation of Test Strategy & Test Cases
  • Actively involved in Functional Testing, Regression Testing.
  • Developed and Performed execution of Test Scripts to show how many records are passed or fail
  • Transformed data from various data sources like Netezza, DB2, Oracle, flat files using OLE DB connection by creating various SSIS packages
  • Created ETL test data for all ETL mapping rules to test the functionality data
  • Developed complex SQL scripts using inner joins and outer joins and sub queries
  • Create Test scripts by using T-SQL. Programming elements include control-of-flow techniques, local and global variables, functions, and error handling techniques
  • Participated in the release control process (when the application is transferred from the build team to the test team) to ensure that solutions meet business requirements
  • Extensively executed T-SQL queries in order to view successful transactions of data and for validating data in Sql Server Database
  • Strong Knowledge in protocols like SSH and SCP and tested the application in Unix Environment
  • Used Sqoop to transfer data from oracle into Hadoop HDFS as well exporting data from Hadoop HDFS file to RDBMS
  • Identifying field and data defects with required information in ETL process in various mappings and one to one mapping.
  • Tested all the ETL processes developed for fetching data from Hadoop Hive system to the target Data warehouse Oracle using complex SQL queries.
  • Performed data validation on the flat files that were generated in UNIX and Hadoop environment using UNIX and Hadoop fs commands as necessary
  • Analyzed business requirements, wrote and implemented the Test Plan, various Test cases and Test Scripts in VBScript to test various functionalities using the automated tool QTP and also wrote manual Test Cases according to functional documents and requirements.
  •  Testing Hadoop HBase provisioning on VMware virtualized platform 
  • Tested Beacons that generated from chrome at Data Enterprise integration with Hadoop. Verifying fill forward methodology for records that are not part of daily load
  • Created Adhoc reports for testing necessities against report database. Reported the defects / bug in Jira tickets and documented the report.
  • Written Test Cases for ETL to compare Source and Target database systems.

Environment: Agile, WinSCP, Oracle 10g, Flat files, TOAD, Aqua Studio, V-SQL, TFS, Jira, UNIX, SSH, UNIX Shell Script, Python, Vertica, Netezza, SQL server, Oracle, Source Tree, Git Hub, Query Surge, Beackoning, Source Tree, informatica, Glitch, Hive, Tableau, ApacheAnt, Mobaxterm, pyunit, AquaStudio, Beyond Compare, Winmerge, Hadoop Hive, Tidal, Cronjob, Jenkins, splunk

Confidential, Brea, CA

Sr. Consultant

Responsibilities: 

  • Designed & Created Test Cases based on the Business requirements (Also referred Source to Target Detailed mapping document & Transformation rules document).
  • Written Test Cases for ETL to compare Source and Target database systems.
  • Interacting with senior peers or subject matter experts to learn more about the requirements
  • Involved in the preparation of Test Strategy & Test Cases in HP ALM
  • Prepared the traceability matrices to fill the gap between the requirements and the test cases covered.
  • Actively involved in Functional Testing, Regression Testing. Executed all the Test Cases in HP ALM with Pass/Fail/Blocked status
  • Developed and Performed execution of Test Scripts manually to verify the expected results
  • Transformed data from various data sources like Netezza, DB2, Oracle, flat files using OLE DB connection by creating various SSIS packages
  • Created ETL test data for all ETL mapping rules to test the functionality of the SSIS Packages
  • Developed complex SQL scripts using inner joins and outer joins and sub queries
  • Create Test scripts by using T-SQL. Programming elements include control-of-flow techniques, local and global variables, functions, and error handling techniques
  • Extensively executed T-SQL queries in order to view successful transactions of data and for validating data in SQL Server Database
  • Worked with SSIS system variable, passing variables between packages. Executed SSIS Packages and verified the results.
  • Verified SSIS Package configuration and variables and various SSIS transformations developed.
  • Tested Complex ETL SSIS Packages and Sessions based on business user requirements and business rules to load data from different source
  • Tested the dimensional cubes developed by SQL Server Analysis Services (SSAS) and queried the data using SQL Server Management studio(SSMS)
  • Identifying field and data defects with required information in ETL process in various mappings and one to one mapping.
  • Identifying duplicate records in the staging area before data gets processed.
  • Verifying fill forward methodology for records that are not part of daily load
  • Altered SSRS parameters to create Adhoc reports for testing necessities against report database
  • Validated SSRS reports to ensure UAT needs
  • Tested the reports generated by SQL Server reporting services and checked as they are as per the Organization standards
  • Reported the defects in TFS & HP ALM and documented the report.

Environment: SQL Server 2008 R2, SSIS, SSRS, SSAS, Guidewire, Talend, Netezza, Microstrategy 9.2, WinSCP, Oracle 10g, Flat files, TOAD, SQL Server Management studio(SSMS), T-SQL, TFS, HP ALM, UNIX

Confidential, Burbank, CA

Data warehouse ETL Engineer

Responsibilities: 

  • Involved in Business analysis and requirements gathering.
  • Tested/Found the defects in universes and reports. Used Mingle for tracking the defects.
  • Tested all data reports published to Web including dashboards, summarized, master-detailed and API’s.
  • Tested graphs for extracting, cleansing, transforming, integrating, and loading data using Data Stage ETL Tool.
  • Worked as ETL Tester responsible for the requirements/ETL Analysis, ETL Testing and designing of the flow and the logic for the Data warehouse project.
  • Responsible for providing Discrepancies of Data Reconciliation reports according to client’s requirement using MS Access.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Created and customized the Settings.xml file for the local and server configuration builds
  • Created Short-cut joins, aliases and contexts to make the universe loops free.
  • A test reporting environment was established to combine Distribution Work Management and information into a consolidated place to allow combined reports to be generated.
  • Deployed. ear Build’s onto QA Server from JBoss and Maven for Validation of Reports through Putty and WinSCP
  • Deleted/Added Permission for the users in SQL Server to access build created locally and server side
  • Locally Created the Build Maven with Clear Case Version for the Users to access the reports from this Build for data validation
  • Develop ETL test plans based on test strategy. Created and executed test cases and test scripts based on test strategy and test plans based on ETL Mapping document.
  • Tested and worked on creating open document reports for business.
  • Used various @Functions like @Prompt (for user defined queries), @Where (For creating conditional filters), and @Select for testing Business Reports with various boundary conditions.
  • Preparation of technical specifications and Source to Target mappings.
  • Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security
  • Experienced using query tools for SQl Server to validate reports and troubleshoot data quality issues.
  • Solid testing experience in working with SQL Stored Procedures, triggers, views and worked with performance tuning of complex SQL queries for better performance and efficiency.
  • Validated format of the reports and feeds.
  • Effectively communicate Data Analysis and testing activities and findings in oral and written formats.
  • Extracted data from various sources like flat files and SQL Server.
  • Designing and creation of complex mappings using involving transformations such as expression, joiner, aggregator, lookup, update strategy, and filter.
  • Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.
  • Worked on issues with migration from development to testing.
  • Deployed the builds and Reports corresponding using MVN commands on local and servers machines
  • Restored Databases and created backups into SQl server from backups

Environment: Mingle, JBoss, Maven, Putty, WinSCP, Clear Case, Data Stage, MS Access, SQL Server 2008 R2, Data Stage 8.0.1, Agile/Scrum Methodologies, SAP, Eclipse, XML Files, MS Excel,, Erwin 4.0,Unix SSH Scripting, Business objects, Connx, Linux, Business Objects

Confidential, Carlsbad, CA

Sr. QA Lead

Responsibilities:

  • Wrote extensive SQL and PL/SQL scripts to test the ETL flow, Data Reconciliation, Initialization, and Change Data Capture, Delta Processing, Incremental process of Policy and Claims systems.
  • Developed a detailed Test Plan, Test strategy, Test Data Management Plan, Test Summary Report based on Business requirements specifications.
  • Tested global sources, global targets, workflows, data profiling and manage metadata.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Prepared a UAT Plan and set up UAT environment.
  • Prepared Execution procedure document format to prepare the Test cases based on mapping document.
  • Analyzed the Data Dictionary developed by the Systems Analyst for developing test scripts to test the Policy and Claims systems.
  • Implementing logical and physical data modeling with STAR schema using Erwin in Data Marts.
  • Extensively used Informatica PowerCenter 9.5.0 to load data sourcing from Source to Target databases.
  • Executing ETL - Informatica Workflows for updating test Databases from the Production servers.
  • Worked rigorously with developers to create test scripts in regards to the Informatica ETL.
  • Tested Address Activities claims data for medical and pharmacy from pharmacy ODS and flat files.
  • Tested workflow tasks such as session, command and decision and email.
  • Tested Informatica Mappings and worked on Staging area to validate the data with SQL Queries.
  • Tested various jobs and performed data loads and transformations using different stages of informatica and pre-built routines, functions and macros.
  • Wrote complex SQL queries to validate EDW data versus EDM source data including identification of duplicate records and quality of data based on Mapping/Transformation rules.
  • Tested the UI Interfaces such as JSP, Struts on JBOSS Server
  • Used various SSIS tasks such a as conditional split, Derived Column on the data retrieved, performed Data Validation Checks during staging and then loaded the data
  • Ran SQL queries to verify the number of records from Source to Target and validated the referential integrity, Time variance, Missing records, and Nulls/Defaults/Trim spaces rules as per the design specifications.
  • Process all of the big data that is collected in a batch environment
  • Worked with XML feeds from multiple sources systems and loaded the same into Enterprise data warehouse
  • Verified correctness of data after the transformation rules were applied on source data.
  • Coordinated execution of User Acceptance Testing, regression and integration testing with multiple departments.
  • Tested the Logos added in the Webi Reports of BO
  • Identified appropriate test data in relevant source systems and incorporated this data into test scripts.
  • Updated the status of the testing to the QA team, and accomplished tasked for the assigned work to the Project Management team regularly.
  • Performed Regression testing of the fixed issues in the new build until no new issues are identified.
  • Submitted weekly bug or issue report updates to the Project Manager in the form of the QA Error Log.
  • Submitted Final Test Report and required documentation for the entire project within the assigned time frame.
  • Prepared documentation on how to approach and validate the data in Data-warehouse

Environment: Informatica 9.5.0, OBIEE 11g, HP Quality Center 11, Oracle 10g, Erwin 4.0, XML, XSLT, UNIX, Scripting, SOAPUI, WebServices, WSDL, AutomatedUITesting, PL/SQL,

TOAD7.0,Noetix,Perfore,MSExcel,Agile/Scrum Methodologies, SSIS, Pivot Tables, Microsoft Visual Studio 2008,Jira,Eclipse,JDK 1.7.0.21

Confidential, Agoura Hills, CA

Sr. QA Lead 

Responsibilities:

  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Develop test plans based on test strategy. Created and executed test cases based on test strategy and test plans based on ETL Mapping document.
  • Preparation of technical specifications and Source to Target mappings.
  • Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security
  • Written test cases to test the application manually in Quality Center and automated them in  Stored Procedures using SSMS
  • Experience with Excel Reports and dynamic dashboards, scorecards and structured reports for operations and higher management.
  • Tested both conditional formatting and threshold level testing for several reports developed in Cognos and Excel.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Tested .net Code used for the web based reports
  • Tested SSIS Packages used for Data Extraction, Transformation and Loading
  • Used various SSIS tasks such  as conditional split, Derived Column on the data retrieved, performed Data Validation Checks during staging and then loaded the data
  • Prepared several test scenarios for the work flow of the entire ETL cycle
  • Developed and documented data Mappings/Transformations, and SqL sessions as per the business requirement. Involved in the development Mappings and also tuned them for better performance.
  • Extensive experience in writing SQL and PL/SQL scripts to validate the database systems and for backend database testing.
  • Performed Functional Testing and Back-end Testing using the database comparable results manually.
  • Load new or modified data into back-end Oracle and SQL Server databases.
  • Tested Scheduled SSIS Packages which ran with SQL Agent which are configured linked servers for data access between new and old servers
  • Created Tables, Graphs and Listings for User Reports
  • Coordinated with Agile teams and implemented all Test Plans in accordance to need of development projects
  • Developed automated test scripts from manual test cases for Regression testing based on the requirement documents using Quick Test Professional.
  • Defects identified in testing environment where communicated to the developers using defect tracking tool Mercury Test Director.
  • Implement, Gather, and report team and project metrics which demonstrate business value and showing continuous improvement in an Agile environment.
  • Developed scripts, utilities, simulators, data sets and other programmatic test tools as required executing test plans.
  • Effectively communicate testing activities and findings in oral and written formats.
  • Reported bugs and tracked defects using Quality Center 11
  • Extensively used Informatica power center for extraction, transformation and loading process.
  • Worked with ETL group for understating mappings for dimensions and facts.
  • Extracted data from various sources like Oracle, flat files and SQL Server.
  • Worked on issues with migration from development to testing.

Environment: SSIS, Siperian, HP Quality Center 11 and 9.2, Oracle 10g, Erwin 4.0, XML, XSLT, UNIX, Shell Scripting, SOAPUI, Web Services, SQL Server 2008, WSDL, Automated UI Testing, SOA Test 5.5.2, SQL, PL/SQL, TOAD 7.0, Cognos, MS Excel, Agile/Scrum Methodologies, Pivot Tables, Microsoft Visual Studio 2008, Mircrosfot Silverlight.

Confidential, Danbury, CT

ETL- MDM Analyst

Responsibilities:

  • Participated and Performed System Integration testing by developing Sub graphs and Integrated with Extraction and Load Graphs.
  • Used Test Director to Report bugs and Data Quality Issues on each check sum and field validation Tests completed.
  • Done 21 CFR Part 11 Assessments for existing and new applications.
  • Validating against FDA Rules 21 CFR Part 11 and tested the application.
  • In the pre-initial load phase validated the data table additions and code tables based on Code table values document and the Source to MDM data mapping document.
  • Performed independent MetaData Management services testing by preparing xml transactions.
  • Communicated and Discussed with Business analysts and Developers about the status of each Data Quality Issues.
  • Executed UNIX Shell Scripts for Batch Job Execution.
  • Written several UNIX scripts for running test loads for regression testing purpose.
  • Ran SQL Statements manually to execute Record Counts on Each Table Loaded into System and Compared with Source Table.
  • Performed tests on various features of Agile Development process
  • Extracted Data from Teradata using Informatica Power Center ETL and DTS Packages to the target database including SQL Server and used the data for Reporting purposes.
  • Involved in the error checking and testing of the ETL procedures and programs Informatica session log.
  • Tracked and reported the bugs with Quality center.
  • Extensively worked on Mercury Test Center and ran various scenarios and scheduled them.
  • Created Testing Audit Table and ran the graphs to store the end results of each test in SAT Audit Table (Schema).
  • Used PERL for automating all the types of modules at a time.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).
  • Modifying the existing Shell Scripts with Ultra Edit utility.
  • Written various Unix Shell Scripts for Scheduling the job in production like check file script, load script, complete script, and DQ script.
  • Involved in developing the UNIX scripts for Informatica Workflows using parameter files and monitoring the workflows on testing environment.
  • Involved in requirements gathering and analysis in support of data warehousing efforts and data quality analysis for cleansing and developed ETL specifications.
  • Participated in bug triage meetings with developers to validate the severity of the bug and responsible for tracking of the bug life cycle using Test Director.
  • Designed test data using MS Excel sheet, created data driven test for testing the application with positive and negative inputs.
  • Perform data driven testing and validated the test results.
  • Created various User Defined Functions for script enhancements and to verify the business logic.
  • Written Complicated SQL queries in SQL Server 2008 for Update Dimension Tables, Export Data from Target tables, etc.
  • Wrote complex SQL scripts in DB2 for testing data quality and validation.

Environment: Informatica 8.6x, Siperian, HP Quality Center, Oracle 10g, 21 CFR Part 11, IBM DB2, Erwin 4.0, Business Objects 5.x, XML, XSLT, IBM AIX 5.3, UNIX, Shell Scripting, UNIX, SOAP, SOAPUI, Web Services, Parasoft, API, WSDL, Automated UI Testing, MQ Messaging, MMQ, SOA Test 5.5.2, SQL, PL/SQL, TOAD 7.0, Agile Methodology, Test Director 7.6.

Confidential, Portsmouth, NH

ETL Analyst Tester

Responsibilities:

  • Involved in Business analysis and requirements gathering.
  • Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
  • Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.
  • Executed sessions and batches in Informatica and tracked the log file for failed sessions.
  • Check the naming standards, data integrity and referential integrity.
  • Responsible for monitoring data for porting to current versions.
  • Did MetaData Management testing for source identification, data collection, data transformation, rule administration, data consolidation and data reconciliation.
  • Checked the reports for any naming inconsistencies and to improve user readability.
  • Extensively used Informatica Power Center for extraction, transformation and loading process.
  • Extensively used Informatica tool to extract, transform and load the data from Oracle to DB2.
  • Validating the reporting objects in the reporter against the design specification document.
  • Validating the data files from source to make sure correct data has been captured to be loaded to target tables.
  • Tested the application developed with .net and its components.
  • Tested .net code for web based reports
  • Data Validation of distributed Databases application systems using Python and TSQL and PL-SQL scripting
  • Validating the Archive process to purge the data that meet the defined business rules.
  • Writing complex SQL queries using Case Logic, Intersect, Minus, Sub Queries, Inline Views, and Union in Oracle.
  • Initiated the bi-weekly QA-Stat-Meet to discuss the intricacies involved in the application(s) being tested.
  • Involved in validating the aggregate table based on the rollup process documented in the data mapping.
  • Written several complex SQL queries for validating and verifying the data in Oracle 10g.
  • Prepared test plans/test schedules with inputs from Project manager and development team.
  • Responsible for reporting and escalating data issues arising from project or daily support.
  • Assisted in creating test data and test cases and execute function system tests as needed.
  • Performed Black box Testing, Regression and end-to-end testing.
  • Developed/revised training documentation and procedure manuals.
  • Responsible for running and analysing quality check reports to ensure system is functioning properly.
  • Tested the application by writing SQL Queries and creating pivot views as to perform back-end testing.

Environment: Informatica 8.4, PERL, SQL, PL/SQL, Autosys, HP Quality Center, Flat Files, Python, Quick Test Pro 9.2, Business Objects XIR3, UNIX, Shell Scripting, Mercury Quality Center, Oracle 10g, SQL Plus, MS Access, Pivot Tables, SOAP, XML, XSD, XML Spy 2008, SQL, Visio 2002 sp2, Windows XP Pro, SQL Navigator 3.5.2.

We'd love your feedback!