We provide IT Staff Augmentation Services!

Qa Analyst Resume

3.00/5 (Submit Your Rating)

Irving, TX

SUMMARY

  • A self - motivated and goal oriented data warehousing professional, supplemented with rigorous industrial experience, having good oral and written communications skills, seeks a challenging position to work on the ETL, DHW and other QA testing positions.
  • Experience in all aspects of Software Test Life Cycle including System Analysis, Testing efforts Design, Development, Execution, Reporting and Closure Documentation.
  • Experience of Testing methodologies, Concepts, Phases, and Types of testing, developing Test Plans, Test Scenarios, Test Cases, Test Reports and documenting test results accordingly after analyzing Business Requirements Documents (BRD), Functional Requirement Specifications (FRS), Mapping documents and Technical Specs.
  • Clear understanding of different ETL tools and different transformations used to move data from different database systems.
  • Experience in writing complex SQL Queries as well as HQL to check the integrity of data to perform SIT and E2E testing between sources to target.
  • Have solid experience on database query tools such as TOAD, SQL server Management Studio, Teradata SQL Assistant.
  • Experienced in Functional testing, System testing, Integration testing, Regression testing, GUI Testing, Black box testing, White box testing, and Boundary testing.
  • Very good understanding of Data Warehousing concepts, Data Analysis, Data Warehouse Architecture and Designing.
  • Understanding of Data Models, Data Schemas, ETL and SQL queries to perform back-end data warehousing testing.
  • Experience in testing initial load, Delta load with End to End testing.
  • Experienced in Generating different Adhoc and Canned reports as per the business needs for development and testing purposes.
  • Experienced on Big data (Hadoop) Hbase & Hive database.
  • Good knowledge of Java programming language.
  • Good Knowledge of Microsoft TFS agile & Scrum models.
  • Self-starter and highly motivated.

TECHNICAL SKILLS

ETL tools: Abinitio, Datastage, Informatica Power Center and SSIS.

Database: Teradata, Oracle, DB2, MS SQL Server, Netezza, Hbase, Hive, HDFS, PIG

Methodologies: Agile/Scrum, Waterfall

Test Management Tools: Qtest, HP Quality Center (ALM), Jira,Trello. MS TFS

Data Access Tools: SQL Assistant, TOAD, SQL Server Management Studio, Oracle SQL Developer, DB Visualizer

Operating Systems: Window, UNIX

Programming Languages: Core Java Knowledge, Shell Scripting

Other Tools & Technologies: Zena Schedular, XML Spy, Win SCP, PuTTY, Note Pad ++, Excel, XML, Informatica TDM

PROFESSIONAL EXPERIENCE

Confidential, Irving, Tx

QA Analyst

Responsibilities:

  • Worked on Genesis Data Warehouse project which was common pool for contracts, balances & positions across all CITI for the use of Finance, Risk and Compliance Analysis.
  • Day to day involvement on data acquisition project for feeds processing & validations where feeds comes from different source systems as per end consumer needs & activities for this process involves getting feed files, processing them till data mart, validating them for functional & regressions testing and report to feed requester for feeds availability for their uses.
  • Participate in business requirements walk through, design walk through and analyzing usiness requirements and functional requirements.
  • Performed Functional and Regression testing activities in day to day basis.
  • Ran different types of jobs in UAT environments using UNIX commands as well as manually from Abinitio and Autosys to process ETL jobs and load data into target table/file for testing purposes.
  • Performed homogeneous and heterogeneous testing between files & tables for count and data validations using automated SQL scripts and that also include loading files into temporary table and compare against target tables.
  • Created test scenarios and test cases based on the test plan, business and data mapping requirements.
  • Performed different validations like DDL, count, source to target for ETL process by using SQL queries against NZ and Hive table.
  • Reviewed mapping & FRD documents as per testing requirements to make sure necessary information & also tested data flow based on mapping document.
  • Performed files validations in unix environment for count and data by using commands like comm, awk, grep & cat.

Environments/Tools: Abinitio, Unix, SQL, Big data Hadoop, Hive, Oracle, Netezza, Excel, Note Pad ++, Autosys scheduler, HP ALM, JIRA, Agile.

Confidential

ETL Tester

Responsibilities:

  • Worked on agile software development methodology and anticipated in daily stand up calls and update the progress to all stake holders to accomplish the sprint planned tasks.
  • Understanding the user stories/ requirements for committed tasks and data flows in Jira by setting up meetings with business analyst as well as developers.
  • Review and understand system requirements, design documents, and mapping document and prepared test strategy and test plan documents.
  • Worked on ACA2RS data which comes under Affordable Care Act Medical Insurance system.
  • Worked on CMS claim submission process project and validated Claims data.
  • Worked on Datalake Gold Membership data installed on Big Data: Hadoop - HDFS, HIVE, HBASE area
  • Performed ETL testing & other data flow testing against homogeneous and heterogeneous data bases like CSV file, XML file to table or table to table.
  • Extensively worked on testing Big Data Hadoop platform to validate hive tables into the data lake which was loaded from different data sources.
  • Tested data quality, duplicate & counts for hive tables Hadoop environment for Hive & Hbase.
  • Performed testing between different databases as source and target such as SQL server, Hive & Teradata tables by importing data into a single platform like Teradata and compare each other by writing SQL.
  • Validated organization data between source and target which is loaded into different databases depending on the product and line of business by using simple and complex SQL and HQL as necessary.
  • Worked on EDW as well as data lake projects for data quality check like count validation, data validation and duplicate checks.
  • Performed SIT and E2E testing from source ingestion to outbound extract for data quality check by using HQL between sources to target.
  • Monitored and verified Zena jobs for successful run and file & data move.
  • Performing the data quality checks and audit checks on the HDFS files which we get from different sources.
  • Extensively worked on unix platform to test file against tables by using different unix commands like copy, read, modify, search or grep etc.
  • Extensively worked on Teradata EDW project and performed different data validations like table structures, count, duplicate, & data quality check by writing simple and complex SQL queries.
  • Performed XML data validations against Teradata tables by preparing Datastage job (Some Transformations techniques uses - Sequential file stage, Transformation Stage, XML Input Stage etc) to load the XML file into QA tables and compare against Developer loaded tables.
  • Designing and executing test cases and test reports by using QTEST and HP ALM.
  • Prepared test execution, defects and test status report to track completion of user stories.
  • Tested audit/control process which was developed between different data hubs by implementing Infogix control tool.
  • Involved on different trainings to get overview of different test automation utilities.

Environment/Tools: Claims, EDW, Datalake, Hadoop, Teradata, Hive, Hbase, PIG, Shell Scripting, SQL Assistant, Qtest, HP ALM, DataStage, Xml file, Putty, Winscp, Jira, & Agile.

We'd love your feedback!