We provide IT Staff Augmentation Services!

Etl\bi Qa Lead Resume

5.00/5 (Submit Your Rating)

MN

SUMMARY

  • 7.8 Years of experience in Information Technology with specific concentration on design,development and testing of database and data - warehousing.
  • Extensive experience in ETL processes using Informatica Powercenter ( 7.1/8.5/9.1/9.5/9.6 .1 ) and SSIS.
  • Experience as Database Developer and Modeling tasks.
  • Experience in all phases of the Data warehousing Life Cycle involving design, development, and analysis and testing of Data warehousing using ETL, Data Modeling and reporting tools.
  • Good knowledgeofRDBMS concepts and proficiency in developing SQL with various databases like ORACLE,MS-SQL Server and Amazon Redshift.
  • Good working knowledge of cube processing and analysis services.
  • Practical understanding of the Data Modeling concepts of Star Schema, Snowflakeschema, Fact and Dimension tables, modeling of data at all the three levels-Conceptual,Logical and Physical.
  • Experience inPerformance Tuning technique, identifying and resolving performance bottlenecks at various levels like sources, targets, mappings and sessions.
  • Experience working in multiple project at a time with different teams fulfilling their quality requirements as a QA lead.
  • Expertise in stimulating and understanding business requirement to express it in terms of functional QA requirements.
  • Experiencein creating test plan documents, test strategy documents and designing the test scenarios and scripting the test cases in order to test the application or data
  • Capable of identifying the issues, finding the root cause and recommending the possible solution to the team as a result of detailed testing and analysis.
  • Proficient in different levels of testing like Functional, Regression, and Integration testing as per business requirement
  • Experience working in HP Application Lifecycle Management (ALM) for QA testing.
  • Capable of overseeing a project from conceptual phase to development by utilizing Technical, project Management skills.
  • Worked extensively with Dimensional modeling, Data migration, Data cleansing, Data profiling, ETL Processes, Data mining, Data audits and Web reporting features for data warehouses.
  • Very strong mix of academic and professional experience in the related area, demonstrated through the implementation of successful projects and diverse roles.
  • Proficient in Ralph Kimball’s dimensional approach to designing large data warehouses.
  • Excellent Communication Skills, interpersonal skills and responsible for interacting with business partners to identify information needs and business requirements for reports.

TECHNICAL SKILLS

Operating Systems: Windows Vista/XP/NT/2003/8.1/10, UNIX/LINUX

Databases: Oracle 12c/11g/10g/9i/8i, MS SQL Server 2016/2012/2008/2005 , Teradata, AWS Redshift, Hadoop, SQL Server Analysis Services (SSAS)

Modeling Tools: SQL Data Modeler,Erwin,MS Visio

ETL Tools/Languages: Informatica Power Center 7.1/8.5/9.1/9.5/9.6 , Informatica PowerExchange, SAP Business Object Data Services (BODS) 4.2, SQL Server Integration Services (SSIS), PL/SQL, My SQL

Reporting Tools: MicroStrategy (9.4/10.4), SQL Server Reporting Services, SAP BusinessObjects BI Launch pad, Power BI

PROFESSIONAL EXPERIENCE

Confidential, MN

ETL\BI QA Lead

Responsibilities:

  • Working as a QA lead on a multiple project.
  • Assisting business analystsin documenting requirementsbased on the business processes and existing business rules
  • Creating test requirement documents including test scope, test plan, test cases and scenarios for database and report testing based on business requirement.
  • Writing complex SQL queries to validate data transformation per business logic for data warehousing components to build net position report.
  • Performing manual test to validate test cases including functional test, regression test, incremental load test and performance test.
  • Performing extensive data validation inwarehouse from different source systems
  • Performing data validation in Hadoop for all data set from different databases and source systems such as data warehouse, JDE, power apps
  • Validating BI reports in SAP BI launch pad and Power BI as per the report requirements and sources such as data warehouse, Hadoop, power apps
  • Identifying the issues, finding the root cause and recommending the possible solution to the team.
  • Worked with business analyst to define expectations for UAT.
  • Assisted business users and stakeholders in User Acceptance Testing (UAT).
  • Conducting post prod validations to ensure everything is working as expected in production after migration
  • Recommending go-no go for each release depending on the result of testing and criticality of bugs if any
  • Logging bugs and tracking defects using Azure devOps, maintaining issues logs, checking the updates, communicating progress, issues, blockers to the team in the daily scrum meeting.

Environment: \Tools:MS-SQL server (2016),SSIS, SSAS, SSRS, Hadoop(ClouderaHue), SAP BusinessObjects BI Launch pad (Reporting), Power BI, PowerApp, Tidal(Scheduling),Azure DevOps, SharePoint.

Confidential, Scottsdale, AZ

ETL\BI Quality Analyst

Responsibilities:

  • Creating test plans and preparing test cases as per the business requirement and mapping document for the new data sets and existing data sets inAWSredshift data warehouse and BI reports in Microstrategy.
  • Writing complex SQL queries to validate data transformation from source to compare with data loaded in the target (using Matillion ETL).
  • Performing manual test based on the test cases, business requirement and use cases including functional test, regression test, incremental load test and performance test.
  • Performing extensivedata validation in warehouse against legacy system, ODS.
  • Validating BI reports including metrices created by Microstrategyagainst business requirement and mapping document using complex SQL queries.
  • Carrying out validations after migration to upper environment-PreProd and Post ProdValidation.
  • Identifying issues, bugs and recommend the possible solutions.
  • Recommending go-no go for each component depending upon result of the testing
  • Communicating progress, issues, blockers to the team in the daily scrum meeting and logging and tracking defects using Trello.

Environment: \Tools:AWS Redshift, SQL server, Athena, Microstrategy (10.4), Matillion, SharePoint, Trello.

Confidential, Minneapolis, MN

ETL Quality Analyst

Responsibilities:

  • Analyzing and translating complex business requirements and functional specifications into testing requirements.
  • Planning, writing, preparing and executing manual test cases based on business requirements, functional specification, business rules and Use Cases
  • Responsible for verifying business requirements, ETL Analysis, ETL test and design of the flow and the logic for the Data warehouse using Informatica and Shell Scripting
  • Carrying out - Functional testing and Regression testing
  • Performing extensive data validations against Data Warehouse
  • Extensive experience in writing SQL and PL/SQL scripts to validate the backend database testing.
  • Performing ETL testing based on mapping document for data movement from source to target.
  • Testing several Informatica mappings to validate the business conditions using complex SQL queries to validate the data transformation rules.
  • Extensively using Informatica to debug, run and monitor loads for testing
  • Involved in bug tracking meetings, Scrum meeting and other test planning meetings
  • Performed end to end testing across databases to ensure data consistency
  • Reporting progress, documenting test cases, logging and tracking defects using Rally

Environment: \Tools: Informatica Power Center 9.5, PL/SQL, Oracle 11g (Transitioning to 12c), SQL Developer, and UNIX, tools: Beyond Compare, Rally, SVN, GITHUB, QUIP, SharePoint, Erwin

Confidential, Charlotte, NC

Database Developer\Testing

Responsibilities:

  • Engagedwith clients to understand the business requirement and scope of the assignment/story.
  • Created high level functional approach document
  • Created data models - Conceptual, Logical, Physical using ErWin.
  • Created Stored Procedures - used XQuery to insert data.
  • Handled XML payload as Stored Procedure input, used XQuery to parse the data.
  • Worked closely with business analyst to analyze client’s existing denormalized data model and gathered requirements for designing new normalized data model with multitenant architecture design
  • Identified core data elements required to apply business rules and reports for the client’s
  • Worked closely with the data architect to identify the gaps and design and develop new normalized data model
  • Designed strategies to migrate data from highly denormalized database to a normalized (3rd normal form) data model
  • Designed and developed SSIS packages to migrate data to new model with required data transformations for customer data mastering
  • Created stored procedures using XQuery to load xml files generated by Biztalk canonical schema to newly designed normalized data model
  • Updated the status of the assignment/project to the team lead, project manager in daily scrum and logged story details in JIRA.

Environment: \Tools: BizTalk Server 2016, SQL Server 2016, inRule, SAP BODS, ErWin, SQL Server Management Studio 2016, SQL Server Data Tools 2015 (SSIS), XQuery, Windows 10

We'd love your feedback!