We provide IT Staff Augmentation Services!

Sr. Etl Qa Analyst Resume

5.00/5 (Submit Your Rating)

Reston, VA

SUMMARY:

  • 11+ years of experience in Quality Assurance and 9 years of ETL & Web Services testing experience.
  • Expert as a database tester in SQL/Oracle
  • Efficiency in Data Analysis, Data Validation, Data Governance, Metadata management, Data loading
  • Involved in ETL process testing using Informatica ETL tool
  • Performed Integration, End - to-End, system testing
  • Reviewed the ETL mappings (Informatica) to ensure the transformation rules are applied correctly
  • Executed the workflows in the workflow manager, to start the ETL process
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables
  • Worked in different Phases of Project including design, development, testing and administration of varied transactional and analytical Data Marts, Elements and structures
  • Expert in Oracle database testing
  • Experience with Netezza 7.0.2
  • Ability to test incremental load/Delta load, multiple times
  • Experience in Autosys and UNIX
  • Ability to test CDC and Incremental Load
  • Total exposure and experience in every phase of Software Development Life Cycle (SDLC)
  • Expertise in Agile/Scrum Methodology and RUP Environments
  • Hands on Experience on working with RallyDev
  • Has good working knowledge in Test Case Design, Test Tool Usage, Test Execution, and Defect Management
  • Experience working on huge and complex data structures and their implications in generating business reports.
  • Can effectively work with in a team and as an individual with Excellent communication and interpersonal skills
  • Proficient with Test Environment Server Restarts, Server Configurations and Server Stats using Putty
  • Proficient in Testing Web services using SOAP UI, Curl Shell and other REST Clients
  • Experience in creating and Validating the XML files based on the defined XSD’s with the help of XML SPY
  • Hands on experience in testing the Rules Services, by firing the Rules with invalid content in the XML messages
  • Validated the exception message generated by listing all the Rules triggered
  • Experience with Jenkins in building the Code, and debugging the code/server issues related to deployments
  • Experience with SVN, PERFORCE and MAVEN Code repository Tools
  • Experience as a system analyst for different phases of the system development or data warehouse lifecycle with demonstrated ability to act in a consultative role to understand functional business processes, related technical needs, and to develop solution proposals.
  • Proven ability to work successfully with technical and non-technical groups, participate effectively on teams, and manage multiple responsibilities.
  • Good Hands-on experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL/ OLAP, Client/Server applications.
  • Hands-on experience with Metadata and data elements, registry and maintaining a data element in a metadata registry.
  • Having well experienced in test management tools Quality center(QC), Confluence and Test Manager, Rally, Jira etc.
  • Used MS Excel and access tools to work on data to get desired results and generating reports.
  • Intelligence technology functions like reporting, OLAP, analytics, data mining, business performance management, benchmarking, text mining, and predictive analytics.
  • Proficient in researching, planning, coordinating and recommending BI solutions, software and system choices to meet the organization's business objectives.
  • Experienced in interacting with business users and executives to identify their needs, gathering requirements and authoring Business Requirement Documents (BRD), Functional Requirement Document (FRD), Use Case Diagrams, Activity Diagrams and Sequence Diagrams

TECHNICAL SKILLS:

Data Warehousing: Informatica 9.0/8.6, Ab Initio GDE 1.15/1.14x/1

Reporting Tools: Cognos 10.2, MicroStrategy 8.0.2

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and Dimension Tables, Pivot Tables, Erwin

Testing Tools: Mercury Quality Center, Rational Clear Quest, Doors, Req Pro, IBM RIT, Hermes, XMLSPY, MS Office, Confluence, Toad, Fiddler, Splunk, Jenkins, SVN, Maven

RDBMS: Netezza 7.0.2, Oracle 11g/10g, MS SQL Server 6.5/7.0, 2000, Sybase, Teradata

SQL Teradata V2R6, MS Access 7.0Database querying Tools: Toad 9.5, 9.7.2, SQL Assistant 4.x, TOAD, SQL Developer

Programming: C, UNIX Shell Scripting, XML, HTML, SQL, PL/SQL

PROFESSIONAL EXPERIENCE:

Confidential, Reston VA

Sr. ETL QA Analyst

Responsibilities:

  • Customizing internal testing framework to flatten and load XML file to Oracle table.
  • Writing complex sql queries for table to table comparison.
  • Test case preparation for based on business functionality and processes.
  • Validating schema of all source xml files using the latest version of xsd.
  • Loading files to actual table using tibco messaging services. Tibco topics and queues are used for message push.
  • Analyzing any production issue and finding root cause based on actual business process.
  • Writing complex sql queries to compare Netezza tables.
  • Understanding the architecture model of the business capability of the project for creating test scenarios.
  • Executing production shakeout after each release based on current business logic.
  • Perfoming end to end testing in UAT environments.
  • Refresh actual production data in UAT environment and execute test cases and perform end to end testing
  • Attended requirement/design meetings to understand project and testing scope.
  • Developed test approach, test cases for informatica ETL’s based on technical specifications and Mapping documents
  • Implemented automation of test scripts in Netezza 7.0.2.
  • Created temp tables to load XML files source data and validated the ETL loads.
  • Created Adhoc load process template and loaded xml data into tables
  • Executed informatica workflows & sessions from the informatica workflow manager and captured the run time.
  • Extensively used TOAD to verify source data and target data after the successful workflow runs using SQL
  • Validated Informatica job dependencies for integration testing
  • Validated meta data for target tables with reference to data dictionary and modeling document
  • Executed sessions and batches in Informatica and tracked the log file for failed sessions
  • Validated Informatica transformation loading process by SQL queries using various aggregate functions, Joins, sub-queries
  • Used VI Editor to Modify Config and Property Files on the Test Servers
  • Used SVN to check-in and check-out RIT Automation Test Scripts
  • Created and Validated the XML files based on the defined XSD’s with the help of XML SPY
  • Validated the XML Schemas which referenced to Single XSD and Multiple XSD’s
  • Validated the XML messages setting with conditions that will fail rule validation(s) and Listing all the failed.
  • Performed Test Environment Server Restarts, Server Configurations and Server Stats using Putty through UNIX Commands
  • Used VI Editor to Modify Config and Property Files on the Test Servers
  • Developed a good command over the XSD’s which helped in creating XML files and XML messages for Test Needs
  • Check the naming standards, data integrity and referential integrity. Prepared the test data for incremental loads and Base loads.
  • Involved in writing complex SQL queries by combing data from multiple tables with various filters
  • Validated Informatica extraction, transformation and loading process by writing SQL against Business rules.
  • Verified record counts, column data types, column values, exceptional handling and SCD1 & SCD2 functionality
  • Worked extensively with Database Procedures, Functions, Cursors and Joins to validate data.
  • Validated vending XML messages from various sources
  • Extensively using Oracle, XML, Flat files and Informatica for data set-up, Informatica and UNIX box for Data Analysis and running batch jobs.
  • The test cases and results are documented in Quality Center and SharePoint link.
  • Worked closely with Business Analyst, System Analyst, and Developers to understand Requirements and Technical Design documents.
  • Developed test approach, test cases for informatica ETL's based on technical specifications and Mapping documents
  • Worked on Informatica 9.6.1 with development team, to resolve defects in SIT.
  • Validated the Source, Stage and Target (End-to-End) based on the test conditions derived from the business requirements.
  • Conducted the Regression tests after each and every bug fix.
  • Coordinated with upstream and downstream systems and identifying the business scenarios.
  • Worked in an agile environment which brings key performers working together in one location - dedicated to one project until its completion.
  • Worked closely with Developers in resolving the issues and discussing the functional dependencies.
  • Worked to minimize risk by identifying, communicating issues/risks in advance.

Confidential, Reston VA

Sr. ETL QA Engineer

Responsibilities:

  • Responsible for creating complete test cases, test plans, test data, and reporting status ensuring accurate coverage of requirements and business processes.
  • Analyzing requirements and creating and executing test cases in HP ALM.
  • Involved in ETL process testing using Informatica ETL tool.
  • Test Case Execution and Adhoc testing.
  • Performed Integration, End-to-End, system testing
  • Verified RESTFUL web-services using Fire-Fox Rest Client and other HTTP Clients
  • Performed Functional Testing(Black box/White Box), Integration, Regression, Systems, Backend, User Acceptance testing
  • Involving in writing SQL queries and Database Checkpoints to verify data quality and calculations, reviews
  • Performed data validation testing writing SQL queries.
  • Wrote complex queries in Oracle SQL assistant to check the data from Source and Target.
  • Supported the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Informatica.
  • Reviewed the ETL mappings (Informatica) to ensure the transformation rules are applied correctly.
  • Executed the workflows in the workflow manager, to start the ETL process.
  • Identifying field and data defects with required information in ETL process in various jobs and one to one mapping.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Extraction of test data from tables and loading of data into SQL tables.
  • Working on Quality Center for creating and documenting Test Plans and Test Cases and register the expected results
  • Experience assessing testing processes, creating and implementing testing strategies in UAT phase.
  • Using Clear Quest for bug tracking and reporting
  • Preparing documentation for some of the recurring defects and resolutions and business comments for those defects.
  • Developing Test Matrix to give a better view of testing effort

Confidential, Reston, VA

Data Warehouse Tester (Sr. ETL QA Analyst)

Responsibilities:

  • Attended requirement/design meetings to understand project and testing scope.
  • Developed test approach, test cases for informatica ETL’s based on technical specifications and Mapping documents
  • Implemented automation of test scripts in Netezza 7.0.2.
  • Created temp tables to load flat files source data and validated the ETL loads.
  • Created Adhoc load process template and loaded xml data into tables
  • Executed informatica workflows & sessions from the informatica workflow manager and captured the run time.
  • Extensively used TOAD to verify source data and target data after the successful workflow runs using SQL
  • Validated Informatica job dependencies for integration testing
  • Validated meta data for target tables with reference to data dictionary and modeling document
  • Executed sessions and batches in Informatica and tracked the log file for failed sessions
  • Validated Informatica transformation loading process by SQL queries using various aggregate functions, Joins, sub-queries
  • Check the naming standards, data integrity and referential integrity. Prepared the test data for incremental loads and Base loads.
  • Involved in writing complex SQL queries by combing data from multiple tables with various filters
  • Validated Informatica extraction, transformation and loading process by writing SQL against Business rules.
  • Verified record counts, column data types, column values, exceptional handling and SCD1 & SCD2 functionality
  • Worked extensively with Database Procedures, Functions, Cursors and Joins to validate data.
  • Validated vending XML messages from various sources
  • Extensively using Oracle, XML, Flat files and Informatica for data set-up, Informatica and UNIX box for Data Analysis and running batch jobs.
  • The test cases and results are documented in Quality Center and SharePoint link.
  • Worked closely with Business Analyst, System Analyst, and Developers to understand Requirements and Technical Design documents.
  • Developed test approach, test cases for informatica ETL's based on technical specifications and Mapping documents
  • Worked on Informatica 9.6.1 with development team, to resolve defects in SIT.
  • Validated the Source, Stage and Target (End-to-End) based on the test conditions derived from the business requirements.
  • Conducted the Regression tests after each and every bug fix.
  • Coordinated with upstream and downstream systems and identifying the business scenarios.
  • Worked in an agile environment which brings key performers working together in one location - dedicated to one project until its completion.
  • Worked closely with Developers in resolving the issues and discussing the functional dependencies.
  • Worked to minimize risk by identifying, communicating issues/risks in advance.

Environment: Informatica 9.5, Oracle 11g, Sybase, TIBCO, UNIX, Shell Scripts, Flat Files, XML files, SQL, PL/SQL, ALM, ICART, GEMS, XML SPY and TOAD, Netezza 7.0.2

WELLS FARGO, Urbandale, IA

Application Systems Analyst

CORE (Common Opportunities, Results, and Experiences) is the umbrella project that is broken down into three functional tiers with associated delivery teams: Customer Facing, Deal Processing, and Deal Decision. The primary business driver of CORE is to allow the business to meet speed-to-market needs; this pervades into almost every aspect of the loan origination process. Specifically for Deal Decision, this means delivering processes and systems that provide common practices in functional areas such as credit policy and pricing. Deal Decision will provide an architecture that will facilitate consistent, timely delivery of Wells Fargo products and fully enable the business to meet the fast changing demands of the market through a BRMS (Business Rules Management System) solutions

Responsibilities:

  • Prepared coverage report for every release for Rules testing
  • Created test cases to verify accuracy of business rules implemented by different services and RuleEntryPoints.
  • Testing each and every rule is firing or not for every service using In-house Rules testing tool called Argent. In order to test this, need to prepare input xml’s for every service with necessary Input data in order to fire a rule by looking at the Rule repository.
  • Generated Oracle SQL scripts to validate large scale data against the XML Source
  • Executed SQL queries to validate the Data in the back-end using Toad
  • Prepared ETA and resource for Rule Scenario testing for every quarterly production release.
  • Conducted orientation program on new product and features for every release.
  • Coordination with external vendors like Freddie Mac, Confidential, HUD and CREDCO
  • Reporting the testing progress to the manager.
  • Used VI Editor to Modify Config and Property Files on the Test Servers
  • Testing Rule flow. Check the Rules logs for order of Rule sets fired.
  • Prepared Test plans, Test Scenarios and Test Scripts as per functional, Business requirements
  • Executed Smoke, Functional, Regression, Integration and UAT Tests in Test and Integration Environments
  • Extensively used Jenkins to Deploy the code in to Test Servers and Test Integration Servers
  • Executing regression test suite using in-house developed batch execution automation tool called TestHarness.
  • Closely monitoring the deployments to different environments and validating the environments by running regression tests.
  • Used Quality Center for Test Script executions and Clear Quest for Defect Tracking
  • Reviewed Functional Test Cases based on major functionalities, to be tested.
  • Created and Validated the XML files based on the defined XSD’s with the help of XML SPY
  • Validated the XML Schemas which referenced to Single XSD and Multiple XSD’s
  • Performed Test Environment Server Restarts, Server Configurations and Server Stats using Putty through UNIX Commands
  • Used VI Editor to Modify Config and Property Files on the Test Servers
  • Developed a good command over the XSD’s which helped in creating XML files and XML messages for Test Needs

Confidential, Washington DC

Application Quality Analyst

Responsibilities:

  • Prepared Test plans, Test Scenarios and Test Scripts as per functional, Business requirements
  • Communicating with Third Party Clients to prepare integration test plans
  • Executed Smoke, Functional, Regression, Integration and UAT Tests in Test and Integration Environments
  • Generated Oracle SQL scripts to validate large scale data against the XML Source
  • Executed SQL queries to validate the Data in the back-end using Toad
  • Good understanding of Dimensions and Facts, Validated CDC process.
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted.
  • Implemented Database Checkpoints for Back-end Testing
  • Used SQL for Querying the Oracle database for data validation and data conditioning
  • Extensively used Jenkins to Deploy the code in to Test Servers and Test Integration Servers
  • Debugged the code/server issues related to deployments to fix the deployment issues
  • Generated IBM RIT Automation Test Scripts To Test the Functionality of the Application based on the XML Source Files, and TIBCO Messages
  • Generated Oracle SQL scripts to retrieve data from the CLOBS and other Data Tables of the Data-store (IDS)
  • Performed Test Environment Server Restarts, Server Configurations and Server Stats using Putty through UNIX Commands
  • Used VI Editor to Modify Config and Property Files on the Test Servers
  • Used SVN to check-in and check-out RIT Automation Test Scripts
  • Created and Validated the XML files based on the defined XSD’s with the help of XML SPY
  • Validated the XML Schemas which referenced to Single XSD and Multiple XSD’s
  • Validated the XML messages setting with conditions that will fail rule validation(s) and Listing all the failed rules that were triggered, and published them to the exception topic
  • Identified Rule Failure scenarios and created XML messages with rule failure scenarios
  • Created Rule(s) and Context and linked them in the DataBase
  • Validated the XSD’s developed by the Architecture Team to point any issues in the attribute and data definitions
  • Developed a good command over the XSD’s which helped in creating XML files and XML messages for Test Needs
  • Worked With Lewtan and SBO who are Third Party vendors for the Project for the Functional and Integration Test Plans and Test Scenarios
  • Used Quality Center and Rally for Test Script executions and Defect Tracking Tools
  • Generated Shell Scripts to transfer the XML files from one server to the other server
  • Monitored the Logs using Splunk & Putty for issue debugging to help Dev Team in quicker resolutions of the defects
  • Configured the Source Code Config & Property Files to integrate two servers for File transmissions
  • Prepared Test Data for the Team to satisfy the Test Needs
  • Verified RESTFUL web-services using Fire-Fox Rest Client and other HTTP Clients
  • Performing server level tests during the code deployment for the server level QA sign off
  • Extensively worked on Execution of automated scripts, analyzing the results, enhancing the scripts
  • Coordinating with Developers for Defect analysis and debugging.
  • Responsible for Daily status, attend the bridge meetings showing progress and future testing efforts in Scrum Meetings

Environment: XML, Java, Web services, UNIX, Oracle, SQL, Windows, LINUX, QTP, SOAP-UI, SVN, TIBCO, Rational Test Bench, XML- SPY, EMS, ETL

Confidential, Herndon VA

Application Quality Analyst

Responsibilities:

  • Compiled SQL statements based on Solution Specs
  • Executed ETL Autosys jobs
  • Validated full volume data
  • Validated BO reports to make sure that the newly added columns exist in those Reports.
  • Validated collateral data ( XML messages) during the Integration testing.
  • Validated UI functionality to Manage Agreement types.
  • Exported TC’s from Quality Center to Doors, and generated RTM

Environment: Java, JSP, Java Script,XML, Clear Quest, Rational Requisite Pro,Herms JMS, Autosys, Browser,Quality Center 10.0,UNIX, Shell Scripting, Oracle 9i/10g, MSWord, MS Excel, Windows XP, TOAD 9.0

Confidential, Jersey City, NJ

ETL / SQL Tester- EDWH

Responsibilities:

  • Worked as a Database tester testing the databases against FRS, BRS and SRS and other documentation.
  • Tested the databases at various stages including development, UAT and other phases of the project.
  • Extensively involved in Business Analysis and Requirements Gathering.
  • Working with Business and Architecture Team to create and understand the mapping documents and involved in Internal Design Review meetings to map the source to Target.
  • Used Microsoft Excel spreadsheet tool for performing calculations, analyzing data and integrating information from different programs.
  • Design and populate test data for ETL/Aggregations, DW & Cognos BI reporting projects.
  • Provided 24/7 support for Teradata, Informatica and Oracle PL/SQL processes that were automated and scheduled through Autosys.
  • Conducted Weekly Onsite-Offshore Status Meetings.
  • Involved in extensive DATA validation using SQL queries and back-end testing.
  • Used Hp quality center for test management.
  • Interacted with the customers to know their BI/System requirements.
  • Experienced in data analysis using SQL, PL/SQL and many other queries based applications.
  • Worked in Unix environment for file comparison and data checks
  • Extensive experience in writing SQL and PL/SQL scripts to validate the database systems and for backend database testing.
  • Responsible for execution of Batch jobs and population test data for other teams.
  • Executed the Batch job in Autosys and validated the data.
  • Performed the data validation once the Batch Job execution is completed.
  • Created project plans for development of Cognos projects.
  • Extensively involved in Business Analysis and Requirements Gathering.
  • Wrote complex SQL, PL/SQL Testing scripts for Backend Testing of the data warehouse application.
  • Used Teradata SQL to Create reports from RDBMS with an ODBC interface.
  • Extensively used various inbuilt transform functions like string substring, string lpad, string index, lookup functions, date functions and error functions.
  • Developed UNIX Korn Shell wrappers to initialize variables, run graphs and perform error handling.
  • Logged and resolved defects in the roll out phase.
  • Developed web-based business intelligence metadata search and analysis tools and automated SQL capture of static reports using ASP and Cognos 10.2 web API.
  • Tested both conditional formatting and threshold level testing for several reports developed in Cognos.
  • Improved the performance of viewing the Baseline and Actual data from 20 minutes to less than 2 minutes by moving some functionality from Informatica to Teradata.
  • Performed backend database testing by writing SQL and PL/SQL scripts to verify data integrity
  • Used Teradata SQL to Import data from a System file directly to the database Used Teradata SQL to create many similar reports (query results or Answer sets) like displaying the DDL (SQL) that was used to create a list of tables
  • Detailed repeatable end to end test cases for application, interviewed developers to gain detailed information on complex calculations and backend exception processing in order to validate the back end utilizing SQL Plus
  • Executed and scheduled tasks; batch processing, metadata management and data mapping.
  • Customized complex reports in Excel using intricate formulas
  • Facilitated meetings with business users, Project Managers, and IT personnel.
  • Processed a query to update a field in an Oracle database, to ensure that debt payments are allocated.
  • TOAD is used to perform manual test in regular basis. UNIX and Oracle are using in this project to write Shell Scripts and SQL queries
  • Experienced in working with DB2, and Terada
  • Conducted interviews and facilitate requirements gathering, process analysis, and application design sessions with company staff across all levels and functions.
  • Interview business users and business SMEs to verify business requirements; gather, analyze and compile Business Requirement Document, Visio Diagrams and Functional Requirement Document
  • Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security
  • Conducted Integration testing, Performance testing, Functional testing and Load testing to ensure that the application meets the requirements and to ensure usability readiness.Extensively used T-SQL to verify and validate the data loaded to the SQL Server 2005

Environment: Informatica 9.0, Data Profiler Shell Scripting, Oracle 11g/10g/9i, Sun Solaris, HP QTP 9.2, Oracle 11g/10g/9.0, Quality Center 10.0, IBM Mainframes, MS Visio, Excel, Cognos 10.2, Clear case, ERWIN 3.5, Clear Quest, Batch controller, and Autosys

Confidential, Monterey Park, CA

ETL Tester

Responsibilities:

  • Extracted data from various sources like Oracle, flat files and SQL Server.
  • Designed reports in Access, Excel using advanced functions not limited to vlookup, pivot tables, formulas.
  • Solid testing experience in working with SQL Stored Procedures, triggers, views and worked with performance tuning of complex SQL queries.
  • Responsible for different Data mapping activities from Source systems to Teradata.
  • Used Partition components like partition by expression, partition by key, etc., to run the middle layer processing parallel.
  • Closely monitored the Autosys batch jobs in ETL/Informatica batch run during System, Integration and Acceptance test runs.
  • Executed sessions and batches in Informatica and tracked the log file for failed sessions.
  • Modified and ran UNIX Scripts for batch jobs.
  • Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted.
  • Implemented Database Checkpoints for Back-end Testing
  • Used SQL for Querying the Oracle database for data validation and data conditioning
  • Produced variance reports, Excel graphs, charts, presentations in PowerPoint
  • Designed, implemented and maintained code needed for data extraction, transformation and loading.
  • Prepared test cases by understanding the business requirements, Data Mapping documents and technical specifications.
  • Analyzed business requirements, system requirements, data mapping requirement specifications and responsible for documenting functional requirements and supplementary requirements in Quality Center 9.0
  • Customized complex reports in Excel using intricate formulas.
  • Implemented SDLC, QA methodologies and concepts in the Project.
  • Involved in testing of Universes from different data sources like Oracle/SQL Server.
  • Worked on profiling the data for understanding the data and mapping document.
  • Reported bugs and tracked defects using Quality Center 9.0
  • Developed Access databases and associated graphical user interfaces.
  • Tested the Oracle Applications and Portals like Executive Dashboard, CIE Portal built on Oracle. Used TOAD Software for Querying ORACLE.
  • Prepared and ran regression scripts, shell scripts and performed sanity tests and end-end testing.

Environment: Informatica 8.6.1, Toad, SQL, PL/SQL, XML, XML Spy 2008, SQL, Server200/2005, Quality center, GEMS, Clear Quest

Confidential, New York, NY

Senior Data Warehousing Tester

Responsibilities:

  • Worked with ETL/ Ab Initio GDE group for understating mappings for dimensions and facts.
  • Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.
  • Extensively used Ab Initio GDE for extraction, transformation and loading process.
  • Worked with ETL group for understating Ab Initio GDE graphs for dimensions and facts.
  • Design and populate test data for ETL/Aggregations, DW & MicroStrategy BI reporting projects.
  • Analyzed business requirements, system requirements, and data mapping requirement specifications interacting with client, developers and QA team.
  • Implemented partition techniques using Partition by Key, Partition by Expression and Round-Robin techniques on the data unloaded from multiple tables before sending the data through data quality checks.
  • Different Informatica transformations especially are used effectively to develop and maintain the database.
  • Extraction, Transaction and Loading was done using different transformations and different expressions using informatica to create test data.
  • Tested data migration to ensure that integrity of data was not compromised. .

Environment: Ab Initio GDE 1.15, Oracle 9i/10g, SQL, PL/SQL, Stored Procedures, XML, XSD, Unix Shell Scripting, Teradata V2R5

Confidential

Software Engineer

Responsibilities:

  • Involved in Data Validation using SQL queries.
  • Creating and executing SQL queries to perform Data Integrity testing on an Oracle Database to validate and test data using TOAD.
  • Responsible for performing Integration testing, performance testing, white box testing, Functionality testing, Negative testing, Positive testing, Regression testing on the application.
  • Responsible for testing GUI module which is the heart of Patient-Soft application. Understanding existing sales system.
  • Executed test cases before and after bug fixes for each build.
  • Interacted with developers regarding priority of bugs and update the status of bugs once they are fixed.
  • Involved in Data Validation with Business Team.
  • Performed negative testing, to make sure validations are done properly.
  • Carried out and oversee test execution for Business scenario, User acceptance and Regression testing.
  • Actively involved in Review meetings and walkthroughs. Provide update to management on testing status and some other issues.

Environment: Informatica Power Mart 7.1, Erwin, XML, XML Spy, Toad, Oracle 8i, Windows 2000, and UNIX.

We'd love your feedback!