- 14+ years of experience in SDLC including Requirement Gatherings, Analysis, Design, Development, Implementation, Testing and Support.
- 13+ years of experience in Software Testing, and 5+ years of experience as ETL developer.
- 5+ years of experience with Cognos. Executed multiple OLAP applications using Cognos (8, 10) to verify/validate data from the EDW which support analytical reporting for Corporate Business Units.
- Worked in Waterfall, Agile and V mode environment.
- Extensively used JMeter and Java for Test Automation.
- Experience in ETL methodology for supporting Data Extraction, Transformations and Loading process in a corporate - wide-ETL solution using Pentaho 4.x 5.x
- Experience ETL testing Automation with QuerySurge which reduced testing runtime to 1/3 compare with manual for continues testing cycles.
- Provided defect Analysis report with sample records which is generated thru QuerySurge to the development team to fix the issue.
- Created UNIX shell scripts to execute Pentaho kjb and ktr and Informatica workflow.
- Expertise lies in Functional testing, Manual Testing, Regression testing, User Acceptance testing (UAT), System testing, Integration testing and Black/White Box testing, Smoke testing
- Experienced working with SalesForce.
- Strong understanding of Banking, Confidential Care and Retail and food distribution business domain.
- Experience in writing complex SQL, Oracle PL/SQL queries including calling stored procedures by passing parameters, functions using SQL Developer and Toad as development tools.
- Expertise in QA Testing in distributed Unix Environment and Oracle & SQL Server databases as back end, Performed end-to-end testing & back-end testing.
- Experienced in dealing with clients to gather requirements, understanding the Business requirements, product demonstrations and providing product support.
- Strong understanding of Banking, Confidential Care and Retail business domain.
- Experienced in Code migration from Development to Demo, UAT, Pre-Prod and Production using UNIX and WinSCP & FileZilla.
- Experience with HP/IBM ALM tools like QA, CQ & JIRA and
- Understand the business rules based on High Level document specifications and implements the data transformation methodologies.
- Ability to meet deadlines and handle multiple tasks. A good team player, motivated, decisive with strong leadership qualities, flexible in work schedules and possess comprehensive communicational skills.
ETL Tools: DataStage, Informatica Power Center 10.x, Abinitio, & Pentaho 5.x, SalesForce, Tibco
Testing Tools: QuerySurge, beyond compare, HP Quality Center 10.x, HP ALM, TFS, JMeter
Databases: Oracle 12c, 11g, 10g, 9i, 8i, SQL Server 2012, MS Access, Netezza
Programming Skills: C, SQL, PL/SQL, SQL*Plus, HSQL, Java, VB Script, Java Script, DHTML, XML
Scripting Languages: UNIX Shell Scripting, Java Script, HDFS Scripts
SQL Tools: Toad, SQL Developer, SQL Server Management Studio
OS: Windows XP, NT, 7, 8, AIX, Unix/Linux
- Extensively used JMeter for Test Automation.
- SAP/Netezza/Oracle sources are used for integration project.
- SAP se16 t-code is used to search source table and create the csv file to load to QA repository.
- JMeter is used to Validate 22 integrations for the project built in Tibco.
- Used HTTP Header Manager Config Element to Thread group for Jason/xml Content-Type and Accept.
- Used HTTP Request Sampler to test new API created for Product Group interface.
- JDBC Connection Configuration is used to get Database connection.
- Used JDBC Request Sampler to connect and get the data from EDW (Netezza).
- BeanShell Sampler is utilized for Java code which is used for data comparisons and validation.
- Validated Source/Target data value and created STDiff and TSDiff for any Diff analysis.
- Jenkins is used to automate jmx file and other components of Interfaces. qTest is used to maintain Test Cases and execution results.
- Setup and configured SMTP Sampler to send Result statistics file and PIC document locations.
- Java testing script validates Status of each Test Cases and write it on Results Statistics file with either Pass or Fail status with the Source/Target numbers.
- Java testing script also validates Source and Target aggregate (SUM) amount and quantity columns and write it to Results Statistics file.
- Confluence tool is used for Project Management and documentation.
Environment: Informatica, Tibco, Confluence, JMeter, Webservices, API, Netezza, Java, JIRA, qTest, Jenkins
ETL QA Analyst/Tester
- Created Test plans for in house developed application “Advantage”.
- Developed QuerySurge Reusable Query Snippets and created new and automate them to execute ETL test cases.
- Provided the Test result analysis to the developer/business to fix the issue found during automation.
- Active participation in decision making and QA meetings and regularly interacted with the Business Analysts & development team to understand Business Process, Requirements & design.
- Created and executed many test cases to verify data such as Row Count, Data Type, Fields max/min length, Upper/Lower Case, Mock up data, Date Range, Cycle Ids, Duplicates, Minus, Sample Records.
- Assisted team to get to the speed to complete testing life cycle.
- Developed SQL queries to show the discrepancies between Source and Target.
- Executed Control-M jobs to load data.
- Performed troubleshooting and performance tuning of quires written for data compare. Analyzed error table entries and resolved related issues.
- Analyzed DataStage mapping to find the root cause of the job failure or data discrepancies.
- Performed detail job log analysis for any job execution failure or missing data in target.
- Used HP QC - ALM to log a defect and tracking.
- Developed simulator which create Inbound files based on sent Outbound file to complete the process cycle.
- Used Splunk to see the logs for webservices.
Environment: DataStage 11.5, UNIX, Oracle, Toad, Control-M, HP QC - ALM, WinSCP, Splunk, Webservices
ETL QA Analyst
- Performed GAP analysis between Technical STM and Business STM and report and discrepancy.
- Executed Tivoli jobs/ICS which generates Subject are Feeds like Claims, Member, Provider, LAB, Rx etc.
- Tested and verified Feed generated from EDW which then ingested to DAL RAW and STD layer.
- Performed regression testing after the defects code fix.
- Performed negative testing to make sure system behave as expected.
- Executed hdfs Scripts to inject feeds into Hive/Impala/Pig tables.
- Verified hdfs scripts executed successfully and feed copied to S3 server.
- Generated csv file from Hive/Impala/Pig tables and copied to local machine using mapped drive.
- Executed command to copy files from one location to another and unzipping files to load data.
- Executed hdfs script to load data from RAW layer to STD layer.
- Performed data validation with all trans logic applied on RAW data.
- Used vi editor to modify par file to avoid insert any special unseen characters in file.
- Loaded Hive/Impala/Pig csv data files to temp table in EDW and compare data with Feeds generated from EDW.
- Used TFS to maintain documents versions and test management.
- Run Cognos reports to verify good data is available for analytics.
- Executed query on STD layer and compare data with Cognos reports using with different prompts.
- Extensively used Clear Quest to log defects and follow up with Dev team to get the code fix and migration.
- Attended daily Scrum standups and provided update to scrum master and notify any road blocks.
- Executed ICS jobs which loads data from EDW to Salesforce and set the parameter dates to run a Partition.
- Created temp tables in EDW and loaded Salesforce data using Salesforce Dataloader.
- Verified data model using Describe Salesforce tool.
- Used JIRA to communicate Project Management all defects raised during test cycles.
- Used Sharepoint extensively for documentation retrieval and uploads.
- Closely worked with BA/DA and Developers to get the requirement docs and speed up the code fix and meet the release date.
Environment: Erwin, Apache Hadoop, HDFS, Pig, Hive, Hue, Shell Scripting, TFS, Oracle, LINUX, UNIX, Toad, Salesforce, DataLoader, Clear Quest, Describe, Jira, IPSwitch, SQL Server 2012.
Data Integration Engineer
- Effectively lead 3 Member team.
- Generated system flow, data flow and process flow diagram and assist team to accomplish project goals.
- Interacted with the Business/System Analysts to transform the business requirements into technical requirements.
- Created Technical Mapping documents, Production Implementation Documents and Resource Allocation Documents.
- Worked on PL/SQL coding and creating stored procedures to be used from Pentaho kjb and ktr jobs
- Worked with multiple pipeline mappings.
- Migrated Classic Oracle Warehouse jobs to Pentaho kjb jobs.
- Developed DataMart for Cognos reports and provided support for Cognos Framework Manager.
- Executed multiple Cognos applications to test data workflow from DWH to DM.
- Used debugger to identify bugs in existing mappings by analyzing data flow and evaluating transformations.
- Performed unit testing at various levels of ETL development and actively involved in team code reviews.
- Fixed invalid mappings and troubleshoot the technical issues related to database.
- Scheduled and monitored jobs in Tidal.
- Used Pentaho kitchen scripts to develop transformation and jobs.
- Created UNIX scripts to execute the Pentaho kjb (job) and ktr (transformation) and control the ETL flow
- Worked with Oracle RDBMS and Flat Files sources and targets.
- Developed mappings to replicate data from SQL Server tables to Oracle tables.
- Developed dimensional modeling using star schema and snowflake schema for Cognos Analytical.
- Migrated Code from Development environment to Demo, UAT, Pre-Prod and Production using UNIX and WinSCP.
- Optimized performance tuning at source, target, mapping and session level.
- Used Hints, Indexing at table level to get optimum performance.