Etl Tester Resume
Phoenix, AZ
SUMMARY
- 9+ years of professional experience in Business Intelligence Testing as Software Quality Assurance Engineer on different platforms and environments
- Experience in full Software Development Life Cycle (SDLC), in - depth knowledge of contemporary QA/test principles, methodologies.
- Tested the ETL process for both before Data validation and after data validation process.
- Was involved in design part for Extracting data from different source systems into the target database using Informatica
- Extensive experience in Order Management system of Order Life Cycle sourcing payments, Invoicing, Returns Logistics and Distribution and Store Management applications
- Preparation of High level design and ETL Technical specifications
- Created ETL test data for all ETL mapping rules to test the functionality of the Informatica graphs.
- Sound Knowledge and experience in Metadata and Star schema/Snowflake schema. Analyzed Source Systems, Staging area, Fact and Dimension tables in Target D/W.
- Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)
- I was also involved in performance tuning of ETL: informatica code and PL/SQL components
- Extensive experience working with ETL tools like Data Stage, Informatica, AbInitio, SSIS, and AutoSys tools
- Experienced in Testing portlets using WebSphere Portlet Factory, and custom builders in WebSphere Portlet Factory
- Experienced in using WebSphere Dashboard Framework builders.
- Experienced in configuring the custom user attributes in LDAP with WMM of WebSphere Portal, and VMM of WAS
- Experience new features implemented by Azure to reproduce and troubleshoot Azure end-user issues and provide solutions to mitigate the issue.
- Strong technical expertise in Integration testing, Functional testing, GUI testing, Backend testing, Security testing
- Efficient in analyzing System Requirements, Use Cases and other documents to gain overall understanding of the new application to determine the appropriate level of testing required and designing end-to-end testing framework that supports both manual and automated.
- Patching and updating Azure IaaS VM’s.
- Migrated on Premises Application to Azure.
- Implemented continuous integration and continuous delivery for Azure Web Apps
- Migrated on Premises servers to Azure using Azure Site Recovery.
- Replicated on-premises physical servers and virtual machines to Azure using Azure Site Recovery.
- Planned, configured, optimized and deployed Microsoft Azure solutions (IaaS, PaaS, VMs, Azure Active Directory, Automation, Monitor, etc.).
- Integrated on-premises Windows Active Directory with Azure Active Directory, and configured multi-factor authentication and federated single sign on.
- Extensively Worked in Quality Assurance using Testing methods for both Manual and Automation tests with all phases of Software Development Life Cycle (SDLC) in Requirements gathering (FSD/SRS).
- Experience in Analysis, Design, Development, Implementation, Testing, and Production & Maintenance Methods using Waterfall, CMM, and Agile/Scrum.
- Working knowledge of SQL, HTML and XML and internet browsers.
- Adept at using various HP tools such as HP Quality Center, HP ALM,HP QTP,HP UFT,HP Loadrunner.
- Experienced in developing and maintaining Test Plans, Test Scripts and Test Cases, Defect Tracking, and Report Generation.
- Expertise in Unit Testing, Integration Testing, GUI testing, Functional testing, System testing, Regression Testing, User Acceptance testing, end-to-end testing and Black Box testing methodologies.
- Experience in writing and executing SQL queries to perform Data Validation and Back end testing of databases to check the integrity of data.
- Produce proactive reports, trending analysis, service level reporting, process consultation and application of ITIL best practice.
- Experience with data analysis, analyzing, documenting business requirements and data specifications
- Experience in creating the Requirement traceability matrix and performing the corresponding analysis.
- Experience in coordinating testing efforts with the offshore teams.
- Excellent communications skills, strong problems solving skills and a good team player.
- Ability to meet deadlines and handle pressure and co-ordinate multiple tasks in a work / project environment.
TECHNICAL SKILLS
ETLBI Testing tools: Informatica Power Center 10.x/9.x/8.x, Informatica IDQ, DVO, Data stage, Abinitio, SSIS, Cognos 10.x/9.x/8.x, Snowflake in Cloud, Tableau, Python, MDM, Teradata, UAT
Big Data Tools: HDFS, MapReduce, Sqoop, Hive, PIG, HBASE, Zookeeper, Redshift database and AWS Glue testing
Cloud Services: Azure, Azure SDK, Azure Chatbot
Defect Tracking Tools: HP /Quality Center, JIRA.Rally, QTP, JUnit, TestNG, Firebug, Fire Path, Load Runner, JMeter.
Build Tools: ANT, Maven.
Products: Trizetto Facets proprietary healthcare software system 4.8/5.4
CI/CD Tools: Jenkins, Hudson’s.
Databases: Oracle 9i/10g, 11g/12c, SQL/PL-SQL, DB.SqlServer, V2R6 Teradata SQL Assistant.
Version Control: Subversion, Team forge, GIT, GITHUB, Source Tree
Operating Systems: Windows XP/7/8,10, Mac OS, UNIX and Linux, Android
PROFESSIONAL EXPERIENCE
Confidential, Phoenix AZ
ETL Tester
Responsibilities:
- Involved in creation of test plan, test scenarios, test cases, Observation Summary Report and Test Execution Summary Report.
- Worked in Agile based methodology and worked in sprint to sprint.
- Attended Daily Scrum Meetings and provided with the status of each day.
- Researched business data to understand source and meaning, performed business analysis, created ETL mapping and data transformation rules.
- Extensively involved in ETL code using Informatica ETL tool in order to meet requirements for extraction, transformation, cleansing and loading of data from source to target data structures
- Extensively worked in the performance tuning of the programs, ETL Procedures and processes.
- Tested the ETL process for both before data validation and after data validation process
- Designed and Tested UNIX shell scripts as part of the ETL process to automate the process of loading, pulling the data for testing ETL loads.
- Extensively used Informatica to load data from Flat Files to Teradata, Teradata to Flat Files and Teradata to Teradata
- Supported the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Informatica.
- Selecting the appropriate AWS service based on compute, data, or security requirements.
- Experience in Integration of Amazon Web Services AWS with other applications infrastructure.
- Good experience on working with Amazon Web Services like EC2, S3, Amazon Simple DB, Amazon RDS, Amazon Elastic Load Balancing, Amazon SQS, AWS Identity and access management, AWS Cloud Watch, Amazon EBS and Amazon Cloud Front.
- Designed AWS architecture, Cloud migration, Dynamo DB and event processing using Lambda function
- Experience in managing and securing the Custom AMI's, AWS account access using IAM.
- Managed storage in AWS using Elastic Block Storage, S3, created Volumes and configured Snapshots
- Experience configuring AWS S3 and their lifecycle policies and to backup files and archive files in Amazon Glacier.
- Experience in creating and maintaining the databases in AWS using RDS.
- Expertise in Deployment Automation, Release Management, Provision full stack using AWS CloudFormation and Elastic Beanstalk
- Experience in Creating various CloudWatch alarms that sends an Amazon Simple Notification Service (SNS) message when the alarm triggers.
- Architected, Designed the Backup and Archiving strategies, implemented Disaster Recovery in AWS Cloud and expertise in using AWS CLI.
- Involved in validating the data by using VB scripts.
- Create Test Scripts using VB script in QTP.
- Worked with data analysts to implement Informatica mappings and workflows, shell scripts and stored procedures to meet business requirements.
- Extensively used Informatica client tools. The objective is to extract data stored in Oracle database, flat files to load finally into a single data warehouse repository, which is in Oracle.
- Extracted Data from Teradata using Informatica Power Center ETL and DTS Packages to the target database including SQL Server and used the data for Reporting purposes.
- Worked on MongoDB database concepts such as locking, transactions, indexes, Sharing, replication, schema design.
- Created multiple databases with sharded collections and choosing shard key based on the requirements.
- Experience in managing MongoDB environment from availability, performance and scalability perspectives.
- Configuring high availability using geographical MongoDB replica sets across multiple data centers.
- Tested Python batch processors to consume and produce various feeds.
- Tested entire frontend and backend modules using Python on Django Web Framework.
- Tested Business Logic using Python on Django Web Framework.
- Tested Merge jobs in Python to extract and load data into MySQL database.
- This work will include loading of historical data onto the Teradata platform, as well as reference data and metadata.
- Involved in Designing Logical and Physical Data model using Erwin Tool Responsible for different Data mapping activities from Source systems to Teradata
- Performed the Back-end Integration Testing to ensure data consistency on front-end by writing and executing SQL statements
- Solid Back End Testing experience by writing and executing SQL Queries.
- Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security
- Prepare architecture and implementation of the applications in the Azure cloud and On-premises environments.
- Responsible to configure and integrate cloud based PaaS application like Office-365 and Dropbox for SSO functionality using ADFS and Microsoft Azure AD.
- Install and configure Virtual machines, storage account, virtual network, Azure load balancer in the Azure cloud.
- Perform cost analyses for data center migration using azure price calculator and Total cost of ownership tool (TCO) Tool.
- Configure VM Scale Set for identical VM and applications as per the business requirements for dev/test environments using Azure resources manager.
- Configure and design Virtual network infrastructure and network connectivity with on-premises and azure services using Azure resource manager, Virtual private network (Peering, VNet to VNet configuration), UDR, Express route, site to site (S2S), point to site (P2S) VPN.
- Familiar with CMMI, the Rational Unified Process (RUP) methodology within the Software Development Life Cycle (SDLC) along with Agile and Scrum
- Testing (that includes unit, integrated, regression, and UAT) a new .NET application which uses Microsoft Visual Studio under the Scrum (Agile) Methodology
- Testing (that includes integrated and regression) an application which uses Microsoft Visual Studio under the Scrum (Agile) Methodology.
- Useddata-driventesting and database accessing techniques during automation scripts development
- These functions are application independent and platform independent that can be used across multiple projects.
- Developed a comprehensiveAutomation Regression Suitethat is executed at the end of each sprint
- Involved in Sprint review meetings and backlogs meeting.
- Involved in giving demos for each sprint.
- Providing daily status report to client on day to day basis.
- Used Rally to understand the stories and track the hours spent on each task in the rally
Environment: Informatica 9.0, Functional Testing, Informatica IDQ, DVO, Agile, Jira, Cognos 10.1/10.2, Unix/Linux, SQL/PL-SQL, Quality Center ALM 11.0, UAT, Unix, SDLC, Agile, MDM, Teradata, MS Azure,, Oracle 10g,VB.Net,Web Services, XML, SqlServer, Agile, UAT
Confidential
ETL QA Tester
Responsibilities:
- Attended the review meetings of functional and technical specs with developers and Account Managers.
- Formulated Test Plan that contains test scenarios for testing the Functional, System, Integration and Regression Testing.
- Tested graphs for extracting, cleansing, transforming, integrating, and loading data using Informatica ETL Tool.
- Extensively used ETL methodology for testing and supporting data extraction, transformations and loading processing in a corporate-wide-ETL Solution using Informatica.
- Extracted Data from Teradata using Informatica Power Center ETL and DTS Packages to the target database including SQL Server and used the data for Reporting purposes
- Design and Develop ETL Processes in AWS Glue to migrate Campaign data from external sources like S3, ORC/Parquet/Text Files into AWS Redshift.
- Data Extraction, aggregations and consolidation of Adobe data within AWS Glue using PySpark.
- In - depth hands on experience with AWS Cloud services like EC2, S3, Elastic Beanstalk, SNS, SQS, DynamoDB, Cloud Watch, Cloud Foundry, Lambda
- Full life-cycle experience utilizing Object Oriented Analysis and Design methodologies
- Integrating current Early Funding system with AWS cloud platform
- Helping the enterprise pick the right AWS services for the applications
- Worked on integrating AWS messaging services like SQS, SNS and DynamoDB
- Backup & Recovery, Database optimization and Security maintenance.
- Created drill maps for end users to drill down and recover data.
- Strong testing Quality Assurance experience within agile environment
- Good understanding of agile software development lifecycle (iterative and incremental)
- Performed tests on various features of agile development process
- Used Teradata SQL to Send queries to Teradata Database;
- Used Teradata SQL to View and sort the results by column and save them to a file;
- Used Teradata SQL to Save queries to a file so you can run them automatically from the same file in the future;
- Experience in creating UNIX scripts for file transfer and file manipulation.
- Used SQL for Querying the DB2 database in UNIX environment
- Performed database integrity check using SQL and Unix Shell Scripts
- Tested SQL Stored Procedures and Queries for Back end testing
- Involved in extensive Data validation using SQL queries and back-end testing
- Involved extensively in doing back end testing of the data quality by writing complex SQL
- Written test cases for Web based testing.
- Tested functionality and performance of web services.
- Involved in Backend Web services testing.
- Developed test requirements and test plans for comprehensive testing of back-end systems like Web Services.
- Experience in managing virtual instances and disks using Puppet.
- Created configuration for establishing a VPN tunnel between on premise network and AWS VPC.
- Designed Stacks using Amazon Cloud Formation templates to launch AWS Infrastructure and resources.
- Developed AWS CloudFormation templates to create custom sized VPC, subnets, EC2 instances, ELB and security groups.
- Used the AWS-CLI to suspend an AWS Lambda function. Used AWS CLI to automate backups of ephemeral data-stores to S3 buckets, EBS.
- Migrated the SQL database to AWS RDS Multi AZs
- Installed and Setup Web Servers (Apache and Tomcat), DB Server (MySQL) and MySQL (Master and Slave) Server, Multiple MySQL Instance with different port.
- Created AWS Route53 to route traffic between different regions
- Worked on HP ALM to write test cases and execute them
- Created UI Actions, UI Policies.
- Performed Import Export of data sets from Dev to other environments
- Created custom tables, based on business needs.
- Participated in daily scrum meeting and developer meetings.
- Service Catalog/Change/Incident/Service Level Management Implementation.
- Performed Onsite Coordinator for the offshore resources team in India and Onsite team. This role involved extensive daily assignment and management of tasks to the Offshore/Onsite resources.
Environment: Informatica 8.6,8.0, Functional Testing, Informatica IDQ, DVO, Cognos 9.3/9.4, Jira, SQL/PL-SQL, Unix, SDLC, Agile, MDM, Teradata, MS Azure, Quality Center 10.0, Unix/Linux, DB2, MS Office, Excel, Visio, MS Project, MS PowerPoint, SharePoint, Microsoft CRM 2011, Office 365, SQL, Agile.UAT
Confidential, Wakefield, MA
ETL QA Tester
Responsibilities:
- Assisted in gathering the business requirements, ETL test and design of the flow and the logic for the Data warehouse project
- Extensively used ETL for testing and supporting data extraction, transformations and loading processing in a corporate-wide-ETL Solution using Informatica.
- Tested the ETL Informatica mappings and other ETL Processes Data Warehouse Testing
- Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping Mapplet Designer and Transformation.
- Reviewed Informatica mappings and test cases before delivering to Client.
- Extensively used Informatica power center for extraction, transformation and loading process.
- Used Microsoft Visio to design the flow of jobs.
- Define a number of test cases using quality data end-to-end business processes during the UAT and validated the system set up for transactions and user access in UAT.
- Involved in Creation of test plan, reports.
- Worked closely with BAs and developers to better understand the functionalities.
- Attend meetings with BAs and client managers to gather documents to write test cases.
- Extensive involved in writing complex sql queries.
- Involved in extensive Data Validation using SQL queries in Oracle,Sql Sever and Sybase Databases
- Tested the front-end trading application known as "Charles River Development"application.
- Involved in validating data for different Hedge fund clients (such as JP Morgan, Goldman Sachs etc) by writing complex Sql queries.
- Tested the brokerage and financial services applications like Equities, Fixed Income, Investment Research, Bonds, Stocks, Options, Capital Markets, Investment Deposits, Accounts (Accounts Activity, Balance & Holdings, and Portfolio Management), E-statements, Retail Consumer Lending, Mortgage Modules and Integration Modules.
- Performed the White box and Black Box testing.
- Tested and debugged the code during white box testing.
- Used Bugzilla as a defect tracking tool
- Coded scripts to clean normalize, reformat data for loading into the ERP system using VB Script.
- Worked on OOPS concept for automating the functionalities of the application..
- Involved in preparing Requirement Traceable Matrix (RTM) and Test Sets in HP Quality Center 9.2.
- Executed smoke Testing to test the main features of the application as and when required.
Environment: Informatica 7.0, Functional Testing, Informatica IDQ, DVO, Cognos 8.0.3, Jira, SQL/PL-SQL, SDLC, Agile, Quality Center 9.2, DB2, Unix, MDM, Teradata, MS Azure, Linux, Sql Developer, .Net, Excel, Bugzilla, SQL, Oracle 10g, Microsoft Visio, VBscript, CRD, Sybase, Agile. UAT,
