We provide IT Staff Augmentation Services!

Etl Bi Qa Lead Resume

San Ramon, CA

SUMMARY:

  • Seeking a team - oriented environment that fosters professional development around Fifteen Years of experience and most recent as QA Lead with strong technical skills and Data warehousing background in Data Analysis, Development and support in testing and development environments.
  • Hands on experience in BI in huge Data warehousing projects using ETL, Schema, RDMS concepts, Normalizations, Data Partitioning via slices and dices of Data Warehouse and Data repository as per required multi-dimensional reporting purposes.
  • Skilled in Problem analysis, Designing Data Model, Programming, Customer Support
  • Extensively worked and managed on T-Sql, stored procedures, Triggers, cursors, functions, DTS packages. Worked closely with Tech team and DBA for data migration by implementing SSIS packages.
  • Worked in Selenium for cloud functional testing of Salesforce application and worked in Workbench tool for data quality and data mapping in White box and black box testing.
  • Defined and verified calculations which involve calculated members, named sets, and other script commands to extend the capabilities of a Microsoft SQL Server 2005 Analysis Services cube.
  • Created Drill-through targets to another report using SQL Server 2005/2008 Reporting Services, SSIS and defined data source, data source views, dimensions, measures, hierarchies, attributes, calculation, actions, perspectives, translation, and security roles.
  • Worked in Intel RDMS team working with in multiple project teams - Pricing, COGNOS BI reporting tool, IDMS team and support Payment Tool and NET ASP for continuous Sales requirements providing data from IDMS Product data. Supported Rebate Forecasting project when needed.
  • Skilled in writing Dynamic Stored Procedures to update/validate table values in multiple databases and Remote Servers & hands on experience in writing and debugging Stored Procedures and Triggers.Experience with handling large tables and figure out logic from existing code and schemas
  • Flexible in working PL/SQL and SQL Server.
  • Ability to work independently or as part of a team to accomplish critical business objectives and to make decisions under pressure.
  • Experience in functional testing all web application and technical lead for Developing Win Forms (Windows Forms) using C# and ASP.NET
  • Designed, verified all reports using VBA Macros using the Pivot tables in the required formats retrieving from Sybase and SQL Server Database queries
  • Automation using Visual Studio and Sql server for better qualitative data when we cannot check in detail for huge amount of data manually.
  • Knowledge of Tableau for data analysis, drag and drop qualities to view preview of Ad hoc data and reports.
  • Created Dashboards using JIRA for Bundle project release tracking and analysis for future enhancements for ease of Technical and Users team to be on same page.

TECHNICAL SKILLS:

SQL Server Tools: Enterprise Manager, Query Analyser, Import & Export (DTS), Profiler, Report Services, Analysis Services

GUI: ASP.NET 2.0/1.x, ASP 3.0, VBScript, JavaScript, HTML, XML, XSLT, CSS, SOAP, UDDI, WSDL,UML2.0,VB.NET, AJAX,WSDL,.NET Framework 3.0, WCF,WPF, VBA

BI Tools: PL/SQL, MS SQL Server 2005 (Integration Services, Reporting Services and Analysis Services), SQL Server MDX, ER Studio 7.0, Crystal Reports 9.0, VSS 6.0, PVCS, Cognos

ETL Tools: Microsoft SQL Server DTS, Microsoft SQL Server 2005/2008 Integration and Reporting Services (SSIS, SSRS, SSAS).

Databases: SQL Server 6.5/7.0/ 2000/2005/2008 , MS Access 97/00/03, MySQL,DB2, Sybase, ORACLE

Operating System: Windows 98/2000/2003/ XP, Windows server 2000/2003,Win NT 4.0, Win 98, Win2000, Win 95, UNIX, MS-DOS

Office Productivity: MS-Project, MS Excel, MS word, MS PowerPoint, MS FrontPage, MS Projects, Adobe Acrobat, Adobe Photoshop, MS-VISIO 2002/2003

Languages: C,C++,C#,SQL, T-SQL, UML, Visual Basic 6.x, HTML, XML, Ajax, FoxPro, Developer 2000

Mainframe: OS/S390- COBOL, CICS, JCL, VSAM

Domain Knowledge: Financial, Insurance, Healthchecks, Auditing and Banking.

PROJECT EXPERIENCE:

Confidential, San Ramon, CA

ETL BI QA lead

Responsibilities:

  • Team coordination and worked in testing and team lead of Marketing Data Mart to enhance Online and Mobile Banking and improve Customer interactions and analyze complete dependency of Geography, Time Dimensions and regular interaction with all project team members, NFG, CBG, RBG Business Owners, and IT/Technical Owner, QA teams and all other key stakeholder.
  • Strong in SQL Server and Teradata and created test scripts as per Data Mapping Documents with pseudocode and been major role to fix gap between technical and Business team since coming from Technical strong SQL skills
  • Interacted with Business teams to collect data for the conceptual design of the system and worked with Development Teams and Managers to analyze business needs and developed technical specifications.
  • Handled BIDM team for data mapping and data structure gaps to fill gaps and display qualitative data with accurate mapping for all data transformations from Data Mart to validate the data flow and Financial Carrying Values and Data migration to slice and dice as per data need to squeeze as per need of source and destination data access.
  • Participate and contribute ideas in team meetings that address strategic business initiatives.
  • Followed agile and Scrum process and got trained from BOW resulted in ease of handling challenges of hidden scenarios and time constraints.
  • Successful testing of verification of data from all schedules and all sources (EDW, BIDM Data Mart, Cost Center Dimensions etc) to implement via Teradata and deliver to users to use as source for their reports and Dashboard generations from Teradata.
  • Created Data Quality Solution that run business rules on data to reject data, cleanse data, validate data and apply business rules to check its (Integrity, Completeness, Uniqueness, Conformity, Accuracy) Data Stewardship is a community within an overall data governance framework.
  • Schema definitions and transformations and defining source and destination criteria and logic to embed for reporting Trend Analysis for recent Online Banking, Private Banking, Mobile Banking customers and their households and their user profile settings to accesses and email facilities to parent accounts as major objective for Bank Marketing trends and forecast analysis with time wise, location wise dashboards
  • Analysis for discounts and charges for all accounts based on criteria and successfully improve business by maximizing Users to login and by verifying counts of Complex Banking Indicators.
  • Data mapping and for smooth ETL transformation involved with Technical, Testing team for all Additional data sources, fields, and updated formulas are needed to meet updated requirements. The calculations are done using annualized current month values, which roll up to the party and household levels, facilitating cross-sell marketing to existing customers.
  • Worked using Oracle, Sql scripts for data testing data under huge data and functionality risks.
  • Worked in Teradata and Views as that’s the final source for Business Users to build dashboards and forecasting Marketing dimensions for improving business and decision making for future releases.
  • Improved the accuracy of the calculations through improved business logic objectives supported the goal of business continuity within the marketing department and the overall goal of increasing revenue from all sectors of the bank.
  • Designed with Excel, Data collection process from Sql queries to provide Marketing Data with primary secondary complex RM relationships and Dependency of Account and party and household level ETL process and data mapping resulting in scoreboard and results in Teradata.
  • Worked in JIRA, CMC tools for Issue tracking and correlate with all Stories / Change requests for constant updates to issue related to address to End Users and Project Management.
  • Worked in HP QC for quality checks with Quality Team and UAT acceptance from Users.
  • Worked in SharePoint for all test plans, project timelines, scheduling meetings approvals and UAT and production support tickets.
  • Working on testing Database relation being strong Sql concepts helped by considering all technical complexities and impacts in data of other projects for any new changes implemented

Confidential, San Ramon, CA

QA Lead

Responsibilities:

  • Terms and Conditions, Contracts, Job Orders, Special Rules, Contacts and Candidates play major entities and data to normalize as per functionality
  • Testing data flow to all reports involved in Regression and Functional Testing cycles.
  • Handling team of Salesforce data extract for data ETL process, Data load and Data mapping verifications.
  • Repeated test scenarios to implement SFDC UI functionalities using Selenium tool for retesting multiple times for the same flow for different data relationships for similar scenarios
  • Challenges like data clean ups in source when not possible and how to migrate with cleaned data without interrupting current UI data
  • Risks in decisions making for users and Dev team to fill gaps for the complexities of data in terms of memory / format not cleaned up / logic not to be interrupted and keep track of scenarios to deliver and report and address to concerned teams for upcoming plans, risks and deliverables.
  • Team coordination and leading QAs for testing in Agile lifecycle and on time qualitative delivery. Conducted meetings and issue tracking process within Business team, Technical team to fill gaps and coordinate with resources to help in prioritizing as per project and Business needs for successful Qualitative delivery.
  • On time information and knowledge sharing with Team ( Heads up with UI team, Salesforce team, Dev team for all related and upcoming issues and analyzed data from DM .. helped in learning complete logic cycle to some extent of Issue related and as per time permits)
  • Black box and white box testing for all Web services transformed from Java applications to Salesforce and multiple logics to cover scenarios of email data and attachments transformed to UI
  • Data serialization of huge data and how to squeeze to fit as per Salesforce objectives
  • Worked in Adapt and Salesforce Workbench tools to validate and verify data using sql skills
  • Rates (complicated logic from data side) - Worked with MJP rate calculations, MJP data and its logic, 69 columns transformation logic calculations from MJP to SFDC as part DM tasks, as a team effort to periodically correlate and grab the story can help BSAs and correlate from DM side and UI side for rate calculations and updated stories to BSAs to fill gap for all UI and DM side when needed.
  • Salary Detail Data Analysis: Periodically involved and learned data migration, Mapping, UI behavior and how to implement and compare between migrated salary detail and newly implemented data from SFDC for all PAF rows display and logic behind to not impact Salary data for US
  • Pay History data Analysis: All gross, Total, Regular, Overtime, Double-time, Margin, Percent rate calculations and historical data for all pay records and migrated data from MJP and without impacting decimal calculations while displaying SFDC UI.
  • Candidate and Contact eligibility Logic: criticality in all conditions and acknowledge Biz requirements and technical involvement to query and verify for Candidate and contact logics for CA and US and repeatedly working with dev to make sure to implement all conditions for all scenarios.
  • Data regression testing for all manual updates of amounts to overwrite MJP calculation for required Legacies and verified
  • Business knowledge expertise for BE, CA and USA data and flexible in working in any country as per biz requirements.
  • Creation of Data mapping documents object wise for all source and logic to implement to convert to Destination Salesforce objects
  • Comparative testing, difference in US and CA logic and similar logic and priority of choosing Environments to test functionality.
  • Ensure users to have qualitative deliverable along with risks to be handled and change requests
  • Regular meetings and prioritizing issues from QA, UAT, Business and technical related and fill gaps to monitor and share with team and leave to management to decide the tasks priority
  • Can slice and dice and filter and verify in any Database repository
  • Can Verify the flow of data either via back end or Front office which might help in any regression / Functional / performance testing for complex logics and can adapt easily to start working as per what been in the complex functionalities in DM team
  • Worked in JIRA, Seleni um, CMC tools for Issue tracking and correlate with all Stories / Change requests for constant updates to issue related to address to End Users and Project Management.

Confidential, San Ramon, CA

Reporting Data Analyst / QA Lead

Responsibilities:

  • Team coordination and worked in successful Data testing of OBIEE Fed Edit check Dashboards and Schedule Recon Dashboards for ease of Bug tracking and analysis of Development for all possible Data migration efforts.
  • Followed agile and Scrum process and got trained from BOW resulted in ease of handling challenges of hidden scenarios and time constraints.
  • Worked on Work Stream 2, 3, 4 and 5 ( First Lien, Home Equity, Corp, CRE, Other Consumer, US Auto, Supplemental )
  • Worked on Fed Edit scoreboard and results mapping for all schedules and related rules to fit for all drill through.
  • Working on UI integration as part of functional testing and integration testing for Fed Edit Checks and data quality issues and gaps between Schedule population and Fed Edit check rule output Results.
  • Working on Fed Edit Checks and data quality issues and gaps between Schedule population and Fed Edit check rule output Results.
  • Able to fill gap between Schedule population and call report keys for Reconciliation.
  • Successful automation of data from all schedules and all sources (AFS, ALS, IM, LP, IL, TSYS etc) in one dashboard for Federal submission.
  • Able to articulate hidden scenarios and cross verification scenarios and reconcile with zero variance for BOW and FHB
  • LOB Mapping using Servicing LOB by mapping using Cost ctr num for all loan level / Segment ID level / Src Instr ID level for all schedule populations.
  • Strong in Sql skills to have peer review of logic and ability to generate technical mapping document for successful automation for all ETLs consolidated into one single data repository
  • Worked in Automation of all reports involved in Regression and Functional Testing cycles.
  • Data validation and verification for all Non - normalized and normalized data and running SQL jobs for all validation rules and run to email to Finance team.
  • Cross Verification Sql Scripts and Visual Studio Premium to verify Cube data accuracy.
  • Worked on Sql Scripts and Procedures to output to Sql / Excel reporting
  • Involved in implementation of submission and resubmission logic using XML format interface into Excel and Sql to XML data transformations
  • Handled BIDM team for data mapping and data structure gaps to fill gaps and display qualitative data with accurate mapping for all data transformations from DataMart to validate the data flow and Financial Carrying Values and Data migration to slice and dice as per data need to squeeze as per need of source and destination data access.
  • Reported to Business and Technical team when needed to fill gaps and involved in Data modeling with Schema structures
  • Involved in Data analysis, Requirement gathering, Qualitative data structures and gaps and data mapping from various sources originated from including schema related data to generate views as one common data repository.
  • Data analysis and providing solutions to commonly used for development and automation testing.
  • Automation using Seleniom and Oracle sql scri pts for better qualitative data when we cannot check in detail for huge amount of data manually.

Confidential, San Jose, CA

BI Data Testing Lead

Responsibilities:

  • Team coordination and created Project Dashboards for ease of Bug tracking and analysis of Development for all possible Data migration efforts.
  • Working on Salesforce data extract for data ETL process, Data load and Data mapping verifications in challenging environment where various sources of data (Sql, Oracle) to be analyzed in huge Data Warehousing environment.
  • Worked in Automation of all reports involved in Regression and Functional Testing cycles.
  • Handled Jitterbit team for all data transformations from Data Mart to Salesforce icloud data and to validate the data flow functionally and data issues.
  • Worked on Selenium too for automation testing of scenarios for integration of all application reporting to Salesforce
  • Data migration to slice and dice as per data need to squeeze as per need of source and destination data access.
  • Involved in Revenue Management, Sales and Marketing and Finance reports for all production support and User acceptance and support for any latest data flow analysis verifying in all environments.
  • Worked as trouble shooter for complex data report to fix all performance issues and indexing and grouping as per business requirements.
  • Data analysis and providing solutions to commonly used for development and automation testing.
  • Automation using Visual Studio and Sql server for better qualitative data when we cannot check in detail for huge amount of data manually.
  • Data validation and verification for all Non-normalized and normalized data and running SQL jobs for all validation rules and run to email to Finance team.
  • Strong technical background to analyze and to slice and dice data and show in the reporting to make ease of user’s decisions for further enhancements.
  • Worked on data flowing to Reports for more than 300 reports for each bundle and automation save major manual effort for huge number of reports.
  • Single handedly created Reporting into Trade services, credits and Revenues need in depth knowledge of Data levels and hierarchies and normalizations and rolling up to any level when needed.
  • Achieved appreciation for working in challenging environments where there were always negative supports in receiving data due various application teams and successfully created roadmap for future projects.

Environment: SSRS reporting tools, SSIS packages, Oracle PL / SQL, MS SQL Server 2000/2005/2008 R2, Windows Server 2003, Enterprise manager, Query Analyzer, SSIS, SSRS,DTS, QC 10.0,SQL profiler, Performance Monitor, Query Optimizer, T-SQL, Share Point Portal Server 2007 (MOSS 2007), IIS 6.0, .XML, Microsoft Visio 2003, MS Visual Studio 2008

Confidential, San Francisco, CA

Sr Sql server Analyst / Technical Project Lead

Responsibilities:

  • Challenging environment where various sources of data to be analyzed in huge Dataware housing environment.
  • Transformed and data verification scripts for data transformation by data dumps and sampling into Excel using SSIS packages, Import utilities from Sql Server and created reports as per needs to uniquely identify, analyze, match and compare ad define and implement calculations.
  • Worked on Complex data report to filter the entire project Product filters and Customer filters to check for Revenue Rollups of Confidential Banking Users worldwide using Web based reporting systems.
  • Worked on Sql Scripts and Procedures to output to Sql / Excel reporting and data migration using ETL via SSIS packages.
  • Implemented XML format interface into Excel and Sql to XML data transformations
  • Reported to Business and Technical team when needed to fill gaps and involved in Data modeling with Schema structures
  • Worked on Visual Basic project to capture data and provide reports as per needs.
  • Involved in multiple server talking through linked servers and regularly tracking historical data .
  • Designed in MS Visio diagrams for ER model, Object driven model, case diagrams for clarity and track of development stages.
  • Experience in working in Sybase, Sql, Oracle databases to capture data as per the needs to make a common data repository for Global Banking reports and analytics.
  • Worked on Role-base security for more than 3000 Confidential Commercial and Corporate Banking and International Banking Users worldwide.
  • Single handedly created Reporting into Trade services, credits and Revenues need in depth knowledge of Data levels and hierarchies and normalizations and rolling up to any level when needed.
  • PDF format conversions and formatted outputs from SSRS tools impressed users to build reports rapidly generated by me.

Environment: Informatica, Google Analytics, SSRS reporting tools, Oracle PL / SQL, MS SQL Server 2000/2005/2008 R2/2012, JIRA, HP QC, Tableau 8.1, Windows Server 2003, Enterprise manager, Query Analyzer, SSIS, SSRS,DTS, QC 10.0,SQL profiler, Performance Monitor, Query Optimizer, T-SQL, Share Point Portal Server 2007 (MOSS 2007), IIS 6.0, .XML, Microsoft Visio 2003, MS Visual Studio 2008

Confidential

Sr Sql Server Analyst

Responsibilities:

  • Involved in designing and implementing and support of all assigned internal projects
  • Provided the handy solution to the dependency issues of data passed through more than one project and thereby avoiding any impacts while solving.
  • Complex interface between Demand product data flow towards Pricing and Deal management system has been handled when there was any data flow mismatch etc. using SSIS packages.
  • Created a new tool to develop Data Quality (DQ) and invented new DQ technique to create any Data quality checks and send emails to respective users and Testers to troubleshoot resulting in filtering and save of time.
  • Worked on Complex data report to filter the entire project Product filters and Customer filters to check for the BAB and Bill figures and compare for the same after deal is fixed. Faced huge impacts and solved dependency issues by creating a new functional design engine which never considered while developing in prior projects.
  • Development, implementation and support of SQL objects along with .NET UI to send the Deal information to SAP by setting batch job timings thrice a day and monitoring the performance and supporting user when needed.
  • Enhancement and implementation of successful iPAR application for NAND products newly implemented for release Q3 2010.
  • Deploying and scheduling Reports with SAP interface to generate all daily, weekly, monthly and quarterly Reports including current status.
  • Scheduling meetings and filling gaps in analysis and requirement gathering before development cycle to be more efficient in testing and support and preventive measures in impacts of new requirement changes on beforehand.
  • Designed in MS Visio diagrams for ER model, Object driven model, case diagrams for clarity and track of development stages.
  • Created Data pops for all interfaces between applications when needed a flow of data to capture between Demand, Pricing, Deal, Rebate Forecast and Payment tools project data
  • Wrote complex Stored Procedures and triggers to capture updated and deleted data from OLTP systems and data warehouse design and analyzed various approaches for maintaining different dimensions and facts in the process of building a data warehousing application.
  • Developed Triggers to raise events on executing insert, delete and update stored procedures.
  • Generated reports using the Crystal Reports and Reporting Services 2005/2008, designed and developed matrix, tabular and parametric Reports in the SQL Server Reporting Services 2005/2008.
  • Developed XML views with tags to structure data to be sent to SAP and improve the data migration process.
  • Working in QC tool for defect prevention and tracking of release enhancements and future requirements,TFS for version control of Sql and .NET UI objects.

Environment: MS SQL Server 2000/2005/2008 , Windows Server 2003, UNIX, Enterprise manager, Query Analyzer, SSIS, SSAS, SSRS,DTS, QC 10.0,SQL profiler, Performance Monitor, Query Optimizer, T-SQL, Share Point Portal Server 2007 (MOSS 2007), IIS 6.0, .XML, ETL, Microsoft Visio 2003, MS Visual Studio 2008

Sql server Analyst

Confidential, Oakland

Responsibilities:

  • Involved in complete Data analysis of Confidential having 11 branch offices and integrate all the data for ease of calculations of Lawyers.
  • Worked on XML Data schema structures to clearly analyze the data and the prototypes of existing to migrate and transform the data needed for lawyers when they relocate in different branches.
  • Used SSIS packages to integrate load and transform the data for multidimensional features and involved in designing reporting using SSRS.
  • Developed stored procedures to automate reports and routine data processing tasks.
  • Experienced in extracting data from large databases( 15 s millions of consumer records, upwards to 100 million transactions, 400 million attributes)
  • Worked on Performance issues and tuned many queries to considerably reduce the duration to retrieve records for I/O operations.
  • Worked in Existing report formats using Pivot tables and displaying Sql Server data and some Lawyer s calculations for their Accounts reports .
  • Developed all reports using VBA Macros using the Pivot tables in the required formats retrieving from Sybase and SQL Server Database queries.
  • Addressed and resolved numerous technical issues and managed ongoing system support
  • Involved in Migration activities. Migration of all the Stored procedures and Schemas of Ramco e - apps from SQL 6.5/7.0 to SQL Server 2000 and made it compatible to work in 65,70 & 80 compatibility levels.
  • Worked on Ad Hoc reports that allow Business users to promote their functionalities within their confidential secured network to distribute and execute Business functionalities using Business Query Model.
  • Created Business Requirement document to understand the existing system and provided complete understanding of current Confidential s Business and Technical information in order provide the common Business understandings within the Confidential Employers and also to provide solutions by adding more values to the currently used Efficient Confidential System.
  • Coordinated with team and users to get complete picture for analysis and fixing and confirming the solutions before delivery.
  • Worked and uniquely handled and created Requirement Specification documents for ease of all developers and testers to learn more about these business applications and updates for all the enhancements and change requests and new requests were developed including Unit Test plan, Integration and system testing implementing waterfall model.

Confidential

Sql Developer /Support

Responsibilities:

  • Managing and tracking of all the enhancement tasks and communicating to Client for requirements and delivering fixed changes to Users.
  • Coordinated with team and users to get complete picture for analysis and fixing and confirming the solutions before delivery.
  • Involved in fixing invalid Mappings, testing of Stored Procedures and Functions, Testing of all possible ways of handling data from the application point of view.
  • Performance Tuning database Queries, Stored Procedures, View to improve performance and scalability.
  • Analyzed the Source data and designed Fact tables & Dimension tables STAR Schema. Involved in Designing data model, Development, Testing & implementation of the Data Mart.
  • Used Data Transformation Services (DTS) an Extract Transform Load (ETL) tool of SQL Server to populate data from various data sources by creating packages for different data loading operations for each application, developed DTS packages for initial and incremental load using Data Driven and Data Pump Tasks from Source to Stage Database
  • ETL process - Developed manual ETL (stored procedure) for population of target schema from load schema and created and Scheduled Jobs to automate the DTS process.
  • Created and managed schema objects such as tables, views, user defined data types, clusters, indexes, procedures and triggers through Query Analyzer
  • Experience in interacting with customers and clarified the issues functionally and technically.
  • Performed backups and restoration, developed backup strategies, scheduled transfer of backups and tested their viability.

Confidential, Cleveland

Sql Developer/Support

Responsibilities:

  • Designed and implemented store procedures for complex business logic using SQL Server for data manipulations of Groove Data and XML formatted data .
  • Extensively handled the Back - end data of MS Groove virtual office 3.1 by migrating data into MS Access/ MS Sql Server, creating all related stored procedures, triggers, cursors required to increase, test the performance issues, manipulation of production data needed for Confidential .
  • Worked mostly on XML Data transfer of Microsoft Groove data to Sql Server for data analysis and fixing the irrelevant data and provide solution to be corrected for end users.
  • Extensively worked with ADO, HYPERLINK " http://ADO.NET \\ blank"ADO.NET to connect SQL Server using Connection, Command, Adapter, Datasets, Data View, Data Row (current/original version), Data Column, Data Reader, Data Adapter and other Database Objects.
  • Acquired knowledge in MS Groove Virtual office 3.1 implemented for data storage in Confidential application.
  • Automatically created groupings of attribute members based on the distribution of the members within the attribute hierarchy.
  • Worked in data analysis of Microsoft Groove Data through XML / XSLT formats and ADO.NET
  • Developed tools to migrate data into Data Grid using Data binding to view, edit and delete data with permission access for solving immediate production related data issues of data from Microsoft Groove Virtual office 3.1
  • Worked using WSDL to transform data by running service executables.
  • Debugged, analyzed and solved many production issues and changed the source code with confirmation of development and delivered to Auditors when in need.
  • Managed complete Tier3 team to prioritize, distinguish and organize and solve user related issues by working with Tier 4 and development team.
  • Taken ownership and responsibility to handle the entire Confidential application production support issues of Tier 3 and worked in GAAIT (Global Accounting & Auditing Information Tool) application to solve language related issues when integrated with Confidential .
  • Played major role with Six Sigma team in problem solving issues and eradication or paper and reworks and providing technical solutions for adding Emailing option by resolving after dragging and dropping emails for related claims to store and retrieve the files in drives and Databases.

Hire Now