- IT professional with 9.5 years of experience in Development, Testing and Implementation of Data warehousing/BI solutions)
- She has expertise in Teradata, Informatica Powercenter, PL/SQL, GIT, Jenkins and Unix shell scripting.
- Experience in leading and driving DevOps Team, release management, coordinating platform outages.
- Experience in SDLC models (Waterfall Model, Incremental & Agile Model)
- Experience in Project Management by performing various tasks related to planning, tracking, estimation, Quality control, Risk management etc.
- Experience in using Informatica Power Center 9.6.1 (Work flow manager, workflow monitor).
- Strong experience in dealing with different data sources - Oracle, Teradata, SQL Server, MySQL, DB2, Flat file, XMLs, WSDLs, CSV files, Cobol Sources.
- Experience in ORACLE 10g, SQL Server, Teradata, DB2 and VSAM.
- Good experience with setting up Control-M Job scheduling for Informatica /Datawarehouse Workflows.
- Experience with high volume datasets.
- Experience in working with tools Quality Center, Jira, Qtest, JMeter, SoapUI, Rest Services
- Experience in Business requirements gathering, documentation and converting into technical documents like - Research & Solution Document (RSD) or Business Requirement Document (BRD), Detail Design Document (DDD), mapping specs document and Unit test document.
- Responsible for working closely with business partners and business specialists to help gather requirements, and translating requirements into system work.
- Experience with working with teams across multiple geographical locations and leading offshore teams in Global delivery model.
- Has an excellent track record of successfully managing small to large projects from conception to completion with strong leadership, managerial, analytical and technical skills.
ETL Tools: Teradata, Informatica Power enter 9.6.1
Programming Languages: UNIX shell scripts, SQL, PL/SQL (stored procedure), Java, C++
Databases: Teradata, Oracle 11/10g, MS SQL server 2008/2005, MySQL, DB2, VSAM
BI Tools: Microstrategy 8.1
Tools: /Utilities: Toad 10.6.1, SQL Developer 1.2., Teradata SQL Assistant. GIT, Jenkins
Testing Tools: SOAP UI, Jmeter, Rest services HP Quality Center, Jira, Qtest, TDM
Data modelling tools: MS Visio
Scheduling Tools: Control-M
- Responsibilities included development, testing and implementation of ETLs using BTEQ, FastLoad utilities in Teradata.
- Part of a DevOps team which implemented this project and provided production support.
- She had supported CICD pipeline implementation for this project using GIT and Jenkins Involved in performance tuning /identifying bottlenecks.
- Involved in design of the job flow for batch jobs and its restart/recovery based on business function.
- Her responsibilities included setting up test environment, deployment and support of Teradata procedures, database components.
- She also implemented Unix shell scripts, automated validation and verification of business function points using java framework.
- She was involved in enhancements, post production support, analysis, and fixes for data issues.
- Setting up of test environment, deployment and support of Teradata procedures, database components.
- Setting up Control-M jobs and their dependencies across multiple applications
- Estimation and resource plan
- Researches, resolves and responds to highly complex questions relating to services supported by the Analytics team
- Created Understanding documentation and Release Testing documentation on the tickets handled, and documented the issues found in a central repository.
- Performed unit testing, system integration testing, and supported user acceptance testing.
- Created automation suite for relational databases sql queries for data validation and verification
- Cross team integration and testing support.
- Worked on UNIX shell scripts and stored procedures to perform tasks like scheduling workflows, error handling and email notification for failures of loads.
- Responsibilities included development testing and implementation of ETLs using BTEQs in Teradata
- Analysis of Business and system requirements including impact analysis of existing systems and create detail requirements with the consultation of Business users
- Setting up different regions - Integration, Performance, Break fix and Production environment with the Wrapper scripts and setup and folder structure for Teradata scripts to run in UNIX environment.
- Created tables, views in Teradata, according to the requirements.
- Performance Tuning of sources, Targets, mappings and SQL queries in transformations.
- Cross team integration and testing support.
- Worked on UNIX shell scripts and stored procedures
- Setup scheduler jobs/Control-M.to perform tasks like scheduling workflows, error handling and email notification for failures of loads.
- Defect fix and production maintenance.
- She had setup Control-M jobs, developed Teradata Macros and used various Teradata analytic functions.
- She has coordinated the platform outage, release management during the implementation/cutover for this project.
- She was involved in enhancements, post production support and analysis.
- Real Time Load Validation of GNIP Stream to hadoop server
- Replay - Disconnect of data during real time data streaming will be recovered using Replay .The time of disconnect will be captured in hbase tables and once stream is established data during disconnect will be loaded to hadoop server.
- Flume job will run to bring data from GNIP data stream to hadoop server. The data will be received as Json object and loaded to hadoop servers as json objects.
- Using JSON diff or Pretty Diff tool we compared data loaded to hadoop server